Edited, memorised or added to reading queue

on 22-Aug-2025 (Fri)

Do you want BuboFlash to help you learning these things? Click here to log in or create user.

Flashcard 7739306806540

Tags
#feature-engineering #lstm #recurrent-neural-networks #rnn
Question
The LSTM network forms a chain of repeating modules, like any RNN, but the modules, apart from the [...] recurrent function of the RNN, possess an internal recurrence (or self-loop), which lets the gradients flow for long durations without exploding or vanishing
Answer
external

statusnot learnedmeasured difficulty37% [default]last interval [days]               
repetition number in this series0memorised on               scheduled repetition               
scheduled repetition interval               last repetition or drill

Parent (intermediate) annotation

Open it
The LSTM network forms a chain of repeating modules, like any RNN, but the modules, apart from the external recurrent function of the RNN, possess an internal recurrence (or self-loop), which lets the gradients flow for long durations without exploding or vanishing

Original toplevel document (pdf)

cannot see any pdfs