In recurrent neural networks (RNNs), which variant is designed specifically to handle long-term dependencies by maintaining a cell state?
- LSTM (Long Short-Term Memory)
- GRU (Gated Recurrent Unit)
- SRU (Simple Recurrent Unit)
- ESN (Echo State Network)
Long Short-Term Memory (LSTM) is a variant of RNN designed to handle long-term dependencies by maintaining a cell state that can capture information over long sequences. LSTM's ability to store and retrieve information over extended time steps makes it well-suited for tasks involving long-term dependencies in data sequences.
Loading...
Related Quiz
- In NLP, which technique allows a model to pay different amounts of attention to different words when processing a sequence?
- The _______ layer in a neural network is responsible for combining features across the input data, often used in CNNs.
- What is a potential consequence of biased algorithms in AI systems?
- What is the purpose of the "ANOVA" test in statistics?
- A bank wants to segment its customers based on their credit card usage behavior. Which learning method and algorithm would be most appropriate for this task?