Which variant of RNN is specifically designed to combat the problem of vanishing and exploding gradients?

  • LSTM (Long Short-Term Memory)
  • GRU (Gated Recurrent Unit)
  • Bidirectional RNN
  • Simple RNN (Recurrent Neural Network)
Long Short-Term Memory (LSTM) is a variant of RNN that is designed to address the vanishing and exploding gradient problem. LSTMs use specialized gating mechanisms to better capture long-term dependencies in data, making them suitable for sequences with long-term dependencies.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *