What is a common problem faced by vanilla RNNs, especially when dealing with long sequences?

  • Overfitting
  • Underfitting
  • Vanishing and Exploding Gradients
  • Lack of Computational Resources
Vanilla RNNs often suffer from vanishing and exploding gradients, which hinder their ability to learn from and retain information over long sequences. Vanishing gradients make it challenging to train the network effectively. This is a key issue in recurrent neural networks.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *