A common problem in training deep neural networks, where the gradients tend to become extremely small, is known as the _______ problem.

  • Overfitting
  • Vanishing Gradient
  • Exploding Gradient
  • Underfitting
The vanishing gradient problem is a common issue in deep neural networks, especially in recurrent neural networks. It occurs when gradients become extremely small during training, making it challenging for the network to learn long-range dependencies. This can hinder the training process and result in poor performance.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *