In a task involving the classification of hand-written digits, the model is failing to capture intricate patterns in the data. Adding more layers seems to exacerbate the problem due to a certain issue in training deep networks. What is this issue likely called?

  • Overfitting
  • Vanishing Gradient
  • Underfitting
  • Exploding Gradient
The issue where adding more layers to a deep neural network exacerbates the training problem due to diminishing gradients is called "Vanishing Gradient." It occurs when gradients become too small during backpropagation, making it challenging for deep networks to learn intricate patterns in the data.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *