Which technique helps prevent overfitting by ignoring certain neurons during training?

  • Batch Normalization
  • Dropout
  • Gradient Clipping
  • ReLU Activation Function
The technique that helps prevent overfitting by ignoring certain neurons during training is 'Dropout.' It randomly deactivates neurons, forcing the network to learn more robust representations.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *