To prevent overfitting in neural networks, the _______ technique can be used, which involves dropping out random neurons during training.

  • Normalization
  • L1 Regularization
  • Dropout
  • Batch Normalization
The technique used to prevent overfitting in neural networks is called "Dropout." During training, dropout randomly removes a fraction of neurons, helping to prevent overreliance on specific neurons and improving generalization.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *