To avoid overfitting in large neural networks, one might employ a technique known as ________, which involves dropping out random neurons during training.

  • Batch Normalization
  • L2 Regularization
  • Gradient Descent
  • Dropout
The 'Dropout' technique involves randomly deactivating a fraction of neurons during training, which helps prevent overfitting in large neural networks.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *