________ is a technique where during training, random subsets of neurons are ignored, helping to make the model more robust.

  • Dropout
  • Regularization
  • Batch Normalization
  • Activation Function
Dropout is a regularization technique that involves randomly deactivating a fraction of neurons during training. This helps prevent overfitting, making the model more robust and less dependent on specific neurons.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *