Which technique involves setting a fraction of input units to 0 at each update during training time, which helps to prevent overfitting?

  • Dropout
  • Batch Normalization
  • Data Augmentation
  • Early Stopping
Dropout involves setting a fraction of input units to 0 during training, which helps prevent overfitting by making the model more robust and reducing reliance on specific neurons.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *