Which technique involves setting a fraction of input units to 0 at each update during training time, which helps to prevent overfitting?
- Dropout
- Batch Normalization
- Data Augmentation
- Early Stopping
Dropout involves setting a fraction of input units to 0 during training, which helps prevent overfitting by making the model more robust and reducing reliance on specific neurons.
Loading...
Related Quiz
- Which RNN architecture is more computationally efficient but might not capture all the intricate patterns that its counterpart can: LSTM or GRU?
- The multi-armed bandit problem is a classic problem in which domain?
- What is the primary purpose of a neural network in machine learning?
- Why might one choose to use a deeper neural network architecture over a shallower one, given the increased computational requirements?
- The value at which the sigmoid function outputs a 0.5 probability, thereby determining the decision boundary in logistic regression, is known as the ________.