Which technique helps prevent overfitting by ignoring certain neurons during training?
- Batch Normalization
- Dropout
- Gradient Clipping
- ReLU Activation Function
The technique that helps prevent overfitting by ignoring certain neurons during training is 'Dropout.' It randomly deactivates neurons, forcing the network to learn more robust representations.
Loading...
Related Quiz
- In the context of convolutional neural networks (CNNs), what operation is used to reduce the spatial dimensions of the input volume?
- Which layer of the OSI model deals with the logical addressing of devices and routing?
- What is the main difference between static routing and dynamic routing?
- In a router, the table that keeps track of paths to active networks and how to reach them is known as the _______ table.
- A company wants to develop a mobile app that works both offline and online, syncing data when a connection is available. Which type of database would be most suitable for this requirement?