A type of regularization technique that adds a penalty to the loss function for large weights is called _______.
- Activation Function
- Dropout
- Gradient Descent
- L1 Regularization
The technique you're describing is 'L1 Regularization.' It adds a penalty term to the loss function for large weights, encouraging the model to use only the most important features and prevent overfitting.
Loading...
Related Quiz
- Which wireless networking technology is optimized for low power usage and short-range communications, often used in wearable devices?
- Which component of IT risk management focuses on identifying and analyzing potential events that may negatively impact the organization?
- A wireless network that is set up by a malicious actor to mimic a legitimate network and trick users into connecting is known as a(n) _______.
- Which scheduling algorithm in operating systems gives the shortest job the highest priority?
- What type of attack involves overwhelming a target with a flood of internet traffic, rendering it inaccessible?