Explain the concept of regularization in Machine Learning. What are some common techniques?
- Increasing complexity, Gradient Boosting
- Increasing complexity, L1/L2
- Reducing complexity, Gradient Descent
- Reducing complexity, L1/L2
Regularization is a technique to reduce overfitting by adding a penalty term to the loss function. Common techniques include L1 (lasso) and L2 (ridge) regularization, which penalize large coefficients in a model.
Loading...
Related Quiz
- Explain how the F1-Score is computed and why it is used.
- In a DQN, the primary function of the neural network is to approximate which function?
- The process of training a Machine Learning model involves using a dataset known as the _________ set, while evaluating it involves the _________ set.
- Overfitting in Polynomial Regression can be visualized by a graph where the polynomial curve fits even the _________ in the training data.
- How does Random Forest handle missing values during the training process?