In Gradient Boosting, what is adjusted at each step to minimize the residual errors?
- Learning rate
- Number of trees
- Feature importance
- Maximum depth of trees
In Gradient Boosting, the learning rate (Option A) is adjusted at each step to minimize residual errors. A smaller learning rate makes the model learn more slowly and often leads to better generalization, reducing the risk of overfitting.
Loading...
Related Quiz
- Which type of network architecture is primarily used for image classification tasks in deep learning?
- The process of adjusting the contrast or brightness of an image is termed as _______ in image processing.
- Which database system is based on the wide-column store model and is designed for distributed data storage?
- Which trend involves using AI to generate high-quality, realistic digital content?
- In time series forecasting, which method involves using past observations as inputs for predicting future values?