In the context of Decision Trees, how can overfitting be controlled using pruning techniques?
- By increasing the number of features
- By increasing the tree complexity
- By reducing the training data
- By reducing the tree complexity
Overfitting in Decision Trees can be controlled using pruning techniques by reducing the tree's complexity. By removing branches that add little predictive power, the model becomes less sensitive to noise in the training data and generalizes better to unseen examples.
Loading...
Related Quiz
- A company wants to classify emails as either spam or not spam. What would be your approach to create a classification model for this problem?
- What is the main difference between supervised and unsupervised learning?
- In what scenarios might DBSCAN be a less appropriate clustering algorithm compared to others?
- Bootstrapping involves resampling with replacement from the dataset to create "n" _________ datasets.
- A medical diagnosis AI system provides a diagnosis but does not give any rationale or reasoning behind it. What aspect of machine learning is this system lacking?