What are the disadvantages of using backward elimination in feature selection?
- It assumes a linear relationship
- It can be computationally expensive
- It can result in overfitting
- It's sensitive to outliers
Backward elimination in feature selection involves starting with all variables and then removing the least significant variables one by one. This process can be computationally expensive, especially when dealing with datasets with a large number of features.
Loading...
Related Quiz
- You're working on a medical diagnosis problem where interpretability is crucial. How might you approach feature selection?
- In a model-based imputation, the choice of the model has a direct impact on the ____________ of the imputation process.
- What is the role of Principal Component Analysis (PCA) in handling Multicollinearity?
- Describe the impact of skewness and kurtosis on parametric testing.
- What is the underlying JavaScript library that Plotly uses to render its graphics?