What are the disadvantages of using backward elimination in feature selection?

  • It assumes a linear relationship
  • It can be computationally expensive
  • It can result in overfitting
  • It's sensitive to outliers
Backward elimination in feature selection involves starting with all variables and then removing the least significant variables one by one. This process can be computationally expensive, especially when dealing with datasets with a large number of features.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *