You need to improve the performance of a weak learner. Which boosting algorithm would you select, and why?

  • AdaBoost
  • Any boosting algorithm will suffice
  • Gradient Boosting without considering the loss function
  • Random Boosting
AdaBoost is a boosting algorithm designed to improve the performance of weak learners. By adjusting the weights of misclassified instances and focusing on them in subsequent models, AdaBoost iteratively corrects errors and enhances the overall model's performance.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *