What are the potential drawbacks of using k-fold Cross-Validation?

  • Higher bias and low variance
  • Increase in computation time and potential leakage of validation into training
  • Lack of statistical estimation properties
  • No drawbacks
k-fold Cross-Validation can increase computational time as the model is trained k times on different subsets of the data. Also, improper implementation can lead to data leakage between validation and training sets. It generally provides a more unbiased estimate of model performance but comes at the cost of increased computation.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *