Explain the Bias-Variance tradeoff in the context of Cross-Validation.

  • Increasing k decreases bias but may increase variance
  • Increasing k decreases both bias and variance
  • Increasing k increases bias but decreases variance
  • Increasing k increases both bias and variance
The Bias-Variance tradeoff in the context of k-fold Cross-Validation refers to the balance between bias (error due to overly simplistic assumptions) and variance (error due to excessive complexity). Increasing k generally decreases bias since more data is used for training, but it may lead to an increase in variance as the validation set becomes more similar to the training set.
Add your answer
Loading...

Leave a comment

Your email address will not be published. Required fields are marked *