What are the limitations of using R-Squared as the sole metric for evaluating the goodness of fit in a regression model?
- R-Squared always increases with more predictors; doesn't account for bias
- R-Squared always increases with more predictors; doesn't penalize complexity in the model
- R-Squared is sensitive to outliers; doesn't consider the number of predictors
- R-Squared provides absolute error values; not suitable for non-linear models
One major limitation of R-Squared is that it always increases with the addition of more predictors, regardless of whether they are relevant. This can lead to overly complex models that don't generalize well. R-Squared doesn't penalize for complexity in the model, making it possible to achieve a high R-Squared value with an overfitted model. It might not always be the best sole metric for assessing the goodness of fit.
Loading...
Related Quiz
- Clustering can be used in _________ analysis to find patterns and similarities in large datasets, facilitating targeted marketing strategies.
- How do training techniques differ between traditional Machine Learning and Deep Learning?
- Can you discuss the geometric interpretation of Eigenvectors in PCA?
- Random Forests introduce randomness in two main ways: by bootstrapping the data and by selecting a random subset of ______ for every split.
- One of the challenges in training deep RNNs is the ________ gradient problem, which affects the network's ability to learn long-range dependencies.