Why might one prefer to use MAE over MSE in evaluating a regression model?
- MAE considers the direction of errors
- MAE gives more weight to larger errors
- MAE is less sensitive to outliers
- MAE is more computationally expensive
One might prefer to use Mean Absolute Error (MAE) over Mean Squared Error (MSE) because MAE is less sensitive to outliers. While MSE squares the differences and thus gives more weight to larger errors, MAE takes the absolute value of the differences, providing an equal weighting. This makes MAE more robust when there are outliers or when one doesn't want to overly penalize larger deviations from the true values.
Loading...
Related Quiz
- In clustering problems where the assumption is that...
- Imagine you're using DBSCAN for spatial data clustering, but the clusters are not forming as expected. What steps would you take to analyze and fix the situation?
- You have a dataset with clusters of varying densities. How would you configure the Epsilon and MinPts in DBSCAN to handle this?
- In reinforcement learning, the agent learns a policy which maps states to ________.
- Underfitting occurs when a model is too _________ and fails to capture the underlying trend of the data.