You have built an SVM for a binary classification problem but the model is overfitting. What changes can you make to the kernel or hyperparameters to improve the model?
- Change the kernel's color
- Change to a simpler kernel or adjust the regularization parameter 'C'
- Ignore overfitting
- Increase the kernel's complexity
Overfitting can be mitigated by choosing a simpler kernel or adjusting the regularization parameter 'C', allowing for a better balance between bias and variance.
How does DBSCAN handle outliers compared to other clustering algorithms?
- Considers them as part of existing clusters
- Ignores them completely
- Treats more isolated points as noise
- Treats them as individual clusters
DBSCAN has a unique way of handling outliers, treating more isolated points as noise rather than forcing them into existing clusters or forming new clusters. This approach allows DBSCAN to identify clusters of varying shapes and sizes while ignoring sparse or irrelevant points, making it more robust to noise and outliers compared to some other clustering methods.
What could be the potential problems if the assumptions of Simple Linear Regression are not met?
- Model May Become Biased or Inefficient
- Model May Overfit
- Model Will Always Fail
- No Impact on Model
If the assumptions of Simple Linear Regression are not met, the model may become biased or inefficient, leading to unreliable estimates. It may also affect the validity of statistical tests.
Ridge regularization adds a ________ penalty to the loss function, which helps to constrain the coefficients.
- L1
- L1 and L2
- L2
- nan
Ridge regularization adds an L2 penalty to the loss function, which helps to reduce the coefficients' magnitude without setting them to zero.
Imagine you are working with a large dataset, and the Elbow Method is computationally expensive. What alternative methods might you consider for determining the number of clusters?
- Double the number of centroids
- Gap Statistic, Silhouette Method
- Randomly choose the number of clusters
- Use the Elbow Method with reduced data
Alternatives like the Gap Statistic and Silhouette Method are used to determine the optimal number of clusters when the Elbow Method is computationally expensive. These methods consider cluster cohesion and separation without requiring extensive computations.
While R-Squared describes the proportion of variance explained by the model, ________ adjusts this value based on the number of predictors, providing a more nuanced understanding of the model's fit.
- Adjusted R-Squared
- MSE
- R-Squared
- RMSE
Adjusted R-Squared is an extension of R-Squared that adjusts the value based on the number of predictors in the model. While R-Squared describes the proportion of variance explained by the model, Adjusted R-Squared takes into account the complexity of the model by considering the number of predictors. This leads to a more nuanced understanding of the model's fit, particularly when comparing models with different numbers of predictors.
You are working on a binary classification problem, and your model is consistently predicting the majority class. What could be causing this issue and how would you approach resolving it?
- Data is corrupted; clean the data
- Ignoring the minority class; use resampling techniques
- Incorrect algorithm; change algorithm
- Too many features; perform feature selection
The issue could be due to imbalanced classes. Approaching it by using resampling techniques, such as oversampling the minority class or undersampling the majority class, can help balance the classes and improve the model's performance.
Increasing the regularization parameter in Ridge regression will ________ the coefficients but will not set them to zero.
- Decrease
- Increase
- Maintain
- nan
Increasing the regularization parameter in Ridge regression will shrink the coefficients towards zero but will not set them to zero, due to the L2 penalty.
Balancing the _________ in a training dataset is vital to ensure that the model does not become biased towards one particular outcome.
- classes
- features
- models
- parameters
Balancing the "classes" in a training dataset ensures that the model does not become biased towards one class, leading to a more accurate and fair representation of the data. This is especially crucial in classification tasks.
Overfitting in Polynomial Regression can be visualized by a graph where the polynomial curve fits even the _________ in the training data.
- accuracy
- linearity
- noise
- stability
A graph showing overfitting in Polynomial Regression will exhibit the polynomial curve fitting even the noise in the training data, not just the underlying trend.