How does Polynomial Regression differ from Simple Linear Regression?

  • It fits a polynomial curve
  • It fits a straight line
  • It is used only for classification
  • It uses more variables
While Simple Linear Regression fits a straight line to the data, Polynomial Regression fits a polynomial curve, allowing for more flexibility in modeling non-linear relationships.

Autonomous vehicles rely on Machine Learning algorithms for tasks like ____________ and ____________.

  • Disease Prediction, Weather Forecasting
  • Object Detection, Path Planning
  • Risk Management, Drug Development
  • Text Classification, Fraud Detection
Autonomous vehicles use Machine Learning for Object Detection and Path Planning, recognizing obstacles and determining optimal routes.

In what scenario would the AUC be a more informative metric than simply using Accuracy?

  • When the class distribution is balanced
  • When the class distribution is imbalanced
  • When the model has only one class
  • nan
The AUC (Area Under the Curve) of the ROC Curve can be more informative than Accuracy when dealing with imbalanced class distribution. It provides a more holistic measure of the model's ability to discriminate between positive and negative classes, unlike Accuracy, which may be skewed.

How is Deep Learning different from traditional Machine Learning techniques?

  • Deep Learning focuses on neural networks with multiple layers
  • Deep Learning requires less data
  • Deep Learning uses shallower models
  • Deep Learning uses simpler algorithms
Deep Learning differs from traditional Machine Learning by using neural networks with multiple layers, enabling the analysis of more complex patterns.

Explain how the ElasticNet regression combines the properties of Ridge and Lasso regression.

  • By alternating between L1 and L2 regularization
  • By using a weighted average of L1 and L2
  • By using both L1 and L2 regularization
  • By using neither L1 nor L2 regularization
ElasticNet regression combines the properties of Ridge and Lasso by using both L1 and L2 regularization. This hybrid approach combines Lasso's ability to perform feature selection with Ridge's ability to handle multicollinearity, providing a balance that can be fine-tuned using hyperparameters.

Explain the Variance Inflation Factor (VIF) and its role in detecting multicollinearity.

  • Measure of how much the variance of an estimated coefficient increases when predictors are correlated
  • Measure of model complexity
  • Measure of model's fit
  • Measure of residual errors
VIF quantifies how much the variance of an estimated regression coefficient increases when predictors are correlated. A high VIF indicates multicollinearity, potentially affecting the model's stability.

How does linear regression differ from nonlinear regression?

  • They differ in the accuracy of predictions
  • They differ in the complexity of the model
  • They differ in the number of outputs
  • They differ in the number of variables used
Linear regression assumes a linear relationship between the dependent and independent variables, while nonlinear regression can model more complex relationships that are not strictly linear.

How is the R-Squared value used in assessing the performance of a regression model?

  • Measures the error variance
  • Measures the explained variance ratio
  • Measures the model's complexity
  • Measures the total sum of squares
The R-Squared value, also known as the coefficient of determination, measures the ratio of the explained variance to the total variance. It provides a statistical measure of how well the regression line approximates the real data points, with a value between 0 and 1. A higher R-Squared value indicates that more of the variance is captured by the model.

How do you assess the fit of a Logistic Regression model?

  • Accuracy only
  • Precision and recall only
  • R-squared only
  • Using metrics such as AUC-ROC, confusion matrix, log-likelihood, etc.
The fit of a Logistic Regression model can be assessed using various metrics, including the AUC-ROC curve, confusion matrix, log-likelihood, and other classification metrics that consider both the positive and negative classes.

A business stakeholder asks you to explain the interaction effect found in a Multiple Linear Regression model built for sales prediction. How would you explain this in non-technical terms?

  • Explain that one variable's effect depends on another variable
  • Ignore the question
  • Provide raw data
  • Use technical jargon
You could explain the interaction effect by stating that the effect of one variable on sales depends on the level of another variable. For example, the effect of advertising on sales might depend on the season, and the interaction term captures this dependency in the model.