___________ refers to the condition where the variance of the errors or residuals is constant across all levels of the explanatory variables.

  • Autocorrelation
  • Heteroscedasticity
  • Homoscedasticity
  • Multicollinearity
Homoscedasticity is the condition in which the variance of the errors or residuals is constant across all levels of the explanatory variables. It is one of the key assumptions of linear regression.

How does adding more predictors to a multiple linear regression model affect its inferences?

  • It always improves the model
  • It always makes the model worse
  • It can lead to overfitting
  • It has no effect on the model
Adding more predictors to a model may increase the R-squared value, making it appear that the model is improving. However, if these additional predictors are not truly associated with the response variable, it may result in overfitting, making the model perform poorly on new, unseen data.

The type of data that describes attributes or characteristics of a group is called ________ data.

  • Continuous
  • Discrete
  • Qualitative
  • Quantitative
The type of data that describes attributes or characteristics of a group is called Qualitative data. These are often non-numeric and may include data types such as text, audio, or video. Examples include a person's gender, eye color, or the make of a car.

A Type II error occurs when we fail to reject the null hypothesis, even though it is _______.

  • FALSE
  • Not applicable
  • Not proven
  • TRUE
A Type II error occurs when we fail to reject the null hypothesis, even though it is false. This is also known as a "false negative" error.

Spearman's Rank Correlation is based on the ________ of the data rather than their raw values.

  • Means
  • Medians
  • Modes
  • Ranks
Spearman's Rank Correlation is based on the ranks of the data rather than their raw values, which makes it a non-parametric method.

How can a Chi-square test for independence be used in feature selection?

  • It can identify the features that are independent from the target variable
  • It can identify the features that are most correlated with the target variable
  • It can identify the features that have a significant association with the target variable
  • It can identify the features that have the highest variance
A Chi-square test for independence can be used in feature selection by identifying the features that have a significant association with the target variable.

What does it mean if two events are independent in probability?

  • The occurrence of one affects the occurrence of the other
  • The occurrence of one does not affect the occurrence of the other
  • They have the same probability of occurrence
  • They occur at the same time
In probability, two events are independent if the occurrence of one event does not affect the occurrence of the other. This means that the probability of both events occurring is the product of their individual probabilities.

What is the purpose of point estimation in statistics?

  • To calculate the variance of a dataset
  • To compare two different datasets
  • To estimate the range of possible values for an unknown population parameter
  • To give a single best guess of an unknown population parameter
The purpose of point estimation in statistics is to provide a single "best guess" or "most likely" value for an unknown parameter of a population, such as the mean or the proportion. It's a single value that approximates an unknown parameter based on sampled data.

What is the effect of multicollinearity on the power of a statistical test?

  • It decreases the power.
  • It has no effect on the power.
  • It increases the power.
  • It makes the power equal to one.
Multicollinearity can inflate the variance of the regression coefficients, thus widening the confidence intervals and reducing the power of the statistical test.

In a multiple linear regression equation, the ________ represents the expected change in the dependent variable for a one-unit change in the corresponding independent variable, holding all other independent variables constant.

  • F-statistic
  • R-squared value
  • regression coefficient
  • residual
In a multiple linear regression equation, the regression coefficient represents the expected change in the dependent variable for a one-unit change in the corresponding independent variable, while holding all other independent variables constant. It gives the direction and strength of the relationship between the dependent variable and each independent variable.