Under what circumstances can the mode of a data set be irrelevant or misleading?
- When the data is continuous
- When the data set is large
- When the data set is small
- When there are multiple modes
The mode can be misleading or irrelevant especially with continuous data. Since the mode is the most frequently occurring value, with continuous data the frequency of each value is often the same (i.e., 1), hence it becomes difficult to define a mode in a traditional sense.
A Variance Inflation Factor (VIF) greater than 5 indicates a high degree of _______ among the predictors.
- correlation
- distribution
- multicollinearity
- variance
A VIF greater than 5 is often taken as an indication of high multicollinearity among the predictors in a regression model. This could lead to imprecise and unreliable estimates of the regression coefficients.
How does the 'elbow method' help in determining the optimal number of clusters in K-means clustering?
- By calculating the average distance between all pairs of clusters
- By comparing the silhouette scores for different numbers of clusters
- By creating a dendrogram of clusters
- By finding the point in the plot of within-cluster sum of squares where the decrease rate sharply shifts
The elbow method involves plotting the explained variation as a function of the number of clusters and picking the elbow of the curve as the number of clusters to use. This 'elbow' is the point representing the optimal number of clusters at which the within-cluster sum of squares (WCSS) doesn't decrease significantly with each iteration.
The bin width (and thus number of categories or ranges) in a histogram can dramatically affect the ________, skewness, and appearance of the histogram.
- Interpretation
- Mean
- Median
- Mode
The bin width and the number of bins in a histogram can dramatically affect the interpretation, skewness, and overall appearance of the histogram. This is because the choice of bin size can influence the level of detail visible in the histogram, potentially either obscuring or highlighting certain patterns in the data.
In PCA, if two variables are similar, they will have _______ loadings on the same component.
- high
- low
- opposite
- random
In PCA, if two variables are similar or highly correlated, they will have high loadings on the same component. This is because PCA identifies the directions (Principal Components) in which the data varies the most, and similar variables will contribute to this variance in the same way.
What is the impact of heteroscedasticity on a multiple linear regression model?
- It affects the linearity of the model
- It affects the normality of the residuals
- It causes multicollinearity
- It invalidates the statistical inferences that could be made from the model
Heteroscedasticity, or non-constant variance of the error term, can invalidate statistical inferences that could be made from the model because it violates one of the assumptions of multiple linear regression. This could lead to inefficient estimation of the regression coefficients and incorrect standard errors, which in turn affects confidence intervals and hypothesis tests.
What is the impact of data transformation on the decision to use non-parametric tests?
- A suitable data transformation may make it possible to use a parametric test
- Data transformation always leads to non-parametric tests
- Data transformation always makes data normally distributed
- Data transformation does not affect the choice between parametric and non-parametric tests
A suitable data transformation may make it possible to use a parametric test instead of a non-parametric test. Transformations can help to stabilize variances, normalize the data, or linearize relationships between variables, allowing for the use of parametric tests that might have more statistical power.
A positive Pearson's Correlation Coefficient indicates a ________ relationship between two variables.
- inverse
- linear
- perfect
- positive
A positive Pearson's Correlation Coefficient indicates a positive relationship between two variables. This means that as one variable increases, the other variable also increases, and vice versa.
What are the assumptions made in simple linear regression?
- Homogeneity, normality, and symmetry
- Independence, homogeneity, and linearity
- Linearity, homoscedasticity, and normality
- Symmetry, linearity, and independence
The assumptions made in simple linear regression include linearity (the relationship between the independent and dependent variables is linear), homoscedasticity (the variance of the residuals is constant across all levels of the independent variable), and normality (the residuals are normally distributed).
Principal Component Analysis (PCA) is a dimensionality reduction technique that projects the data into a lower dimensional space called the _______.
- eigen space
- feature space
- subspace
- variance space
PCA is a technique that projects the data into a new, lower-dimensional subspace. This subspace consists of principal components which are orthogonal to each other and capture the maximum variance in the data.