What type of data is particularly well-suited for analysis using hierarchical linear models?
Nested data
Cross-sectional data
Time series data
Experimental data
What is a common consequence of autocorrelation in linear regression?
Heteroscedasticity
Inflated standard errors of coefficients
Reduced model fit
Biased coefficient estimates
What is the primary reason multicollinearity poses a problem in linear regression?
It inflates the variance of the regression coefficients, making them unreliable.
It makes the model too complex.
It reduces the model's predictive accuracy on new data.
It violates the assumption of linearity between the dependent and independent variables.
What is the primary advantage of using Adjusted R-squared over R-squared when evaluating linear regression models?
Adjusted R-squared always increases when new predictors are added.
Adjusted R-squared penalizes the inclusion of irrelevant variables.
Adjusted R-squared is easier to interpret than R-squared.
Adjusted R-squared is less sensitive to outliers compared to R-squared.
What does a Variance Inflation Factor (VIF) value greater than 10 generally suggest?
No multicollinearity
Severe multicollinearity
Perfect multicollinearity
Why might centering or scaling independent variables be insufficient to completely resolve multicollinearity?
It only works for linear relationships between variables.
It can make the model more complex and harder to interpret.
It doesn't address the fundamental issue of high correlations between the variables.
It requires a large sample size to be effective.
Which of the following is a common method for addressing multicollinearity in multiple linear regression?
Transforming the outcome variable.
Removing one or more of the correlated predictor variables.
Increasing the sample size.
Ignoring the issue, as it has no impact on the model.
How does the Variance Inflation Factor (VIF) quantify multicollinearity?
By determining the difference between the predicted and actual values of the dependent variable
By measuring the change in R-squared when an independent variable is added to the model
By calculating the proportion of variance in one independent variable explained by all other independent variables
By measuring the correlation between two independent variables
If we add more independent variables to a linear regression model, what will happen to the R-squared value?
Always decrease
Remain the same
Depend on the significance of the added variables
Always increase
Which metric penalizes large errors more heavily than smaller errors, making it particularly sensitive to outliers?
R-squared
Root Mean Squared Error (RMSE)
Adjusted R-squared
Mean Absolute Error (MAE)