Why might centering or scaling independent variables be insufficient to completely resolve multicollinearity?
It can make the model more complex and harder to interpret.
It only works for linear relationships between variables.
It requires a large sample size to be effective.
It doesn't address the fundamental issue of high correlations between the variables.
If a predictor has a p-value of 0.02 in a multiple linear regression model, what can you conclude?
The predictor has a practically significant effect on the outcome.
The predictor explains 2% of the variance in the outcome.
The predictor is not statistically significant.
The predictor is statistically significant at the 0.05 level.
Which of the following is a common method for addressing multicollinearity in multiple linear regression?
Ignoring the issue, as it has no impact on the model.
Removing one or more of the correlated predictor variables.
Transforming the outcome variable.
Increasing the sample size.
The performance of the Theil-Sen estimator can be sensitive to which characteristic of the data?
The presence of multicollinearity (high correlation between independent variables)
The non-normality of the residuals
The presence of categorical variables
The presence of heteroscedasticity (unequal variances of errors)
What does a Variance Inflation Factor (VIF) value greater than 10 generally suggest?
Severe multicollinearity
Heteroscedasticity
No multicollinearity
Perfect multicollinearity
What is a potential drawback of removing a highly correlated independent variable to deal with multicollinearity?
It may improve the model's overall fit but reduce its interpretability.
It has no drawbacks and is always the best solution.
It may result in a loss of valuable information and reduce the model's accuracy.
It may lead to an increase in the model's complexity.
Poisson regression, another type of GLM, is particularly well-suited for analyzing which kind of data?
Ordinal data with a specific order
Count data of rare events
Proportions or percentages
Continuous measurements
What is the role of feature selection in Polynomial Regression?
To increase the number of features used in the model to improve accuracy.
To visualize the relationship between the target variable and independent variables.
To reduce the model complexity by identifying and selecting the most relevant features.
To convert categorical variables into numerical variables.
What happens to the bias and variance of a linear regression model as the regularization parameter (lambda) increases?
Bias decreases, Variance decreases
Bias increases, Variance increases
Bias increases, Variance decreases
Bias decreases, Variance increases
When using Principal Component Analysis (PCA) as a remedy for multicollinearity, what is the primary aim?
To create new, uncorrelated variables from the original correlated ones
To introduce non-linearity into the model
To increase the sample size of the dataset
To remove all independent variables from the model