What does the assumption of independence in linear regression refer to?
Independence between the coefficients of the regression model
Independence between the independent and dependent variables
Independence between the errors and the dependent variable
Independence between the observations
A positive coefficient of the independent variable in a simple linear regression model indicates what?
As the independent variable increases, the dependent variable tends to decrease.
The independent variable has no impact on the dependent variable.
As the independent variable increases, the dependent variable tends to increase.
There is no relationship between the independent and dependent variables.
What is the purpose of splitting the dataset into training and testing sets in Linear Regression?
To visualize the relationship between variables.
To reduce the dimensionality of the data.
To evaluate the model's performance on unseen data.
To handle missing values in the dataset.
What does a high R-squared value indicate?
The model is a perfect fit for the data.
A large proportion of the variance in the dependent variable is explained by the independent variables.
The model is not a good fit for the data.
The independent variables are not correlated with the dependent variable.
What does the 'fit_intercept' parameter in 'LinearRegression()' control?
Whether to calculate the slope of the line.
Whether to calculate the intercept (bias) of the line.
Whether to normalize the data before fitting.
Whether to use gradient descent for optimization.
What distinguishes simple linear regression from multiple linear regression?
Simple linear regression analyzes categorical data, while multiple linear regression analyzes numerical data.
Simple linear regression uses a curved line, while multiple linear regression uses a straight line.
Simple linear regression has one independent variable, while multiple linear regression has two or more.
There is no difference; the terms are interchangeable.
If a Durbin-Watson test statistic is close to 2, what does it suggest about the residuals?
They are independent
They exhibit a linear pattern
They are normally distributed
They are homoscedastic
What is the main difference between forward selection and backward elimination in linear regression?
Forward selection starts with no features and adds one by one, while backward elimination starts with all features and removes one by one.
Forward selection is used for classification, while backward elimination is used for regression.
Forward selection starts with all features and removes one by one, while backward elimination starts with no features and adds one by one.
There is no difference; both techniques achieve the same outcome.
In the context of linear regression, what is an error term?
The variation in the independent variable.
The difference between the observed value of the dependent variable and the predicted value.
A mistake made in collecting or entering data.
The difference between the slope and the intercept of the regression line.
Which of the following is NOT a benefit of feature selection in linear regression?
Improved model interpretability
Potential for better generalization to new data
Reduced computational cost
Increased risk of overfitting