Which assumption of linear regression ensures that the relationship between the independent and dependent variables is linear?
Independence
Linearity
Normality of errors
Homoscedasticity
What is the method used in linear regression to estimate the model parameters that minimize the sum of squared errors?
Bayesian Estimation
Maximum Likelihood Estimation
Method of Moments
Least Squares Estimation
What is the main difference between forward selection and backward elimination in linear regression?
Forward selection is used for classification, while backward elimination is used for regression.
Forward selection starts with no features and adds one by one, while backward elimination starts with all features and removes one by one.
There is no difference; both techniques achieve the same outcome.
Forward selection starts with all features and removes one by one, while backward elimination starts with no features and adds one by one.
A positive coefficient of the independent variable in a simple linear regression model indicates what?
As the independent variable increases, the dependent variable tends to increase.
There is no relationship between the independent and dependent variables.
The independent variable has no impact on the dependent variable.
As the independent variable increases, the dependent variable tends to decrease.
If a Durbin-Watson test statistic is close to 2, what does it suggest about the residuals?
They are homoscedastic
They exhibit a linear pattern
They are normally distributed
They are independent
Who is credited as a pioneer in developing the method of least squares, a foundational element of linear regression?
Carl Friedrich Gauss
Blaise Pascal
Alan Turing
Ada Lovelace
Which matplotlib function is commonly used to plot the regression line along with the scatter plot of the data?
plot()
show()
scatter()
hist()
Backward elimination in linear regression involves removing features based on what criterion?
The feature with the highest correlation with the target variable
The feature that results in the smallest decrease in model performance
The feature with the lowest p-value
The feature that contributes the least to multicollinearity
Who is credited with developing the foundational principles of linear regression?
Marie Curie
Isaac Newton
Albert Einstein
Sir Francis Galton
Which of the following is NOT a benefit of feature selection in linear regression?
Potential for better generalization to new data
Improved model interpretability
Increased risk of overfitting
Reduced computational cost