What does a residual represent in linear regression?
The predicted value of the dependent variable.
The slope of the regression line.
The intercept of the regression line.
The difference between the actual and predicted values of the dependent variable.
What does a high R-squared value indicate?
The model is not a good fit for the data.
The independent variables are not correlated with the dependent variable.
A large proportion of the variance in the dependent variable is explained by the independent variables.
The model is a perfect fit for the data.
What does the 'fit_intercept' parameter in 'LinearRegression()' control?
Whether to normalize the data before fitting.
Whether to calculate the intercept (bias) of the line.
Whether to calculate the slope of the line.
Whether to use gradient descent for optimization.
Why is normality of errors an important assumption in linear regression?
It ensures the linearity of the relationship between variables
It guarantees the homoscedasticity of the errors
It is necessary for the calculation of the regression coefficients
It validates the use of hypothesis testing for the model's coefficients
What is the purpose of the coefficient of determination (R-squared) in linear regression?
To measure the proportion of variation in the dependent variable explained by the independent variable(s).
To assess the linearity assumption of the model.
To identify the presence of outliers in the data.
To determine the statistical significance of the model.
If a Durbin-Watson test statistic is close to 2, what does it suggest about the residuals?
They are normally distributed
They are independent
They are homoscedastic
They exhibit a linear pattern
Which of the following is the general equation for a simple linear regression model?
y = b0 + b1x1 + b2x2 + ... + bn*xn
y = b0 * x^b1
y = b0 + b1*x + e
y = e^(b0 + b1*x)
Who is credited as a pioneer in developing the method of least squares, a foundational element of linear regression?
Carl Friedrich Gauss
Alan Turing
Blaise Pascal
Ada Lovelace
Who is credited with developing the foundational principles of linear regression?
Albert Einstein
Isaac Newton
Marie Curie
Sir Francis Galton
Which of the following indicates a strong positive correlation between two variables?
Correlation coefficient (r) close to 0
Correlation coefficient (r) close to 1
Correlation coefficient (r) close to -1
A p-value greater than 0.05