What does the 'fit_intercept' parameter in 'LinearRegression()' control?
Whether to calculate the intercept (bias) of the line.
Whether to calculate the slope of the line.
Whether to normalize the data before fitting.
Whether to use gradient descent for optimization.
Why is normality of errors an important assumption in linear regression?
It validates the use of hypothesis testing for the model's coefficients
It ensures the linearity of the relationship between variables
It is necessary for the calculation of the regression coefficients
It guarantees the homoscedasticity of the errors
What does the linearity assumption in linear regression imply?
The data points are evenly distributed around the regression line.
The independent variables are unrelated to each other.
The relationship between the dependent and independent variables can be best represented by a straight line.
The dependent variable must have a normal distribution.
A positive coefficient of the independent variable in a simple linear regression model indicates what?
As the independent variable increases, the dependent variable tends to increase.
As the independent variable increases, the dependent variable tends to decrease.
There is no relationship between the independent and dependent variables.
The independent variable has no impact on the dependent variable.
Which Python library is primarily used for numerical computing and provides support for arrays and matrices, essential for Linear Regression calculations?
matplotlib
NumPy
pandas
scikit-learn
Which of the following indicates a strong positive correlation between two variables?
Correlation coefficient (r) close to 0
Correlation coefficient (r) close to 1
A p-value greater than 0.05
Correlation coefficient (r) close to -1
If the coefficient of determination (R-squared) for a linear regression model is 0.8, what does this indicate?
20% of the variation in the dependent variable is explained by the independent variable.
80% of the variation in the dependent variable is explained by the independent variable.
The model is a poor fit for the data.
There is a weak relationship between the independent and dependent variables.
If a Durbin-Watson test statistic is close to 2, what does it suggest about the residuals?
They are normally distributed
They are independent
They exhibit a linear pattern
They are homoscedastic
In the context of linear regression, what is an error term?
The difference between the slope and the intercept of the regression line.
The variation in the independent variable.
A mistake made in collecting or entering data.
The difference between the observed value of the dependent variable and the predicted value.
How does the Mean Squared Error (MSE) penalize larger errors compared to smaller errors?
It squares the errors, giving more weight to larger deviations.
It uses a logarithmic scale to compress larger errors.
It doesn't; all errors are penalized equally.
It takes the absolute value of the errors, ignoring the sign.