Simple Regression Analysis
Terms in this set (33)
Other relevant facts being equal: Hold all else constant
Simple Regression Model
Two-variable linear regression model or bivariate linear regression model
y=β0 + β1x + u
dependent variable, explained variable, response variable, predicted variable, or regressand
Independent variable, explanatory variable, control variable, predictor variable, regressor, covariate (x)
error term, disturbance term, represents all other factors other than x that affect y
slope parameter in the relationship between y and x, holding the other factors u fixed. The change in y is simply β1
If all other factors in U are held constant, that the change in U is 0 then x has a linear effect on y. ( Delta y=β1 delta x), one-unit change in x has the same effect of y regardless of the initial value of x
Assumption 2.5 (Expected value of u)
E (u)= 0, Statement of about the distribution of the unobservables in the population. The Average value of u in the population is zero. The value of u does not depend on the value of x.
Assumption 2.6 Zero Conditional mean assumption
E(u,x)=E(u)=0, Average value of the unobservables is the same across all slices of the population determined by the value of x. Example: Average ability must be the same for all education levels.
Population Regression Function
(2.8) E(y,x) is a linear function of x. The PRF gives you a relationship between the average level of y at different levels of x.
called the systematic part of y. Part explained by x. The unsystematic part, us is the part not explained by x.
The summation of (xi-xbar)squared is greater than zero.
Sample covariance between x and y, divided by he sample variance of x
residual for observation i is the difference between the actual yi and fitted value: Yi-Yhati=yi-βhat0-βhat1xi.
First Order Conditions for the OLS estimates
2.14 & 2.15. First order conditions and solutions to the OLS first order conditions are given by 2.17 and 2.19. Ordinary least squards comes from the fact that these estimates minimize the sum of squared residuals. (By an optimization problem using calculus).
OLS Regression Line
emphasizes that this simple regression function estimates the fixed population function. Slope estimate can be written as β1 hat =Change in y hat over change in x.
Fitted values and residuals
U hat=Yi-y hat i (actual population y value - predicted y value)
u hat > o then it under predicts
u hat < 0 than it over predicts
OLS Residuals is zero
The sum of the OLS residuals is zero, its chosen based on that property
Covariance of OLS Residual
sample covariance between the regressors and OLS residuals is zero
X bar, y bar
always on the OLS regression line
SST (Total sums of squares)
the squared differences of y values and y average value. Total sample variation (how spread out yi's are in the sample)
SSE (explained sum of squares)
Squared differences of predicted yi values and average y value. Measures sample variation in yhati
SSR (residual sum of squares)
Measures the sample variation in in the uhati.
Goodness of fit
R squared of the regression is the explained variation to the total. SSE/SST or 1-SSR/SST. Fraction of the sample variance that is explained by x
Incorporating Nonlinearities in simple regression (Constant elasticity model)
log model gives approximately a constant percentage effect Log(wage)=β0 + β1educ + u
log(wage)=5.84+.083 edu: means increase in 1 year of education gives a 8.3% increase in wage
Meaning of "linear" regression
No restrictions on y and x relates to the original explained and explanatory variables of interest. Linear in the parameters. exp of Non linear regressions βsquared or 1/β1
Unbiasedness of OLS (assumption SLR 1):
Linear in Parameters
Linear in Parameters, in the population model the DV is related to the IV and the error term as y=β0 + β1x + u
Unbiasedness of OLS (Assumption SLR 2) Random Sampling
We use a random sample of size n from the population model
Unbiasedness of OLS (Assumption SLR3) Zero conditional mean
E (u, x)=0. Random sampling allows us to derive the statistical properties of the OLS estimator as conditional on the values of xi in our sample. Ui and Xi are independent from one another
Unbiasedness of OLS (Assumption SLR4) Variation in IV
In the sample the IV's xi, are not all equal to the same constant which requires some variation in the population
Unbiasedness of OLS (Assumption SLR 5) Homoskedasticity
var (u,x) = sigma squared
sigma squared is the Unconditional variance of of u and so sigma squared is often called the error variance or disturbance variance. A larger sigma squared means that the distribution of unobservables affecting y is more spread out
Estimating Error variance
sigma squared is equal to SSR/(n-2). The number of parameters that you are estimating.
YOU MIGHT ALSO LIKE...
ASCP MLT/MLS Certification Exam (BOC) Preparation
Econometrics Part I (ch 1 - 3)
Econ 320 Quiz Questions
OTHER SETS BY THIS CREATOR
CSET Multiple Subjects: Subtest 1 Language and Linguistics
THIS SET IS OFTEN IN FOLDERS WITH...
Simple Regression: Key Terms
Multiple Regression Analysis
Ch 4 Interpreting Quantitative Data
CH 1 An Introduction to Ordinary Least Squares