Search
Create
Log in
Sign up
Log in
Sign up
Week 5 - Introduction to Autocorrelation
STUDY
Flashcards
Learn
Write
Spell
Test
PLAY
Match
Gravity
Terms in this set (14)
E(u) = 0
var(u) = ....
var(u) = Ω
if the gauss markov assumptions hold then Ω = .....
Ω = σ^2 I
Ω = σ^2 I
What does the variance covariance matrix look like?
v(u1) - v(ut) on the diagonal
c(u1u2) - c(u1,ut) on the off diagonals
variances on the diagonal
covariance lag on the off diagonals
with heteroskedasticity, Ω looks like .....
v(u1) v(ut) on the diagonal
0s on the off diagonal
What is autocorrelation?
autocorrelation in u is a breach of which assumption?
ts5
cov(ut,ut-s|xt,xt-s) ≠ 0
What is autocorrelation?
cov(ut,ut-s|xt,xt-s) ≠ 0
what does autocorrelation imply for the variance covariance matrix Ω?
v(u1) - var(ut) on diagonal
c(u1,ut)s do not fall away to 0 (off diagonals)
What is autocorrelation?
- autocorrelation implies that v(u1) - var(ut) on diagonal
c(u1,ut)s do not fall away to 0 (off diagonals)
- therefore, Ω ≠ ....
Ω ≠ σ^2 I
What is autocorrelation?
Ω ≠ σ^2 I
this implies what? (2)
1. β^OLS is not efficient (BLUE)
2. Standard variance formula for β^ is invalid
Ω ≠ σ^2 I
this implies 1. β^OLS is not efficient (BLUE)
2. Standard variance formula for β^ is invalid
- we therefore need to create a structure for Ω that overcomes these two issues
Structure of Ω -
assume ut follows an AR(1) process
ut = 0 + Put-1 + εt εt ~ iid (0,σ^2ε)
* for an AR(1) process, E(ut) = .........
E(ut) = P / 1-P = 0
Structure of Ω -
assume ut follows an AR(1) process
ut = 0 + Put-1 + εt εt ~ iid (0,σ^2ε)
* for an AR(1) process,
VAR(UT) = ........
var(ut) = σ^2 = σ^2 / 1-P^2
Structure of Ω -
assume ut follows an AR(1) process
ut = 0 + Put-1 + εt εt ~ iid (0,σ^2ε)
* for an AR(1) process,
corr(ut,ut-1) = ....
corr(ut,ut-2) = .....
cov(ut,ut-1) = ....
cov(ut,ut-2) = .....
corr(ut,ut-1) = P
corr(ut,ut-2) = P^2
cov(ut,ut-1) = P * σ^2
cov(ut,ut-2) = P^2 * σ^2
Structure of Ω -
assume ut follows an AR(1) process
ut = 0 + Put-1 + εt εt ~ iid (0,σ^2ε)
* for an AR(1) process,
corr(ut,ut-1) = P
corr(ut,ut-2) = P^2
cov(ut,ut-1) = P * σ^2
cov(ut,ut-2) = P^2 * σ^2
what does the variance covariance matrix of Ω look like?
Ω = (variacnes on diagonal, cov lag on off diagonal ;σ^2 on diagonal, p*σ^2 On off diagonal
Ω = (variacnes on diagonal, cov lag on off diagonal ;σ^2 on diagonal, p*σ^2 On off diagonal
how can this be simplified?
Ω = σ^2 ( 1 diagonal, P, p^2 on offdiagonals)
Ω = σ^2 ( 1 diagonal, P, p^2 on offdiagonals)
this assumes no heteroskedasticity (constant variance) and is the basis of designing what?
designing robust inferences for autocorrelations
;