-
Autocorrelation
?
In
the
regression
context,
the
classical
linear
regression
model
assumes
that
there
is
no
correlation
between
disturbance
terms.
Symbolically:
Co
v
(
u
i
,<
/p>
u
j
)
?
E
(
u
i
u
j
)
?
0
i
?
j
?
If there is
dependence between error terms, we have
autocorrelation.
Symbolically:
Cov
(
u
i
,
u
j
p>
)
?
0
?
Forms of autocorrelation:
(1)
first order
autocorrelation:
u
t
?
f
(
u
t
?
1
)
(2)
higher order
autocorrelation:
u
t
?
f
(
u
t
?
1
,
u<
/p>
t
?
2
,...
)
?
Usually
we
assume
that
autocorrelation
between
the
error
terms
are
linear,
namely:
u
t
?
a
1
u
t
< br>?
1
?
v
t
where
v
t
satisfies all the CLRM
assumptions
?
We
concentrate on the first order autocorrelation:
p>
?
1
?
u
t
?
a
1
u
t
?
1
< br>?
v
t
,
the
estimated
formula
for
a
1
is:
a
?
u
u
t
t
?<
/p>
2
T
T
t
?
1
2
?
u
t
?
2
.
The
t
?
1
coefficient of
autocorrelation is:
?
?
?
?
u
u
< br>t
t
?
2
T
2
t
t
?
2
T
t
?
p>
1
T
?
a
1
2
t
?
1
?
u
?
< br>u
t
?
2
t
?
The
first
order
autocorrelation
of
u
can
be
expressed
as:
u
p>
t
?
?
u
t
?
1
?
v
t
?
ρ
takes
the
value
between
[-1,1]
,
when
ρ
>0,
positively
autocorrelated
,
when
ρ
<0,
negatively
autocorrelated
?
If
u
t
is
first
order
autoregressive
(AR(1))
process,
it
has
mean,
variance and
covariance like the following:
E
(
u
t
)
?
E
(
?
u
t
?
1
?<
/p>
v
t
)
?
?
E
(
u
t
?
1
)
?
E
(
v
t
)
?
0
Var
(
u
t
)
?
E
(
p>
u
t
)
2
?
E
(
?
u
t
?
1
< br>?
v
t
)
2
?<
/p>
?
2
Var
(<
/p>
u
t
)
?
?
v
2
?
?
u
2
?
?
v
2
/(
< br>1
?
?
2
)
2
Cov
(
u
t
,
u
t
?
1
)
?
p>
E
(
u
t
u
t
?
1
)
?
E
[(
?
u
t
?
1
?
v
t
)
u
t
?
1<
/p>
]
?
??
u
p>
2
Cov
(
p>
u
t
,
u
t
?
s
)
?
?
s
?
< br>u
?
1
?
2
?
?
E
(
UU
'<
/p>
)
?
?
?
?
u
?
...
?
T
?
1
?
?
?
?
1
...
?
2
?
...
?
T
?
2
?
T
?
3
...
?
T
?
1
?
?
...
?
T
?
2
?
...
...
?
?
...
1
?
?
OLS estimation in the presence of autocorrelation
?
is still
unbiased
●
OLS estimator
?
1<
/p>
?
)
is
not minimum any more
●
Var
(
?
1
p>
T
T
?
)
?
E
(
?
?
?
?
)
< br>?
E
(
x
u
/
x
2
)
2
Var
(
?
?
t
t
?
p>
t
1
1
1
2
t
?
1
t
?
1
?
(
1
2
x
?
p>
t
t
?
1
T
)
2
E
(
x
1
u
< br>1
?
x
2
u
2
?
...
x
T
u
T
)
2
x
t
x
p>
s
?
?
(
t
?
1
T
p>
x
t
?
x
t
?
1
T
2
t
)
E
< br>(
u
)
?
2
?
2
2
t
t
?
s
?
p>
x
t
?
1
T
E
(
u
t
u
s
)
< br>2
t
●
When
u
t
and
X
t
are
both
positively
autocorrelated,
variance
of
?
is
underestimated.
OLS
estimator
does
not
have
minimum
?
1
variance
Detection of
autocorrelation
1.
Graphical method
(1)
run
regression for the given data, calculate residual
figure for residuals
(2)
analyze
residual figures (figures)
?
t
, draw the
u
2.
DW
(Durbin-Watson test) test
●
Durbin
–
Watson d
statistic:
d
?
?
?
(
u
t
?
2
n
t
?
t
?
1
)
2
?
u
2
t
?
?
u<
/p>
t
?
2
t
?
n
●
D
statistic
is
simply
the
ratio
of
the
sum
of
squared
difference
in
successive
residuals
to
the
RSS.
A
great
advantage
of
d
statistic:
simple
●
Assumptions underlying d statistic:
(1)
(2)
The
regression model includes the intercept terms
The explanatory variables, the
X
’
s, are non stochastic, or
fixed in
repeated samplings
(3)
The
disturbances U
t
are
generated by the first order autoregressive
scheme:
u
t
?
?
u
t
?
1
?
?
t
(4)
(5)
The error
term U
t
is assumed to be
normally distributed
The
regression
model
does
not
include
the
lagged
values
of
dependent variable as one
of the explanatory variables
(6)
There are no
missing observations in the data
●
The probability
distribution of d statistic depends in a
complicated way
on the X values present
in a given sample
●
There
is
no
unique
critical
value
that
will
lead
to
the
rejection
or
acceptance
of
the
null
hypothesis
that
there
is
no
first
order
serial
correlation in the
disturbances U
t
●
DW critical
value:
2
2
?
?
?
?
u
?
u
?
?
t
t
?
1
?<
/p>
2
?
u
t
u
t
?
1
d
?
2
?
u
?
t
?
u
?
u
?
d
?
2
(
1<
/p>
?
?
?
u
t
t
?
1
2
t
)
Postive
Incon-
No Serial
Incon-
Negative
Correlation
clusion
Serial
Correlation
Serial
clusion
Correlation
,
0
d
L
d
U
2
4-d
U
4-d
L
4
?
u
p>
?
u
?
?
?
Now let
’
s
define
?
?
?
u
t
t
?
1
2
t
So
?
)
,
0
?
d
?
p>
4
d
?
2
(
1
?
?
?
?
0
< br>,
d=2, there is no serial
correlation
●
If
?
●
The
closer
d
is
to
0,
the
greater
the
evidence
of
positive
serial
correlation
●
The
closer
d
is
to
4,
the
greater
the
evidence
of
negative
serial
correlation
●
The mechanics of Durbin Watson test are
as follows: (1) run the OLS
regression
and obtain the residuals. (2) Compute d statistic.
(3) For the
given sample size and given
number of explanatory variables, find out
the critical d
L
and d
u
values.
(4) follow the decision rules to reject or
accept the null
●
The d test has a great drawback in
that, if it falls in the indecisive zone,
one can not conclude that first order
autocorrelation does or does not
exit
-
-
-
-
-
-
-
-
-
上一篇:核磁共振中常用的英文缩写和中文名称
下一篇:餐单翻译