Exercises # 6 6.1. Let {Xt} be a stationary process with mean 0. Define
Pn = PL(X1,...,Xn) and Xˆn+1= PnXn+1. (a) Show that, for h > 1, PnXn+h= PnXˆn+h.
(b) Show that
PnXn+h=
n+h−1
X
j=h
ϑn+h−1,j(Xn+h−j− ˆXn+h−j) where ϑn,j are the coefficients determined by the innovations algorithm.
(c) Let {Xt} be a causal ARMA(p,q) process, and let n > max{p, q}. Show that
PnXn+h=
h−1
X
j=1
ϕjPnXn+h−j+
p
X
j=h
ϕjXn+h−j+
q
X
j=h
ϑn+h−1,j(Xn+h−j− ˆXn+h−j)
where the sums are considered empty if the lower index is larger than the upper, and ϕj≡ 0 for j > p.
(d) The previous formula is a recursive method for computing PnXn+hstarting from h = 1.
Write explicitely the formula for PnXn+2 in case of an ARMA(1,1) process.
(e) Define σ2(h) = E(Xn+h−PnXn+h)2the h-step prediction error. Prove for an ARMA(p,q) process
σ2(2) = vn+1+ (ϕ1+ ϑn+1,1)2vn for n > max{p, q}
Hint: compute σ2(2) by writing Xn+2− PnXn+2= (Xn+2− ˆXn+2) + ( ˆXn+2− PnXn+2).
6.2. Let ˜Pn the projection on Mn the linear space generated by {Xj, j ≤ n}.
Assume that {Xt} is a causal and invertible ARMA(p,q) process, i.e. we can write
Xt=
∞
X
j=0
ψjZt−j Zt= Xt+
∞
X
j=1
πjXt−j.
(a) Show that ˜PnXn+hcan be computed recursively from P˜nXn+h= −
h−1
X
j=1
πjP˜nXn+h−j−
∞
X
j=h
πjXn+h−j.
(b) Show also that
P˜nXn+h=
∞
X
j=h
ψjZn+h−j.
(c) Conclude that
σ˜2(h) = E(Xn+h− ˜PnXn+h)2= σ2
h−1
X
j=0
ψj2.
6.3. Let {Xt} be a stationary process with mean 0. Define ui(t) (i ≤ t < n) be the difference between the value of Xn−t+i and the best linear prediction using the previous i values, i.e.
ui(t) = Xn−t+i− PL(Xn−t,...,Xn−t+i−1)Xn−t+i.
Symmetrically, let vi(t) (1 ≤ t < n) be the difference between the value of Xn−t and the best linear prediction using the following i values, i.e.
vi(t) = Xn−t− PL(Xn−t+1,...,Xn−t+i)Xn−t
For i = 0, one defines u0(t) = v0(t) = Xn−t.
(a) Show, using the definition of projection, that
ui(t) = Xn−t+i−
i
P
k=1
φi,kXn−t+i−k vi(t) = Xn−t−
i
P
k=1
φi,kXn−t+k
and derive the equations satisfied by the coefficients φi,j.
(b) Remembering that the coefficients φi,j can be chosen according to Durbin-Levinson algorithm that yields for j = 1, . . . , i − 1 (i > 1)
φi,j= φi−1,j− φi,iφi−1,i−j
show that one obtains the recursion
ui(t) = ui−1(t − 1) − φi,ivi−1(t) vi(t) = vi−1(t) − φi,iui−1(t − 1), 1 ≤ i ≤ t < n. (1) (c) Define
d(i) =
n−1
X
t=2
(u2i(t − 1) + vi2(t))
and
σi2= 1 2(n − 1)
n−1
X
t=i
(u2i(t) + v2i(t))
Show, using (1), that
σ2i = 1
2(n − 1) d(i − 1) − 4φi,i
n−1
X
t=i
ui−1(t − 1)vi−1(t) + φ2i,id(i − 1)
! .
(d) Compute the minimum of σ2i with respect to φi,i showing that it is obtained at
φBi,i= 2 d(i − 1)
n−1
X
t=i
ui−1(t−1)vi−1(t) and (σ2i)B= 1
2(n − 1)d(i−1) 1 − (φBi,i)2 (2)
[the superscript B stands for Burg].
(e) Show that one obtains finally the iteration
d(i) = d(i − 1) 1 − (φBi,i)2 − u2i(n − 1) − v2i(i).
(f) Show that φB1,1 is very close to ˆρ(1) and specify the difference. Analogously show that φB2,2 is an estimate of α(2), the partial correlation coefficient.
6.4. (5.11 of ITSM ) Given two observations x1and x2 from the causal AR(1) process satisfying Xt= ϕXt−1+ Zt, {Zt} ∼ W N (0, σ2)
and assuming that |x1| 6= |x2|, find the maximum likelihood estimates of ϕ and σ2.
6.5. (5.12 of ITSM ) Derive a cubic equation for the maximum likelihood estimate of the coefficient ϕ of a causal AR(1) process based on the observations X1, . . . , Xn.