Statistical models A.Y. 2014/15
Written exam of March 27, 2015.
1. Consider the general linear model Y = Xβ + E, where Y = (y1, . . . , yn) is a vector in Rn, X is a matrix n × p and E are random variables with mean 0.
(a) Show that ˆβ = (XtX)−1XtY is an unbiased estimator of β. Compute also V ( ˆβ). This choice is generally called the least square method; what does this exactly mean?
(b) Two possible motivations for the choice of ˆβ are maximum likelihood estimation, or Gauss-Markov theorem. State precisely the results (proofs are not necessary, but, if time allows, they are welcome) and the assumptions on the errors E for either result.
(c) Let ˆε = Y − X ˆβ the observed residuals; prove that V (ˆεi) = σ2(1 − Hii) where H = X(XtX)−1Xt and σ2 = V (Ej) for each j. Explain why, if Hii is close to 1 for some value of i, this results makes us expect that ˆyiwill be close to yi.
(d) Are ˆεiand ˆεj independent for i 6= j? [Hint: check the proof of the previous result]
2. (a) Write down the matrix X and the vector β in order to set the linear regression yi = a + bxi+ εi, i = 1 . . . n as a general linear model Y = Xβ + E.
(b) How can be obtain a confidence interval for the parameter b? Write down the elements of the formula, not necessarily the explicit computation.
(c) When is the matrix (XtX) not invertible?
(d) Assume that the number of points n = 10 and there exists a qualitative variable Z such that Zi = A for i = 1 . . . 4 and Zi = B for i = 5 . . . 10. Write down the model
yi =
(µA+ bxi+ εi if Zi = A µB+ bxi+ εi if Zi = B
as a general linear model specifying the matrix X and the vector β (there are several ways to do this, but there is one that is preferable if we then wish to test the hypothesis µA= µB).
3. Write down, if possible, the following models as linear models, possibly after some transfor- mation? In all cases εi represent independent and equidistributed error terms, E(εi) = 0; a, b, c . . . parameters to be estimated.
(a) yi =
(a + b(xi− 30) + εi if xi < 30 a + c(30 − xi) + εi if xi > 30 (b) yi = axbi(1 + εi)
(c) yi = a + bxi xi+ c + εi
(d) yi = a cos(ωxi+ ϕ) + b sin(ωxi + ϕ) + εi
1