MAT 3379 - Winter 2016 Introduction to Time Series Analysis Study Guide for Midterm You will be allowed to have one A4 sheet (one-sided) of notes date: Monday, Febraury 29, Midterm 1 Topics 1 Evaluate covariance function in simple models - see Q1, Q2, Q3 in Assignment 1; 2 Check if ARMA model is stationary and causal - see Q4 in Assignment 1; 3 Derive the linear representation for AR(1), ARMA(1, q) - Q1 in Assignment 2; Examples in Lecture Notes; IMPORTANT: Derive means that you have to start with the definition of the model and end up with the final formula, cf Example 26 in Lecture Notes 4 Calculate autocovariance function for AR(1), AR(2), ARMA(1,1), ARMA (1,2), MA(q) using the liner representation or recursive approach - Q1 in Assignment 2 5 ARMA Model identification from graphs of ACF and PACF - see Q2 in Assignment 2; 6 Derivation of the best linear predictor for AR(1) and AR(2) using Yule- Walker procedure Calculation of MSPE - Q4 in Assignment 2; IMPOR- TANT: Derive means that you have to start with the definition of the model and end up with the final formula, as I did in Section 31 in Lecture Notes in a general case 7 Find the best linear predictor for AR(1) and AR(2) using Yule-Walker procedure Calculate MSPE - Q4 in Assignment 2 IMPORTANT: Find means that you can use Eqs (9) and (10) in Lecture Notes, as I did in Example 31 8 Find the best linear predictor for MA(1), MA(2), ARMA(1,1) and ARMA(1,2) models using Yule-Walker procedure (for small n) - Q2, Q3 in Assignment 3; IMPORTANT: Find means that you can use Eqs (9) and (10) 9 Simple calculations with the Durbin-Levinson algorithm PACF - Q1, Q2 in Assignment 3 10 Derive the formulas for Yule-Walker estimators in AR(1), AR(2) and ARMA(1,1) models IMPORTANT: Derive means that you have to start with the definition of the model and end up with the final formula, as I did in Section 62 in Lecture Notes - Q7b) in Assignment 2 2 Details Here, I combined the calculations scattered in Lecture Notes and Assignments
21 Derive linear representation 211 AR(1) [Example 25 in Lecture Notes] AR(1) is defined by where φ(z) = 1 φz Define φ(b)x t = Z t, (1) χ(z) = 1 φ(z) Now, the function 1/(1 φz) has the following power series expansion: χ(z) = 1 φ(z) = φ j z j This expansion makes sense whenever φ < 1 Take equation (18) and multiply both sides by χ(b): since χ(z)φ(z) = 1 for all z That is, χ(b)φ(b)x t = χ(b)z t, X t = χ(b)z t, X t = χ(b)z t = φ j B j Z t = φ j Z t j The above formula gives a linear representation for AR(1) with ψ j = φ j We note that the above computation makes sense whenever φ < 1 212 ARMA(1,1) [Example 26 in Lecture Notes] ARMA(1,1) is defined by where φ(z) = 1 φz, θ(z) = 1 + θz Define φ(b)x t = θ(b)z t, (2) χ(z) = 1 φ(z) = φ j z j Take equation (3) and multiply both sides by χ(b): since χ(z)φ(z) = 1 for all z That is, χ(b)φ(b)x t = χ(b)θ(b)z t, X t = χ(b)θ(b)z t, X t = χ(b)θ(b)z t = φ j B j (1 + θb)z t = φ j Z t j + θ φ j Z t j 1
Until now everything was almost the same as for AR(1) Now, we want X t to have a form ψ jz t j That is ψ j Z t j = φ j Z t j + θ φ j Z t j 1 j=1 Re-write it as ψ 0 Z t + ψ j Z t j = φ 0 Z t + (φ j + θφ j 1 )Z t j We can identify coefficients as j=1 ψ 0 = 1, ψ j = φ j 1 (θ + φ), j 1 The above formula gives a linear representation for ARMA(1,1) The formula is obtained under the condition that φ < 1 Furthermore, it is also assumed that θ + φ 0, otherwise X t = Z t 213 ARMA(1,2) ARMA(1,2) is given by where φ(z) = 1 φz, θ(z) = 1 + θ 1 z + θ 2 z 2 Define φ(b)x t = θ(b)z t, (3) χ(z) = 1 φ(z) = φ j z j Take equation (3) and multiply both sides by χ(b): since χ(z)φ(z) = 1 for all z That is, χ(b)φ(b)x t = χ(b)θ(b)z t, X t = χ(b)θ(b)z t, X t = χ(b)θ(b)z t = φ j B j (1+θ 1 B+θ 2 B 2 )Z t = φ j Z t j +θ 1 Now, we want X t to have a form ψ jz t j That is ψ j Z t j = φ j Z t j + θ 1 φ j Z t j 1 + θ 2 φ j Z t j 2 φ j Z t j 1 +θ 2 Re-write it as ψ 0 Z t +ψ 1 Z t 1 + ψ j Z t j = φ 0 Z t +(φ 1 +θ 1 )Z t 1 + (φ j +θ 1 φ j 1 +θ 2 φ j 2 )Z t j j=2 We can identify coefficients as ψ 0 = 1, ψ 1 = φ + θ 1, ψ j = φ j 2 (φ 2 + θ 1 φ + θ 2 ), j 2 (4) The above formula gives a linear representation for ARMA(1,1) The formula is obtained under the condition that φ < 1 j=2 φ j Z t j 2
214 ARMA(1,q) The same idea as above 22 Derive autocovariance function for ARMA model 221 AR(1) Using the linear representation: [Example 410 in Lecture Notes] We use the representation X t = φj Z t j and the general formula γ X (h) = σz 2 ψ jψ j+h to obtain γ X (h) = σ 2 Zφ h 1 Using the recursive method: [Example 212 in Lecture Notes] Take AR(1) equation X t = φx t 1 +Z t Multiply both sides by X t h and apply the expected value to get E[X t X t h ] = φe[x t 1 X t h ] + E[Z t X t h ] Since E[X t ] = 0, then E[X t X t h ] = γ X (h) and E[X t X t h ] = γ h 1 Also, for all h 1 we can see that Z t is independent of X t h = φj Z t h j Hence, E[Z t X t h ] = E[Z t ]E[X t h ] = 0 (This is the whole trick, if you multiply by X t+h it will not work) Hence, we obtain γ X (h) = φγ X (h 1), or by induction γ X (h) = φ h1 γ X (0), h 1 We need to start the recursion by computing γ X (0) = Var(X t ) = σx 2 We have Var(X t ) = φ 2 Var(X t 1 ) + Var(Z t ) (again, X t 1 and Z t are independent) Since X t is stationary we get σ 2 X = φ 2 σ 2 X + σ 2 Z Solving for σx 2 : σx 2 = σz 2 1 Finally γ X (h) = φ h σz 2 1 222 AR(2) We did not derive the linear representa- Using the linear representation: tion for this model 1 There was a typo in Lecture Notes
Using the recursive method: Take AR(2) equation X t = φ 1 X t 1 +φ 2 X t 2 + Z t Multiply both sides by X t h and apply the expected value to get E[X t X t h ] = φ 1 E[X t 1 X t h ] + φ 1 E[X t 2 X t h ] + E[Z t X t h ] Since E[X t ] = 0, we have E[X t X t h ] = γ X (h), E[X t 1 X t h ] = γ X (h 1) and E[X t 2 X t h ] = γ X (h 2) Also, for all h 1, Z t is independent of X t h Hence, for h 1 Hence, we obtain E[Z t X t h ] = E[Z t ]E[X t h ] = 0 γ X (h) = φ 1 γ X (h 1) + φ 2 γ X (h 2) (5) We need to start the recursion by computing γ X (0) = Var(X t ) = σx 2 and γ X (1) To get γ X (1) we use again AR(2) equation, multiply by X t 1 and apply expectation to get so that and E[X t X t 1 ] = φ 1 E[X 2 t 1] + φ 2 E[X t 2 X t 1 ] + E[Z t X t 1 ] }{{} =0 γ X (1) = φ 1 γ X (0) + φ 2 γ X (1), (6) γ X (1) φ 1 = γ X (0) (7) Now, we need to get γ X (0) Take the AR(2) equation, multiply by X t and apply expectation to get Now, Hence, E[X 2 t ] = φ 1 E[X t 1 X t ] + φ 2 E[X t 2 X t ] + E[Z t X t ] E[Z t X t ] = E[Z t (φ 1 X t 1 + φ 2 X t 2 + Z t )] = σ 2 Z We already know (equation (5) with h = 2) γ X (0) = φ 1 γ X (1) + φ 2 γ X (2) + σ 2 Z (8) γ X (2) = φ 1 γ X (1) + φ 2 γ X (0) We plug-in this expression into (8) to get Solving (8)-(9) we obtain γ X (0) = φ 1 γ X (1) + φ 2 {φ 1 γ X (1) + φ 2 γ X (0)} + σ 2 Z (9) γ X (h) = φ 1 γ X (h 1) + φ 2 γ X (h 2), h 2, φ 1 γ X (1) = σz 2 (1 + φ 2 ) {( ) 2 φ 2 1 } γ X (0) = σz 2 (1 + φ 2 ) {( ) 2 φ 2 1 } Note: to check that the last two equations make sense, take φ 2 = 0, φ 1 = φ Then AR(2) reduces to AR(1) and the last two formulas should reduce to γ X (0) and γ X (1) for AR(1)
223 MA(q) Trivial 224 ARMA(1,1) and ARMA(1,q) Using linear representation: For ARMA((1, 1)) (an in general for ARMA(1,q)) use the linear representation with the general formula for the covariance of the linear process Specifically, since ψ 0 = 1, ψ j = φ j 1 (φ + θ), j 1, we have ] γ X (0) = σz 2 ψj 2 = σzψ 2 0+σ 2 Z 2 ψj 2 = σz 2 1 + (φ + θ) 2 φ 2(j 1) = σz [1 2 (θ + φ)2 + Similarly, γ X (1) = σ 2 Z j=1 [(θ + φ) + φ ] (θ + φ)2 You can also obtain similar formulas for γ X (h) You can also notice that γ X (h) = φ h 1 γ X (1) That is [ ] γ X (h) = σzφ 2 h 1 (θ + φ)2 (θ + φ) + φ j=1 Using the recursive method: You take the defining equation X t = φx t 1 + Z t + θz t 1, multiply both sides by X t h and then try to find a recursive equation, similar to AR(1) or AR(2) 23 The best linear predictor for AR(1) and AR(2) using Yule-Walker procedure Yule-Walker equation: or, equivalently, Formula for MSPE n (k): 231 Find P n X n+1 for AR(1): Γ n a n = γ(n; k) (10) a n = Γ 1 n γ(n; k) (11) γ X (0) a T n γ(n; k) [Example 31 in Lecture Notes] AR(1) model is given by X t = φx t 1 + Z t, where Z t are iid centered with mean zero and variance σ 2 Z and φ < 1 Hence, µ = E[X t] = 0 Recall that Then σ 2 Z γ X (h) = φ h, h 0 γ(n; k) = γ(n; 1) = (γ X (1),, γ X (n)) T = σ2 Z (φ,, φn ) T
The equation (10) becomes σ 2 Z 1 φ φ 2 φ 3 φ n 1 φ 1 φ φ 2 φ n 2 a 1 a n = σ2 Z φ n 1 φ n 2 φ n 3 φ n 4 1 (12) Now, either you invert the matrix on the left hand side or you guess the solution a n = (φ,, 0) T You have to verify that the guessed solution solves (12) Hence, in AR(1) case the prediction is 232 Find P n X n+2 for AR(1): P n X n+1 = φx n Now, we try to guess P n X n+2 If we happen to have observations X 1,, X n+1, then prediction of the next X n+2 th value is φx n+1 However, we have only n observations, so that in the latter formula we have to predict X n+1 The prediction of X n+1 has the form φx n Hence, we may guess that P n X n+2 = φ(φx n ) = φ 2 X n You have to verify that this is the correct guess 233 Find P n X n+1 for AR(2): AR(2) model is X t = φ 1 X t 1 + φ 2 X t 2 + Z t Hence, we may guess that the one-step prediction for AR(2) has the form P n X n+1 = φ 1 X n + φ 2 X n 1, that is (a 1, a 2,, a n ) = (φ 1, φ 2, 0,, 0) We verify it by checking validity of the Yule-Walker equation for two-step prediction: γ X (0) γ X (1) γ X (2) γ X (3) γ X (n 1) γ X (1) γ X (0) γ X (1) γ X (2) γ X (n 2) γ X (n 1) γ X (n 2) γ X (n 3) γ X (n 4) γ X (0) We have to check whether our choice is correct For the first and the second row on the left hand side we get, respectively, φ 1 γ X (0) + φ 2 γ X (1) = γ X (1); φ 1 γ X (1) + φ 2 γ X (0) = γ X (2) φ φ n a 1 a n = Now, looking at Assignment 2, we can recognize that the above equations are exactly formulas that are valid for covariances of AR(2) model That is, we verified that our guess was correct You can check that all remaining rows on the left hand side reduce to the recursive formula for AR(2) We can recognize the first equation to be (6) while the second one is just the recursive formula for AR(2) That is, we verified that our guess was correct 234 MSPE n (1) for AR(1) MSPE n (1) = γ X (0) a T n γ(n; 1) = γ X(0) φγ X (1) = σ2 Z φ2 σ 2 Z = σ2 Z γ X (1) γ X (n)
235 MSPE n (2) for AR(1) MSPE n (2) = γ X (0) a T n γ(n; 2) = σ2 Z φ2 γ X (2) = σ2 Z σ2 Z φ4 = (1 φ4 ) 236 MSPE n (1) for AR(2) σ 2 Z MSPE n (1) = γ X (0) a T n γ(n; 1) = γ X (0) (φ 1, φ 2 )(γ X (1), γ X (2)) T = γ X (0) φ 1 γ X (1) φ 2 γ X (2) = γ X (0) φ 1 γ X (1) φ 2 (φ 1 γ X (1) + φ 2 γ X (0)), where in the last line I used the recursive formula for AR(2) You can leave it as it is, or you can plug-in the messy expressions for AR(2) 24 Durbin-Levinson algorithm for AR(p) 241 AR(1) Find φ 11, φ 22 From Theorem 32 in Lecture Notes we have: Furthermore, φ 11 = ρ X (1) = γ X(1) γ X (0) = φ φ 22 = [γ X (2) φ 11 γ X (1)] /v 1 = [γ X (2) φγ X (1)] /v 1 (13) We note that for AR(1) model we have γ X (2) = φγ X (1), hence φ 22 = 0 242 AR(2) Find φ 11, φ 22, φ 33 From Theorem 32 in Lecture Notes we have: Furthermore, φ 11 = ρ X (1) = γ X(1) γ X (0) = φ 1 The formulas for covariances of AR(2) are φ 22 = [γ X (2) φ 11 γ X (1)] /v 1 (14) φ 1 γ X (h 1) + φ 2 γ X (h 2) = γ X (h), h 2, (15) φ 1 γ X (1) = γ X (0) Use the above formulas (first one with h = 2) and replace γ X (2) in (14) to get φ 22 = [φ 1 γ X (1) + φ 2 γ X (0) φ 11 γ X (1)] /v 1 (16) [ ] = γ X (0) φ2 1 φ 1 φ 2 + φ 2 γ X (0) γ X (0) 2 1 (1 φ 2) /v 2 1 (17) Now, from Theorem 32, v 1 = v 0 ( 11) = γ X (0) ( φ 2 ) 1 1 ( ) 2 (18)
If you combine (16)-(18) together you will get φ 22 = φ 2 (19) Now, for φ 33 : recall that this value represents partial autocovariance at lag 3 It was mentioned in class that for AR(2), PACF vanishes after lag 2 Hence, we should get φ 33 = 0 We will verify it From Theorem 32 we get Use (15) with h = 3 to get φ 33 = [γ X (3) φ 21 γ X (2) φ 22 γ X (1)] /v 1 2 φ 33 = [(φ 1 )γ X (2) (φ 2 φ 22 )γ X (1)] /v 1 2 Keeping in mind (19), in order to show that φ 33 = 0 it is enough to show that φ 21 = φ 1 We use again Theorem 32: φ 21 = φ 12 φ 11 = φ 11 ( 2) = φ 1 ( ) = φ 1 That is, φ 33 = 0