L9 ACF and PACF for Some Time Series Processes
For an AR(1) processes, as we did in L8__Autocorrelation and the Partial Autocorrelation Functions. We can infer,
Doing it recursively we will end up with
ACF becomes
- For an \(MA(q)\) process
- \(E(Y_{t}) = c = \mu\)
- \(V(Y_{t}) = V(e_{t}) + \theta_{1}^{2}V(e_{t-1}) + \dots + \theta^{2}_{q}V(e_{t-q})\)
- which is \(\sigma_{e}^{2} \sum_{j=0}^q \theta_{j}^{2}\) where \(\theta_{0}=1\)
Simulated Processes¶
- For a \(AR(1)\) process with coefficient (\(\phi_{1}\)) = 0.6
- Plot looks like a normal TS graph
- ACF is downward sloping (since \(\phi_{1}^k\) keeps getting smaller with \(k\))
- For an \(MA(1)\) process
- \(\gamma_{1} = Cov(Y_{t},Y_{t-1}) = \theta_{1}\sigma_{e}^{2}\)
- \(\gamma_{2}= Cov(Y_{t},Y_{t-2})\) and so, \(\gamma_{k}=0\) for all \(k \gt 1\)
- For an \(MA(2)\) process
- \(\gamma_{1}=Cov(Y_{t},Y_{t-1}) = \theta_{1}(1+\theta_{2})\sigma_{e}^{2}\)
- \(\gamma_{2} = \theta_{2}\sigma_{e}^{2}\)
- for \(k\gt 2\), \(\gamma_{k}= 0\)
-
For an \(MA(q)\) process
- for \(k \gt q\), \(\gamma_{k} = 0\)
- Thus, the ACF function \(\gamma_{k}\) can help us identify the possible models to fit into a certain component.
-
For a random walk,
- \(E(Y_{t}) = t\mu\)
- \(V(Y_{t}) = \gamma_{0}= t\sigma_{e}^{2}\)
- \(\gamma_{k}= (t-k)\sigma_{e}^{2}\)
- \(\rho_{k} = \sqrt{ 1 - \dfrac{k}{t} }\)
- In general a random walk is non-stationary, mean and variance depend on \(t\).