L7 Basic Time Seires Processes
- These are the building blocks to any kinds of advanced models.
White Noise Process¶
Let \(e_{1},e_{2},\dots\) be sequence of random errors IID \(N(0,\sigma_{e}^{2})\)
- Neither the mean nor the variance depends on \(t\)
Why "white noise"?
A white light thrown on some surface that distributes it into 7 colors, the fluctuation of that white light resembles this noise.
Moving Average Process¶
Let \(e_{1},e_{2},\dots\) be sequence of random errors IID \(N(0,\sigma_{e}^{2})\)
The \(0.5e_{t-1}\) is like an average, since it moves with \(t\), this is called a moving average process.
Random Walk¶
Let \(e_{1},e_{2},\dots\) be sequence of random errors IID \(N(0,\sigma_{e}^{2})\)
$$
\begin{align}
Y_{1} & = e_{1} \
Y_{2} & = e_{1} + e_{2} \
\end{align}
$$
and so on…
It is the sum of all errors up until time point, \(t\).
We can also write,
- Current state that depends on a previous state + one additional random error.
- Drunkard's walk: Cause a drunk person doesn't know his/her next step.
Linear Trend¶
Let \(e_{1},e_{2},\dots\) be sequence of random errors IID \(N(0,\sigma_{e}^{2})\)
where \(a, b\) are constants.
- The mean shifts up.
Autoregressive Process: \(AR(p)\)¶
Forecast a variable using a linear combination of the past values of the variable itself.
- Regress the observation with its own past values (lags).
- Auto? Because \(Y_{t}\) corresponds to only one random variable.
\(AR(1)\) model¶
- \(AR(1)\) model is stationary if \(\lvert \phi_{1} \rvert \lt 1\)
- e.g. \(Y_{t} = 10 + 0.4 Y_{t-1} + e_{t}\) is stationary, since \(0.4 \lt 1\)
\(AR(2)\) model¶
- \(AR(2)\) model is stationary if \(\lvert \phi_{2} \rvert \lt 1\) and \(\phi_{1} + \phi_{2} \lt 1\)
Moving Average Process: MA(q)¶
MA model uses the past errors in a regression. It uses the last \(q\) errors.
- Each value of \(Y_{t}\) can be considered as weighted moving average of past forecast errors.
- MA model is suitable for an irregular component, \(I_{t}\).