Skip to content

L7 Basic Time Seires Processes

  • These are the building blocks to any kinds of advanced models.

White Noise Process

Let \(e_{1},e_{2},\dots\) be sequence of random errors IID \(N(0,\sigma_{e}^{2})\)

\[ Y_{t} = e_{t} \]
  • Neither the mean nor the variance depends on \(t\)

Why "white noise"?

A white light thrown on some surface that distributes it into 7 colors, the fluctuation of that white light resembles this noise.

Moving Average Process

Let \(e_{1},e_{2},\dots\) be sequence of random errors IID \(N(0,\sigma_{e}^{2})\)

\[ Y_{t} = e_{t} + 0.5e_{t-1} \]

The \(0.5e_{t-1}\) is like an average, since it moves with \(t\), this is called a moving average process.

Random Walk

Let \(e_{1},e_{2},\dots\) be sequence of random errors IID \(N(0,\sigma_{e}^{2})\)

$$
\begin{align}
Y_{1} & = e_{1} \
Y_{2} & = e_{1} + e_{2} \
\end{align}
$$
and so on…

\[ Y_{t} = \sum_{i=1}^t e_{i} \]

It is the sum of all errors up until time point, \(t\).

We can also write,

\[ \boxed{ Y_{t} = Y_{t-1} + e_{t} } \]
  • Current state that depends on a previous state + one additional random error.
  • Drunkard's walk: Cause a drunk person doesn't know his/her next step.

Linear Trend

Let \(e_{1},e_{2},\dots\) be sequence of random errors IID \(N(0,\sigma_{e}^{2})\)

\[ Y_{1} = a + bt + e_{t} \]

where \(a, b\) are constants.

  • The mean shifts up.

Autoregressive Process: \(AR(p)\)

Forecast a variable using a linear combination of the past values of the variable itself.

\[ Y_{t} = c + \phi_{1}Y_{t-1} + \phi_{2}Y_{t-2} + \dots + \phi_{p} Y_{t-p} + e_{t} \]
  • Regress the observation with its own past values (lags).
  • Auto? Because \(Y_{t}\) corresponds to only one random variable.

\(AR(1)\) model

\[ Y_{t} = c + \phi_{1}Y_{t-1} + e_{t} \]
  • \(AR(1)\) model is stationary if \(\lvert \phi_{1} \rvert \lt 1\)
  • e.g. \(Y_{t} = 10 + 0.4 Y_{t-1} + e_{t}\) is stationary, since \(0.4 \lt 1\)

\(AR(2)\) model

\[ Y_{t} = c + \phi_{1}Y_{t-1} + \phi_{2}Y_{t-2} + e_{t} \]
  • \(AR(2)\) model is stationary if \(\lvert \phi_{2} \rvert \lt 1\) and \(\phi_{1} + \phi_{2} \lt 1\)

Moving Average Process: MA(q)

MA model uses the past errors in a regression. It uses the last \(q\) errors.

\[ Y_{t} = c + e_{t} + \theta_{1} + \dots + \theta_{q}e_{t-q} \]
  • Each value of \(Y_{t}\) can be considered as weighted moving average of past forecast errors.
  • MA model is suitable for an irregular component, \(I_{t}\).

\(ARMA(p,q)\)

\[ Y_{t} = \phi_{1}Y_{t-1} + \dots + \phi_{p}Y_{t-p} + \theta_{1}e_{t-1} + \dots \theta_{q}e_{t-q} + e_{t} \]