Skip to content

Basic Time Series Processes

These fundamental stochastic processes serve as the building blocks for constructing and understanding advanced time series models.

1. White Noise Process

A white noise process is the simplest form of a stationary time series.
* Definition: Let \(e_{1}, e_{2}, \dots\) be a sequence of random errors such that \(e_{t} \sim \text{IID } N(0, \sigma_{e}^{2})\). The process is defined as:
$\(Y_{t} = e_{t}\)$
* Stationarity: Neither the mean (\(E[Y_t] = 0\)) nor the variance (\(Var(Y_t) = \sigma_e^2\)) depends on the time point \(t\).

Why "white noise"?

Much like white light contains all visible frequencies of the spectrum, this noise process contains equal power across all frequencies in its spectral density. A white light thrown on some surface that distributes it into 7 colors results in fluctuations that resemble this noise.

2. Moving Average (MA) Process

The MA process suggests that the current observation is a function of current and past "shocks" or errors.
* Simple Example: \(Y_{t} = e_{t} + 0.5e_{t-1}\).
* Mechanism: The term \(0.5e_{t-1}\) acts like an average of sorts. Because this influence moves forward with \(t\), it is called a moving average process.

3. Random Walk

The Random Walk represents a process where the current value is the accumulation of all past random shocks.
* Mathematical Form: \(Y_{t} = \sum_{i=1}^t e_{i}\).
* Recursive Form: $\(Y_{t} = Y_{t-1} + e_{t}\)$
* Characteristics: The current state depends entirely on the previous state plus one additional random error.

The Drunkard's Walk

This is often called the "Drunkard’s Walk" because a drunk person doesn't know their next step; each step starts from the previous position but moves in a completely random direction.

4. Linear Trend

A linear trend model describes a process where the mean level of the series changes systematically over time.
* Equation: \(Y_{t} = a + bt + e_{t}\) (where \(a\) and \(b\) are constants).
* Behavior: Unlike stationary processes, the mean shifts up (or down) linearly with \(t\).

5. Autoregressive Process: AR(p)

Autoregressive models forecast a variable using a linear combination of its own past values.
* General Form: \(Y_{t} = c + \phi_{1}Y_{t-1} + \phi_{2}Y_{t-2} + \dots + \phi_{p} Y_{t-p} + e_{t}\)
* Concept: This involves regressing the observation against its own "lags." It is "Auto" because \(Y_{t}\) relates only to its own historical values.
* AR(1) Model: \(Y_{t} = c + \phi_{1}Y_{t-1} + e_{t}\).
* Stationarity Condition: An AR(1) process is stationary if and only if \(|\phi_{1}| < 1\).
* Example: \(Y_{t} = 10 + 0.4 Y_{t-1} + e_{t}\) is stationary since \(|0.4| < 1\).
* AR(2) Model: \(Y_{t} = c + \phi_{1}Y_{t-1} + \phi_{2}Y_{t-2} + e_{t}\).
* Stationarity Condition: Requires \(|\phi_{2}| < 1\) and \(\phi_{1} + \phi_{2} < 1\).

6. Moving Average Process: MA(q)

While AR models use past values, MA models use past forecast errors in a regression, looking back \(q\) errors.
* General Form: \(Y_{t} = c + e_{t} + \theta_{1}e_{t-1} + \dots + \theta_{q}e_{t-q}\)
* Utility: Each value of \(Y_{t}\) is essentially a weighted moving average of past forecast errors. This model is particularly suitable for modeling the irregular component (\(I_{t}\)) of a time series.

7. ARMA(p, q)

The ARMA model combines both Autoregressive and Moving Average components to capture complex dependencies in a parsimonious way.
* Equation: $\(Y_{t} = \phi_{1}Y_{t-1} + \dots + \phi_{p}Y_{t-p} + e_{t} + \theta_{1}e_{t-1} + \dots + \theta_{q}e_{t-q}\)$