Skip to content

Some Specific Multivariate TS models

Multivariate models expand the concepts of univariate time series into a vector-matrix framework, allowing for the modeling of systems where variables influence each other across time.

1. Linear Filtering

A multivariate linear time-invariant filter relates an \(r\)-dimensional input series \(X_{t}\) to a \(k\)-dimensional output series \(Y_{t}\).

\[Y_{t} = \sum_{j=-\infty}^\infty \Psi_{j} X_{t-j}\]

Where \(\Psi_{j}\) are \(k \times r\) matrices for all \(j\).

  • Causality: If \(\Psi_{j}=0\) for all \(j < 0\):
    $\(Y_{t} = \sum_{j=0}^\infty \Psi_{j} X_{t-j}\)$
    In this case, \(Y_{t}\) is expressible only in terms of the present and past values of the input series \(X_{t}\).
  • Stability: Let \(\lVert A \rVert^{2}\) denote the "norm" of \(A\), defined as \(tr(A'A)\). A linear filter is "stable" if \(\sum_{j=0}^\infty \lVert \Psi_{j} \rVert < \infty\).

Theorem: Convergence and Stationarity

If the linear filter is stable and the input random vectors \(\{ X_{t} \}\) have finite second moments, then:

  1. Filtering representations exist uniquely and converge in the mean square:
    $\(E\left( Y_{t} - \sum_{j=-n}^n \Psi_{j} X_{t-j} \right)\left( Y_{t} - \sum_{j=-n}^n \Psi_{j}X_{t-j} \right)' \to 0,\quad \text{as } n\to \infty\)$

  2. If \(X_{t}\) is stationary with cross covariances \(\Gamma_{x}(l)\), then \(Y_{t}\) is stationary with cross covariances:
    $\(\Gamma_{y}(l) = Cov(Y_{t}, Y_{t+l}) = \sum_{i=-\infty}^\infty \sum_{j=-\infty}^\infty \Psi_{i}\Gamma_{x}(l+i-j)\Psi_{j}'\)$


2. Wold (MA) representation of the Series

This represents the "infinite MA representation of a stationary vector process." Let \(\{ Y_{t} \}\) be a multivariate stationary process with mean \(\mu\):

\[Y_{t} - \mu = \sum_{j=0}^\infty \Psi_{j}e_{t-j}$$ $$Y_{t} = \mu + \Psi(B)e_{t}\]

Where \(\Psi(B)\) is a \(k \times k\) matrix of backward shift operators, defined as \(\Psi(B) = \sum_{j=0}^\infty \Psi_{j}B^j\).


3. Vector Autoregressive Moving Average (VARMA)

The \(VARMA(p,q)\) process is defined as:

\[\Phi_{p}(B)(Y_{t}-\mu) = \Theta_{q}(B)e_{t}\]

Where:
* \(\Phi_{p}(B) = \Phi_{0} - \Phi_{1}B - \dots - \Phi_{p}B^p\): Autoregressive matrix polynomial. Note that \(\Phi_{i}\) are matrices, not single parameter quantities.
* \(\Theta_{q}(B) = \Theta_{0} - \Theta_{1}B - \dots - \Theta_{q}B^q\): Moving average matrix polynomial. These are also matrices.
* Special Cases:
* \(q=0 \implies VAR(p)\)
* \(p=0 \implies VMA(q)\)

A process is stationary if we can represent it as a convergent vector moving average process of infinite order:
$\(Y_{t} = \mu + e_{t} + \sum_{j=1}^\infty \Psi_{j}e_{t-j} = \mu + \Psi(B)e_{t}\)$

Isolating \(Y_{t} - \mu\):
$\(Y_{t} - \mu = \Psi(B)e_{t} = \Phi_{p}(B)^{-1}\Theta_{q}(B)e_{t}\)$
Thus, \(\Psi(B) = I + \sum_{j=1}^\infty \Psi_{j}B^j = \Phi_{p}(B)^{-1}\Theta_{q}(B)\).

Note on Stationarity

While the RHS above looks like a VMA (no \(Y_{t-i}\) terms), it is a representation of the VARMA. For stationarity, we use the property \(\Phi(z)^{-1} = \det(\Phi(z))^{-1}\Phi^*(z)\), where \(\Phi^*(z) = \text{adj}(\Phi(z))\).
The process is stationary if \(\{\det(\Phi(z))^{-1}\}\) is convergent for \(|z| \leq 1\).


4. Vector Moving Average (VMA)

A \(VMA(q)\) process:
$\(Y_{t} = \mu + e_{t} + \sum_{j=1}^q \Theta_{j}e_{t-j} = \mu + \Theta(B)e_{t}\)$
Where \(\Theta(B) = I_{k} + \Theta_{1}B + \Theta_{2}B^{2} + \dots + \Theta_{q}B^q\).

Vector MA(1) Case

Assume \(\mu = 0\):
$\(Y_{t} = e_{t} + \Theta_{1}e_{t-1}\)$
By recursive substitution, this can be viewed as an infinite autoregressive process (\(VAR(\infty)\)):
$\(e_{t} = Y_{t} - \Theta_{1}e_{t-1} = Y_{t} - \Theta_{1}(Y_{t-1} - \Theta_{1}e_{t-2}) = \dots\)$
$\(Y_{t} = \Theta_{1}Y_{t-1} - \Theta_{1}^{2}Y_{t-2} + \dots + e_{t}\)$

This proves that an invertible Vector Moving Average process can be rewritten as a \(VAR(\infty)\).


5. Autocovariance Matrix of a Process

In a multivariate context, the autocovariance is a matrix that captures dependencies across all constituent series at lag \(l\):
$\(\Gamma(l) = E[(Y_{t}-\mu)(Y_{t+l}-\mu)']\)$