Skip to content

L33 Some Specific Multivariate Time Series Models

Linear Filtering

  • \(X_{t}\) is a "r" dimensional input series
  • \(Y_{t}\) is a "k" dimensional output series

Multivariate linear (Time invariant) filter relating \(X_{t}\) and \(Y_{t}\)

\[ Y_{t} = \sum_{j=-\infty}^\infty \Psi_{j} X_{t-j} \]

where, \(\Psi_{j}\) are \(k\times r\) matrices for all \(j\).

If \(\Psi_{j}=0\) for all \(j \lt 0\)

$$
Y_{t} = \sum_{j=0}^\infty \Psi_{j} X_{t-j}
$$
\(Y_{t}\) is expressible only in terms of present and past values of the input series \(X_{t}\).

  • Let \(\lVert A \rVert^{2}\) denote the "norm" of \(A\) = \(tr(A'A)\)
  • Linear filter is "stable" \(\impliedby\) \(\sum_{j=0}^\infty \lVert \Psi_{j} \rVert \lt \infty\)

Theorem

If

  • Linear filter is stable
  • Input random vectors \(\{ X_{t} \}\) have finite second moments

Then

  • Filtering representations exists uniquely and converges to mean square
\[ E\left( Y_{t} - \sum_{j=-n}^n \Psi_{j} X_{t-j} \right)\left( Y_{t} - \sum_{j=-n}^n \Psi_{j}X_{t-j} \right)' \to 0,\quad \text{as } n\to \infty \]

If the linear filter is stable and \(X_{t}\) is stationary with cross covariances \(\Gamma_{x}(l)\), then \(Y_{t}\) is stationary with cross covariances,

\[ \Gamma_{y}(l) = Cov(Y_{t}Y_{t+l}) = \sum_{i=-\infty}^\infty \sum_{j=-\infty}^\infty \Psi_{j}\Gamma_{x}(l+i-j)\Psi_{j} \]

Wold (MA) representation of the Series

  • "infinite MA representation of a stationary vector process"

Let \(\{ Y_{t} \}\) be a multivariate stationary process with mean \(\mu\)

$$
Y_{t} - \mu = \sum_{j=0}^\infty \Psi_{j}e_{t-j}
$$
$$
Y_{t} = \mu + \sum_{j=0}^\infty \Psi(B)e_{t}
$$
Where \(\Psi(B)\) is a \(k \times k\) matrix of backward shift operator. (= \(\sum_{j=0}^\infty \Psi_{j}B^j\))

Vector Autoregressive Moving Average (VARMA)

\(VARMA(p,q)\) process

\[ \Phi_{p}(B)(Y_{t}-\mu) = \Theta_{q}(B)e_{t} \]

where,

  • \(\Phi_{p}(B)= \Phi_{0} - \Phi_{1}(B) - \dots \Phi_{p}B^p\)
    • and \(\Phi_{i}\) here are matrices and not single quantities (parameters)
  • \(\Theta_{q}(B)= \Theta_{0} - \Theta_{1}(B) - \dots \Theta_{q}B^q\)
    • and \(\Theta_{i}\) here are matrices and not single quantities (parameters)
  • \(q=0\) \(\implies\) \(VAR(p)\)
  • \(p=0\) \(\implies\) \(VMA(q)\)

A process is stationary if we can represent this as a convergent vector moving average process of infinite order.

$$
Y_{t} = \mu + e_{t} + \sum_{j=1}^\infty \Psi_{j}e_{t-j} = \mu + \Phi(B)e_{t}
$$
By taking \(Y_{t}-\mu\) on one side we get

\[ Y_{t} - \mu = \Phi(B)e_{t} = \Phi_{p}(B)^{-1}\Theta_{q}(B)e_{t} \]

And so,

$$
\Phi(B) = 1 + \sum_{j=1}^\infty \Phi_{j}B^j = \Phi_{p}(B)^{-1}\Theta_{q}(B)
$$
- This is not VARMA because it doesn't have any \(Y_{t-i}\) terms on the RHS. It is just VMA.
- If we can write this as VMA, it is stationary.

\[ \Phi(z)^{-1} = \det(\Phi(z))^{-1}\Phi^*(B) \]
  • where \(\Phi^*(z) =\text{adj}(\Phi(z))\)

Thus the process is

\[ Y_{t} = \mu + \det(\Phi(B))^{-1} \Phi^*(B) \Theta_{q}(B)e_{t} \]

The process is stationary if \(\{ \det(\Phi(z))^{-1} \}\) is convergent for \(\lvert z \rvert \lt 1\).

VMA (order \(q\))

\[ Y_{t} = \mu + e_{t} + \sum_{j=1}^q \Theta_{j}e_{t-j} = \mu + \Theta(B)e_{t} \]

where

\[ \Theta(B) = I_{k} + \Theta_{1}B + \Theta_{2}B^{2} + \dots + \Theta_{q}B^q \]

Vector MA(1)

Assume, \(\mu = 0\)

\[ Y_{t} = e_{t} + \Theta_{1}e_{t-1} \]

By recursive substitution,

\[ \Theta_{1}Y_{t-1} + \Theta_{1}^{2}Y_{t-2} + \dots + e_{t} \equiv VAR(\infty) \]

A Vector Infinite autoregressive process can be rewritten as VMA(1)

Autocovariance Matrix of a Process

\[ \Gamma(l) = E(Y_{t}Y_{t+l}) \]