Skip to content

L17 Model Estimation

  • After specifying the model orders, estimate the underlying parameters \(\phi_{1},\phi_{2}\dots, \theta_{1},\theta_{2},\dots\)
  • Assume
    • We know the order, \((p,q)\)
    • Model has 0 mean (if not, subtract \(\bar{Y}\) and fit a 0 mean ARMA process, \(X_{t}\) and get back \(Y_{t} = X_{t} + \bar{Y}\))

Estimation Techniques

Method of Moments (MoME)

  • a.k.a. "Yule Walker Estimation"
  • works for only AR models for large \(n\)
  • Not efficient

Equate population moments to sample moments and solve for parameters.

  • Yule Walkter estimation for AR(1)
    • \(\rho_{1} = \phi\)
    • error variance (\(\sigma_{e}^{2}\)) can be found by its relationship with \(\gamma_{1}\)

Maximum Likelihood Methodology (MLE)

Assume \(e_{t} \sim \mathcal{N}(0,\sigma_{e}^{2})\) IID

  • We cannot use the joint density of the actual time series observations \(f(y_{1}, y_{2},\dots,y_{n})\) since they cannot be broken down into product of marginals (not independent haire). Thus we make use of residuals,
\[ f(e_{1},e_{2},\dots,e_{n}) = f(e_{1})\times\dots \times f(e_{n}) \]
  • since all errors are independent according to the assumption.
  • take the conditional1 log-likelihood function
\[ \ln L(\mu, \phi, \theta, \sigma_{e}^{2}) = -\dfrac{n}{2} \ln(2\pi \sigma_{e}^{2}) - \dfrac{S_{*}(\mu,\phi, \theta)}{2\sigma_{e}^{2}} \]
  • Have to use some numerical approximation techniques.

MLE have many advantages.

Conditional Least Squares Technique

For an AR(1) model,

\[ e_{t} = Y_{t} - \phi_{1}Y_{t-1} \]

Thus, \(SSE = \sum_{t=1}^n e_{t}^{2}\) and finally we get,

\[ \hat{\phi} = \dfrac{\sum Y_{t}Y_{t-1}}{\sum Y_{t-1}^{2}} \]

  1. Because numerical approximation techniques require an initial values, so the likelihood function is conditional.