Skip to content

Model Estimation

Once the model orders \((p, q)\) are specified, the next step is to estimate the underlying parameters \(\phi_{1}, \phi_{2}, \dots\) and \(\theta_{1}, \theta_{2}, \dots\).

Assumptions for Estimation

To simplify the estimation process, we generally assume:
* Known Orders: The model orders \((p, q)\) have already been identified.
* Zero Mean: The model is assumed to have a zero mean. If the original series has a non-zero mean, we subtract the sample mean (\(\bar{Y}\)) to fit a zero-mean ARMA process (\(X_{t}\)) and later transform it back using \(Y_{t} = X_{t} + \bar{Y}\).


Estimation Techniques

1. Method of Moments (MoME)

Also known as Yule-Walker Estimation, this technique involves equating population moments to sample moments and solving for the parameters.
* Scope: It primarily works for AR models when the sample size \(n\) is large.
* Efficiency: It is generally considered not very efficient compared to other methods.
* Example (AR(1)): For an \(AR(1)\) process, the parameter estimate is simply \(\hat{\phi} = \rho_{1}\). The error variance (\(\sigma_{e}^{2}\)) is determined through its relationship with the auto-covariance \(\gamma_{1}\).

2. Maximum Likelihood Methodology (MLE)

MLE is a powerful technique that assumes the residuals follow a specific distribution, typically \(e_{t} \sim \mathcal{N}(0, \sigma_{e}^{2})\) and are IID.
* Residual-Based Approach: Since time series observations \((y_{1}, \dots, y_{n})\) are dependent, their joint density cannot be broken into simple marginal products. Instead, we utilize the joint density of the independent residuals: \(f(e_{1}, \dots, e_{n}) = f(e_{1}) \times \dots \times f(e_{n})\).
* Conditional Log-Likelihood: Because numerical approximation techniques require initial values, a conditional log-likelihood function is used:
$\(\ln L(\mu, \phi, \theta, \sigma_{e}^{2}) = -\dfrac{n}{2} \ln(2\pi \sigma_{e}^{2}) - \dfrac{S_{*}(\mu, \phi, \theta)}{2\sigma_{e}^{2}}\)$
* Computation: This method often requires numerical approximation techniques to find the maximum.

3. Conditional Least Squares Technique

This technique focuses on minimizing the Sum of Squared Errors (SSE).
* Example (AR(1)): Since the error is defined as \(e_{t} = Y_{t} - \phi_{1}Y_{t-1}\), we calculate the SSE as \(\sum e_{t}^{2}\).
* Resulting Estimator: The parameter estimate for an \(AR(1)\) model is:
$\(\hat{\phi} = \dfrac{\sum Y_{t}Y_{t-1}}{\sum Y_{t-1}^{2}}\)$