L17 Model Estimation
- After specifying the model orders, estimate the underlying parameters \(\phi_{1},\phi_{2}\dots, \theta_{1},\theta_{2},\dots\)
- Assume
- We know the order, \((p,q)\)
- Model has 0 mean (if not, subtract \(\bar{Y}\) and fit a 0 mean ARMA process, \(X_{t}\) and get back \(Y_{t} = X_{t} + \bar{Y}\))
Estimation Techniques¶
Method of Moments (MoME)¶
- a.k.a. "Yule Walker Estimation"
- works for only AR models for large \(n\)
- Not efficient
Equate population moments to sample moments and solve for parameters.
- Yule Walkter estimation for AR(1)
- \(\rho_{1} = \phi\)
- error variance (\(\sigma_{e}^{2}\)) can be found by its relationship with \(\gamma_{1}\)
Maximum Likelihood Methodology (MLE)¶
Assume \(e_{t} \sim \mathcal{N}(0,\sigma_{e}^{2})\) IID
- We cannot use the joint density of the actual time series observations \(f(y_{1}, y_{2},\dots,y_{n})\) since they cannot be broken down into product of marginals (not independent haire). Thus we make use of residuals,
- since all errors are independent according to the assumption.
- take the conditional1 log-likelihood function
- Have to use some numerical approximation techniques.
MLE have many advantages.
Conditional Least Squares Technique¶
For an AR(1) model,
Thus, \(SSE = \sum_{t=1}^n e_{t}^{2}\) and finally we get,
-
Because numerical approximation techniques require an initial values, so the likelihood function is conditional. ↩