ARCH & GARCH Models¶
Mean Variance Framework¶
- Measurement of risk $$ \sigma_{t}^2 = Var(R_{e}) $$
- Higher the variance, more is the volatility. Thus more deviation ergo, risk. Unpredictable fluctuation = Volatility.
- In finance, volatility plays a more important role than $E(R)$ since the financial decision gets affected by the amount of risk associated with an investment. We always ask the question "Is it a risky asset or not?" in order to asses the volatility
Economic Time Series: The Stylized Facts¶
- Characteristics or salient features of financial time series variables (i.e. specialized properties) are known as Stylized facts.
- Most of the financial time series have excessive kurtosis.
Important characteristics of data¶
There are four important characteristics of Data
- Mean: 1st moment = $\mu_{i}$
- Variance: 2nd moment = $\mu_{2}$
- Skewness: 3rd moment = $\beta_{1}= \dfrac{\mu_{3}^2}{\mu_{2}^3}$
- Kurtosis: 4th moment = $\beta_{2} = \dfrac{\mu_{4}}{\mu_{2}^{2}}$
where,
- $\mu_{1} = E(X) =\dfrac{1}{n} \sum Y_{t}$
- $\mu_{2} = E(Y_{t}-E(Y_{t}))^2= \dfrac{1}{T}\sum(Y_{t}-\bar{Y})^{2}$
- $\mu_{3} = E(Y_{t}- E(Y_{t}))^{3} = \dfrac{1}{T}\sum(Y_{t}-\bar{Y})^3$
- $\mu_{4}=E(Y_{t}-E(Y_{t}))^{4} = \dfrac{1}{T}\sum(Y_{t}-\bar{Y})^{4}$
Skewness¶
- Skewness = "lack of symmetry"
- So, $\beta_{1} = 0 \implies$ Symmetric distribution
- Else, Skewed distribution
- $\beta_{1} \gt 0 \implies$ Positively (right) Skewed
- $\beta_{1} \lt 0 \implies$ Negatively (left) Skewed
- Skewness = horizontal deviation
- Kurtosis = vertical deviation = "Peakedness"
- $\beta_2 = 3$ = Mesokurtic (normal curve)
- $\beta_2 \gt 3$ = Platykurtic (Flat)
- $\beta_2 \lt 3$ = Leptokurtic (Peaked)
- Most financial time seires is Leptokurtic (thick tailed)
Stylized Facts¶
Leverage Effect¶
- Negative correlation between asset's returns and its future volatility
- When $P_{A}$ falls, $Var(P_{A})$ increases, and vice versa
- Financial Leverage Hypothesis
- Total value of company: $\text{Assets}=\text{Equity}+\text{Debt}$
- When stock price falls, Equity falls, but value of Debt (e.g. outstanding bonds) remains relatively constant in the short term.
- Thus, Debt-to-Equity (financial leverage) increases
- $\implies$ Company's stock is riskier from a shareholder's perspective $\implies$ Higher volatility for the stock
- Volatility Feedback Hypothesis
- An anticipated increase in volatility (risk) $\implies$ Demand higher future expected return
- For a higher expected future return, stock's current price must drop.
- Increase in volatility $\implies$ immediate negative return.
- Asymmetric nature: The magnitude of volatility increase for a price drop $\gt$ the volatility decrease for a price rise.
- Models like EGARCH (Exponential GARCH) are used to capture this effect
ARCH processes¶
Till now, we have assumed homoskedasticity for all models,
$$ \begin{align*} V(\epsilon_{t}) & = E(\epsilon_{t}^2) - E(\epsilon_{t})^2 \\ & = E(\epsilon_{t}^2) = \sigma_{2} \end{align*} $$
But due to stylized facts, financial time series data is conditionally heteroskedastic in nature. Meaning, that its variance is time dependent. ($V(R_{t}|F_{t-1})$)
Thus, we use an ARCH process to model this heteroskedastic variance.
- AR $\to$ Volatility depends on the volatility of the previous lags
- C $\to$ conditional to time
- Heteroskedastic $\to$ $V(\epsilon_{t})\neq \sigma^2$ for all $t$
- Say, we found the best possible model for our application
- Then, as a diagnostic test, we plot the residuals of our regression and find out that the variance of the residuals varies with data, i.e. there are patterns in the residual variance that can still be accounted for in the model to make our predictions better.
- Thus, we use an ARCH(p) model
The formulation of the variance for an ARCH(1) model would be,
$$ Var(\epsilon_{t}) = \sigma_{t}^2 = \alpha_{0}+\alpha_{1}\sigma_{t-1}^2 $$
Thus, we can model our residuals like so
$$ \epsilon_{t} = \omega_{t}\sqrt{ \alpha_{0} +\alpha_{1} \epsilon_{t-1}^2 } $$
where, $\omega_{t}$ is a white noise process. (This works since $\epsilon_{t} \sim N(0,\sigma_{t}^2)$, so we are essentially just saying that $\epsilon_{t}=\omega_{t}\sigma_{t}^2$)
An ARCH(2) model would be
$$ \epsilon_{t} = \omega_{t} \sqrt{ \alpha_{0}+\alpha_{1}\epsilon_{t-1}^2 + \alpha_{2} \epsilon_{t-2}^{2} } $$
Testing for ARCH(p) using Correlograms¶
We can take the square of the residual model (for an ARCH(2) process)
$$ \epsilon_{t}^2 = \omega_{t} (\alpha_{0} +\alpha_{1}\epsilon^2_{t-1} + \alpha_{2}\epsilon_{t-1}^2) $$ to show that that it is an autoregressive equation of the square of the residuals.
Thus, when we look at the PACF of squared residuals
we will find that the first two lags are significant. (The third lag is not considered because there is a sudden drop down)
Thus, we can try fitting an ARCH(2,1) first.
Lagrange Multiplier (LM) test for ARCH (Engle's Test)¶
A statistical procedure used to determine if the residuals of a time series model exhibit heteroscedasticity that can be modeled by an ARCH process.
Process¶
- Estimate the Conditional Mean equation: Fit a suitable model, $y_{t}$ to its conditional mean $\mu_{t}$
- Obtain the residuals from this model $\to$ $\hat{\epsilon_{t}} = y_{t}- \hat{\mu}_{t}$
- Run the Auxiliary regression on the squared residuals
- Regressed on a constant and their own lagged values up to order $m$.
- $\hat{\epsilon}_{t}^2 =\alpha_{0}+\alpha_{1}\hat{\epsilon}_{t-1}^{2} +\dots+ \alpha_{m}\hat{\epsilon}_{t-m}^2 + u_{t}$
- Formulate Hypothesis and Compute the Test Statistic
$$ \begin{align*} H_{0} &: \alpha_{1} = \alpha_{2} = \dots = \alpha_{m} = 0\text{ (No ARCH effects)} \\ H_{1} &: \text{At least one } \alpha_{i} \neq 0 \end{align*} $$ Test statistic is calculated using the sample size $T$ and the $R^2$ from the auxiliary regression
$$ LM = T \times R^2 \sim \chi^2(m) $$
Rejection of $H_{0}$ indicates that ARCH effects are present in the residuals, suggesting that an ARCH or GARCH model would be appropriate to model time-varying volatility.
GARCH processes¶
Bollestor (1956) extended the ARCH model to GARCH (Generalized ARCH)
- The issue with an ARCH model is that it is "bursty" i.e. it doesn't have persistent volatility. (It will be constant, and suddenly increase and return to the normal levels)
- This is problematic if we want to model a process which has persistent volatility.
Here, we use the GARCH model
$$ \epsilon_{t} = \epsilon_{t} \sqrt{ \alpha + \alpha_{1} \epsilon_{t-1}^{2} + \beta_{1} \sigma_{t-1}^{2} } = \epsilon_{t}\sigma_{t} $$
This helps us obtain that persistent volatility, because we are just not looking at the value of the residual/innovation ($\epsilon_{t-1}$) in the previous period. But also the volatility of the previous time period ($\sigma_{t-1}^{2}$).
- If it goes far away from its average yesterday, that deviation will have an effect on the volatility on today.
- The volatility propagates over a period of time.
- A GARCH(p,q) model
- $p$ lags of the residuals
- $q$ lags of the volatility
Most financial time series are observed to fit well for GARCH(1,1).
Testing for significance
- OLS estimation won't be BLUE (non-linear) for any ARCH or GARCH, thus we would fit an MLE (which would require the knowledge of the distribution, but that would mean making a new assumption).
- Also the time series is lepto-kurtic so the best general alternative is to
- use $t$-distribution which is thick tailed
- use GED (Generalized Error Distribution) with $\mathcal{v} \lt $ (for leptokurtic distribution)
Asymmetric GARCH¶
TARCH & EGARCH¶
"Bad" news has a more pronounced effect on the volatility of asset prices than "good " news.
Reason: The debt-to-equity value rises as there is a negative stock price shock. And D2E ratio is a measure of riskiness of holding the stock. Thus, the volatility increase. On the other hand, when the returns increase, the volatility falls. This is known as the leverage effect.
"New information" is measured by the size of $\epsilon_t$.
To allow the effects of good and bad news differently on volaility: We have the threshold-GARCH (TARCH) process
$$ \sigma^2_{t} = a_{0}+a_{1}\epsilon_{t-1}^2 + \lambda_{1}d_{t-1}\epsilon_{t-1}^2 + \beta_{1}\sigma^2_{t-1} $$
where $d_{t-1}$ is a dummy variable
$$ d_{t-1} = \begin{cases} 1 & \epsilon_{t-1} \lt 0 \\ 0 & \epsilon_{t-1} \gt 0 \end{cases} $$
Another model that allows for this is the Exponential Garch model (EGARCH)
- Intrinsically has the ability to account for "leverage effect" baked in
- Prices become more volatile if they decrease