Announcement

Collapse
No announcement yet.
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Unstable GARCH model with unexpected coefficients

    Hi! I am using a dataset (attached) that includes daily observations of SP 500 returns, OVX, and VIX for a period of approx. 10 years to run a GARCH model to test the impact of oil price uncertainty on US stock market volatility. I take the following steps to do so:
    1. After declaring data as time series with daily observations and generating the required variables, I defined the model's parameter constraints as: constraint define 1 [ARCH]_b[L1.arch] + [ARCH]_b[L1.garch] + [HET]_b[_cons] = 1
    2. ​I run the GARCH model: arch retsp lagreturn, arch(1/1) garch(1/1) constraints(1) het(ovxsqrlag vixsqrlag), where retsp is log return on SP500, lagreturn is one-period lag of retsp, ovxsqrlag is one-period lagged squared OVX (oil price implied volatility index) and vixsqrlag is one-period lagged squared VIX (sp500 implied volatility index).
    3. The output I get is as follows:


    As can be seen, the constant term of the variance equation is negative, and so is the coefficient on the lagged squared ovx, which is counter to the prediction. Could you please help me figure out what may be causing these results?

    Thank you in advance!
    Attached Files
Working...
X