Multivariate volatility forecasting, part 2 – equicorrelation

Last time we showed how to estimate a CCC and DCC volatility model. Here I describe an advancement labored by Engle and Kelly (2012) bearing the name: Dynamic equicorrelation. The idea is nice and the paper is well written.

Departing where the previous post ended, once we have (say) the DCC estimates, instead of letting the variance-covariance matrix be, we force some structure by way of averaging correlation across assets. Generally speaking, correlation estimates are greasy even without any breaks in dynamics, so I think forcing some structure is for the better.

In essence, the authors propose to let:

(1)   \begin{equation*} R_t = (1-\rho_t) I_n + \rho_t J_n, \end{equation*}

where I_n denotes the n-dimensional identity matrix, in our case with 3 ETFs: n=3, and J_n is in our case 3 \times 3 matrix of ones. Convince yourself that this is nothing more than fixing the free off-diagonal elements of R_t at some restricted value \rho_t. It is of course natural to choose this value to be the average cross correlation at each point in time.

Demonstration

We use the same assets from last time. We fit a DCC model and proceed to fix the off-diagonal entries at their cross sectional average. Last time we used the univariate garch package rugarch for illustration purposes, but since the focus is now progressed we will use the quicker multivariate rmgarch package.

What is important now is to get the covariance over time, so as to build a portfolio. We can extract an array of size 3 \times 3 \times T from the object garchdccfit, which means the covariance matrix in the 3 by 3 matrix, over time in the third dimension:

What we do now is have a look how the correlation evolves over time, and how the average of those correlation looks over time:
equicorr We see the correlation between the two fixed-income ETFs is extremely high and quite stable. The correlation with the SPY is less so, shifting between 0 and quite negative. The image is very much related to the correlation-structure post. We see how recently, with the increased volatility in the market, we have moved quickly to negative correlation region. What is nice about looking at correlations is that from simply looking at series themselves you may not realize this is happening. I mean, it is not like SPY is tanking while TLT is booming. We can here see it more clearly since in the computation, correlation considers the volatility of TLT. Meaning to say that fixed income does not need to boom in order to provide the diversification benefits it is supposed to provide.

The blue line is simply the average of the other three lines. At each time point, instead of letting each entry in the covariance matrix be free, we can set it to the value in the blue line. The next few lines construct a portfolio from two different estimation of the covariance matrix. The one is the DCC, and in the other we use the idea promoted by Engle and Kelly. After that we will plot the weights of the two different portfolios, aimed to minimize volatility of the portfolio.

portfolio_weights
– It looks as if the weights based on DECO covariance are more volatile, but they are less so (mind the Y-scale).
– Weights have become more balanced, with less extreme shorts. This is an important insight. We can apply this shrinkage idea to accomplish a long only portfolio in a smart way, other than optimize under some constraints.
– Note how the weight of the SPY have decreased and the weight of the IEF has increased, as expected given recent events.
——————————————————-
This is the most vanilla implementation possible, here only to fix the idea. The first thing you can think of is to set only part of the entries of the covariance matrix, not all of them as we have done here. I can imagine that fixing all entries when you construct a portfolio of 1K assets is idiotic, but limiting entries according to clusters (industries?) should be useful. As a final note, why the average \rho? other correlation measures, even based on some side-analysis may provide further refinement.

Code for plots

References and further readings

  • Robert Engle & Bryan Kelly, 2011. Dynamic Equicorrelation, Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 30(2), pages 212-228, July.
  • Alexios Ghalanos (2015). rmgarch: Multivariate GARCH models. R package version 1.2-9.
  • Jeffrey A. Ryan (2015). quantmod: Quantitative Financial Modelling Framework. R package version 0.4-4.

  • 11 comments on “Multivariate volatility forecasting, part 2 – equicorrelation”

    1. Hi Eran,
      Did you assume the equicorrelation structure in the example R codes? Thanks.
      Yongli

      1. Consider equicorrelation as a add-on structure over your covariance\correlation matrix estimate. dcccov is without the additional quicorrelation structure while in this line of code: dec_cov[,,i] <- dt %*% eq_cor %*% dt we force this structure.

    2. Hi Eran,

      How do you deal with the missing values? If I have the data already, Is it necessary to convert the data to the same class (“xts” “zoo”)?

      Thank you.

      1. Personally speaking, I don’t like to convert objects outside the base package (so I try to keep data.frame and matrix as my classes). It is not necessary to convert to xts in order to run the code.

        Regarding missing values. You can interpolate them (yourself, using a built-in approx function or use one of the excellent imputation packages available (Amelia comes to mind) for good NAs interpolation.

    3. Hello Eran,
      Would you mind sharing the codes for plotting the “correlation over time” graph?

      Thanks.

    4. Hi Eran,
      Great post. I’ve tried to run your code and it keeps returning:
      Error in dat[1:n, , i] <- dat0 :
      number of items to replace is not a multiple of replacement length
      Do you know why this might be? Sorry, I'm very new to R.
      Many thanks

      1. Make sure n equals NROW(dat0). dat[1:n, , i] is a matrix. Number of rows is n, number of columns is exactly as the number of columns in dat0.

    5. Dear Professor,

      Why do you choose armaOrder=c(0,0)?
      According to your comment, your intention is GJR GARCH(1,1).
      Anyway, what are the typical armaOrder for fitting stock market, say 212 of S’&P500 stocks (1993-1-1~1996-1-1)?.

      gjrtspec <- ugarchspec(mean.model=list(armaOrder=c(0,0)), variance.model =list(model = "gjrGARCH"),distribution="std")
      # dcc specification – GARCH(1,1) for conditional correlations

    Leave a Reply to Eran Raviv

    Your email address will not be published. Required fields are marked *