A vector autoregression (VAR) process can be represented in a couple of ways. The usual form is as follows:

The above (AR process) is what we often see and use in practice. However, I recently see more and more the Moving Average representation of the process. I wish it would not be called moving average since it can be easily confused with moving average from technical analysis, which is unrelated. Anyway, it is written as:

Apparently, the MA form is sometimes more convenient to work with, for proving stuff or as a variance decomposition tool, since you can work directly with the errors and see what does a shock in one variable do to another. The unintuitive look of the infinite sum made me code this simple example:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 |
options(digits = 3) tt = 100 arcoef = .6 arcept = 2 y <- 5 r <- 0 for (i in 2:tt){ r[i] = rnorm(1,0,5) # Random N(0,5) y[i] = arcept +arcoef*y[(i-1)] + r[i] } plot(y, ty = "b") # Now create the MA representation: movrep = function(J){ aj <- NULL for (k in 1:J){ aj[k] <- (arcoef)^(k-1) } mu <- sum(aj[1:J])*arcept au <- (arcoef^J)*y[1] sr <- aj[1:(J)]%*% rev(r[1:(J)]) # Note the reverse... return(mu+au+sr) } ymov = NULL for (i in 1:tt){ ymov[i] <- movrep(i) } ht = function(x,h=6){ cat("Head: ","\n") ; print(head(x,h)) cat("\n","Tail: ","\n") ; print(tail(x,h)) } ht(cbind(y,ymov), h=4) # See that the two forms give same results: Head: y ymov [1,] 5.00 5.00 [2,] -2.41 -2.41 [3,] 3.42 3.42 [4,] -1.97 -1.97 Tail: y ymov [97,] 15.76 15.76 [98,] 13.23 13.23 [99,] 9.32 9.32 [100,] -2.80 -2.80 |

This simple code can be extended if needed to an actual VAR instead of AR and with higher lag order since VAR(P) can be written as VAR(1) with some manipulation, see references.

Some references:

Little Book of R for Time Series