Lesson 6.6: From One Series to Many: Vector Autoregression (VAR) Models
We now make a monumental leap from modeling single time series to modeling entire systems of interconnected variables. This lesson introduces the Vector Autoregression (VAR) model, the multivariate generalization of the AR model. VAR allows us to capture the rich, dynamic interplay between multiple time series, moving beyond simple correlation to analyze concepts like causality and shock propagation.
Part 1: The Limits of Univariate Models
The ARIMA and GARCH models we mastered in Module 5 are powerful tools for understanding a single time series in isolation. However, economic and financial variables do not exist in a vacuum. The price of oil is not independent of the value of the US dollar. A central bank's interest rate decisions affect stock market returns, which in turn affect consumer spending.
Treating these as separate univariate problems is to ignore the fundamental feedback loops that govern the system. A simple regression might tell us that influences , but what if also influences ? This bidirectional relationship is beyond the scope of the models we have learned so far.
The Core Problem: A Interconnected Financial System
Consider a simple system of two variables: the daily returns of the S&P 500 () and the daily change in the VIX volatility index ().
- A univariate AR model for S&P 500 returns would only use past S&P 500 returns to forecast the future: .
- A univariate model for VIX changes would only use past VIX changes.
- This approach completely misses the well-known dynamic: a large negative stock return today often leads to a spike in the VIX tomorrow, and a high VIX level today often signals nervous markets and can influence future stock returns.
To capture these cross-variable dynamics and feedback loops, we need to model the variables together as a single, unified system. This is the purpose of the Vector Autoregression model.
Part 2: The VAR(p) Model Specification
A VAR model is a system of equations where each variable is regressed on its own lagged values and the lagged values of all other variables in the system. The "Vector" in its name comes from the use of vector and matrix algebra to represent this system compactly.
Let's consider a system with two variables, and . A VAR(1) model consists of two distinct regression equations:
Equation 1: Predicts using past values of both and .
Equation 2: Predicts using past values of both and .
- The coefficients capture the variable's own-lag dynamics (the AR part).
- The coefficients (where ) are the crucial new terms. They capture the influence of the -th lag of variable on the current value of variable .
- The error terms and are assumed to be white noise, but they can be contemporaneously correlated, .
The VAR(p) Model in Matrix Form
We can write this system much more elegantly using matrices. Let be a vector of our variables at time .
For our bivariate VAR(1) example:
Key Assumptions: For a VAR model to be useful, all variables in the system must be stationary. If they are non-stationary (I(1)), we must either difference them or use the more advanced VECM model we will learn about later.
Part 3: The New Toolkit - Granger Causality and IRFs
A fitted VAR model is a dense system of coefficients, which can be difficult to interpret directly. The real power of VAR analysis comes from two specialized tools that allow us to understand the model's dynamics.
Granger causality is a statistical, not a philosophical, definition of causality. It asks a simple question: "Does knowing the past of variable help me make a better forecast of variable , even after I already know the entire past of ?"
In our bivariate VAR(1), we say that **Granger-causes** if the coefficient is statistically significantly different from zero. We test this with a simple F-test on the relevant coefficients.
An Impulse Response Function is the most important tool for VAR analysis. It traces out the dynamic impact of a one-time shock to one of the variables on the future paths of all variables in the system.
It answers the question: "If a shock of one standard deviation hits variable today, what is the effect on variable over the next 10 periods?"
The Problem of Identification: A raw shock to might be correlated with a shock to . To trace out a "pure" shock to one variable, we need to make an ordering assumption. The standard method for this is the **Cholesky decomposition** of the covariance matrix of the errors, which assumes that the first variable in the ordering contemporaneously affects all others, the second affects all but the first, and so on. The ordering of variables in a VAR model matters for the IRF.
Part 4: Python Implementation - A Macroeconomic VAR
Analyzing a Macro System with a VAR Model
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import statsmodels.api as sm
from statsmodels.tsa.api import VAR
# --- 1. Load and Prepare Data ---
# Use the famous Macroeconomic Data from statsmodels
data = sm.datasets.macrodata.load_pandas().data
data = data[['infl', 'unemp', 'tbilrate']].copy()
# Take the first difference to ensure stationarity
data_diff = data.diff().dropna()
# --- 2. Select the VAR Order (p) ---
# We can use information criteria to select the optimal lag order
model_for_selection = VAR(data_diff)
# results_aic = model_for_selection.select_order(maxlags=10, ic='aic')
# print(results_aic.summary()) # Let's assume AIC suggests p=1
# --- 3. Fit the VAR(1) Model ---
model = VAR(data_diff)
results = model.fit(1) # Fit with 1 lag
print(results.summary())
# --- 4. Granger Causality Test ---
# Does unemployment Granger-cause inflation?
gc_test = results.test_causality('infl', ['unemp'], kind='f')
print("\nGranger Causality Test (unemp -> infl):")
print(gc_test.summary())
# A low p-value suggests it does.
# --- 5. Impulse Response Function (IRF) ---
# How does a shock to the treasury bill rate affect unemployment?
irf = results.irf(periods=20)
# The plot traces the response of each variable to an impulse in each variable.
irf.plot(orth=True, response='unemp', impulse='tbilrate')
plt.title('Response of Unemployment to a T-Bill Rate Shock')
plt.xlabel('Periods after shock')
plt.ylabel('Change in Unemployment')
plt.grid(True, alpha=0.3)
plt.show()
# --- 6. Forecast ---
# Forecast the next 10 quarters
lag_order = results.k_ar
forecast = results.forecast(data_diff.values[-lag_order:], steps=10)
# The output is an array of forecasts for each variable
print("\n10-Quarter Forecast (Differenced Data):")
print(forecast)
What's Next? Finding Long-Run Balance
The VAR model is a powerful tool for analyzing the short-run dynamics of stationary systems. We built it by differencing our non-stationary macro data. But in doing so, we may have thrown away valuable information about the **long-run relationships** between the levels of the variables.
For example, economic theory suggests that two interest rates with different maturities should not drift arbitrarily far apart from each other forever; there is a long-run equilibrium relationship that anchors them. Differencing the data ignores this anchor.
In the next lesson, we will introduce the concept of **Cointegration**—a statistical test for the existence of these stable, long-run equilibrium relationships between non-stationary time series.