Applications of Eigen-analysis

Markov Chains & Dynamic Systems

We've developed the complete theory of eigenvalues and eigenvectors (Av=λvAv=\lambda v) and the powerful computational tool of diagonalization (A=PDP1A = PDP^{-1}).

Now, we put it to work. The true power of eigen-analysis is in understanding systems that evolve over time. These are called dynamic systems. We will explore one of the most famous and useful examples: Markov Chains.

What is a Dynamic System?

A dynamic system is any system that changes from one state to another over discrete time steps according to a fixed rule. If that rule is a linear transformation, we can model it with a matrix.

Let xkx_k be the state of a system at time step kk. The state at the next time step, xk+1x_{k+1}, is given by:

xk+1=Axkx_{k+1} = A x_k
  • x0x_0 is our initial state.
  • x1=Ax0x_1 = A x_0
  • x2=Ax1=A(Ax0)=A2x0x_2 = A x_1 = A (A x_0) = A^2 x_0
  • In general: xk=Akx0x_k = A^k x_0

The state of the system at any future time kk is determined by the kk-th power of the transformation matrix AA applied to the initial state x0x_0.

Application: Markov Chains
A Markov Chain is a specific type of dynamic system used to model probabilities. It describes the movement of something between a finite number of states.

The Rules of a Markov Matrix (or Transition Matrix):

  1. All entries are between 0 and 1 (they represent probabilities).
  2. The entries in each column sum to 1 (the total probability of moving from a given state must be 100%).

Example: A Simple Market Model

Imagine a city where people choose between two streaming services, "StreamFlix" and "ConnectPlus." Each month, some people switch.

  • StreamFlix keeps 90% of its customers, and 10% switch to ConnectPlus.
  • ConnectPlus keeps 80% of its customers, and 20% switch to StreamFlix.

We model this with a transition matrix AA:

A=[0.90.20.10.8]A = \begin{bmatrix} 0.9 & 0.2 \\ 0.1 & 0.8 \end{bmatrix}

Let x0=[0.6,0.4]Tx_0 = [0.6, 0.4]^T be the initial market share. After one month:

x1=Ax0=[0.90.20.10.8][0.60.4]=[0.620.38]x_1 = A x_0 = \begin{bmatrix} 0.9 & 0.2 \\ 0.1 & 0.8 \end{bmatrix} \begin{bmatrix} 0.6 \\ 0.4 \end{bmatrix} = \begin{bmatrix} 0.62 \\ 0.38 \end{bmatrix}

StreamFlix will have 62% and ConnectPlus will have 38%.

The Big Question: What Happens in the Long Run?
The **steady state** of this system is the eigenvector corresponding to the eigenvalue λ=1\lambda = 1.

We need to find the eigenvector for λ=1\lambda=1 by solving (AI)v=0(A - I)v = 0.

AI=[0.910.20.10.81]=[0.10.20.10.2]A - I = \begin{bmatrix} 0.9-1 & 0.2 \\ 0.1 & 0.8-1 \end{bmatrix} = \begin{bmatrix} -0.1 & 0.2 \\ 0.1 & -0.2 \end{bmatrix}

This gives the equation 0.1s+0.2c=0-0.1s + 0.2c = 0, which simplifies to s=2cs = 2c. The eigenvector is of the form [2c,c][2c, c]. We can choose `c=1`, giving a basis vector v1=[2,1]v_1 = [2, 1].

To make this a probability vector (components sum to 1), we normalize it:

xss=[2/31/3][0.6670.333]x_{ss} = \begin{bmatrix} 2/3 \\ 1/3 \end{bmatrix} \approx \begin{bmatrix} 0.667 \\ 0.333 \end{bmatrix}

Conclusion: The market will stabilize with StreamFlix holding 2/3 of the market and ConnectPlus holding 1/3, regardless of the initial state.

Summary: Eigen-analysis in Action
  • Dynamic systems xk+1=Axkx_{k+1} = Ax_k can be understood by analyzing the matrix `A`.
  • Diagonalization (Ak=PDkP1A^k = PD^kP^{-1}) is key to computing long-term states.
  • Markov Chains model probabilistic state changes.
  • The steady state of a Markov chain is the eigenvector for the eigenvalue λ=1\lambda = 1.
  • Other eigenvalues with magnitude less than 1 determine the rate of convergence to the steady state.

**Up Next:** We will explore the special case of **Symmetric Matrices** and the **Spectral Theorem**.