For the past eight lessons, we have been on a journey. We started with the simple problem `Ax=b` and, in our quest to understand it, we have uncovered a hidden world of interconnected structures: the Column Space, the Null Space, the Row Space, and the Left Null Space.
We have treated them as separate characters in our story. Today, we bring them all on stage for the final act. The Fundamental Theorem of Linear Algebra is not a new calculation. It is a statement of profound, beautiful truth that connects all these ideas into a single, cohesive picture. It is the grand unified theory of what a matrix *is*.
Let's consider an `m x n` matrix `A` with rank `r`. This means:
`A` takes input vectors from `n`-dimensional space (ℝⁿ).
`A` produces output vectors in `m`-dimensional space (ℝᵐ).
The "true" dimension of the action is `r` (the number of pivots).
The Decomposition of Spaces & Dimensions
The theorem tells us that the input and output spaces can each be split perfectly into two orthogonal subspaces.
Input Space (ℝⁿ)
Split into the Row Space and the Null Space. They are orthogonal complements.
dim(Row Space) + dim(Null Space) = r + (n-r) = n
Output Space (ℝᵐ)
Split into the Column Space and the Left Null Space. They are orthogonal complements.
dim(Column Space) + dim(Left Null Space) = r + (m-r) = m
This diagram is the most important picture in all of linear algebra. It shows that `A` acts as a perfect, one-to-one mapping from its `r`-dimensional Row Space to its `r`-dimensional Column Space. The entire `(n-r)`-dimensional Null Space is collapsed into nothing.
A Complete Walkthrough
Let's analyze a `3x4` matrix (`m=3, n=4`) with rank `r=2`.
Let A=1232463784911, which has RREF R=100200010110.
Subspace
Dimension
Basis
Column Space `C(A)`
r = 2 (plane in ℝ³)
⎩⎨⎧123,378⎭⎬⎫
Null Space `N(A)`
n-r = 4-2 = 2 (plane in ℝ⁴)
⎩⎨⎧−2100,−10−11⎭⎬⎫
Row Space `C(Aᵀ)`
r = 2 (plane in ℝ⁴)
{[1,2,0,1],[0,0,1,1]}
Left Null Space `N(Aᵀ)`
m-r = 3-2 = 1 (line in ℝ³)
(Found by reducing `Aᵀ`)
Orthogonality Check: Take a basis vector from the Row Space, `v = [1, 2, 0, 1]`, and a basis vector from the Null Space, `x = [-2, 1, 0, 0]`. Their dot product is `(1)(-2) + (2)(1) + (0)(0) + (1)(0) = 0`. They are orthogonal, as the theorem guarantees.