Completing the Picture of the Four Fundamental Subspaces
We have spent a great deal of time focusing on the columns of our matrix `A`. The columns gave us the **Column Space** (`C(A)`), which told us about the possible outputs of our transformation. The relationship between the columns gave us the **Null Space** (`N(A)`), which told us about the inputs that get lost.
But what about the **rows**? The rows of a matrix are vectors too. They must have a story to tell. By exploring the world of the rows, we will uncover the final two pieces of our puzzle: the **Row Space** and the **Left Null Space**.
The Row Space: The World of "Effective" Inputs
The **Row Space** of a matrix `A`, written as `C(Aᵀ)`, is the **span of the row vectors of `A`**.
This seems straightforward, but what does it *mean*? If the Column Space is the space of all possible outputs, the Row Space represents the space of **"effective" inputs**. It is the part of the input space that the matrix `A` actually "pays attention to" and transforms into the output space. Any part of an input vector that is orthogonal to the Row Space will be "ignored" by the transformation and sent to zero. In fact, the Row Space is the **orthogonal complement** of the Null Space.
Key Idea
The matrix `A` acts as a perfect, one-to-one mapping from its Row Space to its Column Space.
Finding a Basis for the Row Space
This is where Gaussian Elimination shines once again, and in a much simpler way than for the other subspaces. The **non-zero rows of the Row Echelon Form (REF)** of `A` form a basis for the Row Space of `A`.
Example:
Let's use our familiar matrix `A`:
A=1232463784911
We already found its REF, which we called `U`:
U=100200310410
The non-zero rows of `U` are `[1, 2, 3, 4]` and `[0, 0, 1, 1]`. A basis for the Row Space of `A` is:
{[1,2,3,4],[0,0,1,1]}
The dimension of the Row Space is **Two**. What was the dimension of the Column Space (the rank)? **Two**.
Profound Result
The dimension of the Row Space is always equal to the dimension of the Column Space. dim(Row Space) = dim(Column Space) = rank(A)
The Left Null Space: The Final Piece
To find the fourth space, we follow the pattern. The **Left Null Space** of `A` is the Null Space of `Aᵀ`. It is the set of all vectors `y` such that `Aᵀy = 0`.
It's called the "Left" Null Space because if we transpose the equation `Aᵀy = 0`, we get `(Aᵀy)ᵀ = 0ᵀ`, which simplifies to `yᵀA = 0`. The vector `y` is on the *left* of `A`.
The Left Null Space is the orthogonal complement to the Column Space. It is the set of all vectors that are perpendicular to every single vector in the Column Space. If the Column Space is a plane in 3D, the Left Null Space is the line (the normal vector) that is perpendicular to that plane.
The Grand Unification: The Fundamental Theorem of Linear Algebra
For any `m x n` matrix `A` with rank `r`, the four fundamental subspaces have the following properties and relationships.
The **Row Space** is the **orthogonal complement** of the **Null Space**.
The **Column Space** is the **orthogonal complement** of the **Left Null Space**.
This is the big picture. This is the complete, elegant, geometric theory of what any matrix `A` *is* and *does*. It takes the entire input space, splits it into two perpendicular subspaces (Row Space and Null Space), and transforms one into the Column Space while annihilating the other.
**Up Next:** We have now completed the entire theoretical core of solving `Ax=b` and understanding the structure of `A`. We will now pivot to a new, major topic: **Determinants**.