Linear Algebra: Row Space and Left Null Space
AI-Generated Content
Linear Algebra: Row Space and Left Null Space
Understanding the four fundamental subspaces of a matrix is like having a complete map of a linear transformation's domain and codomain. While the column space and null space are often discussed first, the row space and left null space complete the picture, revealing powerful orthogonality relationships and dimension constraints that are critical in engineering applications, from solving underdetermined systems to data compression and network analysis.
Defining the Row Space
The row space of a matrix is the set of all possible linear combinations of its row vectors. Formally, if is an matrix, its row space is a subspace of . It consists of all vectors of the form , where is any row vector of appropriate length, though a more practical definition is that it contains all vectors for which the system is consistent. Crucially, the row space of is identical to the column space of .
The most reliable method for finding a basis for the row space is to compute the reduced row echelon form (RREF) of . The non-zero rows of the RREF form a basis for the row space. For example, consider the matrix: Its RREF is: The two non-zero rows, and , form a basis for the row space of . This method works because elementary row operations do not change the row space; they only change the spanning set. A key insight is that while the RREF is unique, the basis you obtain is just one of infinitely many possible bases for the same subspace.
Understanding the Left Null Space
The left null space of a matrix is the set of all vectors such that . In other words, it is the null space of . If is , then the left null space is a subspace of . The name "left null space" comes from multiplying the null vector on the left of .
To find a basis for the left null space, you can find the null space of . A systematic computational approach involves appending the identity matrix to and performing row reduction. Consider the same matrix : We form the augmented matrix and row reduce: We would find the special solutions to . For this specific , its transpose is a matrix with rank 2, so its null space (the left null space of ) has dimension . It is trivial, containing only the zero vector. A non-trivial example would be a matrix with linearly dependent rows.
The Orthogonality Relationships
The true power of the four subspaces emerges from their orthogonal relationships in and , a concept formalized by the Fundamental Theorem of Linear Algebra.
For an matrix with rank :
- In : The row space of and the null space of are orthogonal complements. Every vector in the row space is orthogonal to every vector in the null space. Furthermore, together they span the entire , meaning any vector in can be written uniquely as the sum of a row space vector and a null space vector.
- In : The column space of and the left null space of are orthogonal complements. Every vector in the column space is orthogonal to every vector in the left null space. Together, they span .
This orthogonality can be expressed as:
- Row space Null space.
- Column space Left null space.
This duality is perfect: the vectors that kill on the right (null space) are orthogonal to the rows, and the vectors that kill on the left (left null space) are orthogonal to the columns.
Dimension Relationships and the Complete Picture
The dimensions of the four subspaces are governed by the rank of the matrix , which is the bridge connecting all of them. Let be an matrix of rank .
- Row Space Dimension: . The rank is the dimension of the row space.
- Column Space Dimension: . The rank is also the dimension of the column space.
- Null Space Dimension: . This is the number of free variables in .
- Left Null Space Dimension: . This is the number of free variables in .
These relationships give us the complete picture of matrix subspace geometry. The matrix acts as a linear transformation from to . Within , the domain splits into two perpendicular parts: the row space (dimension ) and the null space (dimension ). The matrix sends the entire null space to the zero vector in . It maps the row space in a one-to-one fashion onto the column space in . In the codomain , the column space (dimension ) is itself orthogonal to the left null space (dimension ). This elegant structure explains why is solvable only when is in the column space—or equivalently, when is orthogonal to every vector in the left null space.
Common Pitfalls
- Confusing Row Space with Column Space Basis: Students often mistakenly take the columns of the RREF as a basis for the row space. Remember: the non-zero rows of the RREF form the basis for the row space. The column space basis comes from the pivot columns of the original matrix .
- Misidentifying the Left Null Space Vector Space: For an matrix , the left null space lives in , not . A quick check: if is in the left null space, the equation must be dimensionally consistent, forcing to have components.
- Overlooking the Trivial Case: A matrix with full row rank () has a left null space of dimension zero (just the zero vector). Forgetting this can lead to unnecessary calculations searching for non-existent basis vectors.
- Assuming Orthogonality Without the Correct Inner Product: The orthogonality relationships hold under the standard dot product (Euclidean inner product). If you are working in a vector space with a different inner product, these relationships must be re-evaluated in that context.
Summary
- The row space of is the span of its rows; a basis is found from the non-zero rows of its RREF.
- The left null space of is the set of all such that ; it is the null space of .
- The row space and null space are orthogonal complements in , while the column space and left null space are orthogonal complements in .
- The dimensions are perfectly balanced: row space and column space both have dimension (the rank), while the null space has dimension and the left null space has dimension .
- Together, these four subspaces provide a complete geometric decomposition of the domain and codomain of the linear transformation defined by , a foundational concept for advanced engineering mathematics.