Skip to content
Feb 9

Linear Algebra: Eigenvalues and Eigenvectors

MA
Mindli AI

Linear Algebra: Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors sit at the center of linear algebra because they reveal how a linear transformation acts in its “natural” directions. Instead of tracking how a matrix scrambles every vector, eigenvectors identify special directions that the matrix only stretches, compresses, or flips, and eigenvalues quantify that scaling. This viewpoint powers major tools in differential equations, vibration analysis, quantum mechanics, and modern data analysis such as principal component analysis (PCA).

What eigenvalues and eigenvectors mean

Let be an matrix. A nonzero vector is an eigenvector of if applying to produces a scalar multiple of :

The scalar is the corresponding eigenvalue. Geometrically, maps the line spanned by back onto itself. The transformation may stretch (), shrink (), reverse direction (), or collapse ().

Two practical points matter immediately:

  • Eigenvectors are defined only up to scaling. If is an eigenvector, so is for any nonzero scalar .
  • Not every matrix has independent eigenvectors. When it does, the matrix becomes much easier to analyze.

Finding eigenvalues: the characteristic polynomial

To find eigenvalues, rewrite the eigenvector equation as:

A nonzero solution exists only when the matrix is singular, meaning:

The expression is the characteristic polynomial. Its roots are the eigenvalues of . For an matrix, has degree , so there are eigenvalues counting multiplicity (over the complex numbers). Over the real numbers, some eigenvalues may be complex.

Algebraic and geometric multiplicity

When an eigenvalue repeats as a root of , it has algebraic multiplicity greater than 1. The set of eigenvectors associated with , plus the zero vector, forms a subspace called the eigenspace:

Its dimension is the geometric multiplicity. Always:

This inequality is the key to understanding why some matrices resist diagonalization.

Computing eigenvectors: solving a null space

Once an eigenvalue is known, eigenvectors come from solving the homogeneous linear system:

This is a standard null-space computation: row-reduce and solve for the free variables. In practice, one typically finds a basis for each eigenspace rather than a single eigenvector, because each eigenspace can have dimension larger than 1.

Diagonalization: why eigenvectors matter

A matrix is diagonalizable if it can be written as:

where is diagonal and is invertible. The columns of are eigenvectors of , and the diagonal entries of are the corresponding eigenvalues.

Diagonalization is powerful because it turns many matrix operations into simple scalar operations. For example:

and is trivial to compute, since it just raises each diagonal entry to the th power.

When is a matrix diagonalizable?

A sufficient (not necessary) condition is that has distinct eigenvalues. More generally, is diagonalizable exactly when it has a basis of eigenvectors, meaning the total number of linearly independent eigenvectors across all eigenvalues is .

If an eigenvalue’s geometric multiplicity is too small compared to its algebraic multiplicity, you do not get enough eigenvectors. In that case, cannot be diagonalized (though it may still be put into Jordan normal form over the complex numbers).

Differential equations: decoupling linear systems

Consider the linear system of differential equations:

If is diagonalizable, write and change variables with . Then:


This decouples the system into independent scalar differential equations:

with solutions . Transforming back gives:

Eigenvalues control stability and long-term behavior. If all eigenvalues have negative real parts, solutions decay to zero. If any eigenvalue has positive real part, solutions grow in some direction. Complex eigenvalues introduce oscillations.

Vibrations and normal modes

In mechanical and structural engineering, eigenvalues and eigenvectors describe normal modes of vibration. A common idealized model is:

where is a mass matrix and a stiffness matrix. Seeking solutions of the form (or ) leads to a generalized eigenvalue problem:

Here, eigenvectors represent mode shapes: patterns of motion where every part oscillates with the same frequency. The eigenvalues determine natural frequencies. This is not just mathematical elegance; in design and safety, avoiding resonance means understanding where these eigenvalues lie and how mode shapes couple to external forcing.

Quantum mechanics: observables and eigenstates

In quantum mechanics, physical observables are represented by operators that, in finite-dimensional settings, appear as matrices. Measurement outcomes correspond to eigenvalues, and the states that yield definite outcomes correspond to eigenvectors (eigenstates).

A central requirement is that observable operators be Hermitian (complex analog of symmetric), which guarantees real eigenvalues and an orthonormal basis of eigenvectors. This structure makes spectral decomposition possible: the operator can be analyzed in terms of its eigenvalues and eigenvectors, and probabilities of measurement outcomes relate to how a state decomposes into that eigenbasis.

Data analysis and PCA: directions of maximum variance

Principal component analysis uses eigenvalues and eigenvectors to identify dominant patterns in high-dimensional data. For centered data, the sample covariance matrix is symmetric. PCA finds orthonormal eigenvectors of :

The eigenvectors (principal components) give directions in feature space. The eigenvalues measure variance captured along each direction. Sorting eigenvalues from largest to smallest ranks components by importance. Keeping the top components produces a low-dimensional representation that preserves as much variance as possible, often improving visualization, compression, and noise reduction.

A practical interpretation: if is much larger than the rest, the data is largely one-dimensional in structure, even if recorded in many variables.

Practical considerations and common pitfalls

Numerical computation

In real applications, eigenvalues are usually computed numerically, especially for large matrices. Small perturbations in data can shift eigenvalues, and nearly repeated eigenvalues can make eigenvectors unstable. This is one reason practitioners pay attention to conditioning and use robust algorithms (such as QR methods) rather than symbolic characteristic polynomials for large problems.

Symmetric matrices behave better

Real symmetric matrices (and complex Hermitian matrices) are a best-case scenario: eigenvalues are real, eigenvectors can be chosen orthonormal, and diagonalization is guaranteed with an orthogonal (or unitary) matrix. This stability is why covariance matrices in PCA are so convenient.

Not every matrix is diagonalizable

It is tempting to assume eigenvectors always form a basis, but defective matrices exist, even with real entries. When diagonalization fails, one may need Jordan forms, generalized eigenvectors, or alternative decompositions (Schur, SVD) depending on the application.

Why eigenvalues and eigenvectors keep showing up

Eigenvalues and eigenvectors connect algebra, geometry, and computation. They convert a complicated linear transformation into understandable actions along a few key directions, and they do so in a way that scales from classroom examples to industrial simulations and modern machine learning workflows. Whether you are solving , predicting resonant frequencies, interpreting quantum measurements, or compressing data with PCA, the same core idea repeats: find the directions that a system preserves, and read the system’s behavior off the scalars that multiply them.

Write better notes with AI

Mindli helps you capture, organize, and master any subject with AI-powered summaries and flashcards.