Matrix powers, differential equations, Markov chains, and principal component analysis.
Overview
Eigenvalues and eigenvectors have countless applications across mathematics, science, and engineering. They simplify complex problems by revealing the fundamental modes of a system.
Matrix Powers
If A is diagonalizable with A=PDP−1:
An=PDnP−1
where:
Dn=λ1n0⋮00λ2n⋮0⋯⋯⋱⋯00⋮λnn
Example
A=[4123],P=[211−1],D=[5002]
A3=PD3P−1 where D3=[125008]
Fibonacci Numbers
The Fibonacci sequence Fn+2=Fn+1+Fn can be expressed as:
[Fn+1Fn]=[1110]n[10]
Eigenvalues of [1110]:
λ1=21+5≈1.618 (golden ratio φ)
λ2=21−5≈−0.618
Closed form (Binet's formula):
Fn=5φn−ψn
Markov Chains
For transition matrix P:
λ=1 is always an eigenvalue
Stationary distribution is eigenvector for λ=1
Long-term behavior determined by eigenvalue 1
Example
P=[0.80.20.30.7](transition probabilities)
Stationary: solve (P−I)π=0
π=[0.60.4] (60% state 1, 40% state 2)
Differential Equations
For dtdx=Ax:
Solution:
x(t)=c1eλ1tv1+c2eλ2tv2+⋯+cneλntvn
Each term corresponds to an eigenmode.
Stability Analysis
All eigenvalues have negative real parts → stable (solutions decay)
Any eigenvalue has positive real part → unstable (solutions grow)
Eigenvalues on imaginary axis → oscillatory
Principal Component Analysis (PCA)
Given data covariance matrix C:
Eigenvalues = variance in each principal direction
Eigenvectors = principal directions
Sort by eigenvalue for dimensionality reduction
Data variance along vi=λi
Total variance=λ1+λ2+⋯+λn
Vibration Analysis
For mechanical systems:
Eigenvalues → natural frequencies (ω2=λ)
Eigenvectors → mode shapes
Mass-spring system: Mx¨=−Kx
Eigenvalue problem: Kv=ω2Mv
Image Compression
Singular Value Decomposition (SVD) uses eigenvalues:
A=UΣVT
Keep largest singular values (related to eigenvalues of ATA) for compression.
Google PageRank
Web page importance via eigenvector of link matrix: