Eigenvalues and eigenvectors
Adapted from Wikipedia · Discoverer experience
In linear algebra, an eigenvector or characteristic vector is a special kind of vector that does not change its direction when a certain mathematical operation, called a linear transformation, is applied to it. Instead, the eigenvector is simply stretched or shrunk by a certain amount. This amount is called the eigenvalue. If the eigenvalue is negative, the eigenvector's direction is flipped to the opposite way.
Think of vectors as arrows that have both size and direction. A linear transformation might twist, stretch, or skew these arrows. However, the eigenvectors are the special arrows that only get longer or shorter without changing which way they point. The eigenvalue tells us exactly how much the eigenvector is stretched or shrunk.
Eigenvectors and eigenvalues help us understand many different areas, from geology to quantum mechanics. They are especially important when a system repeats the same operation many times. In such cases, the largest eigenvalue shows us what will happen to the system after a long time, and the matching eigenvector shows us the stable state that the system will end up in.
Matrices
When we multiply a special kind of number, called a scalar, by a nonzero vector v, we get the same vector v but stretched or shrunk. If this happens when we multiply a square matrix A by the vector v, then v is called an eigenvector of A, and the scalar is called the eigenvalue.
In other words, for a square matrix A and a vector v, if multiplying A by v simply stretches or shrinks v without changing its direction, then v is an eigenvector, and the amount of stretching or shrinking is the eigenvalue. This relationship can be written as A v = λ v, where λ is the eigenvalue.
Overview
Eigenvalues and eigenvectors are important ideas in the study of linear transformations. They come from a German word meaning "own" or "characteristic". They are used in many areas, like studying how things spin, how they vibrate, and even in facial recognition systems.
An eigenvector is a special vector that does not change direction when a linear transformation is applied to it. Instead, it is simply stretched or shrunk by a certain amount. This amount is called an eigenvalue. For example, if you have a picture and move parts of it in a certain way, some lines might stay pointing in the same direction but get longer or shorter — these lines are eigenvectors.
History
Eigenvalues and eigenvectors were first studied in the 18th century when mathematicians looked at how objects spin and move. They discovered special directions that objects turn around, called principal axes, which are connected to eigenvectors.
In the 1800s, more mathematicians built on this idea, solving important math problems and creating new ways to understand these special values. By the early 1900s, the German word eigen meaning "own" was used, leading to the term "eigenvalue" we use today. The first computer method to find these values was created in 1929.
Eigenvalues and eigenvectors of matrices
See also: Euclidean vector and Matrix (mathematics)
Eigenvalues and eigenvectors are important ideas in linear algebra, often studied when learning about matrices. Linear transformations, which change vectors in space, can be represented by matrices, making this concept very useful in many areas of math and science.
To understand eigenvectors, think of a vector that keeps its direction when a certain math operation is applied to it — it might just get longer or shorter. This operation is called a linear transformation, and the amount it stretches or shrinks the vector is called an eigenvalue. In simple terms, an eigenvector is a special vector that only changes in length (or direction if the eigenvalue is negative) when a matrix multiplies it. The number by which it is scaled is the eigenvalue.
This idea helps solve many problems in physics, computer graphics, and more, by showing which directions are unchanged by certain transformations.
| A v = w = λ v , {\displaystyle A\mathbf {v} =\mathbf {w} =\lambda \mathbf {v} ,} | 1 |
| ( A − λ I ) v = 0 , {\displaystyle \left(A-\lambda I\right)\mathbf {v} =\mathbf {0} ,} | 2 |
| det ( A − λ I ) = 0 {\displaystyle \det(A-\lambda I)=0} | 3 |
| det ( A − λ I ) = ( λ 1 − λ ) ( λ 2 − λ ) ⋯ ( λ n − λ ) , {\displaystyle \det(A-\lambda I)=(\lambda _{1}-\lambda )(\lambda _{2}-\lambda )\cdots (\lambda _{n}-\lambda ),} | 4 |
Eigenvalues and eigenfunctions of differential operators
Main article: Eigenfunction
In linear algebra, eigenvectors and eigenvalues can also apply to special math rules called differential operators, which work on spaces with infinitely many dimensions. These operators act on functions rather than simple vectors.
For example, consider the rule that finds how a function changes at each point, called the derivative operator. If this operator acts on a function and the result is just the same function multiplied by a constant, that function is called an eigenfunction. The constant it is multiplied by is the eigenvalue. The solutions to this are special functions called exponentials, which grow or shrink at a steady rate.
General definition
An eigenvector is a special nonzero vector that, when a linear transformation is applied to it, changes only by a scaling factor. This scaling factor is called an eigenvalue. In simple terms, if you apply the transformation to the eigenvector, it either stretches or shrinks but doesn’t change its direction.
For example, if v is an eigenvector and T is the transformation, then applying T to v gives us a new vector that is just the original v multiplied by a number λ. This relationship can be written as T(v) = λv. Here, λ is the eigenvalue that tells us how much the eigenvector is scaled up or down.
| T ( v ) = λ v . {\displaystyle T(\mathbf {v} )=\lambda \mathbf {v} .} | 5 |
Dynamic equations
Difference equations are a way to describe how things change over time. The simplest ones look like a list of numbers where each number depends on the numbers before it. To solve these, we use something called a characteristic equation, which helps us find special numbers called roots. These roots help us understand the overall pattern of change.
We can also use a similar idea for differential equations, which describe smooth changes over time. By finding special numbers, we can solve these equations and predict how things will behave in the future.
Calculation
Main article: Eigenvalue algorithm
Finding eigenvalues and eigenvectors can be tricky because the theory is quite different from how it's done in practice.
Classical method
First, we find the eigenvalues, then we use them to find eigenvectors. This method works well for small matrices but becomes difficult for larger ones.
Eigenvalues
For a matrix, eigenvalues are found by solving its characteristic polynomial. This is easy for small matrices but gets harder as they grow bigger. There are exact ways to find these solutions for smaller problems, but for larger matrices, we need to use special computer methods because exact calculations become too messy.
Eigenvectors
Once we know an eigenvalue, we can find its eigenvectors by solving a set of equations. For example, if we know that 6 is an eigenvalue of a certain matrix, we solve equations to find all vectors that change only by a scaling factor when the matrix is applied to them.
Simple iterative methods
Main article: Power iteration
Another way is to start with a random vector and repeatedly multiply it by the matrix. Over time, this vector gets closer and closer to an eigenvector. There are also tricks to make this process converge faster.
Modern methods
Good computer methods for this problem weren't developed until 1961 with the QR algorithm. Since then, scientists have created even better methods, especially for very large matrices that have many zeros. These modern methods usually find both eigenvalues and eigenvectors together.
Applications
Eigenvectors and eigenvalues help us understand how linear transformations affect geometric shapes. For example, they can show how a square becomes a rectangle while keeping the same area. They also help in principal component analysis, a method used to simplify large sets of data, like those in bioinformatic studies.
In graph theory, eigenvalues help measure the importance of nodes, as in Google’s PageRank algorithm. They also play a role in studying vibrations, where they represent natural frequencies and movement patterns of structures. Eigenvectors and eigenvalues are also important in quantum mechanics, describing particle states and energy levels.
| Scaling | Unequal scaling | Rotation | Horizontal shear | Hyperbolic rotation | |
|---|---|---|---|---|---|
| Illustration | |||||
| Matrix | [ k 0 0 k ] {\displaystyle {\begin{bmatrix}k&0\\0&k\end{bmatrix}}} | [ k 1 0 0 k 2 ] {\displaystyle {\begin{bmatrix}k_{1}&0\\0&k_{2}\end{bmatrix}}} | [ cos θ − sin θ sin θ cos θ ] {\displaystyle {\begin{bmatrix}\cos \theta &-\sin \theta \\\sin \theta &\cos \theta \end{bmatrix}}} | [ 1 k 0 1 ] {\displaystyle {\begin{bmatrix}1&k\\0&1\end{bmatrix}}} | [ cosh φ sinh φ sinh φ cosh φ ] {\displaystyle {\begin{bmatrix}\cosh \varphi &\sinh \varphi \\\sinh \varphi &\cosh \varphi \end{bmatrix}}} |
| Characteristic polynomial | ( λ − k ) 2 {\displaystyle \ (\lambda -k)^{2}} | ( λ − k 1 ) ( λ − k 2 ) {\displaystyle (\lambda -k_{1})(\lambda -k_{2})} | λ 2 − 2 cos ( θ ) λ + 1 {\displaystyle \lambda ^{2}-2\cos(\theta )\lambda +1} | ( λ − 1 ) 2 {\displaystyle \ (\lambda -1)^{2}} | λ 2 − 2 cosh ( φ ) λ + 1 {\displaystyle \lambda ^{2}-2\cosh(\varphi )\lambda +1} |
| Eigenvalues, λ i {\displaystyle \lambda _{i}} | λ 1 = λ 2 = k {\displaystyle \lambda _{1}=\lambda _{2}=k} | λ 1 = k 1 λ 2 = k 2 {\displaystyle {\begin{aligned}\lambda _{1}&=k_{1}\\\lambda _{2}&=k_{2}\end{aligned}}} | λ 1 = e i θ = cos θ + i sin θ λ 2 = e − i θ = cos θ − i sin θ {\displaystyle {\begin{aligned}\lambda _{1}&=e^{i\theta }\\&=\cos \theta +i\sin \theta \\\lambda _{2}&=e^{-i\theta }\\&=\cos \theta -i\sin \theta \end{aligned}}} | λ 1 = λ 2 = 1 {\displaystyle \lambda _{1}=\lambda _{2}=1} | λ 1 = e φ = cosh φ + sinh φ λ 2 = e − φ = cosh φ − sinh φ {\displaystyle {\begin{aligned}\lambda _{1}&=e^{\varphi }\\&=\cosh \varphi +\sinh \varphi \\\lambda _{2}&=e^{-\varphi }\\&=\cosh \varphi -\sinh \varphi \end{aligned}}} |
| Algebraic mult., μ i = μ ( λ i ) {\displaystyle \mu _{i}=\mu (\lambda _{i})} | μ 1 = 2 {\displaystyle \mu _{1}=2} | μ 1 = 1 μ 2 = 1 {\displaystyle {\begin{aligned}\mu _{1}&=1\\\mu _{2}&=1\end{aligned}}} | μ 1 = 1 μ 2 = 1 {\displaystyle {\begin{aligned}\mu _{1}&=1\\\mu _{2}&=1\end{aligned}}} | μ 1 = 2 {\displaystyle \mu _{1}=2} | μ 1 = 1 μ 2 = 1 {\displaystyle {\begin{aligned}\mu _{1}&=1\\\mu _{2}&=1\end{aligned}}} |
| Geometric mult., γ i = γ ( λ i ) {\displaystyle \gamma _{i}=\gamma (\lambda _{i})} | γ 1 = 2 {\displaystyle \gamma _{1}=2} | γ 1 = 1 γ 2 = 1 {\displaystyle {\begin{aligned}\gamma _{1}&=1\\\gamma _{2}&=1\end{aligned}}} | γ 1 = 1 γ 2 = 1 {\displaystyle {\begin{aligned}\gamma _{1}&=1\\\gamma _{2}&=1\end{aligned}}} | γ 1 = 1 {\displaystyle \gamma _{1}=1} | γ 1 = 1 γ 2 = 1 {\displaystyle {\begin{aligned}\gamma _{1}&=1\\\gamma _{2}&=1\end{aligned}}} |
| Eigenvectors | All nonzero vectors | u 1 = [ 1 0 ] u 2 = [ 0 1 ] {\displaystyle {\begin{aligned}\mathbf {u} _{1}&={\begin{bmatrix}1\\0\end{bmatrix}}\\\mathbf {u} _{2}&={\begin{bmatrix}0\\1\end{bmatrix}}\end{aligned}}} | u 1 = [ 1 − i ] u 2 = [ 1 + i ] {\displaystyle {\begin{aligned}\mathbf {u} _{1}&={\begin{bmatrix}1\\-i\end{bmatrix}}\\\mathbf {u} _{2}&={\begin{bmatrix}1\\+i\end{bmatrix}}\end{aligned}}} | u 1 = [ 1 0 ] {\displaystyle \mathbf {u} _{1}={\begin{bmatrix}1\\0\end{bmatrix}}} | u 1 = [ 1 1 ] u 2 = [ 1 − 1 ] {\displaystyle {\begin{aligned}\mathbf {u} _{1}&={\begin{bmatrix}1\\1\end{bmatrix}}\\\mathbf {u} _{2}&={\begin{bmatrix}1\\-1\end{bmatrix}}\end{aligned}}} |
Images
This article is a child-friendly adaptation of the Wikipedia article on Eigenvalues and eigenvectors, available under CC BY-SA 4.0.
Images from Wikimedia Commons. Tap any image to view credits and license.
Safekipedia