When a matrix transformation is applied to it, the eigenvalues are scalars that cause some vectors (eigenvectors) to change. In other words, if A is a square matrix of order n x n and v is a non-zero column vector of order n x 1, and Av = λv (which indicates that the product of A and v is just a scalar multiple of v), then the scalar (real number) corresponding to the eigenvector v is called an eigenvalue of the matrix A.
The word “eigen” comes from the German language and denotes “individual,” “property,” or “ownership.” Eigenvalues are also referred to as “characteristic values,” “characteristic roots,” “proper values,” and so on.
How to Find Eigenvalues?
Here are the steps to find the eigenvalue of a matrix:
If λ is an eigenvalue of a square matrix A, then according to the concept of eigenvalues,
Av = λv
If I is the identity matrix of the same order as A, then it could be said that the above equation as
Av = λ (Iv) (because v = Iv)
Av – λ (Iv) is equal to 0
Taking v as a common factor,
v (A – λI) is equal to 0
Since v is a non-zero column vector,
A – λI is equal to 0
Thus, A – λI is a zero matrix, and the determinant can be written as 0.
i.e., |A – λI| is equal to 0
This equation is also known as the characteristic equation (where |A – λI| is called the characteristic polynomial), and by solving this for λ, we get the eigenvalues. Here is the step-by-step process used to find the eigenvalues of a square matrix A.
- Take the identity matrix I, whose order is the same as A.
- Multiply every element of I by λ to get λI.
- Subtract λI from A to get A – λI.
- Find its determinant.
- Set the determinant to zero and solve for λ.
Properties of Eigenvalues
In addition to the steps to find eigenvalues of matrix importance, let us discuss the properties in detail!
- There are at most n eigenvalues in a square matrix of order n.
- There is just one eigenvalue in an identity matrix, which is 1.
- The elements of the major diagonal are the eigenvalues of triangular matrices and diagonal matrices.
- The sum of matrix A’s eigenvalues equals the sum of its diagonal elements.
- The determinant of matrix A is equal to the product of its eigenvalues.
- Hermitian and symmetric matrices have real eigenvalues.
- In skew Hermitian and skew-symmetric matrices, the eigenvalues are either zeros or imaginary integers.
- The eigenvalues of a matrix and its transpose are the same.
- If A and B are two square matrices of the same order, then the eigenvalues of AB and BA are the same.
- An orthogonal matrix’s eigenvalues are 1 and -1.
- If λ is an eigenvalue of A, then kλ is an eigenvalue of kA, where the scalar is nothing but ‘k’.
- If λ is an eigenvalue of A, then λk is an eigenvalue of Ak.
- If λ is an eigenvalue of A, then 1/λ is an eigenvalue of A-1 (if the inverse of A exists).
- If λ is an eigenvalue of A, then |A| / λ is an eigenvalue of the adjoint of A.
Aside from these facts, there is a theorem about eigenvalues known as the “Cayley-Hamilton Theorem.” Each square matrix fulfils its characteristic equation. If A is a square matrix, then |A – λI| is equal to 0 is satisfied. If the characteristic equation of a square matrix A is if λ2 – 8λ + 12 is equal to 0, then A2 – 8A + 12 is equal to 0.
Applications of Eigenvalues
- Eigenvalues are employed in various fields, including electric circuits, quantum physics, and control theory.
- Car stereo systems are designed with them in mind.
- They are also used in bridge design.
- It should come as no surprise that eigenvalues are used to determine Google’s page rank.
- Geometric transformations rely on them.
Conclusion
As stated at the outset of this piece, eigenvalues and eigenvectors are often used in several methodologies and circumstances dealing with system evolution and differential equations. Principal component analysis, factor analysis, and cluster analysis are all methodologies that use eigenvalues, and eigenvectors are the steps to find eigenvalues of a matrix.