An eigenvalue is one that can be found by using the eigenvectors. In the mathematics of linear algebra both eigenvalues and eigenvectors are mainly used in the linear transformation analysis. Eigenvalues are one that is always represented in form of a matrix and also with a linear set of equations which is mainly a matrix equation. Eigenvectors are also known as “characteristics roots”.
Definition Of Eigenvalue
The eigenvalue is a special type of scalar associated with linear equation systems. The eigenvalues are represented mainly by matrix form. The word eigen in german means characteristics or proper. The eigenvalue is a scalar representation that is mainly used to transform the eigenvector. The eigenvalue is mainly represented by using the following equation.
AX= λX
The λ used in the equation is a scalar value which is an eigenvalue of A.If the value of the eigenvalue turns out to be negative then the transformation direction is negative. But for every matrix there exist a eigenvalues
Definition Of Eigenvector
The vectors that do not change its direction during linear transformations are known as eigenvectors. The value of eigenvectors can only be changed by using the scalar values.
Eigenspace is one that consists of a set of all eigenvectors with all its eigenvalues. If A is a square matrix of order n*n and λ is the eigenvalue of matrix A. Let X be a non zero vector and also an eigenvector then it satisfies the relation.
AX=λX.
Eigenvalue Decomposition
It is a process of matrix factorisation into its canonical form. The matrices that are diagonalizable can be factored into its canonical form.
If A is a square matrix of the order of n*n then the matrix A can be diagonalized by using the formula
D=B-1AB.
Where D is a diagonalisation matrix. The B is the one that contains eigenvectors of all eigenvalues in the form of a matrix.
Single Value Decomposition
The single value decomposition (SVD) is a process that contains the U and V matrix in the same order as the matrix of A. The matrix U is always a normal matrix that contains the eigenvalues of A*AT. The V matrix is always a transpose of the multiplication of AT*A. The Σ is an on which contains the root of eigenvalues. Thus single value decomposition can be represented by using the formula
A=U*Σ*VT
Signature In Eigenvalues Decomposition
Signature = number of positive terms – number of negative terms
S = 2p-r
The number of positive terms in the canonical form is an index(p).
Nature of roots based on the sign of Eigenvalue :
Positive-definite
The nature of the matrix is said to be positive definite if all of its eigenvalues are positive.
Eg: eigenvalues= 1,3,6
Negative definite
The nature of the matrix is said to be negative definite if all of its eigenvalues are negative.
Eg: eigenvalues=-1,-3,-6.
Positive semidefinite
The nature of the matrix is said to be positive semi-definite if all of its eigenvalues are positive and one of its eigenvalues is zero.
Eg:eigenvalue= 1,0,5.
The nature of the matrix is said to be positive definite if all of its eigenvalues are positive.
Negative semidefinite
The nature of the matrix is said to be negative definite if all of its eigenvalues are negative and one of its eigenvalues is zero.
Eg: eigenvalue: -1,0,-3
Indefinite
The nature of a matrix is said to be indefinite if it contains a mixture of all negative and positive eigenvalues.
Eg; 1,-1,-9.
Example On Eigen Value Decomposition
Let’s take a 2×2 matrix. Let the matrix be named A.
The elements in the matrix
At (1,1)=2
At(1,2)=1
At(2,1)=1
At(2,2)=2
On solving this matrix by using the method of eigenvalues we get the eigenvalues as 1 and 3.
At eigenvalue=1
The eigenvectors of the matrix A is
At(1,1)=1
At(2,1)=-1
At eigenvalue =3
The eigenvectors of matrix A are
At (1,1)=1
At(2,1)=1.
This by using the method diagonalization
A=PDP-1
Where the P is a matrix that contains the eigenvectors of each eigenvalue.
Conclusion
Thus In this article it has been said that how to solve a matrix and decompose it into its eigenvalue and eigenvectors form.