An eigenvalue is one that can be found by using the eigenvectors. In the mathematics of linear algebra, both eigenvalues and eigenvectors are mainly used in the linear transformation analysis. Eigenvalues are one that is always represented in the form of a matrix and also with a linear set of equations which is mainly matrix equations. Eigenvectors are also known as “characteristics roots”.
Definition Of Eigenvalue
The eigenvalue is a special type of scalar associated with linear equation systems. The eigenvalues are represented mainly by matrix form. The word eigen in german means characteristics or proper. The eigenvalue is a scalar representation that is mainly used to transform the eigenvector. The eigenvalue is mainly represented by using the following equation.
AX= λX
The λ used in the equation is a scalar value which is an eigenvalue of A. If the value of the eigenvalue turns out to be negative, then the transformation direction is negative. But for every matrix, there exist eigenvalues.
Definition Of Eigenvector
The vectors that do not change their direction during linear transformations are known as eigenvectors value of eigenvectors can only be changed by using the scalar values.
Eigenspace is one that consists of a set of all eigenvectors with all its eigenvalues. If A is a square matrix of order n*n and λ is the eigenvalue of matrix A. Let X be a non zero vector and also an eigenvector. Then it satisfies the relation.
AX=λX.
Eigenvalues In Square Matrix
If A is a square matrix of order n*n, the value |A-λI|=0 is the characteristic matrix. The condition for the characteristic matrix is that it should be indefinite, undefined and non zero scalars. The determinant of the Eigen matrix can be found by using the expression |A-λI|. The characteristic equation or the Eigen equation of the given matrix can be found by the expression |A-λI|=0. It is also known as the identity matrix. The roots that are derived from the Eigen matrix are known as eigen roots.
The eigenvalues in a triangular and diagonal matrix are always equivalent to its elements in principal diagonals. The scalar matrix always has eigenvalues in the scalar only.
Difference Between Eigenvalues And Eigenvectors
Eigenvectors is a one that shows the directions along which the particular transformations in linear space occur by dipping, stretching or compressing.
Eigenvalues is a one that defines the strength of the transformations in the direction of the eigenvector. It also defines the factor by which compression occurs.
Properties Of Eigenvalues
- Eigenvalues of real symmetric matrices are always real.
- The Eigenvalues of skew-symmetric matrices are either imaginary or zero.
- The eigenvalues of the unit and orthogonal matrices are always | λ|=1.
- Properties of eigenvalues are always equal to | λ|=1.
- If A and B are two matrices with the same order(in rows and columns), then the eigenvalue of BA=eigenvalues of AB.
- The Sum of eigenvalues of A is always equal to the trace(A).
Properties Of Eigenvectors
- Eigenvectors have distinct eigenvalues.This type of eigenvector is always linearly independent.
- The zero matrices or singular matrix always has zero eigenvalues.
- If A is a square matrix, the λ=0 cannot exist for that matrix.
- If λ is an eigenvalue and A is a square matrix, then kλ is an eigenvalue of kA.
- If A is a square matrix of order n*n and λ is an eigenvalue of A. If n>=0 is an integer, then we can find that λ^n is also an eigenvalue of A^n.
- If A is a square matrix of order n*n and λ is an eigenvalue of A. If p(x) is a polynomial, then p(λ) is an eigenvalue in the matrix of p(A).
- If A is a square matrix of order n*n and λ is an eigenvalue of A, then λ-1 is always equal to A-1.
- If A is a square matrix of order n*n and λ is an eigenvalue of A, then λ is an eigenvalue of the transpose of A.
Conclusions
Thus the eigenvalues and eigenvectors play an important role in all daily activities and finding many processes. Thus the process of finding eigenvalues and vectors is important.