Matrix has a long history of applications in solving linear algebraic equations. Matrix was known as array until the 1800s. The term ‘matrix’ was coined by James Joseph Sylvester in 1850. He understood that a matrix was an object giving rise to several determinants called minors, and determinants are the smallest units of the matrix derived by removing columns and rows of belonging units.
The elements are often numbers but could be any mathematical object. This can be added and multiplied with given properties. A matrix with m rows and n columns is called an m×n matrix, where m and n are the matrix dimensions.
What is a matrix in algebra?
A matrix is a set of numbers arranged in rows and columns to form a rectangular or square array. The numbers are called the elements of the matrix.
For example – Matrix A has two rows and three columns
2 4 | 5 7 | 8 1 |
This is a two × three matrix. A matrix with n rows and n columns is called a square matrix of order n, and a matrix with m rows and n columns is called a rectangular matrix. An ordinary number can be denoted as a 1 × 1 matrix.
Thus, aij is the element of the ith row and jth column in matrix A. If A is the 2 × 3 matrix shown above, then a11 = 2, a12 = 5, a13 = 8, a21 = 4, a22 = 7, and a23 = 1.
Operations of matrices
The basic operations are
Addition of matrices
Subtraction of matrices
Scalar multiplication of matrices
Multiplication of matrices
Addition of matrices
If A[aij]mxn and B[bij]mxn are two matrices of the same order. Their sum A + B is a matrix. Each element of that matrix is the sum of the corresponding elements.
i.e. A + B = [aij + bij]mxn
Properties of addition of matrices
If A, B, and C are the matrices of the same order, then
(a) Commutative Law: A + B = B + A
(b) Associative Law: (A + B) + C = A + (B + C)
(c) Identity of the matrix: A + O = O + A = A,
where O is zero matrix which is the additive identity of the matrix.
(d) Additive inverse: A + (-A) = 0 = (-A) + A,
where (-A) is obtained by changing the sign of every element of A, which is the additive inverse of the matrix.
(e) A+B=A+C
B+A=C+A
(f) tr (A±B) = tr (A) ± tr (B)
(g) If A + B = 0 = B + A, then B is called the additive inverse of A, and A is called the additive inverse of A.
Subtraction of matrices
If A and B are two matrices of the same order, then we define A−B= A + (−B).
Consider the two matrices A and B of order 2 x 2.
We can subtract matrices by subtracting each element of one matrix from the corresponding element of the second matrix, i.e., A – B = [aij – bij]mxn
Scalar multiplication of matrices
If A=[aij]m×n is given matrix and k is any number, then the matrix obtained by multiplying the elements of A by k is called the scalar multiplication of A by k, and it is denoted by:
k Am×n = Am×n k = [ kai×j ]
Properties of scalar multiplication of matrices
If A, B are two matrices of the same order and λ and μ are any two scalars then:
(a) B λ (A+B) = λ A + λ B
(b) A (λ + μ) A=λ A + μ A
(c) λ (μ A)=(λ μ A) = μ(λ A)
(d) (−λ A) = − (λ A)=λ(−A)
(e) tr (kA)=k tr(A)
Multiplication of matrices
If A and B are any two matrices with the same rows and columns, then their product is defined as AB.
If A=[aij]m×n and B=[bij]n×p then their product AB=C=[cij]m×p will be a matrix of order m×p where (AB)ij=Cij=r=1nairbrj
Properties of matrix multiplication
(a) Matrix multiplication is not commutative, i.e. in general AB≠BA.
(b) Matrix multiplication is associative, i.e. (AB) C = A (BC).
(c) Matrix multiplication is distributive over matrix addition, i.e.,
A.(B + C) = A.B + A.C and (A + B)C = AC + BC.
(d) If A is m × n matrix, then
Im A = A = A In.
(e) The product of two matrices can be a null matrix, i.e. AB = 0, but it is not necessary that matrix A = 0 or B = 0.
(f) If A is m × n matrix, and O is a null matrix, then Am×n.On×p = Om×p., which is the product of a matrix with a null matrix, is always a null matrix.
(g) In the case of AB = 0 (It does not mean that A = 0 or B = 0), sometimes the product of two non-zero matrices is a zero matrix.
(h) If AB = AC, B ≠ C (Cancellation Law is not applicable).
(i) tr (AB)=tr (BA).
(j) There is one multiplicative identity for every square matrix, such as AI = IA = A.
What are the different types of matrices?
The different types of matrices are used in mathematics, engineering, and science. The list of the most commonly used types of matrices in linear algebra is given here:
Row matrix
Column matrix
Singular matrix
Rectangular matrix
Square matrix
Identity matrices
Matrix of ones
Zero matrix
Diagonal matrix
Null matrix
Applications of matrices in various fields
Matrices have applications in various fields like,
Cryptography
Cryptography is the technique to encrypt data so that only the relevant person can get the data and relate information. This encryption is done by using an invertible key that is not invertible. Then, the encrypted signals cannot be unencrypted and cannot get back to their original form. This process is done using matrices.
Wireless communication
Matrices are used to model the wireless signals and optimise them. For detection, extractions and processing of the information embedded in signal matrices are used.
Matrices help in processing and representing digital images.
Computer graphics
Using matrices to manipulate a point is a common mathematical approach in video game graphics. Matrices are also used to express graphs. Matrix operations such as translation, rotation, and sealing are used in graphics.
Conclusion
Matrices are very useful and powerful in mathematical analysis and collecting data. They have many applications. Characteristics of matrices are useful in programming and storing data. We find many algebraic equations using matrices. Matrix algebra provides the system of operation on well order set of numbers. The common operations are addition, multiplication, and subtraction. The most significant use of matrix algebra is its extensive use in solution systems of many simultaneous linear equations.