A Matrix is an array or table representing or consisting of numbers or symbols or expressions arranged in rows and columns representing a mathematical object or a property of such an object, usually like in statistics. This is known as a matrix in Mathematics. Any number of columns and rows can be used to construct a matrix. Matrices are the plural form of Matrix. Matrix operations are dependent on some specific rules, like if matrices have the same number of rows and columns, then only operations like addition and subtraction can be operated on them.
Whereas for operations like multiplication, the columns in the first matrix and rows in the second matrix must remain the same. The entries or numbers in the matrix are known as the element of that matrix. Horizontal data or entries are termed as rows. Vertical data or entries are termed columns. Its number of rows and columns determines the size of the matrix.
Notation of Matrices
Suppose M number of rows and N number of columns are present in a matrix. The number of entries present in the matrix will be M X N. Generally, matrices are represented by upper case letters like ‘A’. In contrast, the entries inside the matrices are represented by lowercase letters like ‘a’. Each entry in the matrix has two subscripts; one of them represents the position of the column, and another represents the position of the row in the matrix like Aij, where i means the position of the row while j means the position of the column.
Matrix multiplication:
In Linear Algebra, Matrix Multiplication is a binary operation that helps in the production of a matrix resulting from two matrices. This concept was first invented by the French Mathematician Jacques Philippe Marie Binet in 1812. It represents the composition of linear maps for which matrices are being used for representation. The binary operation of matrix multiplication is a basic tool and can be used in various areas of mathematics and applied mathematics, statistics, physics, economics, and engineering.
Fundamental applications
In linear Algebra Matrix, Multiplication has facilitated and clarified the computations where a strong relationship lies between Matrix scalar Multiplication and Linear Algebra, which plays an important role in mathematics and engineering, physics, computer science, and chemistry.
Linear maps
If there is a finite basis of vector space, it represents each vector uniquely by the finite sequence of the scalars, called a coordinate vector. A new vector space is created by these called coordinate vectors, which are isomorphic to real or original vector space. A coordinate vector is generally represented as a column matrix (also termed as a column vector), which means a matrix with only one column. The Column Vector can represent both a coordinate vector and a vector of the original vector space.
General properties on multiplying matrices
Matrix multiplication shares few properties with the usual multiplication. Matrix multiplication follows laws like Non-commutativity, Distributivity, and many more while computing the product. The matrix multiplication cannot be possible if the number of columns of the first matrix is the same with correspondence to the number of columns of the second given matrix.
Non-commutativity Law:
The operation will be commutative if two given elements, element A and element B, the product AB is defined, then BA is defined. Hence AB is equal to BA.
Suppose the size of A is m x n and B is p x q, then A x B is defined if n = p and B x A is defined if m = q. Hence, if one product is defined, the other product is generally not defined.
Distributivity Law:
Concerning matrix addition, matrix multiplication is distributive. That is, if A, B, C, and D are matrices of respective sizes m x n, n x p, m x p, p x q, one has (left distributivity)
The following is followed as (B + C) = A.B + B.C
and (right distributivity)
And, (B + C). D = B.D + C.D where (.) means multiplication
Transpose Law:
If the scalar acquires the commutative property, then the transpose of a product of matrices is the product, in the reverse order, of the transposes of the factors. That is
Transpose of (AB) = Transpose of A x Transpose of B
For noncommutative entries, there hold no entities since the order between the entries of A and B is reversed when one expands the definition of the matrix product.
Associativity law:
Suppose for three matrices like A, B, and C, the products A(BC) and (AB)C are defined only if the number of columns of A equals the number of rows of B. The number of columns of B matches up with the number of rows of C (in particular, if one of the products is defined, the other is also defined). In this case, one of the matrices must have the associative property.
(AB)C = A(BC)
Abstract algebra
Matrix Product can also be defined by the entries which belong, and this does not require multiplication of entries in the tables of the semiring to be commutative. The identity matrices (generally or mainly the square matrices where the entries are zero outside of the main diagonal and 1 on the main diagonal) are identity elements of the matrix product.
Conclusion
A rectangular array or table of numbers, symbols, or expressions arranged in rows and columns represents a mathematical object or a property of such an object similar to statistics. This is known as a matrix in Mathematics. The Matrix operations largely depended on specific rules and regulations—Matrix addition, subtraction, scalar multiplication share some important properties.