A matrix A is a rectangular array of numbers, coefficients or variables. A (n ｴ m)-dimensional matrix has n rows and m columns. A vector x is a particular type of matrix with only one column, i.e. if x is a (n ｴ 1) matrix, then x is a n-dimensional vector. Thus:
a_{11} | a_{12} | .... | a_{1m} | |
a_{21} | a_{22} | .... | a_{2m} | |
A = | .... | .... | .... | .... |
a_{n1} | a_{n2} | .... | a_{nm} |
and:
x_{1} | |
x = | x_{2} |
... | |
x_{n} |
are examples of a matrix (A) and a vector (x). We can denote a matrix A by its typical element, thus A = [a_{ij}], where i = 1, ..., n; j = 1, ..., m. We now turn to some definitions and properties of matrices and operations on matrices.
Square matrix: If A has the same number of rows as columns, we refer to it as a square matrix.
Matrix addition: A + B - elements are added correspondingly, so A + B = [a_{ij} + b_{ij}]. This requires that A and B have same dimensions. It is clear that the following properties hold:
(i) A + B = B + A
(ii) (A + B) + C = A + (B + C).
Scalar multiplication: a A - every element is multiplied by scalar a , thus a A = [a a_{ij}].
Matrix multiplication AB - uses procedure whereby a typical element of AB, call it c_{ij}, is obtained by the summation of the products of the ith row of A (call it a_{i}) and the jth column of B (call it b_{j}). Thus, c_{ij} is the inner product of a_{i} and b_{j}, i.e. c_{ij} = ・/font> a_{i}, b_{j・/font> }= ・/font> _{k=1}^{n} a_{ik}b_{kj}. For conformability to multiplication, the number of columns in A must be equal to the number of rows in B. The resulting matrix, AB has the same number of rows as A and columns as B. In particular, note that if x is an (m ｴ 1) vector and A is an (n ｴ m) matrix, then Ax is a (n ｴ 1) vector. Note that in general, AB ｹ BA even if both are defined. Assuming conformability, then the following properties hold:
(i) A(x + y) = Ax + Ay
(ii) A(a x) = a Ax.
Transpose: if we interchange the rows and columns of A the resulting matrix is called the transpose of A and we denote it A｢ . Properties:
(i) (AB)｢ = B｢ A｢
(ii) (A+B)｢ = A｢ + B｢ .
Symmetric: a matrix A is symmetric if A = A｢ .
Skew-Symmetric: a matrix A is skew-symmetric if A = -A｢ .
Idempotent: A is idempotent if A^{2} = A.
Null: O is a null matrix if all its elements are zero, a_{ij} = 0 for all i, j. Its main property is that A0 = 0 for any A.
Diagonal matrix: A is an diagonal matrix if it is all zeroes except for the principal diagonal, i.e. A = [a_{ij}] where a_{ij} = 0 for all i ｹ j.
Identity matrix: I is an identity matrix if it is a diagonal matrix where the principal diagonal is composed entirely of ones, i.e. I = [d _{ij}] where d _{ii} = 1 for all i; d _{ij} = 0 for all i ｹ j. Its main property is that AI = IA = A for any A where I has dimension conformable for multiplication to A.
Linear independence: a matrix A is linearly independent if ・/font> _{i=1}^{m} l _{i}a_{i} = 0 implies l _{i} = 0 for all i, where a_{i} is the ith column (row) of A. A is linearly dependent if it is not linearly independent. Any matrix which contains a null vector 0 as one of its rows or columns will be linearly dependent.
Rank: the rank of a matrix A, often denoted r(A), is the number of linearly independent rows or columns of that matrix. It is easily shown that the row rank of a matrix is equal to the column rank of a matrix. Properties:
(i) r(AB) ｣ min [r(A), r(B)]
(ii) r(A) = r(A｢ )
(iii) r(A + B) ｣ r(A) + r(B).
Inverse: if A is a square matrix (i.e. n ｴ n matrix), then A^{-1} is an "inverse matrix" of A if A^{-1}A = I or AA^{-1} = I. The inverse matrix is unique for any A. A matrix A is "singular" if it has no inverse; it is "non-singular" if it has an inverse. A is invertible if and only if r(A) = n, thus A must be a linearly independent matrix. To find an inverse, it can be shown that: A^{-1} = adjA/|A| where adjA is the adjoint matrix, which is defined as: adjA = [|C_{ij}|]｢ , the transpose of the matrix of cofactors of A (see below). Properties:
(i) (A^{-1})^{-1} = A
(ii) (A｢ )^{-1} = (A^{-1})｢
(iii) (AB)^{-1} = B^{-1}A^{-1}
(iv) |A^{-1}| = 1/|A|.
Orthogonality: A is orthogonal if A｢ A = I. Properties: A^{-1} = A｢ .
Minor: The ij minor of an n ｴ n matrix, A, denoted |M_{ij}|, is the determinant of the (n-1)ｴ (n-1) matrix obtained by deleting the ith row and the jth column of A.
Cofactor: The ij cofactor of A is denoted |C_{ij}| and is the ij minor of A if i+j is even and the negative of the ij minor of A if i+j is odd, i.e. |C_{ij}| = (-1)^{i+j}|M_{ij}|.
Determinant: if A is a square matrix, then a "determinant" |A| is a scalar associated with that matrix which can be obtained by a Laplace cofactor-expansion process of the elements of the matrix. Expanding by the ith row, then: |A| = ・/font> _{j=1}^{n} a_{ij}|C_{ij}|; equivalently, expanding by the jth column, then: |A| = ・/font> _{i=1}^{n} a_{ij}|C_{ij}|. Some properties follow:
(i) in a 2 ｴ 2 case, |A| is merely the product of the principal diagonal minus the product of the off-diagonal;
(ii) if A is singular, then |A| = 0; if A is non-singular, then |A| ｹ 0; (thus |A| ｹ 0 if all columns and rows of A are linearly independent of each other).
(iii) |A｢ | = |A|
(iv) |AB| = |A||B|
(v) if A is a diagonal matrix, then |A| = ・/font> _{i=1}^{n} a_{ii}
(vi) multiplication of any one column (or row) by a scalar a will change the value of the determinant a -fold;
(vii) addition of a multiple of any column (or row) to any other column (or row) will leave the determinant unchanged.
(viiii) |A| = 0 if any of its columns (or rows) is linearly dependent on any other of its columns (or rows);
(ix) |A^{-1}| = |A|^{-1}
Trace: the trace of a matrix A (denoted trA) is the sum of the elements on the principal diagonal, i.e. trA = ・/font> _{i=1}^{n} a_{ii}
Principal Leading Minor: the kth order principal leading minor of n ｴ n matrix A, denoted |M_{k}|, is the determinant of the first k rows and columns of A.
Permutation: P is a permutation matrix if in each row and column of P there is an element equal to 1 and the rest of the elements are 0.
Decomposability: A is decomposable if its rows and columns can be renumbered such that A is transformed to:
A_{1} | A_{12} | |
0 | A_{2} |
where A_{1} and A_{2} are square submatrices. A is decomposable if and only if there is a permutation matrix such that P^{-1}AP yields the transformed matrix noted above.
Diagonalization: A is "diagonalizable" if there exists a matrix Z such that Z^{-1}AZ is a diagonal matrix.
P-Matrix: (Gale and Nikaido, 1965) A is a P-matrix if all principal minors of A are positive.
N-Matrix: (Inada, 1966) A is a N-matrix if all the principal minors are negative.
N-P Matrix: (Nikaido, 1968) A is an N-P matrix if it has all the principal minors of odd orders negative and all those of even orders positive (and a P-N matrix if this is reversed).
Back | Top | Selected References | Next |