Consider the following assertion: if A is a square matrix, then there is a scalar l and a vector x such that:
Ax = l x
l is then known as an eigenvalue (or "characteristic root") of A while x is known as the associated eigenvector (or "characterstic vector") of A. To find the eigenvalues and eigenvectors which solve the system, we can proceed as follows. We can rewrite the system:
(A - l I)x = 0
thus we now have a homogeneous system. Thus, as we know, either the trivial case holds (i.e. x = 0) or the determinant vanishes, i.e. |A - l I| = 0.
Consider now the last possibility. The vanishing determinant can be re-expressed as an nth degree polynomial in l known as a "characteristic equation". In a 2 ｴ 2 case, where:
a_{11} | a_{12} | |
A = | a_{21} | a_{22} |
then:
a_{11} - l | a_{12} | |
A - l I = | a_{21} | a_{22} - l |
so the characteristic equation is:
|A - l I| = (a_{11} - l )(a_{22} - l ) - a_{21}a_{12} = 0
or simply:
|A - l I| = l ^{2} - (a_{11} + a_{22})l + (a_{11}a_{22} - a_{21}a_{12}) = 0
which is a simple quadratic equation. Notice that the coefficient attached to l is merely the negative of the trace of the original matrix A, i.e. - (a_{11} + a_{22}) = - tr A, while the last term is merely the determinant of the original matrix, i.e. (a_{11}a_{22} - a_{21}a_{12}) = |A|. Thus, we can write:
|A - l I| = l ^{2} - (trA)l + |A| = 0
This is generally true for all two-dimensional equation systems. There are always two solutions to a quadratic equation and these can be obtained from the familiar square root rule, in this case:
l _{1}, l _{2} = trA/2 ｱ ﾖ [(trA)^{2} - 4|A|]/2
where the eigenvalues l _{1}, l _{2} are real if trA^{2} ｳ 4|A|, otherwise they are complex conjugates.
For higher-dimensional systems, the polynomial is, of course, different. In fact, an n-dimensional matrix A will have n eigenvalues l with associated eigenvectors x which can solve the system Ax = l x. Nonetheless, some general rules apply. For instance:
(i) the sum of the eigenvalues of a square matrix is equal to its trace, i.e. ・/font> _{i=1}^{n }l^{ }_{i} = tr A
(ii) the product of the eigenvalues of a square matrix is equal to its determinant, i.e. ﾕ _{i=1}^{n} l _{i} = |A|
Having established the eigenvalues, the question now turns to the associated eigenvectors. These are obtained by plugging in one of the eigenvalues and solving for the ratios between x_{i}s in x. For instance, if we take the first eigenvalue l _{1} then our equation system is
(A - l _{1}I)x = 0
For the 2 ｴ 2 case, we can write this out in a systems of two equations:
(a_{11 }- l _{1})x_{1} + a_{12}x_{2} = 0
a_{21}x_{1} + (a_{22 }- l _{1})x_{2} = 0
Although it might not be obvious, the first equation is a linear transformation of the second equation, thus the ratio x_{1}/x_{2} will be the same regardless of how we solve it, i.e.
-a_{12}/(a_{11} - l _{1}) = x_{1}/x_{2} = (a_{22} - l _{1})/a_{21}
Once x_{1}/x_{2 }is obtained, the only thing that remains to obtain some levels of x_{1} and x_{2}, we have to normalize the system, e.g. we could take x_{2} = 1 or impose x_{1} + x_{2} = 1 or x_{1}^{2} + x_{2}^{2} = 1 as a normalization device. From this we would thus obtain the vector x = [x_{1} x_{2}]｢ . This x is the eigenvector associated with the eigenvalue l _{1}. If we then took the second eigenvalue l _{2}, we would also find another eigenvector x associated with that by the same means. In an n-dimensional system, we would have n eigenvalues with associated eigenvectors.
An Example:
Consider the following matrix:
2 | 2 | |
A = | 2 | -1 |
imposing Ax = l x, then we wish to find the l s such that |A - l I| = 0. This reduces itself to the polynomial:
l ^{2} - l - 6 = 0
which yields two values l _{1} = 3 and l _{2} = -2. Plugging the first eigenvalue into the system (A - l _{1}I)x = 0, we then obtain the two equation system:
-x_{1} + 2x_{2} = 0
2x_{1} - 4x_{2} = 0
which are obviously linearly dependent, thus the solution in either case is x_{1}/x_{2} = 1/2. If we normalize x_{2} = 1, then x = [1/2 1]｢ , normalizing x_{1} = 1, then x = [1, 2]｢ , normalizing x_{1} + x_{2} = 1, then x = [1/3, 2/3]｢ , normalizing x_{1}^{2} + x_{2}^{2} = 1, then x = [1/ﾖ 5, 2/ﾖ 5]｢ , and so on for other normalizations. Note that whatever normalization we choose, the resulting eigenvector be linearly related to any other obtained by a different normalization device.
To obtain the second eigenvector, we need to plug in the second eigenvalue l _{2} = -2 into our system (A - l _{2}I)x = 0. This yields:
4x_{1} + 2x_{2} = 0
2x_{1} + x_{2} = 0
where the solution in either case is x_{1}/x_{2} = -0.5. If we normalize x_{2} = 1, for instance, then x = [-0.5 1]｢ is the eigenvector associated with l _{2}.
Back | Top | Selected References | Next |