Stable Matrices

Blake's Newton

Back

Consider the system of ordinary differential equations x (t) = Ax(t) + b. Let x(t) = F (t)c + xp where xp is a particular solution to the system and F(t)c is the general solution of the homogeneous case x = Ax. We define a matrix A as "stable" iff x(t) is stable, or:

Stable Matrix: The matrix A is stable if all the solutions x (t) = Ax(t) converge toward zero as t converges toward infinity; that is, F (t)c 0 when t .

We now prove a quite standard theorem:

Theorem: (Negative Eigenvalues) The matrix A is stable if and only if all its eigenvalues have negative real parts.

Proof: Suppose A is stable and let l i = ui + iwi be an eigenvalue (complex or real) of A. Let vi be the associated eigenvector, thus Avi = l ivi. Now, we know that F(t) = [v1el1t, v2el2t, .., vnelnt] thus, defining j i(t) = vielit, then:

F (t) = [j 1(t), j 2(t), .., j n(t)]

Now, as j i(t) = vielit, then taking the first derivative ji (t) = l ivielit = Avielit = Aj i(t) by definition. Thus, ji(t) is a solution to the homogeneous system, x = Ax. Now, by definition, A is stable iff F(t)c 0 when t , thus it must be that j i(t) 0 for all i = 1, .., n for A to be stable. We now show that this implies that the real parts of the l i must be negative for all i, i.e. ui < 0 for all i. Consider the kth coordinate of vi and thus of j i(t), i.e. j ik(t) = vikelit. Since limt j i(t) = 0, then it must be that limt j ik(t) = 0, or limt |j ik(t)| = 0 or limt |vikelit| = 0. Dropping the subscript i for cleaner notation, so that we now are consider l = u + iw as a representative eigenvalue, then limt |vkelit| = limt |vk||elit| = limt |vk||e(u + iw)t| = limt |vk||eut||e iwt| = 0. As |eiwt| has a finite upper bound, i.e. as |eiwt| M for some M < and eut > 0, then necessarily u < 0, i.e. Re(l ) < 0. Since this is true for any l (i.e. for all l i, i = 1, .., n), it follows that if A is stable, then Re(l i) = ui < 0 for all i = 1, .., n, i.e. the real parts of all eigenvalues must be negative.

Conversely, suppose that the real parts of all eigenvalues l i = ui + iwi so Re(l i) = ui < 0 for all i = 1, .., n. Let x(t) be a solution to x = Ax and let xk(t) be the kth coordinate of x(t). Then, xk(t) = ・/font> j=1n a kj eljt where a kj are coefficients and l j = uj + iwj. Thus, |xk(t)| = |・/font> j=1n a kjeljt| = |・/font> j=1n a kje(uj + iwj) t| =|・/font> j=1n a kjeujteiwjt|. This implies, by triangular inequality that |xk(t)| ・/font> j=1n|a kjeujteiwjt| = ・/font> j=1n|a kj||eujt||eiwjt|. Now, again, by upper boundedness |eiwt| M and eut > 0, then |xk(t)| ・/font> j=1n|a kj|eujtM. Thus, limt |xk(t)| limt ・/font> j=1n|a kj|eujtM. Since uj < 0 for all j = 1, 2, .., n, then limt ・/font> j=1n|a kj|eujtM = 0, thus it follows that limt |xk(t)| 0, i.e. limt |xk(t)| = limt xk(t) = 0. Since this is true for any k = 1, 2, .., n, then the solution x(t) converges towards to zero as t converges towards infinity. Thus, the matrix A is stable.

Under what conditions are all eigenvalues negative? The Routh-Hurwitz conditions on n-degree polynomials are necessary and sufficient for negativity of all eigenvalues. Murata's (1977: p.92) "modified Routh-Hurwitz" conditions can be simpler to use.

Nonetheless, we can still look for properties of A that will yield a stable A. The following are straightforward:

Theorem: (Symmetric Case) The symmetric matrix A is stable iff (-1)k|Ak| > 0 for all k = 1, 2, .., n, where |Ak| is the kth order principal leading minor of A (or the determinant of a submatrix of A obtained by deleting the last n - k rows and columns).

Proof: From quadratic forms on a real symmetric matrix, we know that A is negative definite if and only if all its eigenvalues are negative. We also know for symmetric matrices that A is negative definite if and only if (-1)k|Ak| > 0.

Theorem: (Necessity) If a n n matrix A with real elements is stable, then its trace is negative and its determinant has the same sign as (-1)n, that is A stable implies trA = ・/font> i=1n aii < 0 and (-1)n|A| > 0.

Proof: This is elementary. Recall that trA = ・/font> i=1n l i and |A| = i=1n l i. Thus, if all l i < 0, then necessarily, tr A < 0. Similarly, if all l i < 0, then |A| > 0 if n is even and |A| < 0 if n is odd, thus it must be that (-1)n|A| > 0.

The above theorem is actually sufficient if A is a 2 2 matrix. However, for higher dimensions, this is no longer the case. As a result we must look elsewhere for sufficient conditions. The following are a few.

Theorem: (Lyapunov) A real n n matrix A is a stable matrix if and only if there exists a positive definite matrix H such that A H + HA is negative definite.

Proof: Given elsewhere via Lyapunov's method.

Theorem: (Quasi-Negative Case) A real n n matrix A is stable if A is quasi-negative definite.

Proof: This follows from Lyapunov's theorem. Namely, if A is quasi-negative definite, then A + A is negative definite. Thus, A I + IA is negative definite. Obviously, letting I = H, then we can see this is merely a special case of the previous theorem.

Theorem: (Symmetric Negative Case) A real n n matrix A is stable if A is symmetric and negative definite.

Proof: If A is symmetric and negative definite, then A + A is negative definite, and Lyapunov's theorem applies.

A more interesting set of sufficiency conditions has been derived in the course of the analysis of the local stability of a Walrasian tatonnement process in general equilibrium. In short, we have the following:

D-Stability: A matrix A is "D-Stable" if BA is stable for any positive diagonal matrix B.

It is obvious that if A is D-stable, then A is stable (just set B = I). Sufficient conditions for D-stability are the following:

Theorem: (Arrow-McManus) matrix A is D-stable if there exists a positive diagonal matrix C such that A C + CA is negative definite.

Proof: (Arrow and McManus, 1958). To see this, we must prove that BA is stable. Thus, from the earlier theorem, we need (BA) H + H(BA) to be negative definite. Let H = CB-1. As B is a positive diagonal matrix, then (B-1) = B-1 and B = B . If C is a positive diagonal matrix, then H = CB-1 is also a positive diagonal matrix. Substituting in for H, then the condition becomes (A B )CB-1 + CB-1(BA) = A C + CA. Since we assumed that A C + CA is negative definite, then it also must be that (BA) H + H(BA) is negative definite, thus BA is stable and thus A is D-stable.

It is easy to notice that if we let C = I, then we obtain the condition that A + A is negative definite. We know that if (i) A is quasi-negative definite or if (ii) A is symmetric and negative definite, then A + A is negative definite. Thus (i) and (ii) are also sufficient conditions for stability.

A more interesting set of results is derived from the property of "diagonal dominance" introduced into economics by Lionel McKenzie (1960).

Diagonal Dominance: a n n matrix A with real elements is dominant diagonal (dd) if there are n real numbers dj > 0, j = 1, 2, .., n such that

dj|ajj| > ・/font> i j di|aij|

for j = 1, 2, .., n.

This is also known as "column" diagonal dominance. "Row" diagonal dominance is also available and defined as the existence of di > 0 such that di|aii| > ・/font> j i dj|aij| for all i = 1, 2, ...., n. It is easy to show that a matrix which is column dd will also be row dd. This is from the Hawkins-Simon condition: namely, suppose -|aij| is the ij entry of A for i j and |aii| is the ii entry, then y > 0 such that yA > 0 implies x > 0 such that Ax > 0.

Hadamard: If n n matrix A is dominant diagonal and dj = 1 for all j = 1,2, .., n, or:

|ajj| > ・/font> i j |aij|

for j = 1, 2, .., n, then A is a "Hadamard" matrix.

Alternatively, if we consider a matrix D with diagonal elements dj as defined in the dd case, and A is dd, then obviously DA is a Hadamard matrix.

Theorem: (McKenzie) If A is dominant diagonal, then |A| 0.

Proof: (McKenzie, 1960) Suppose not. Suppose |A| = 0. Then there is an x 0 such that DAx = 0, thus djajjxj + ・/font> i j diaijxi = 0 for all j = 1, 2, .., n. In absolute value terms, then, using the triangular inequality, this implies for all j:

|djajj| |xj| = |・/font> i j diaijxi| ・/font> i j |diaij ||xi|

Let J be an index set such that if j J, then |xj| |xi| for all i = 1, .., n. Then, for j J:

|djajj| |xj| ・/font> i j |diaij ||xi| ・/font> i j |diaij ||xj|

which implies:

|djajj| ・/font> i j |diaij |

or, as all dj > 0, then:

dj|ajj| ・/font> i j di |aij |

for all j J, which contradicts diagonal dominance.

The following theorem, due to McKenzie (1960), establishes that if A has a negative dominant diagonal, then A is stable:

Theorem: (Sufficiency) If an n n matrix A is dominant diagonal and the diagonal is composed of negative elements (aii < 0 for all i = 1, .., n), then the real parts of all its eigenvalues are negative, i.e. A is stable.

Proof: (McKenzie, 1960) Suppose that A has an eigenvalue l = u + iv with a non-negative real part, i.e. u 0. As aii < 0, then:

|l - aii| = ((ui - aii)2 + vi2) ui - aii |aii|

Since A has a dominant diagonal, there exist dj > 0 (j = 1, .., n) such that dj|ajj| ・/font> i j di |aij |, thus:

dj|l - ajj| dj|aii| ・/font> i j di |aij |

But notice that this implies that the system (l I - A) has a dominant diagonal as well. By our previous theorem, we know that if a matrix has dd, then it is non-singular, i.e. it must be that |l I - A| 0. But then that contradicts the postulate that l is an eigenvalue of A. Thus, there is no eigenvalue l with a non-negative real part, i.e. the real parts of all eigenvalues of A are negative (ui < 0 for all i) , thus A is stable.

Finally, note the following corollary:

Corollary: If A has negative diagonal dominance, then it is D-stable.

Proof: We have to prove that BA is stable, where B is a positive diagonal matrix (thus, bii > 0 for all i). If A has negative diagonal dominance, then we know all its eigenvalues have negative real parts. Recall that diagonal dominance implies that there are dj > 0 (j = 1, .., n) such that dj|ajj| ・/font> i j di |aij |. Consequently, dividing and multiplying by the relevant bjj > 0, we obtain:

(dj/bjj)|bjjajj| ・/font> i j (di /bii)|biiaij | for all j = 1, 2, .., n.

Thus, dj/bjj > 0 will satisfy diagonal dominance for matrix BA. Finally, as ajj < 0 and bjj > 0 for all j, then bjjajj < 0, thus BA has a negative diagonal. Thus, BA is negative dominant diagonal and thus, by our previous theorem, is stable. Thus, A is D-stable.

back Back top Top Selected References

nextNext