## References for generalized eigenvectors, the matrix exponential and for the SN decomposition.

one correction 30 Aug 2008, per Brian Hall’s comment below. see “edit” in this post.

First, let’s talk about books which are already in the bibliography. It’s pretty easy to find statements that for matrix exponentials,

exp(A+B) = exp(A) exp(B)

if A and B commute; edit: it’s a little rarer to find the full (and true) “if and only if” A and B commute.

Correct is:

$e^{(A+B)\ t} = e^{A\ t}\ e^{B\ t}\ \text{for all t if and only if AB = BA}$

(Golub & van Loan, “Matrix Computations”, 2nd ed. p. 559)

end edit

What’s very rare is the SN decomposition,

A = N + S,

where N is nilpotent, S is diagonable, and N and S commute.

Halmos’ “Finite Dimensional Vector Spaces” has very little about the matrix exponential, but it includes existence and uniqueness of the SN decomposition as an exercise! More generally, Halmos has 3 consecutive sections on triangular form, nilpotence, and jordan form, i.e. on the underlying nature of eigenspaces.

Strang’s “Linear Algebra and Its Applications” has an appendix on jordan form, but I don’t believe he mentions generalized eigenvectors. More generally, Strang illustrates the matrix exponential for solving matrix differential equations.

Stewart’s “Introduction to Matrix Computations” has a different definition of generalized eigenvectors.

I’ll be looking in all three of those to finally make sense of generalized eigenvectors. (It would be so easy if repeated eigenvalues always led to generalized eigenvectors, but they don’t; and I will plead that it all looks more reasonable after I know that generalized eigenvectors exist: I was long out of graduate school before I heard of them.)

In addition to linear algebra texts, we may find the matrix exponential in dynamical systems texts and in lie algebra / lie groups texts.

## The following books have been added to the bibliography.

The classic text by Hirsch & Smale, “Differential Equations, Dynamical Systems, and Linear Algebra”, has the NS decomposition. This is one of “the” books. I’ll confess that their examples were a lot easier to follow than their theorems. Unfortunately, it’s out of print, so you need to seek a used copy; or go with Perko, below.

Be careful. There is a 2nd edition, with an additional author (i.e. Hirsch, Smale, Devaney) and a changed title: “Differential Equations, Dynamical Systems, & An Introduction to Chaos, Second Edition.” The back cover advertises “Simplified treatment of linear algebra”. It does not contain the SN decomposition.

Fortunately, there is another book on dynamical systems which has the SN decomposition, and it’s a book I like a lot: Perko’s “Differential Equations and Dynamical Systems”. In addition to having the NS decomposition, this has plenty of problems involving generalized eigenvectors, and I recommend it specifically for that, too.

Finally, one of my introductory texts on Lie stuff, Hall’s “Lie Groups, Lie Algebras, and Representations”, also has the NS decomposition. Any book on matrix Lie groups has to have the matrix exponential, but this is the only one in which I have found the SN decomposition.

Hall, Brian C. Lie Groups, Lie Algebras, and Representations: An Elementary Introduction. Springer, 2003.
ISBN 0 387 40122 9.
[lie groups, lie algebras, representations; 25 May 2008]
This is a math book (for this subject, there are a lot of physics books). It is readable: it discusses its theorems and definiitons. It is primarily about matrix groups rather than lie groups, but it has lie groups in the appendices. As its title promises, it treats lie algebras, and representations. This is one of my favorite introductions to the subject.

Hirsch, Morris W., Smale, Stephen, and Devaney, Robert L. Differential Equations, Dynamical Systems & An Introduction to Chaos. Elsevier Acadameic Press, 2004 (2nd ed).
ISBN 0 12 349703 5.
[applied linear algebra, differential equations, dynamical systems, discrete systems, chaos; 25 May 2008]
This is a far different book from the 1st edition. It looks like a rather fine introduction to the first edition, and a good supply of examples.

Hirsch, Morris W., Smale, Stephen. Differential Equations, Dynamical Systems, and Linear Algebra, Academic Press, 1974.
ISBN 0 12 349550 4.
[applied linear algebra, differential equations, dynamical systems; 25 May 2008]
This is one of “the” books, even if it is out of print! (And Smale is a Fields medallist.) And yet, I find their examples a lot more informative than their theorems.

Perko, Lawrence. Differential Equations and Dynamical Systems. Springer, 2001 (3rd ed).
ISBN 0 387 95116 4
[applied linear algebra, differential equations, dynamical systems; 25 May 2008]
I really like this book: it’s readable, it’s got lots of examples and pictures, and it covers a lot of ground. It is an upper division / beginning graduate math book.

Advertisements

### 6 Responses to “books added – 25 May”

1. Brian Hall Says:

Rip, thanks for the comments about my book. I have a comment about the “if and only if” in the result about exponentials of sums of commuting matrices. If exp(tA+tB) is equal to exp(tA)exp(tB) for all real numbers t, then indeed A and B must commute. But if we merely know that exp(A+B)=exp(A)exp(B), then I don’t think this is sufficient to conclude that A and B commute. One can produce examples where exp(A), exp(B), and exp(A+B) are all equal to the Identity matrix. As an example, try A={{6pi i, 0},{0,-6pi i}} and B={{0,8pi},{-8pi,0}}. To compute the exponentials, use the formula for the exponential of a 2×2 matrix with trace zero—or ask Mathematica. And you may compute that the two matrices do NOT commute!
Cheers, Brian

• Anonymous Says:

B should be {{0, 8pi} , {8pi , 0}}

• rip Says:

Actually… no. The counterexample works. But thank you very much for your skepticism. I should have checked this a long time ago – but it seems to be right.

Here are A and B and A+B:

$A = \left(\begin{array}{cc} 6 i \pi & 0 \\ 0 & -6 i \pi \end{array}\right)$

$B = \left(\begin{array}{cc} 0 & 8 \pi \\ -8 \pi & 0\end{array}\right)$

$A + B = \left(\begin{array}{cc} 6 i \pi & 8 \pi \\ -8 \pi & -6 i \pi \end{array}\right)$

Here are the matrix exponentials, all equal to the identity (so we have exp(A+B) = exp(A) exp(B)):

$MatrixExp[A] = \left(\begin{array}{cc} 1 & 0 \\ 0 & 1\end{array}\right)$

$MatrixExp[B] = \left(\begin{array}{cc} 1 & 0 \\ 0 & 1\end{array}\right)$

$MatrixExp[A+B] = \left(\begin{array}{cc} 1 & 0 \\ 0 & 1\end{array}\right)$

And here are the products AB and BA, which are not equal:

$A.B = \left(\begin{array}{cc} 0 & 48 i \pi ^2 \\ 48 i \pi ^2 & 0\end{array}\right)$

$B.A = \left(\begin{array}{cc} 0 & -48 i \pi ^2 \\ -48 i \pi ^2 & 0\end{array}\right)$

2. rip Says:

hi Brian,
i’m delighted to see your comment. I don’t doubt your counterexample, and I’m grateful for it. Unfortunately I’m in the middle of the workweek, and about all I have time for right now is, “Thank you. I’ll get to this as soon as I can.”

vale,
rip

3. rip Says:

Hi all,
I have corrected this post and the Matrix Exponential 1 of 3 in linear algebra, per Brian Hall’s comment and counterexample. I think those are the only two places where I asserted the “if and only if” and I got it wrong in both places. (At least I was consistent.)

It was just plain careless reading on my part: the only place I saw it – as far as I can tell in retrospect – seems to be Golub and van Loan, and they had it right. What was I thinking?

I’m sorry about that, but I doubt that it will be the last of my mistakes. Don’t you mistake my posts for gospel; read them critically.

Vale,
Rip