Two mathematical events during this past week were deeply satisfying.

One of them occurred Tuesday morning as I was waking up. I do not use an alarm clock, so I wake up gradually. I was… mulling over is a good verb for it… mulling over Monday’s post. By the time I was fully awake, I knew that I had made one mistake in terminology, and that I wanted to point out the distinction between a vector norm and a matrix norm.

This is a far cry from waking up knowing new mathematics (as Ramanujan did)… and, were I perfect, I would never need to wake up knowing I had made a mathematical mistake. But I’m not perfect, and I do make mistakes… and I was glad that my subconscious had pointed it out to my conscious mind.

So, although I did not deliberately set out to “sleep on” Monday’s post, I did in fact do so and it was worthwhile. I made an edit and added two comments to the post on Tuesday morning before work.

The other event occurred Wednesday evening: I now know how the eigenstructure table is computed in Mathematica®.

I had asked on the Mathematica newsgroup, for the second time, how it was computed. A week had gone by with no answers at all. Wednesday morning I found an e-mail from a Wolfram employee.

Perhaps my question wasn’t sufficiently clear. He addressed my concerns about the documentation, but did not explain the computation.

But he did invite me to send him a Mathematica notebook. And so I worked on one Wednesday evening after work.

I finished it at 8:30 PM. I thought it was good enough to send, but I thought I would sleep on it. After all, I had just had a reminder of how good it could be to let something simmer overnight.

Anyway, at 8:30 PM I flopped into the recliner in front of my TV… I was free to unwind and “Criminal Minds” would start in half an hour.

10 minutes later it suddenly occurred to me that I needed to apply this algorithm to that matrix. It might not work out, but I had to try it.

It did in fact work out. I had gotten Mathematica’s eigenstructure table for the Hald data. I love it when I do that! And it was just before 9 PM, so I had not even missed the beginning of my show.

Can you guess what this Monday’s post will be? (Life permitting, of course.)

It turns out that I do not yet understand the calculation. Yes, I know exactly how to compute it – but I do not know why.

Interestingly, the calculation reminds me of a couple of things I saw when we were doing PCA (principal component analysis): one, we saw in Bartholomew et al. that computing scores involved dividing instead of multiplying by square roots of eigenvalues; two, we saw Basilevsky normalize the rows instead of the columns of an eigenvector matrix.

Well, this calculation divides by the eigenvalues (rather than by their square roots, but it divides), and it normalizes the rows of an eigenvector matrix.

Let me see if I can figure out what’s going on. But I will save that for another post… Monday’s post will already have two examples in it, and I think that will be sufficient for a day’s work.

On of my alter-egos has done his bit this morning – the kid felt like looking at vibration in structures. The grad student will bend his nose to the grindstone of abstract algebra after I get this post out. Then I think I’ll try to get Monday’s post written – after all, the notebook I was going to send to Wolfram about the eigenstructure table puts the post at the beginning of what I call Stage IV: it contains precisely the mathematics for the post, no more and no less… so I just have to surround the math with words and then turn the result into a post.

Hmm. The kid does whatever he wants… the grad student slaves away at homework… the teacher writes posts… poor Rip tries to math whenever they leave him some time… and now my subconscious has been throwing in his two cents. Well, it seems to work.

And yes, I deliberately left a typo in the previous paragraph, using “to math” as a verb.

## Leave a Reply