(this has nothing to do with the covariance matrix of X; we’re back with for the SVD of X.)
in the course of computing the reciprocal basis for my cut down
i came up with the following matrix:
now, is very well behaved. Its two columns are orthogonal to each other:
And each column of is orthogonal to one of the vectors…
and each column of has dot product 1 with the other of the vectors:
i know perfectly well that i could have simply said that is diagonal and that , for my cut-down .
I wanted to write out the orthogonality relations for explicitly. you see, there’s just one tiny problem: i know that is wrong. It is not the reciprocal basis for .
What could be wrong? how can it be so well-behaved but wrong? on a hunch, i computed the normals to the planes spanned by the columns of B and by the columns of , and by the (first two) columns of . in 3D we just use the cross product of two vectors to find their normal.
The cross product of the first two columns of is:
That’s a normal to the plane spanned by the first two columns of . it’s also (anti-)parallel to the third column of v:
Of course: the eigenvector basis described by v is orthogonal, so the third one is normal to the plane spanned by the first two.
Similarly, the cross product of the first two columns of B is:
(how convenient that the normal vectors have all components equal!) that, too, is (anti-)parallel to the third column of v, so it’s also normal to the plane spanned by the first two columns of .
and the cross product of the columns of ?
well, shiver my timbers. that is not like the other two cross products, hence not normal to the plane spanned by the first two columns of .
In contrast to the two vectors of B, the two vectors of do not lie in the same plane as the vectors. They do not span the same plane. They are not a reciprocal basis for the same plane.
is very nice 2D basis. it just happens to span a plane other than the one spanned by . it’s not normal to the third column of v.
It isn’t sufficient to require ; we also need to require that the columns of span the same space – in this case, the same plane – as the columns of !
And how would we realize that our alleged reciprocal basis wasn’t one? well, the projections of X onto the reciprocal basis should be the . if they’re not, something’s wrong.
(guess what? now you know how i discovered that was wrong.)
i would hate weighted eigenvector matrices when there are eigenvalues equal to zero, except that i think i can handle them. this is when i decided to replace zero eigenvalues by 1s, take the inverse transpose, and then take a sub-basis (i.e. the appropriate number of columns).
My own list of things to do now includes constructing the reciprocal basis in precisely that fashion. i choose to construct it because it corresponds to , and let’s me compute the data values for the basis.