## PCA / FA example 4: davis. Reciprocal basis 4.

(this has nothing to do with the covariance matrix of X; we’re back with $A^R$ for the SVD of X.)

in the course of computing the reciprocal basis for my cut down $A^R$

$\left(\begin{array}{ll} 7.48331 & 0. \\ -3.74166 & -2.44949 \\ -3.74166 & 2.44949\end{array}\right)$

i came up with the following matrix:

$\beta = \left(\begin{array}{ll} 0.0445435 & 0 \\ -0.0890871 & -0.204124 \\ -0.0890871 & 0.204124\end{array}\right)$

now, $\beta$ is very well behaved. Its two columns are orthogonal to each other:

$\left(0.0445435,-0.0890871,-0.0890871\right)$ $\left(\begin{array}{l} 0 \\ -0.204124 \\ 0.204124\end{array}\right) = 0$

And each column of $\beta$ is orthogonal to one of the $A^R$ vectors…

$\left(0.0445435,-0.0890871,-0.0890871\right)$ $\left(\begin{array}{l} 0. \\ -2.44949 \\ 2.44949\end{array}\right) = 0$

$\left(0,-0.204124,0.204124\right)$ $\left(\begin{array}{l} 7.48331 \\ -3.74166 \\ -3.74166\end{array}\right) = 0$

and each column of $\beta$ has dot product 1 with the other of the $A^R$ vectors:

$\left(0.0445435,-0.0890871,-0.0890871\right)$ $\left(\begin{array}{l} 7.48331 \\ -3.74166 \\ -3.74166\end{array}\right) = 1$

$\left(0,-0.204124,0.204124\right)$ $\left(\begin{array}{l} 0. \\ -2.44949 \\ 2.44949\end{array}\right) = 1$

i know perfectly well that i could have simply said that $\beta$ is diagonal and that $\beta^T\ A^R = I$, for my cut-down $A^R$.

I wanted to write out the orthogonality relations for $\beta$ explicitly. you see, there’s just one tiny problem: i know that $\beta$ is wrong. It is not the reciprocal basis for $A^R$.

What could be wrong? how can it be so well-behaved but wrong? on a hunch, i computed the normals to the planes spanned by the columns of B and by the columns of $\beta$, and by the (first two) columns of $A^R$. in 3D we just use the cross product of two vectors to find their normal.

The cross product of the first two columns of $A^R$ is:

$\{-18.3303,-18.3303,-18.3303\}$

That’s a normal to the plane spanned by the first two columns of $A^R$. it’s also (anti-)parallel to the third column of v:

$\{0.57735,0.57735,0.57735\}$.

Of course: the eigenvector basis described by v is orthogonal, so the third one is normal to the plane spanned by the first two.

Similarly, the cross product of the first two columns of B is:

$\{-0.0181848,-0.0181848,-0.0181848\}$.

(how convenient that the normal vectors have all components equal!) that, too, is (anti-)parallel to the third column of v, so it’s also normal to the plane spanned by the first two columns of $A^R$.

and the cross product of the columns of $\beta$?

$\{-0.0363696,-0.00909241,-0.00909241\}$.

well, shiver my timbers. that is not like the other two cross products, hence not normal to the plane spanned by the first two columns of $A^R$.

In contrast to the two vectors of B, the two vectors of $\beta$ do not lie in the same plane as the $A^R$ vectors. They do not span the same plane. They are not a reciprocal basis for the same plane.

$\beta$ is very nice 2D basis. it just happens to span a plane other than the one spanned by $A^R$. it’s not normal to the third column of v.

It isn’t sufficient to require $\beta^T\ A = I$; we also need to require that the columns of $\beta$ span the same space – in this case, the same plane – as the columns of $A^R$!

And how would we realize that our alleged reciprocal basis $\beta$ wasn’t one? well, the projections of X onto the reciprocal basis should be the $S^R$. if they’re not, something’s wrong.

(guess what? now you know how i discovered that $\beta$ was wrong.)

i would hate weighted eigenvector matrices when there are eigenvalues equal to zero, except that i think i can handle them. this is when i decided to replace zero eigenvalues by 1s, take the inverse transpose, and then take a sub-basis (i.e. the appropriate number of columns).

My own list of things to do now includes constructing the reciprocal basis in precisely that fashion. i choose to construct it because it corresponds to $S^R$, and let’s me compute the data values for the $A^R$ basis.