## PCA / FA example 4: davis. Scores & reciprocal basis 1.

The reciprocal basis for $A^R$, using matrix multiplication.

i keep emphasizing that $A^Q$ is the new data wrt the orthogonal eigenvector matrix; and i hope i’ve said that the R-mode scores $S^R = X\ A^R$ are not the new data wrt the weighted eigenvector matrix $A^R$. In a very real sense, the R-mode scores $S^R$ do not correspond to the R-mode loadings $A^R$. This is important enough that i want to work out the linear algebra. In fact, i’ll work it out thrice. (And i’m going to do it differently from the original demonstration that $A^Q$ is the new data wrt v.)

Of course, people will expect us compute the scores $S^R$, and I’ll be perfectly happy to. I just won’t let anyone tell me they correspond to the $A^R$.

First off, we have been viewing the original data as vectors wrt an orthonormal basis. Recall the design matrix: $\left(\begin{array}{lll} -6 & 3 & 3 \\ 2 & 1 & -3 \\ 0 & -1 & 1 \\ 4 & -3 & -1\end{array}\right)$

Let’s take just the second observation: $X_2 = (2,\ 1,\ -3)$

Those are the components of a vector.

Second, we have an orthonormal basis defined by the orthogonal eigenvector matrix v. Recall $A^Q$: $\left(\begin{array}{lll} -7.34847 & 0. & 0. \\ 2.44949 & -2.82843 & 0. \\ 0. & 1.41421 & 0. \\ 4.89898 & 1.41421 & 0.\end{array}\right)$

for the second row, in particular, we have new components…. $A^{Q}_2 = (2.44949,\ -2.82843,\ 0.)$

We’ve already seen that old and new components (x and y, resp) are related by $X_2 = v\ A^{Q}_2$,

although we wrote the whole thing as $A^Q = X\ v$. yes? we confirm it simply by computing… $v\ A^{Q}_2 = \left(\begin{array}{lll} 0.816497&0&0.57735\\ -0.408248&-0.707107&0.57735\\ -0.408248&0.707107&0.57735\end{array}\right)$ x $\left(\begin{array}{l} 2.44949\\ -2.82843\\ 0.\end{array}\right)$

= $\left(\begin{array}{l} 2.\\ 1.\\ -3.\end{array}\right)$

which is $X_2$.

The second row of $A^Q$ represents the same vector as the second row of X, but the components are different because the basis vectors are different.

All that was review, albeit from a slightly different point of view. let’s move on to the weighted eigenvector matrix $A^R$; it is not an orthonormal basis because the basis vectors are not of length 1, although they are orthogonal to each other. Here’s my $A^R$ from the SVD: $\left(\begin{array}{llll} 7.48331&0.&0.&0.\\ -3.74166&-2.44949&0.&0.\\ -3.74166&2.44949&0.&0.\end{array}\right)$

i just want the first two columns. Here’s my $A^R$ cut down… $\left(\begin{array}{ll} 7.48331&0.\\ -3.74166&-2.44949\\ -3.74166&2.44949\end{array}\right)$

and my $S^R$ cut down: $\left(\begin{array}{ll} -67.3498&0.\\ 22.4499&-9.79796\\ 0.&4.89898\\ 44.8999&4.89898\end{array}\right)$

I’ve been implying, if not saying, that the projections of X onto $A^R$, namely $X\ A^R$ (which are the R-mode scores $S^R$), are the components of the data wrt a basis other than $A^R$. it’s what may best be called the reciprocal basis.

(if you haven’t looked at “the reciprocal basis” in https://rip94550.wordpress.com/2008/03/21/transpose-matrix-adjoint-operator-2/, you might want to.)

Briefly, a reciprocal basis is constructed to make the dot product of vectors work out right when we have a non-orthonormal basis. What we do is get the components of one vector wrt the basis, the other vector wrt the reciprocal basis, and then we can compute their dot product in the usual way, as the sum of products of corresponding entries; i.e. as a row vector times a column vector.

If we have a transition matrix A, then the transition matrix B for the reciprocal basis is B = $A^{-T}$. (because it gives us $B^T\ A = A^{-1}\ A = I$, which says that the dot product of a row of $B^T$ with a column of A is either 0 or 1. it’s equivalent to saying that the dot product of a column of B with a column of A is either 0 or 1. either way, the column vectors of B and of A are pairwise orthogonal.)

we have a problem, however: we have a matrix $A^R$ which is not invertible. Another damned dragon to cope with.

We could look for a matrix B such that $B^T\ A = I$. unfortunately, such matrices are not unique.

Let’s construct the reciprocal basis geometrically. As $A^R$ was constructed from v by multiplying columns by $\sqrt{\text{eigenvalue}}$ we will construct B by dividing v by $\sqrt{\text{eigenvalue}}$, but we’ll just do it for the first two columns. The quickest way to accomplish that is to post-multiply by a diagonal matrix whose entries are $\frac{1}{\sqrt{\text{Eigenvalue}}}$, when possible and 1 otherwise… the diagonal eigenvalue matrix is… $\left(\begin{array}{lll} 9.16515&0.&0.\\ 0.&3.4641&0.\\ 0.&0.&0.\end{array}\right)$

so i form a new diagonal matrix $\left(\begin{array}{lll} 0.109109&0.&0.\\ 0.&0.288675&0.\\ 0.&0.&1.\end{array}\right)$

Note the 1 instead of 0 in the (3, 3) slot.

recall v… $\left(\begin{array}{lll} 0.816497&0&0.57735\\ -0.408248&-0.707107&0.57735\\ -0.408248&0.707107&0.57735\end{array}\right)$

and compute the product of v and the new diagonal matrix $\left(\begin{array}{lll} 0.816497&0&0.57735\\ -0.408248&-0.707107&0.57735\\ -0.408248&0.707107&0.57735\end{array}\right)$ x $\left(\begin{array}{lll} 0.109109&0.&0.\\ 0.&0.288675&0.\\ 0.&0.&1.\end{array}\right)$

= $\left(\begin{array}{lll} 0.0890871&0.&0.57735\\ -0.0445435&-0.204124&0.57735\\ -0.0445435&0.204124&0.57735\end{array}\right)$

and since i only want the first two columns, i cut that down to $B = \left(\begin{array}{ll} 0.0890871&0.\\ -0.0445435&-0.204124\\ -0.0445435&0.204124\end{array}\right)$

and call it B.

It is true that $B^T\ A = I$. (so the columns of B are mutually orthogonal to the columns of A.)

Now, B is the transition matrix for the reciprocal basis of $A^R$. $S^R$ was defined as $X\ A^R$, the projections of the data onto the $A^R$ basis. i claim that $S^R$ is the components of the data wrt B, the reciprocal basis for $A^R$.

i want to confirm this just for the second data vector, namely that $X_2 = B\ S^{R}_2$,

i.e. the old components are found by applying the transition matrix B to the new components. the RHS is $B\ S^{R}_2 = \left(\begin{array}{ll} 0.0890871&0.\\ -0.0445435&-0.204124\\ -0.0445435&0.204124\end{array}\right)$ x $\left(\begin{array}{l} 22.4499\\ -9.79796\end{array}\right)$

= $\left(\begin{array}{l} 2\\ 1\\ -3\end{array}\right)$

and, once again, we have found the old components of the second observation. More importantly, we have exhibited a basis $B \ne\ A^R$ wrt which the R-mode scores represent the new components of the data.