PCA / FA example 4: Davis. review (2)

Edited 16 Aug 2008 to mention the post of 16 Aug 2008. see “edit:”

Let’s take up the questions about the third set of definitions of the A’s and S’s. Recall: We computed 4 matrices, A^R, S^R, A^Q, S^Q. Just as importantly, we computed them 3 different ways, and got – strictly speaking – 3 different answers because the sizes were different. The 3 different ways were the SVD, the full eigendecompositions of X\ X^T and X^T\ X, and davis’ cut-down versions which used an invertible 2×2 matrix of \sqrt{\text{eigenvalues}}.

for the first and third alternatives, I showed that Q-mode and R-mode were related by equations of the form

S^Q = A^R\ \sqrt{\text{eigenvalues}}

and

S^R = A^Q\ \sqrt{\text{eigenvalues}},

where the exact form of \sqrt{\text{eigenvalues}} depends on which of the 2 alternatives we used.

  • is this [duality] true for the 3rd alternative?
  • should i work it out?
  • does it affect my recommendation that we either use the SVD or the full eigendecompositions in preference to davis’ cut-down version?
  • we should take a careful look at the relationship between the eigendecompositions and the SVD.

does duality hold for the 3rd alternative? (yes.)

my primary set of definitions comes from the SVD and gives me some very convenient equations:

A^R := v\ w^T.

A^Q := u\ w.

S^R := X\ A^R ( = A^Q\ w^T,\ \text{duality})

S^Q : X^T\ A^Q ( = A^R\ w,\ \text{duality})

the duality between Q-mode and R-mode is clear and easy if we use the SVD. What we lost, compared to the alternatives, is that A^R is not the same size as v, A^Q is not the same size as u.

Davis’ definitions required that we throw away all the columns and rows of zeroes. i believe that this is equivalent to the cut-down SVD, where u and v are orthonormal but not orthogonal. (see the SVD category.) we saw the duality here, too.

for the remaining alternative, where i had more columns of zeroes than davis, but fewer than the SVD, i had to use the 3×3 and 4×4 \sqrt{eigenvalue} matrices, i.e. square matrices. (sorry, i’ve changed notation, because i can’t remember the old one: D for diagonal, 3 or 4 for the size.)

D3 = \left(\begin{array}{lll} 9.16515&0.&0.\\ 0.&3.4641&0.\\ 0.&0.&0.\end{array}\right)

D4 = \left(\begin{array}{llll} 9.16515&0.&0.&0.\\ 0.&3.4641&0.&0.\\ 0.&0.&0.&0.\\ 0.&0.&0.&0.\end{array}\right)

Then, for example, A^R is a \sqrt{eigenvalue}-weighted eigenvector matrix (and the weights are D3 because the orthogonal eigenvector matrix v is 3×3):

A^R := v\ D3

S^R := X\ A^R

A^Q := u\ D4

S^Q := X^T\ A^Q.

We can write D3 and D4 in terms of w: they are

D3 = \sqrt{w^T\ w}

D4 = \sqrt{w\ w^T}.

should i work it out? i don’t think so, but here’s the answer:

from there, we could show that duality holds, although the exact form of the equations is different. In fact,

S^R = A^Q\ w\ (\text{instead of }A^Q\ w^T)

S^Q = A^R\ w^T\ (\text{instead of }A^R\ w)

it may seem weird, but it’s just a consequence of changing the sizes of A^R and A^Q. all these different forms of w are the same two nonzero \sqrt{eigenvalues} in different sizes of matrices. If you want to work this out, bear in mind that diagonal matrices commute.

does it affect my recommendation that we either use the SVD or the full eigendecompositions in preference to davis’ cut-down version? no.

just make sure you have the correct duality equations for whichever of these 3 alternatives you choose to use.

my personal preference is overwhelmingly to use the SVD, but i consider it one of three valid choices. i dislike one of those choices (the cut-down SVD) but that too may be considered just a burr under my saddle.

Let me check the duality, for myself (i.e. let me make sure i wrote the last two duality equations correctly.) first, S^R = A^Q\ w:

A^Q\ w = \left(\begin{array}{llll} -7.34847&0.&0.&0.\\ 2.44949&-2.82843&0.&0.\\ 0.&1.41421&0.&0.\\ 4.89898&1.41421&0.&0.\end{array}\right) x \left(\begin{array}{lll} 9.16515&0&0\\ 0.&3.4641&0\\ 0.&0&0\\ 0.&0&0\end{array}\right)

= \left(\begin{array}{lll} -67.3498&0.&0.\\ 22.4499&-9.79796&0.\\ 0.&4.89898&0.\\ 44.8999&4.89898&0.\end{array}\right)

S^R = \left(\begin{array}{lll} -67.3498&0&0\\ 22.4499&-9.79796&0\\ 0.&4.89898&0\\ 44.8999&4.89898&0\end{array}\right)

That’s good.

second, S^Q = A^R\ w^T:

A^R\ w^T = \left(\begin{array}{lll} 7.48331&0.&0.\\ -3.74166&-2.44949&0.\\ -3.74166&2.44949&0.\end{array}\right) x \left(\begin{array}{llll} 9.16515&0&0&0\\ 0.&3.4641&0&0\\ 0.&0&0&0\end{array}\right)

=\left(\begin{array}{llll} 68.5857&0.&0.&0.\\ -34.2929&-8.48528&0.&0.\\ -34.2929&8.48528&0.&0.\end{array}\right)

S^Q = \left(\begin{array}{llll} 68.5857&0&0&0\\ -34.2929&-8.48528&0&0\\ -34.2929&8.48528&0&0\end{array}\right)

And that’s good.

Does the duality between Q-mode and R-mode depend on having centered data? certainly not for the SVD. i believe that what’s affected is the interpretation of X^T\ X and X\ X^T as being proportional to the covariance matrices \frac{X^T\ X}{N-1} and \frac{X\ X^T}{k-1}.

we should take a careful look at the relationship between the eigendecompositions and the SVD. i suspect i had something specific in mind when i wrote that, but i don’t recall it. But let’s refresh our memories.

Let’s recall where the \sqrt{eigenvalue} matrices D3 and D4 came from. From the SVD

X = u\ w\ v^T,

we compute

X^T\ X = v\ w^T\ u^T\ u\ w\ v^T = v\ w^T\ w\ v^T = v\ {D_3}^2\ v^T

and

X\ X^T = u\ w\ w^T\ v^T = u\ {D_4}^2\ u^T

that is, the matrix v of the SVD of X is an eigenvector matrix for X^T\ X, and u is an eigenvector matrix for X\ X^T. the diagonal eigenvalue matrices of X^T\ X and X\ X^T are D3^2 andD4^2 resp. the nonzero entries in D3 and D4 and w are the same. in other words, the nonzero eigenvalues of X^T\ X and X\ X^T are the squared nonzero values of w.

if the matrix X were of full rank (for davis’ example, that would have been rank 3), then there would have been 3 nonzero eigenvalues (equivalently, 3 nonzero principal values w). both X^T\ X and X\ X^T would have been of rank 3, but since X\ X^T is 4×4, it’s fourth eigenvalue must be zero.

to me, the most important thing is that we know the data is, in fact, two dimensional. how do we know it? any number of ways:

  1. there are only two nonzero eigenvalues.
  2. there are only two nonzero singular values.
  3. every one of A^R, S^R, A^Q, S^Q has only two nonzero columns.
  4. every row of the design matrix X adds up to zero.

i have to say that i find all but (3) completely convincing, although (3) should be convincing. If all someone gave me was (3), i’d feel more comfortable looking at the eigenvalues and counting how many were nonzero. I’d be making sure that someone hadn’t thrown away an eigenvector with a small but nonzero eigenvalue, or replaced a small but nonzero eigenvalue by zero.

ok, we know it’s true. Why is it true? why is the design matrix two dimensional? simply because every row of the design matrix adds up to zero. Our 3D vectors satisfy a linear restriction, they are linearly dependent.

in particular, the same thing would happen if every row added up to 1, or to 100 (per cent). (it would happen if every row summed to any common value; but 0, 1, and 100 are numbers we might well have required of our data.)

edit: not only do I have to be more precise, but here be dragons, too. See 16 Aug 2008 “PCA / FA tricky preprocessing”. Setting the row sums to zero will reduce the rank. Setting the row sums to 1 or 100 will not; but following that by centering the data, will reduce the rank.

to recap:

  • is this [duality] true for the 3rd alternative? yes.
  • should i work it out? i chose not to, but gave you the answer.
  • does it affect my recommendation that we either use the SVD or the full eigendecompositions in preference to davis’ cut-down version? no. if anything, i’m rather blase’ about which alternative anyone uses. just realize that the duality equations are different.
  • we should take a careful look at the relationship between the eigendecompositions and the SVD. Done, but i wonder if i had something more specific in mind.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: