## The familiar part

This is the second post about Example 7. We confirm their analysis, but we work with the computed correlation matrix rather than their published rounded-off correlation matrix.

I am pretty well standardizing my notation. V is an orthogonal eigenvector matrix from an eigendecomposition; is the associated eigenvalues, possibly as a list, possibly as a square diagonal matrix. is the square roots of , possibly as a list, possibly as a square matrix, and possibly as a matrix with rows of zeroes appended to make it the same shape as w (below).

Ah, X is a data matrix with observations in rows. Its transpose is .

The (full) singular value decomposition (SVD) is the product , with u and v orthogonal, w is generally rectangular, as it must be the same shape as X.

Read the rest of this entry »