The familiar part

This is the second post about Example 7. We confirm their analysis, but we work with the computed correlation matrix rather than their published rounded-off correlation matrix.

I am pretty well standardizing my notation. V is an orthogonal eigenvector matrix from an eigendecomposition; $\lambda$ is the associated eigenvalues, possibly as a list, possibly as a square diagonal matrix. $\Lambda$ is the square roots of $\lambda\$, possibly as a list, possibly as a square matrix, and possibly as a matrix with rows of zeroes appended to make it the same shape as w (below).

Ah, X is a data matrix with observations in rows. Its transpose is $Z = X^T\$.

The (full) singular value decomposition (SVD) is the product $u\ w\ v^T\$, with u and v orthogonal, w is generally rectangular, as it must be the same shape as X.
Read the rest of this entry »