Well, I’ve managed to do a little reading without computing things… and I’ve managed to do a few simple computations that make sense.

I’ve been reading a very little bit about ridge regression and principal component regression. It appears that either or both may be used when the independent variables are multi-collinear. Principal component regression just might show up here relatively soon; it will, however, probably be quite a while before I tackle ridge regression

(My major concern about principal component regression is to make sure that it really is exactly what it sounds like. I know perfectly well how to do principal components and how to do regression. I just need to know that the terminology really refers to the obvious computation: find the principal components of the data and use some or all of them as the independent variables, instead of the original data.)

I also found a new – to me – definition of “deletion residuals”… and they are used to compute a new – to me – measure of the best fitting regression. I did take a little time to confirm the definitions and to verify the computations on one example. These should both show up in technical posts soon.

The next regression post, however, may not be the “single deletion properties” that I intended… because I think I know how to calculate both the “mean prediction” and the “single prediction” bands which Mathematica® provides. I have verified some calculations for a 1-variable case – that is, the properties which Mathematica prints match a numerical example – but I still need to verify them for a multiple-variable case, and then I need to work out the derivations and confirm the calculations by hand.

That is, of the three major properties which I did not understand last weekend, I think I do now understand two of those properties… and it seems appropriate to explain them right after the post in which I failed to explain them.

I’ve also found what has started out as a very readable book on Bayesian methodology in statistics. I’m not likely to abandon my habitual practice in regression, but maybe I had better post it before this book makes me an apostate.

I won’t go into details, but you might simply look up Bayes Theorem. Wikipedia has a nice statement and some nice examples.

Now I’m going to turn my kid loose to see what mathematics looks like pure fun this morning… and then I’ll probably go to work on those regression bands.

### Like this:

Like Loading...

*Related*

## Leave a Reply