I finished playing a very long game of Ascendancy in the middle of the week. My alter ego the kid is now reading Körner’s “The Pleasures of Counting”. My alter ego the undergraduate managed to put in a couple of hours on circuit theory (trying to understand the use of complex impedance)… he even started making headway in an old Dover book, Kron’s “Tensors for Circuits”… the last time I looked at it, I was lost.

I have some mathematics done for a post for this Monday… but I keep thinking about other things I might add to it. If I can’t stabilize the content, I may find it difficult to put the post out on time. Duh.

So who’s in charge? The managing editor wants to publish… but the mathematician isn’t ready to call it quits. We’ll see what happens.

While looking for an online reference to a particular formula – I’m horrified that I couldn’t find it in my own library! – I found the following site. It appears to deal primarily with undergraduate numerical mathematics. (And it did have the formula I was seeking.)

There are 2 points I need to emphasize. One, it has an index to a collection of YouTube videos – mathematics, of course.

Two, I have only looked at five of the videos… and I have found a mistake – 2 of them actually – in one “slide”. Unfortunately, I did not find a blog post corresponding to the video… and I didn’t see any way to attach comments to the video. Ah, I did just send an email.

Here’s a freeze-frame:

The full video is here.

The lecturer asserts that a steeper slope implies a higher R^2 because the vertical distance between the data and the fitted line will be larger. Yes, but the vertical distance between the data and its mean value is also larger. The R^2 and the adjusted R^2 will not change. What will change? The estimated variance.

He also said that if the x values were more spread out, then the R^2 would be higher. That’s interesting, because if the x values are more spread out, then the computed slope would be lower… and according to the first point, the R^2 would be lower. In fact, the R^2 will be the same.

It is conceivable that I have completely misunderstood what he means, so let me be explicit. Take a 2-variable data set, x and y, and fit a regression. Now multiply y by 10 and fit another regression: you will get the same R^2 and adjusted R^2; the estimated variance will be 100 times larger.

Now multiply x by 10 (still using the 10 times y) and fit a 3rd regression: you will get the same R^2 and adjusted R^2, and the same estimated variance as for the second case.

So I’m saying that the R^2 and adjusted R^2 are not affected by vertical or horizontal scaling of the data. (We’ve seen that the R^2 and adjusted R^2 are the same for the Hald data and the standardized Hald data – and standardizing data is a change of scale. And I standardized everything, including the dependent variable, so I also had a change of vertical scale.)

(The numerical stability might be affected by changes of scale! We’ve just seen that taking powers of x = 1.986, 1.987, …, 1.993 leads to horrendous inversions of X’X, while centering the x values (i.e. rescaling to -7/2, -5/2, …, 7/2) eliminates the numerical inaccuracies.)

In other words, just as I tell you to be careful reading my posts, I tell you to be careful reading his, or watching his videos. But you might well find something of interest among them….

I started this before I turned my kid and undergraduate loose, so let me get about their play and work… before I turn to Monday’s post.