## Compressed Sensing: the L1 norm finds sparse solutions

I’ve just finished watching three hours of lectures on something called compressed sensing. I had fully expected to put out a post showing how to compute time from position on an elliptical orbit. But I really want to show you a key piece of mathematics. This was utterly new to me.

I want to show you, via a small example, that minimizing the L1 norm often returns sparse solutions.

Hey, what?

Imagine you have a one megapixel photograph. More likely, the raw file created by your digital camera is 8 to 15 megapixels… but if you save it as JPEG, you may compress it by a factor of – what? – 20? (I’m too busy to work it out.)

The key is that there is some representation in which the photograph has a sparse representation… the data can be represented with good accuracy by fewer numbers than we started with. We see something similar when we represent a sound recording by its Fourier coefficients, or computer tomography by a Radon transform.

Let me get to the simple example.
Read the rest of this entry »

## Happenings – 2011 Mar 26

During the week I came across a link to a sequence of three old movies about the Lagrangian and Eulerian viewpoints in fluid mechanics. Here’s a direct YouTube link to the first:

and, just in case you’re interested, here’s the link I found first, which is on a fluid mechanic’s blog. (Those of us who have seen “Forbidden Planet” know that a man can be “the best quantum mechanic in the system”… I see no reason not to call someone a fluid mechanic.)

And speaking of fluid mechanics…
Read the rest of this entry »

## Draw Poker – Improving the Hand

We need a technical term. I have a source against which I can check my answers, so I want to compute the “odds against” improving a hand. The odds against something are computed as the number of bad outcomes divided by the number of good outcomes.

For example: we saw in the previous poker post that the probability of being dealt at least a pair is the good outcomes (“poker” in the previous post) divided by the total outcomes (“hands”):

OK, the probability of being dealt at least a pair is almost 50%.

The odds against being dealt at least a pair is bad outcomes divided by good outcomes:

That is, the odds against being dealt at least a pair are just barely greater than 1; so close in fact that we might as well call it “even odds”.

Now let’s suppose you’ve been dealt at least a pair – or that you’ve been dealt four cards toward a straight or flush. (Worthless as they stand, they have great potential.)
Read the rest of this entry »

## Happenings – 2011 Mar 19

Monday 21 March. One edit; as usual, search on “edit”.

Good afternoon.

First, something of a time-sensitive nature.

While I was channel surfing last night – since “Blue Bloods” wasn’t on – I discovered something interesting. Tonight is a full moon… but not just any full moon. The moon will be at its closest approach (pick one: periapse, perilune, periselene, pericynthion).

As the second of the following two links put it, there’s a full moon every month and the moon goes through periapse every month – but they don’t often happen at the same time.

Here are a couple of links… more things will undoubtedly appear on the Internet as this day goes on.

http://articles.cnn.com/2011-03-18/us/nasa.moon_1_low-hanging-moons-astronomers-or-psychologists-celestial-event?_s=PM:US

http://blog.al.com/space-news/2011/03/full_moon_tonight_will_be_the.html

I’ve got another book on order… differential geometry this time, rather than multicollinearity. What the heck? It’s less than \$35 – and my history, especially the recent purchase of a completely unknown “book” that turned out to be a small collection of printed and bound Wikipedia articles – suggests that I will spend \$35 on a book sight unseen. But this wasn’t a random book chosen for its title.
Read the rest of this entry »

## introduction

Draw poker is played with a standard deck of 52 cards: 13 ranks (Ace through 10, and jack, queen, king) in 4 suits. (An ace usually counts as higher than a king, unless it is being used as “1” in a run.) Each player is dealt 5 cards.

Before I ever played poker – at home – I was required to learn a mantra: one pair, two pair, three of a kind… straight, flush, full house… four of a kind, straight flush. From least to best, those are the possible winning hands. (In fact, I will distinguish a royal straight flush from other straight flushes.)

What is the probability of being dealt each one of them?

That’s the first question I’ll answer, in several parts.

After the cards have been dealt and bet on, each player may discard some and get replacements. In a second post I will ask and answer: what are the odds against improving any given hand?

There is an excellent wiki article out there. In fact, the nice way to calculate the probabilities comes from it (especially for one pair, and for nothing).
Read the rest of this entry »

## Happenings – 2011 Mar 12

It’s been the kind of week where I looked briefly at a lot of things.

First, however, let me share something that I forgot to post last Saturday. It is especially relevant to the post about Freeman Dyson’s mental compartmentalization, physicist versus pure mathematician.

The following comes from “Empirical Model-building and Response Surfaces” (p. 219 ), by Box and Draper (0 471 81033 9):

We have sometimes been dismayed to find that the engineer newly introduced to statistical methods may put statistics and engineering in different compartments of his mind and feel that when he is doing statistics, he no longer needs to be an engineer. This is of course not so. All his engineering knowledge, hunches, and cunning concerning such matters as choice of variables and transformation of variables must still be employed in conjunction with statistical design and analysis. Statistical methods used with good engineering know how make a powerful combination, but poor engineering combined with mechanical and unimaginative use of inadequately understood statistical methods can be disastrous.

## Introduction and Review

In the previous post, we investigated multicollinearity in the Hald data as given. We used the singular value decomposition (SVD) of the design matrix X

X = u w v’.

In particular, we generalized from our experience with linear dependence, namely the fact that the rightmost columns of v are a basis for the null space of X, if the columns of X are linearly dependent… that is, the rightmost columns of X.v are 0. If, instead, we have merely near linear dependence, i.e. multicollinearity, then the rightmost columns of X.v are small rather than zero.

I will say a little more about X.v shortly.

In contrast to looking at all subsets of the columns of X, we investigated multicollinearity for four specific regressions: the best 1-, 2-, and 3-variable regressions, and for one additional regression which I had found interesting.

We learned that the most significant multicollinearity involved all four independent variables: their sum was very nearly constant, and slightly less than 100. Reading on my part informed me that these four variables were a subset of the original measurements — which were percentages adding up to 100.

It is noteworthy — even crucial — that the correlation matrix of the data does not reveal this multicollinearity.
Read the rest of this entry »