this is the third post of its kind. as i said in the first happenings post, i had three breakthroughs last weekend: in PCA, controls, and quantum mechanics. the controls was in post (2), the PCA is coming; this is the quantum mechanics.

at the beginning of what is now last weekend, i had a breakthru in quantum mechanics. i found a simple example of one of the things i was trying to calculate. this came from McMahon’s “Quantum Mechanics Demystified”. he is also the author of “Relativity Demystified”. both books have some really excellent examples in them, things not usually found in introductory books. the bad news is, his quantum mechanics (henceforth QM) text is marred by an awful lot of typos. the good news is, it’s cheap.

i can summarize my overall reaction as follows. i just checked for a 2nd edition which might have fixed the typos. there isn’t one. but he does have “quantum field theory demystified” and i’m going to order it, just because he’s the author. i’ll take my chances with typos. i’m expecting to see some informative examples anyway.

ok, i’ve ordered it, and a few other books on my shopping list. oh, and there’s a “string theory demystified” by mcmahon notyet published. for under $15, how can i go wrong?

for more on this kind of stuff, you want volume III of “The Feynman Lectures on Physics”. possibly Dirac’s “Quantum Mechanics”. for this, you want bras and kets, rather than wave functions and the schrodinger equation.

i’m not going to try to teach you quantum mechanics, but maybe i’ll whet your appetite. i’m also not going to prove everything in what follows.

consider the following matrix:

it is diagonal. the diagonal entries must be the eigenvalues, and the identity matrix is an orthogonal eigenvector matrix for Jz. (go ahead, work that out if you need to.)

we interpret Jz as a QM operator which can produce one of 3 measurements of “spin”, namely 0 or (which we usually think of as ). and after a measurement the system is in a “state” described by the eigenvector associated with the eigenvalue we obtained from our measurement.

that is, if i say that we measured Jz and got , then the system is in the state

which is the eigenvector associated to +.

now, we measure Jx, the component of spin along the x-axis. for now, i accept the author’s claim that the operator matrix for Jx is given by

wrt the Jz eigenvector basis. i know how to work it out in principle, but i need to go back to Lie algebras. what i want to shout about is the following calculations.

we believe that a measurement of Jx must be one of the eigenvalues of Jx, and that after such a measurement, the system will be in a state described by one of the Jx eigenvectors.

we ask two questions: what are the possible values of Jx “the spin along the x-axis”? and what is the probability of obtaining each one of those values?

well, we need to get the eigendecomposition of Jx. mathematica returns an eigenvector matrix: that is not orthonormal. “make it so.” we get the following orthonormal eigenvector matrix, which is also a transition matrix to the eigenvector basis of Jx:

yo, look at the eigenvalues:

i have them in non-customary order. the 1st eigenvector is for the 0 state, the 3rd is for the +1 state. it’s very good that the eigenvalues are 0 and , the same possibilities as for Jz. to be specific, for example, the first eigenvalue is 0 and the first eigenvector is

so that is the eigenvector (wrt the Jz basis) associated to a measurement of 0 of Jx.

what we have in hand, after measuring Jz = +, is a vector u

in the Jz (original) basis. what are its components v wrt the new basis? from

,

(U is orthogonal, i.e. the inverse is the transpose).

so we compute

x

=

so our system, in the state (1,0,0) after a measurement of Jz, is in the state

with respect to the Jx-eigenvector basis (not wrt to the Jz basis). the result of measuring Jx, however, must be one of the eigenvalues of Jx, and the state of the system after the measurement must be one of the eigenvectors of Jx.

(incidentally, you may not recognize my computation using the transition matrix in McMahon’s solution. he got my answer, but in a more round-about way.)

how do we get from a non-eigenvector to an eigenvector? QM tells us that the probabilitiy of finding the system in a particluar eigenvector state is given by the squared magnitude of the component wrt that eigenvector.

we take the state

and square each component, getting

.

those are the probabilities of finding the system in the associated eigenvector state. given my ordering for the eigenvalues,

the probabilities must be read off as 1/2 for 0, and 1/4 for each of and .

yes! that’s his answer.

(because all the elements were real, i simply squared the components; in general, we could have complex entries and we would need to compute the squared magnitude of complex numbers.)

Advertisements

## Leave a Reply

Fill in your details below or click an icon to log in:

You are commenting using your WordPress.com account. ( Log Out / Change )

You are commenting using your Google+ account. ( Log Out / Change )

You are commenting using your Twitter account. ( Log Out / Change )

You are commenting using your Facebook account. ( Log Out / Change )

wConnecting to %s