Gamow, Bread Rationing, and the Normal Distribution

"angelic" baker & suspicious professorYou might be familiar with George Gamow, the mid-20th-century physicist who, with his student Ralph Alpher, came up with the Big Bang Theory long before there was any experimental evidence. Gamow is also the best writer for the layperson on science, and one of the best on math, I’ve ever read. In particular, his book One, Two, Three… Infinity is a masterpiece, jammed with fascinating ideas presented with absolute clarity. (Though much of it — mostly the non-math stuff — is kind of out-of-date now; the revised edition came out in 1961.)

A few years back, my friend Doug Hofstadter sent me a short article of Gamow’s about how the normal distribution was once used to expose a dishonest baker — apparently a true story. But true or not, it’s a fascinating story of mathematical probability in real life, one that I think would interest even a lot of apathetic middle-school and high-school students! A PDF of the story is available at



Terms, Notations, and (Mostly Needless) Confusions

I wanted to call this “Logician General’s Warning: Confusion about Terminology is Hazardous to Your Understanding”, but it takes too much space…


Needless Confusion Over Terminology and Notation

A friend of mine who has a degree in statistics commented a few years ago that he couldn’t understand why people were confused about the terms random variable, probabilistic variable, and stochastic variable; after all, they all mean the same thing. I instantly realized that I myself had been confused because I didn’t know that. Or, quite likely, I once knew but had totally forgotten! I’ve seen confusion — usually needless confusion — over terminology cause serious problems many times, both inside and outside the classroom.

And while I’m talking about probabilistic things, how about Bernoulli “processes”, Markov “chains”, and Hidden Markov “models”? In my experience, those are the usual terms for the three phenomena; but they’re all “processes”!

The same thing happens with notation. I was guest-teaching a lesson on Zeno’s paradox of Achilles and the Tortoise to a high-school math “exploration” class (see my post about it, As an example of a convergent infinite series, I wrote on the board


A lot of students had trouble with the 1/(2^n) part until their regular teacher pointed out that it means the same thing as (1/2)^n — a more familiar notation to them. And I probably would have used the latter form, if it had even occurred to me it might make a difference 😦 .

Hard-to-Avoid Confusion Over Terminology and Notation

How many students confuse quadratic expressions, quadratic equations, and quadratic functions? Many of my own students certainly did, but at least the terms are as consistent as possible. I’d say the situation with the two common notations for derivatives — dy/dx and y’ — is somewhere between “Needless” and “Hard-to-Avoid”. There’s some justification for both notations, but I wonder if it’s worth it.

It’s vitally important that students understand and remember the terms and notation we throw at them. If they’re mechanically following rules but they confuse widgets and wodgets, they’re dead; even if they’re really going for understanding, confusion about terms and notation can waste a lot of their time, and ours.