An image burbled up in my social-media feed the other day, purporting to be a list of “17 Equations that Changed the World.” It’s actually been circulating for a while (since early 2014), and purports to summarize the book by that name written by Ian Stewart. This list is typo-ridden, historically inaccurate and generally indicative of a lousy knowledge-distribution process that lets us down at every stage, from background research to fact-checking to copy-editing.

Continue reading 17 Equations that Clogged My Social-Media Timeline

# Category Archives: Classical Mechanics

# Delayed Gratification

A post today by PZ Myers nicely expresses something which has been frustrating me about people who, in arguing over what can be a legitimate subject of “scientific” study, play the “untestable claim” card.

Their ideal is the experiment that, in one session, shoots down a claim cleanly and neatly. So let’s bring in dowsers who claim to be able to detect water flowing underground, set up control pipes and water-filled pipes, run them through their paces, and see if they meet reasonable statistical criteria. That’s science, it works, it effectively addresses an individual’s very specific claim, and I’m not saying that’s wrong; that’s a perfectly legitimate scientific experiment.

I’m saying that’s not the whole operating paradigm of all of science.

Plenty of scientific ideas are not immediately testable, or directly testable, or testable in isolation. For example: the planets in our solar system aren’t moving the way Newton’s laws say they should. Are Newton’s laws of gravity wrong, or are there other gravitational influences which satisfy the Newtonian equations but which we don’t know about? Once, it turned out to be the latter (the discovery of Neptune), and once, it turned out to be the former (the precession of Mercury’s orbit, which required Einstein’s general relativity to explain).

There are different mathematical formulations of the same subject which give the same predictions for the outcomes of experiments, but which suggest different *new* ideas for directions to explore. (E.g., Newtonian, Lagrangian and Hamiltonian mechanics; or density matrices and SIC-POVMs.) There are ideas which are proposed for good reason but hang around for *decades* awaiting a direct experimental test—perhaps one which could barely have been imagined when the idea first came up. Take *directed percolation*: a simple conceptual model for fluid flow through a randomized porous medium. It was first proposed in 1957. The mathematics necessary to treat it cleverly was invented (or, rather, adapted from a different area of physics) in the 1970s…and then forgotten…and then rediscovered by somebody else…connections with other subjects were made… Experiments were carried out on systems which *almost* behaved like the idealization, but always turned out to differ in some way… until 2007, when the behaviour was finally caught in the wild. And the experiment which finally observed a directed-percolation-class phase transition with quantitative exactness used a liquid crystal substance which wasn’t synthesized until 1969.

You don’t need to go dashing off to quantum gravity to find examples of ideas which are hard to test in the laboratory, or where mathematics long preceded experiment. (And if you do, don’t forget the other applications being developed for the mathematics invented in that search.) Just think very hard about the water dripping through coffee grounds to make your breakfast.

# “More Decimal Digits”

On occasion, somebody voices the idea that in year $N$, physicists thought they had everything basically figured out, and that all they had to do was compute more decimal digits. I won’t pretend to know whether this is *actually* true for any values of $N$ — when did one old man’s grumpiness become the definitive statement about a scientific age? — but it’s interesting that not every physicist with an interest in history has supported the claim.

One classic illustration of how the old guys with the beards knew their understanding of physics was incomplete involves the *specific heats of gases.* How much does a gas warm up when a given amount of energy is poured into it? The physics of the 1890s was unable to resolve this problem. The solution, achieved in the next century, required quantum mechanics, but the problem was far from unknown in the years before 1900. Quoting Richard Feynman’s *Lectures on Physics* (1964), volume 1, chapter 40, with hyperlinks added by me:

Continue reading “More Decimal Digits”

# Textbook Cardboard and Physicist’s History

By the way, what I have just outlined is what I call a “physicist’s history of physics,” which is never correct. What I am telling you is a sort of conventionalized myth-story that the physicists tell to their students, and those students tell to their students, and is not necessarily related to the actual historical development, which I do not really know!

Back when Brian Switek was a college student, he took on the unenviable task of pointing out when his professors were indulging in “scientist’s history of science”: attributing discoveries to the wrong person, oversimplifying the development of an idea, retelling anecdotes which are more amusing than true, and generally chewing on the textbook cardboard. The typical response? “That’s interesting, but I’m still right.”

Now, he’s a palaeontology person, and I’m a physics boffin, so you’d think I could get away with pretending that we don’t have that problem in *this* Department, but I started this note by quoting Feynman’s *QED: The Strange Theory of Light and Matter* (1986), so that’s not really a pretence worth keeping up. When it comes to formal education, I only have systematic experience with one field; oh, I took classes in pure mathematics and neuroscience and environmental politics and literature and film studies, but I won’t presume to speak in depth about how those subjects are taught.

So, with all those caveats stated, I can at least sketch what I suspect to be a contributing factor (which other sciences might encounter to a lesser extent or in a different way).

Suppose I want to teach a classful of college sophomores the fundamentals of quantum mechanics. There’s a standard “physicist’s history” which goes along with this, which touches on a familiar litany of famous names: Max Planck, Albert Einstein, Niels Bohr, Louis de Broglie, Werner Heisenberg, Ernst Schrödinger. We *like* to go back to the early days and follow the development forward, because the science was *simpler* when it got started, right?

The problem is that all of these men were highly trained, professional physicists who were thoroughly conversant with the knowledge of their time — well, naturally! But this means that any one of them knew more *classical* physics than a modern college sophomore. They would have known Hamiltonian and Lagrangian mechanics, for example, in addition to techniques of statistical physics (calculating entropy and such). Unless you know what they knew, you can’t really follow their thought processes, and we don’t teach big chunks of what they knew until after we’ve tried to teach what they figured out! For example, if you don’t know thermodynamics and statistical mechanics pretty well, you won’t be able to follow why Max Planck proposed the blackbody radiation law he did, which was a key step in the development of quantum theory.

Consequently, any “historical” treatment at the introductory level will probably end up “conventionalized.” One has to step extremely carefully! Strip the history down to the point that students just starting to learn the science can follow it, and you might not be portraying the way the people actually did their work. That’s not so bad, as far as learning the facts and formulæ is concerned, but you open yourself up to all sorts of troubles when you get to talking about the *process* of science. Are we doing physics differently than folks did *N* or 2*N* years ago? If we are, or if we aren’t, is that a problem? Well, we sure aren’t doing it like they did in chapter 1 of this textbook here. . . .

# Freak Waves

No, they’re not just sonic effects from early Frank Zappa recordings. As Dr. SkySkull explains, *rogue* or *freak waves* are remarkable — and sometimes dangerous — phenomena in optics and on the high seas.

# Because the World Needs Nightmares

You know what the Scientifick Blogohedron needs more of? Well, besides introductions to basic subjects, so that we can be more than chatterbots reacting to whatever news story incenses us the most?

Gosh, you people are demanding.

No, I’m talking about *nightmare fuel!*

And as only children’s television can deliver. You remember *Square One TV*, right? It came on PBS in the afternoons, after *Reading Rainbow* and before *Where in the World is Carmen Sandiego?*. Like every other aspect of my generation’s formative years, it can be relived via the video tubes. Our lives have already been uploaded: the Singularity came and went, and we were all too busy arguing to notice.

Looking back, Reimy the Estimator Girl was fairly cute, and the “Angle Dance” is somewhat frightening in that in-1983-this-was-the-future way, but one bit of sheer irrational terror stands out. I refer, of course, to the mask which Reg E. Cathey wears in the title role of “Archimedes”:

**LYRICS WITH LINKY GOODNESS:**

Archimedes!

Archimedes!

A mathematician and scientist

Born in 287 BC

He lived in the city of Syracuse

On the island of Sicily

He said he could move the world

If he only had a place to stand

A fulcrum and a lever long

And the strength of an average man

He solved the problems of his days

Using math in amazing ways

His great work lives on today

Archimedes!

Archimedes!

Continue reading Because the World Needs Nightmares

# The Laplace-Runge-Lenz Vector

Greetings from The Amaz!ng Meeting!

This is the happiest I have yet been, here in the city I am learning to hate, the city with nothing to offer me and nothing to enjoy — except Joshua, Rebecca, PZ, Phil and nine hundred other friends. Yes, Gentle Reader, you know me well enough to guess that my most truly recreational experience here in Las Vegas has been sleeping late, eating a Toblerone in bed, tidying up a blag post from my drafts pile, and missing Michael Shermer’s talk. Joshua just “texted” me, as the kids say, with the following message: “Ok. Shermer needs his Powerpoint privileges revoked. How many text-dump slides is he going to use?”

Call me psychic, Gentle Reader, or at least give me a little credit for remembering what I read. . . .

Anyway, because I’m here to have *fun,* it’s time to talk physics. With equations.

When we studied the hydrogen atom, we found that an interaction potential which fell off inversely with distance was shape-invariant, implying all sorts of nice symmetry properties of the hydrogen atom’s state space. The classical analogue of this situation would be two objects interacting via an *inverse-square* force (remember that force is given by the derivative of the potential). Having recently taken a rather madcap tour of the history of classical mechanics, we can probe a little more deeply and investigate one item in more technical detail. Today’s subject will be defining and appreciating the Laplace-Runge-Lenz vector, which as we said earlier was not discovered by Laplace, Runge or Lenz. After finding out that this vector quantity is conserved, we’ll take a quick look at equations which define ellipses and then show that an inverse-square law of gravity can yield elliptical orbits. If any portion of this post is in error, please return the unused portion for a full refund.

Continue reading The Laplace-Runge-Lenz Vector

# The Necessity of Mathematics

Today, everything from international finance to teenage sexuality flows on a global computer network which depends upon semiconductor technology which, in turn, could not have been developed without knowledge of the quantum principles of solid-state physics. Today, we are damaging our environment in ways which require all our fortitude and ingenuity just to comprehend, let alone resolve. More and more people are becoming convinced that our civilization requires wisdom in order to survive, the sort of wisdom which can only come from scientific literacy; thus, an increasing number of observers are trying to figure out why science has been taught so poorly and how to fix that state of affairs. Charles Simonyi draws a distinction between those who merely “popularize” a science and those who promote the public understanding of it. We might more generously speak of bad popularizers and good ones, but the distinction between superficiality and depth is a real one, and we would do well to consider what criteria separate the two.

Opinions on how to communicate science are as diverse as the communicators. In this Network age, anyone with a Web browser and a little free time can join the conversation and become part of the problem — or part of the solution, if you take an optimistic view of these newfangled media. Certain themes recur, and tend to drive people into one or another loose camp of like-minded fellows: what do you do when scientific discoveries clash with someone’s religious beliefs? Why do news stories sensationalize or distort scientific findings, and what can we do about it? What can we do when the truth, as best we can discern it, is simply not politic?

Rather than trying to find a new and juicy angle on these oft-repeated questions, this essay will attempt to explore another direction, one which I believe has received insufficient attention. We might grandiosely call this a foray into the philosophy of science popularization. The topic I wish to explore is the role mathematics plays in understanding and doing science, and how we disable ourselves if our “explanations” of science do not include mathematics. The fact that too many people don’t know statistics has already been mourned, but the problem runs deeper than that. To make my point clear, I’d like to focus on a specific example, one drawn from classical physics. Once we’ve explored the idea in question, extensions to other fields of inquiry will be easier to make. To make life as easy as possible, we’re going to step back a few centuries and look at a development which occurred when the modern approach to natural science was in its infancy.

Our thesis will be the following: that if one does not understand or refuses to deal with mathematics, one has fatally impaired one’s ability to follow the physics, because not only are the *ideas* of the physics expressed in mathematical form, but also the *relationships* among those ideas are established with mathematical reasoning.

This is a strong assertion, and a rather pessimistic one, so we turn to a concrete example to investigate what it means. Our example comes from the study of planetary motion and begins with Kepler’s Three Laws.

**KEPLER’S THREE LAWS**

Johannes Kepler (1571–1630) discovered three rules which described the motions of the planets. He distilled them from the years’ worth of data collected by his contemporary, the Danish astronomer Tycho Brahe (1546–1601). The story of their professional relationship is one of clashing personalities, set against a backdrop of aristocracy, ruin and war. From that drama, we boil away the biography and extract some items of geometry:

Continue reading The Necessity of Mathematics

# Physics versus Fear

In the Channel 4 programme *Breaking the Science Barrier* (1996), Richard Dawkins faces down a pendulum:

This is a classic example of a “put your money (or nose) where your mouth is” physics demonstration. It also appears in Carl Sagan’s novel *Contact* (1985), for example, and Feynman did it during the freshman physics lectures he gave at Caltech.

Later in the same show, Dawkins interviews Douglas Adams, and gives a few comparisons to help understand the depths of evolutionary time:

Continue reading Physics versus Fear

# Nutating on National TV

I know a guy who knows Conan O’Brien. I mean, this is pretty freaky: how often is a piece of lab equipment in the place where you did your undergrad thesis mentioned on TV?

Here is Conan, setting up the problem:

# Physics from Open Yale Courses

Yale has started putting course material online in a systematic way, following in the grand tradition of MIT’s OpenCourseWare. Among the handful they’ve uploaded so far, the two which catch my eye the most strongly are Fundamentals of Physics and Frontiers and Controversies in Astrophysics. These classes come with Quicktime video of the lectures, and all material is licensed under Creative Commons BY-NC-SA.

Hat tip: Peter Suber.

# Pop Science in *Die Zeit* and Spike TV

For today, I’m going to let I Postdoc, therefore I am voice my complaints for me:

What bothers me, is that both journalists and the public seem to be so much more interested in, ahem, improbable science than in the usual garden variety.

Via Doug Natelson, who adds,

It is a shame that sometimes the media can’t tell the difference between good science or engineering and crackpottery. On a plane last week I had someone (who realized I was a physicist from my reading material) ask me about the guy who can get hydrogen from seawater by hitting it with microwaves. Kind of cool, yes. Source of energy? Of course not — it takes more microwave power to break the water into hydrogen and oxygen than you can get back by burning the resulting hydrogen. It’s called thermodynamics.

Spoilsport!

There go my plans for funding my all-consuming chocolate habit by selling the secret of seawater power to Big Oil so that the industry could suppress it. . . .

At this point, I should also mention the depressing and macabre quote of the day, which comes from Prof. Clifford Johnson, string theorist at USC. He describes what happened to the guest appearance he made on Spike TV’s show *MANswers.* While the whole post is quite worth reading, I’ll quote just a tiny bit:

Continue reading Pop Science in *Die Zeit* and Spike TV

# James Burke, Weightless

I grew up on James Burke‘s books and television series, a taste I inherited from my father. All the way through high school, I had a reputation as a know-it-all, a walking encyclopaedia (who was, at least, a helpful guy). I kept telling people that if they’d watched *Connections* — and read *The Cartoon History of the Universe* — my know-it-all-itude would be a great deal less impressive.

I saw James Burke live, once, at an aerospace conference back in 2000. I had to run off and do something else right after his talk, so I didn’t get a chance to have a conversation or even get a book autographed (one more reason I’m glad I was able to say thanks to James Gleick). I do, however, remember a story he told about his days in the BBC, covering the Apollo program.

He was responsible for explaining the scientific motivations for going to the Moon, the motivations which stay valid after you beat the Russians there. As he reports in the *Connections* book, the British media put more emphasis on this than the Americans did, so British interest remained relatively high during the later missions, when NASA’s TV ratings in the United States were dropping. At the time, the significance of all this was probably less apparent, and Burke was busy enough just trying to translate NASA tech-speak into something people could understand.

Part of this job involved reading through NASA’s manuals for the equipment which would be used on the Moon. These machines were not simple devices, nor were their instructions straightforward. Each manual had to have a full description of possible failure modes, with contingency plans for each eventuality, and all written with such detail that both the astronauts and the Mission Control people could handle all possible malfunctions without going back to the original engineers. The result, Burke said, was something like this: “If X-Y-Zed, then gobbledygook. If X-Y-Zed-Beta, then gobbledygook squared.”

Finally, after a whole page of bullet-pointed technical jargon, came the last contingency plan:

“If all else fails, kick with lunar boot.”

And because this is the multimedia era, here’s a clip from BBC Four, in which James Burke demonstrates how the KC-135 “Vomit Comet” was used for weightless training.

# Nine Minutes of Science

OK, this is too good to pass up. Jim Blinn, the computer-graphics expert responsible for the *Mechanical Universe* animations — and therefore, responsible for filling my childhood with arrows — summarizes *The Mechanical Universe* in nine minutes. Watch all of first-year physics packed in a single morsel:

Blinn also worked on Caltech’s *Project MATHEMATICS!* series. I’m a little surprised that so few of the *Project MATHEMATICS!* videos have found their way onto the Intertubes yet. Here’s a “teaser trailer” of sorts, made from clips of “The Story of π”:

Continue reading Nine Minutes of Science

# Rosenhouse on Amanda Shaw

Following up on his previous post, “Is Math a Gift From God?” — calculus students say, “No!” — Jason Rosenhouse has a new essay for your delectation, “Is God Like an Imaginary Number?” Again, the short answer is, “Nope.” The longer answer will take us into the history of mathematics, the role of mysticism in theology and the relationship between science and verbal description.

Rosenhouse sets himself the task of fisking an essay in the religious periodical *First Things,* by a “Junior Fellow” of that publication named Amanda Shaw. Shaw’s thesis is that the notion of God is akin to that of an imaginary number, and moreover that the same closed-minded orthodoxy which rejected the latter from mathematics for oh so many years is unjustly keeping the former out of science. I find this stance to be, in a word, ironical: if you’re looking for dogmatism and condemnations of the heterodox, your search will be much more rewarding if you look among the people who reject scientific discoveries because they are inconsistent with a Bronze Age folk tale than if you search through science itself!

Still, it’s a fun chance to talk about history and mathematics.

**PART A: COMPLEX NUMBERS**

As I described earlier, “imaginary” and “complex” numbers arise naturally when you think about the ordinary, humdrum “real numbers” — you know, fractions, decimals and all those guys — as *lengths on a number line.* In this picture, adding two numbers corresponds to sticking line segments end-to-end, multiplication means *stretching* or *squishing* (in general, *scaling*) line segments, and negation means flipping a segment over to lie on the opposite side of zero. Complex numbers appear when you ask the question, “What operation, when performed twice in succession upon a line segment, is equivalent to a negation?” Answer: *rotating* by a quarter-turn!

Historically, mathematicians started getting into complex numbers when they tried to find better and better ways to solve real-number equations. Girolamo Cardano (1501–1576), also known as Jerome Cardan, posed the following problem:

If some one says to you, divide 10 into two parts, one of which multiplied into the other shall produce […] 40, it is evident that this case or question is impossible. Nevertheless, we shall solve it in this fashion.

Writing this in more modern algebraic notation, this is like saying [tex] x + y = 10 [/tex] and [tex] xy = 40 [/tex], which we can combine into one equation by solving for [tex] y [/tex], thusly:

[tex] xy = x(10 – x) = 40.[/tex]

In turn, shuffling the symbols around gives

[tex] x^2 – 10x + 40 = 0,[/tex]

which plugging into ye old quadratic formula yields

[tex] x = \frac{10 \pm \sqrt{100 – 160}}{2}, [/tex]

or, boiling it down,

[tex] x = 5 \pm \sqrt{-15}. [/tex]

Totally loony! Taking the square root of a *negative number?* Forsooth, thy brains are bubbled! Oh, wait, didn’t we just realize that we could maybe handle the square root of a negative number by moving into a two-dimensional plane of numbers? Yes, we did: that’s the prize our talk of flips and rotations won us!

Continue reading Rosenhouse on Amanda Shaw

# Dangerous Ideas

I must admit that when I hear somebody talking about “dangerous ideas,” one of my eyebrows will — without voluntary intervention on my part — lift upwards, Spock-style. Such talk invariably reminds me of my old film-studies professor, David Thorburn, who said, paraphrasing the acerbic Gerald Graff, “if the self-preening metaphors of peril, subversion and ideological danger in the literary theorists’ account of their work were taken seriously, their insurance costs would match those for firefighters, Grand Prix drivers and war correspondents.”

Still, when Bee at Backreaction says something is interesting, I take a look. Today’s topic is the *Edge* annual question for 2006, “What is your Dangerous Idea?” Up goes the eyebrow. I don’t want to go near the Susskind/Greene spat about “anthropic” reasoning; frankly, without technical details far beyond the level of an *Edge* essay, “anthropic” talk rapidly devolves into inanities which resemble the assertion, “Hitler *had* to lose the war, because otherwise we wouldn’t be sitting around talking about why Hitler lost the war.” Suffice to say that neither Susskind nor Greene mentions NP-complete problems or proton decay.

So, moving on, let’s get to what Bee calls “the more bizarre pieces.” I was particularly drawn to and repelled from (yeah, it was a weird feeling) the essays of Rupert Sheldrake and Rudy Rucker. The latter goes off about “panpsychism,” which sounds like a fantastic opportunity to ramble about quantum mechanics, the inner lives of seashells and the dictionary of Humpty Dumpty, in which words mean exactly what the speaker wants them to mean, reason and usage notwithstanding.

Hey, “consciousness” is just one tiny part of what living things do, and life is a teensy fraction of what the Universe does. Why not give the rest of the biosphere a little attention and support “panphotosynthesism” instead?

Continue reading Dangerous Ideas