17 Equations that Clogged My Social-Media Timeline

An image burbled up in my social-media feed the other day, purporting to be a list of “17 Equations that Changed the World.” It’s actually been circulating for a while (since early 2014), and purports to summarize the book by that name written by Ian Stewart. This list is typo-ridden, historically inaccurate and generally indicative of a lousy knowledge-distribution process that lets us down at every stage, from background research to fact-checking to copy-editing.

'17 Equations that Changed the World' (errors in image detailed in the post text)

The following comments are meant to be representative, not exhaustive.

It’s not known whether Pythagoras proved the theorem we named for him—or if any of the stories about him are more than legends, really. When you go back that far, the history of mathematics and science becomes semi-legendary. The best one can typically do for “evidence” is a fragment of a lost book quoted in another book that happened to survive, and all of it dating to decades or centuries after the events ostensibly being chronicled. Did Pythagoras actually prove the theorem we named after him, or did he merely observe that it held true in a few special cases, like the 3-4-5 right triangle? Tough to say, but the latter would have been easier, and it would seem to appeal to a number mystic, for whom it’s all about the Benjamins successive whole numbers. Pythagoras himself probably wrote nothing, and nothing in his own words survives. It’s not clear whether his contemporaries viewed him as a mathematician or primarily as a propounder of an ethical code. (Even only 150 years after the time he purportedly lived, the ancient authorities disagreed about whether Pythagoras was a vegetarian, with Aristoxenus saying no and Eudoxus yes.) If Pythagoras had never lived, and a cult had attributed their work to that name in ritual self-denial; if the stories of his visiting Egypt and being the son of a Tyrian corn merchant began as parables and were later taken as biography—it would be hard to tell the result from what we have today. (And, in fact, groups of mathematicians do sometimes publish under a collective pseudonym: witness the Bourbaki collective.)

Typical, really: Indian and Chinese people do the actual work, and the white guy who likely didn’t gets all the credit.

I’ll outsource the criticism of the “logarithms” part:

Once again [the] simple attribution to John Napier is exactly that, simplistic and historically misleading. We can find the principle on which logarithms are based in the work of several earlier mathematicians. We can find forms of proto-logarithms in both Babylonian and Indian mathematics and also in the system that Archimedes invented to describe very large numbers. In the fifteenth century Triparty, of the French mathematician Nicolas Chuquet we find the comparison between the arithmetical and geometrical progressions that underlay the concept of logarithms but if Chuquet ever took the next step is not clear. In the sixteenth century the German mathematician Michael Stifel studied the same comparison of progressions in his Arithmetica integra and did take the next step outlining the principle of logarithms but doesn’t seem to have developed the idea further.

It was in fact John Napier who took the final step and published the first set of logarithmic tables in his book Mirifici Logarithmorum Canonis Descriptio in 1614. However the Swiss clockmaker and mathematician, Jost Bürgi developed logarithms independently of Napier during the same period although his book of tables, Arithmetische und Geometrische Progress Tabulen, was first published in 1620.

The “calculus” line is a mess. For starters, in at least one version circulating online, it’s got an extra “=” thrown in, which makes the whole thing gibberish. The $df$ over $dt$ notation is due to Leibniz, but the list attributes it to Newton, his bitter enemy (and a pretty bitter guy overall, by many accounts). Pierre de Fermat understood quite a bit of the subject before Newton worked on it, getting as far as computing the maxima and minima of curves by finding where their tangent lines are horizontal. And the philosophy of setting up the subject of calculus using limits is really a nineteenth-century approach to its foundations.

Inverse-square gravity was considered before Newton, and imaginary numbers before Euler.

Credit for the normal distribution should also go to de Moivre (earlier than Gauss) and Laplace (contemporaneous).

Maxwell never wrote his Equations in that manner; that came later, with Heaviside, Gibbs, Hertz and vector calculus. The simplification provided by the vector calculus is really nothing short of astonishing.

The idea of entropy came via Clausius, who found inspiration in the work of Carnot. The statement that entropy either increases or stays the same, which we could write as $dS \geq 0$, predates Boltzmann. What Boltzmann provided was an understanding of how entropy arises in statistical physics, the study of systems with zillions of pieces whose behavior we can’t study individually, but only in the aggregate. If you want to attribute an equation to Boltzmann in recognition of his accomplishments, it’d be better to use the one that is actually carved on his tombstone,
$$S = k \log W.$$

I am not sure that $E = mc^2$ is the proper way to encapsulate the essence of relativity theory. It is a consequence, not a postulate or a premise. The Lorentz transformation equations would do a better job at cutting to the heart of the subject. Note that these formulae are named after Lorentz, not Einstein; to put the history very, very briefly, Lorentz wrote the equations down first, but Einstein understood what they meant. (And the prehistory of $E = mc^2$ is pretty fascinating, too.)

Plucking out the Schrödinger equation (the list omits the umlaut because careless) does a disservice to the history of quantum mechanics. There are ways of doing quantum physics without invoking the Schrödinger equation: Heisenberg’s matrix mechanics, the Dirac–Feynman path integral, and the one it’s my day job to work on. In fact, not only did Heisenberg’s formulation come first, but we didn’t know what Schrödinger’s work meant until Max Born clarified that the square of the size of Schrödinger’s complex number $\Psi$ is a probability.

The number of names in that last paragraph—and I wasn’t even trying—is a clue that factoids and bullet points are not a good way of learning physics.

Yes, Robert May did write about the logistic map,
$$x_{t+1} = k x_t(1-x_t),$$
but he was hardly the first to poke at it. In his influential paper “Simple mathematical models with very complicated dynamics,” there’s a moment which expresses pretty well how science happens sometimes:

How are these various cycles arranged along the interval of relevant parameter values? This question has to my knowledge been answered independently by at least 6 groups of people, who have seen the problem in the context of combinatorial theory, numerical analysis, population biology, and dynamical systems theory (broadly defined).

Also, d’Alembert was not named “d’Almbert.”