# Know Thy Audience?

D. W. Logan et al. have an editorial in PLoS Computational Biology giving advice for scientists who want to become active Wikipedia contributors. I was one, for a couple years (cue the “I got better”); judging from my personal experience, most of their advice is pretty good, save for item four:

Wikipedia is not primarily aimed at experts; therefore, the level of technical detail in its articles must be balanced against the ability of non-experts to understand those details. When contributing scientific content, imagine you have been tasked with writing a comprehensive scientific review for a high school audience. It can be surprisingly challenging explaining complex ideas in an accessible, jargon-free manner. But it is worth the perseverance. You will reap the benefits when it comes to writing your next manuscript or teaching an undergraduate class.

Come again?

Whether Wikipedia as a whole is “primarily aimed at experts” or not is irrelevant for the scientist wishing to edit the article on a particular technical subject. Plenty of articles — e.g., Kerr/CFT correspondence or Zamolodchikov c-theorem — have vanishingly little relevance to a “high school audience.” Even advanced-placement high-school physics doesn’t introduce quantum field theory, let alone renormalization-group methods, centrally extended Virasoro algebras and the current frontiers of gauge/gravity duality research. Popularizing these topics may be possible, although even the basic ideas like critical points and universality have been surprisingly poorly served in that department so far. While it’s pretty darn evident for these examples, the same problem holds true more generally. If you do try to set about that task, the sheer amount of new invention necessary — the cooking-up of new analogies and metaphors, the construction of new simplifications and toy examples, etc. — will run you slap-bang into Wikipedia’s No Original Research policy.

Popularization is hard. When you make a serious effort at it, let yourself get some credit.

Know Thy Audience, indeed: sometimes, your reader won’t be a high-school sophomore looking for homework help, but is much more likely to be a fellow researcher checking to see where the minus signs go in a particular equation, or a graduate student looking to catch up on the historical highlights of their lab group’s research topic. Vulgarized vagueness helps the latter readers not at all, and gives the former only a gentle illusion of learning. Precalculus students would benefit more if we professional science people worked on making articles like Trigonometric functions truly excellent than if we puttered around making up borderline Original Research about our own abstruse pet projects.

ARTICLE COMMENTED UPON

• Logan DW, Sandal M, Gardner PP, Manske M, Bateman A, 2010 Ten Simple Rules for Editing Wikipedia. PLoS Comput Biol 6(9): e1000941. doi:10.1371/journal.pcbi.1000941

# Complexity Swag

By Gad, the future is an amazing place to live.

Where else could you buy this?

Or this?

(Via Clauset and Shalizi, naturally.)

I have a confession to make: Once, when I had to give a talk on network theory to a seminar full of management people, I wrote a genetic algorithm to optimize the Newman-Girvan Q index and divide the Zachary Karate Club network into modules before their very eyes. I made Movie Science happen in the real world; peccavi.

# Textbook Cardboard and Physicist’s History

By the way, what I have just outlined is what I call a “physicist’s history of physics,” which is never correct. What I am telling you is a sort of conventionalized myth-story that the physicists tell to their students, and those students tell to their students, and is not necessarily related to the actual historical development, which I do not really know!

Richard Feynman

Back when Brian Switek was a college student, he took on the unenviable task of pointing out when his professors were indulging in “scientist’s history of science”: attributing discoveries to the wrong person, oversimplifying the development of an idea, retelling anecdotes which are more amusing than true, and generally chewing on the textbook cardboard. The typical response? “That’s interesting, but I’m still right.”

Now, he’s a palaeontology person, and I’m a physics boffin, so you’d think I could get away with pretending that we don’t have that problem in this Department, but I started this note by quoting Feynman’s QED: The Strange Theory of Light and Matter (1986), so that’s not really a pretence worth keeping up. When it comes to formal education, I only have systematic experience with one field; oh, I took classes in pure mathematics and neuroscience and environmental politics and literature and film studies, but I won’t presume to speak in depth about how those subjects are taught.

So, with all those caveats stated, I can at least sketch what I suspect to be a contributing factor (which other sciences might encounter to a lesser extent or in a different way).

Suppose I want to teach a classful of college sophomores the fundamentals of quantum mechanics. There’s a standard “physicist’s history” which goes along with this, which touches on a familiar litany of famous names: Max Planck, Albert Einstein, Niels Bohr, Louis de Broglie, Werner Heisenberg, Ernst Schrödinger. We like to go back to the early days and follow the development forward, because the science was simpler when it got started, right?

The problem is that all of these men were highly trained, professional physicists who were thoroughly conversant with the knowledge of their time — well, naturally! But this means that any one of them knew more classical physics than a modern college sophomore. They would have known Hamiltonian and Lagrangian mechanics, for example, in addition to techniques of statistical physics (calculating entropy and such). Unless you know what they knew, you can’t really follow their thought processes, and we don’t teach big chunks of what they knew until after we’ve tried to teach what they figured out! For example, if you don’t know thermodynamics and statistical mechanics pretty well, you won’t be able to follow why Max Planck proposed the blackbody radiation law he did, which was a key step in the development of quantum theory.

Consequently, any “historical” treatment at the introductory level will probably end up “conventionalized.” One has to step extremely carefully! Strip the history down to the point that students just starting to learn the science can follow it, and you might not be portraying the way the people actually did their work. That’s not so bad, as far as learning the facts and formulæ is concerned, but you open yourself up to all sorts of troubles when you get to talking about the process of science. Are we doing physics differently than folks did N or 2N years ago? If we are, or if we aren’t, is that a problem? Well, we sure aren’t doing it like they did in chapter 1 of this textbook here. . . .

I noticed this one when it first hit the arXivotubes a while back; now that it’s been officially published, it caught my eye again.

G. Rozhnova and A. Nunes, “Population dynamics on random networks: simulations and analytical models” Eur. Phys. J. B 74, 2 (2010): 235–42. arXiv:0907.0335.

Abstract: We study the phase diagram of the standard pair approximation equations for two different models in population dynamics, the susceptible-infective-recovered-susceptible model of infection spread and a predator-prey interaction model, on a network of homogeneous degree $$k$$. These models have similar phase diagrams and represent two classes of systems for which noisy oscillations, still largely unexplained, are observed in nature. We show that for a certain range of the parameter $$k$$ both models exhibit an oscillatory phase in a region of parameter space that corresponds to weak driving. This oscillatory phase, however, disappears when $$k$$ is large. For $$k=3, 4$$, we compare the phase diagram of the standard pair approximation equations of both models with the results of simulations on regular random graphs of the same degree. We show that for parameter values in the oscillatory phase, and even for large system sizes, the simulations either die out or exhibit damped oscillations, depending on the initial conditions. We discuss this failure of the standard pair approximation model to capture even the qualitative behavior of the simulations on large regular random graphs and the relevance of the oscillatory phase in the pair approximation diagrams to explain the cycling behavior found in real populations.

# Interlude, with Cat

Want to know why I never get anything done? It’s not just because I find myself volunteered to write a one-act musical entitled Harry Crocker and the Plot of Holes. It’s also because Sean Carroll linked to a whole bunch of physics blogs, mine included, thereby obligating me to read through all their archives, and in the backblog of High Energy Mayhem I found a pointer to a talk by Krishna Rajagopal (my professor for third-term quantum â€” small world) on applying gauge/gravity duality to strongly coupled liquids like RHIC’s quark-gluon soups and cold fermionic atoms tuned to a Feshbach resonance. It still counts as “work” if the videos I’m watching online are about science, right? Look, if you use the “Flash presentation” option, it plays the video in one box and shows the slides in another! (Seriously, that’s a simple idea which is a very cool thing.)

Anyway, while I stuff my head with ideas I barely have the background to understand, and while I’m revising a paper so that it (superficially) meets PNAS standards, and while I try to re-learn the kinetic theory I forgot after that exam a few years back. . . Here’s a cat!

(This one is for Zeno, and was recaptioned from here.)

# String Theory and Atomic Physics

Physics, as Clifford Johnson recently reminded us, has a strongly pragmatic side to its personality: “If that ten dimensional scenario describes your four dimensional physics and helps you understand your experiments, and there’s no sign of something simpler that’s doing as good a job, what do you care?” As that “ten dimensional” bit might suggest, the particular subject in question involves string theory, and whether tools from that field can be applied in places where they were not originally expected to work. From one perspective, this is almost like payback time: the first investigations of string theory, back in the 1970s, were trying to understand nuclear physics, and only later were their results discovered to be useful in attacking the quantum gravity problem. Now that the mathematical results of quantum-gravity research have been turned around and applied to nuclear physics again, it’s like coming home — déjà vu, with a twist.

This is quintessential science history: tangled up, twisted around and downright weird. Naturally, I love it.

Shamit Kachru (Stanford University) has an article on this issue in the American Physical Society’s new online publication, called simply Physics, a journal intended to track trends and illustrate highlights of interdisciplinary research. Kachru’s essay, “Glimmers of a connection between string theory and atomic physics,” does not focus on the nuclear physics applications currently being investigated, but rather explores a more recent line of inquiry: the application of string theory to phase transitions in big aggregates of atoms. Screwing around with lithium atoms in a magnetic trap is, by most standards, considerably more convenient than building a giant particle accelerator, so if you can get your math to cough up predictions, you can test them with a tabletop experiment.

(Well, maybe you’ll need a large table.)

If you’ve grown used to hearing string theory advertised as a way to solve quantum gravity, this might sound like cheating. Justification-by-spinoff is always a risky approach. It’s as if NASA said, “We’re still stalled on that going-to-the-Moon business, but — hey — here’s TANG!” But, if your spinoff involves something like understanding high-temperature superconductivity, one might argue that a better analogy would be trying for the Moon and getting weather satellites and GPS along the way.

Moreover, one should not forget that without Tang, we could not have invented the Buzzed Aldrin.

# Meanwhile, on the Intertubes

The evilutionary superscientist P-Zed has been trying to drive the riffraff away from his website by writing about biology. First we had “Epigenetics,” and now we’ve got “Snake segmentation.” Meanwhile, Clifford Johnson is telling us about “Atoms and Strings in the Laboratory” (with bonus musical accompaniment). Stick around for stupid questions from me in the comments!

(Everything I know is really just the sum total of answers I’ve received for stupid questions.)

Juan A. Bonachela, Haye Hinrichsen, Miguel A. Munoz, “Entropy estimates of small data sets” J. Phys. A: Math. Theor. 41 (2008). arXiv: 0804.4561.

Estimating entropies from limited data series is known to be a non-trivial task. Naive estimations are plagued with both systematic (bias) and statistical errors. Here, we present a new “balanced estimator” for entropy functionals (Shannon, Rényi and Tsallis) specially devised to provide a compromise between low bias and small statistical errors, for short data series. This new estimator out-performs other currently available ones when the data sets are small and the probabilities of the possible outputs of the random variable are not close to zero. Otherwise, other well-known estimators remain a better choice. The potential range of applicability of this estimator is quite broad specially for biological and digital data series.

As an exercise, discuss the relation of this approach to the coincidence-based methods of Ma, Bialas et al.

# An Unusual Occurrence

So there I was, quietly standing in Lobby 10, queuing to buy myself and a few friends advance tickets to Neil Gaiman’s forthcoming speech at MIT, when a strange odor proturbed onto my awareness. “That’s odd,” thought I, “it smells like backstage at my high school’s auditorium. [snif snif] Or the bathroom at Quiz Bowl state finals. . . And it’s not even 4:20. Something very unusual is going on, here on this university campus.”

I became aware of a, well, perhaps a presence would be the best way to describe it: the sort of feeling which people report when their temporal and parietal lobes are stimulated by magnetic fields. Something tall and imposing was standing. . . just. . . over. . . my. . . right. . . shoulder! But when I turned to see, I saw nothing there.

Feeling a little perturbed, I bought my tickets and tried to shrug it off. Not wanting to deal with the wet and yucky weather currently sticking down upon Cambridge, I descended the nearest staircase and began to work my way eastward through MIT’s tunnel system, progressing through the “zeroth floors” of the classroom and laboratory buildings, heading for Kendall Square and the T station. Putting my unusual experience in the ticket queue out of my mind, I returned to contemplating the junction of physics and neuroscience:

“So, based on the power-law behavior of cortical avalanches, we’d guess that the cortex is positioned at a phase transition, a critical point between, well, let’s call them quiescence and epileptic madness. This would allow the cortex to sustain a wide variety of functional patterns. . . but at a critical point, the Wilson-Cowan equations should yield a conformal field theory in two spatial dimensions. . . .

“But if you reinterpret the classical partition function as a quantum path integral, a field theory in 2D becomes a quantum field theory in one spatial and one temporal dimension. And the central charge of the quantum conformal field theory is equal to the normalized entropy density. . . so we should be able to apply gauge/gravity duality and model the cortex as a black hole in anti-de Sitter spacetime —”

Suddenly, a tentacle wrapped around my chest, and constricted, and pulled, and lifted — not up, but in a direction I had never moved before. Like a square knocked out of Flatland, I had been displaced.
Continue reading An Unusual Occurrence

# Physics on the Brain, Part 1

Can physics tell us about ourselves?

To phrase the question more narrowly: can the statistical tools which physicists have developed to understand the collective motion of large agglutinations of particles help us figure out what our brains are doing?

If Jack Cowan and his colleagues are correct, ideas from statistical physics can tell us important facts about our own brains. By studying the recurring motifs of hallucinations, we can construct a geometry of the mind.

“Honeycomb” form constant,
from Bresloff, Cowan et al. (2002)
It’s hard to imagine any sort of regularity in a phenomenon as eccentric as visual hallucinations. Our culture is brimming with psychedelia, music and art produced “under the influence” of one or another infamous chemical. Yet the very fact that we can label artwork as “psychedelic” suggests that the effects of those mind-bending substances have a certain predictability. In the 1920s, long before the days of review boards and modern regulations for human experimentation, the neurologist Heinrich KlÃ¼wer ingested mescaline and recorded his observations. He reported visual hallucinations of four distinct types, which he called “form constants.” These form constants included tunnels and funnels, spirals, honeycomb-like lattices and cobweb patterns. Similar structures have been reported with other drugs, like LSD; these same form constants also appear during migraines, in “hypnogogic” (falling asleep) and “hypnopompic” (waking up) states, when pressure is applied to closed eyes, and even in ancient cave paintings.

If the same hallucinatory images appear from many causes, might they be indicative of some more general property of brain structure?
Continue reading Physics on the Brain, Part 1

# Quote of the Moment

The quark gluon plasma studied at RHIC is the least viscous fluid known to man.

Just how non-viscous is it? Well, the folks in the STAR Collaboration tell us that for the quark-gluon plasma, the ratio of shear viscosity to entropy density, $$\eta / s$$, is more than one hundred times smaller than that of water.

Barton Zwiebach has some videos on this subject which may serve as a good introduction for those with a moderate physics background.

# Anti-Matters

Starting your own journal is a time-honored way to make pseudoscience and outright antiscience look more respectable. Known loonball Paul Cameron did it in order to bolster his homophobia, and the walking dishonesty generators known as “creationists” have done it several times. (Ad hominem? Sure, if you like. But creationism is the morally bankrupt pursuit of the factually wrong, and I lost patience with it quite some time ago.) Now, Jason Rosenhouse reports, they’ve gone and done it again.

These journals invariably founder on their inability to find any scientists willing to write for them. Remember Proceedings in Complexity, Information and Design? It’s been moribund since November 2005. Or how about Origins and Design? That one went belly-up around the turn of the century.

The latest representative of the genre is Anti-Matters. It bills itself as â€œA quarterly open-access journal addressing issues in science and the humanities from non-materialistic perspectives.â€

Apparently, “non-materialistic” science means attacking evolutionary biology with — wait for it — the Second Law of Thermodynamics! Some bad arguments just never die.

Man, I’m on an emotional roller-coaster here. “Ooh, Jason Rosenhouse has a new post! Joy joy. I wonder what it’s about.” Then I read, and: “Oh. Creationists blathering about the Second Law.

Oliver Johnson, Christophe Vignat (2006). Some results concerning maximum Renyi entropy distributions.

We consider the Student-t and Student-r distributions, which maximise Renyi entropy under a covariance condition. We show that they have information-theoretic properties which mirror those of the Gaussian distributions, which maximise Shannon entropy under the same condition. We introduce a convolution which preserves the Renyi maximising family, and show that the Renyi maximisers are the case of equality in a version of the Entropy Power Inequality. Further, we show that the Renyi maximisers satisfy a version of the heat equation, motivating the definition of a generalized Fisher information.

Luciano da F. Costa, Francisco A. Rodrigues, Gonzalo Travieso, P. R. Villas Boas (2006). Characterization of complex networks: A survey of measurements.

# ICCS: Emergence in Particle Systems 1

I typed the following notes during Hiroki Sayama‘s presentation on “Phase separation and dynamic pattern formation in heterogeneous self-propelled particle systems.” Unfortunately, I couldn’t get a WiFi signal in the room where Sayama gave his talk, so I’m falling short of the gonzo science ideal, posting about the talk after it was given instead of as it occurs.

Sayama is speaking about particle swarm systems, and the phase-separation and dynamic pattern formation behaviors they exhibit. He adds the novel feature of heterogeneity to the particle system. Research on self-propelled particles goes back to Reynolds (1987), Vicsek et al. (1995), Aldana et al. (2003), Chuang et al. (2006), etc. Reynolds was a computer scientist who created a method for simulating bird flocking, which developed into the simulation which created the bats in the otherwise unremarkable Batman Begins. Vicsek and Aldana were physicists.

These systems show collective behaviors such as random clustering, coherent motions and milling. The same system can exhibit all of these behaviors, depending upon the input parameters. Cranking up the noise can induce phase transitions. Almost all of this work focused on homogeneous particle systems, in which all particles share the same kinetic particles. What, then, would happen if two or more types of self-propelled particles were mixed together?

Sayama works in a framework he calls Swarm Chemistry, which is implemented as a Java applet that can be run online.
Continue reading ICCS: Emergence in Particle Systems 1

# Nine Minutes of Science

OK, this is too good to pass up. Jim Blinn, the computer-graphics expert responsible for the Mechanical Universe animations — and therefore, responsible for filling my childhood with arrows — summarizes The Mechanical Universe in nine minutes. Watch all of first-year physics packed in a single morsel:

Blinn also worked on Caltech’s Project MATHEMATICS! series. I’m a little surprised that so few of the Project MATHEMATICS! videos have found their way onto the Intertubes yet. Here’s a “teaser trailer” of sorts, made from clips of “The Story of π”:
Continue reading Nine Minutes of Science