Framing is back. Sheril Kirshenbaum is writing guest posts over at Chris Mooney’s place (1, 2 and 3 so far). In her first three posts, she’s talking sense, though her writing isn’t exactly rocking my geological column.
(That sounds a little dirtier than I intended. Ah, well, I’m not an old fossil yet.)
The interesting thing is that nothing of what Kirshenbaum has written involves deep anthropological foundations. You could have said exactly the same things before the framing kerfluffle and with no knowledge of Lakoffian whosiewhatsits. Now that the subject has been called back to my mind, I think I can offer an executive summary of what bothers me about the whole “framing” business.
First, any attempt to convince a person of a scientific fact on grounds other than the evidence is morally suspect. Second, one must demonstrate that such a trick is even effective. (Can a simple dollars-and-cents accounting convince people to buy compact fluorescent bulbs, without talking about any real environmental reasons to use them?) Arguing for science on the grounds that science means jobs and money forces you to compete with all the other arguments which key on jobs and money. (Can’t we also improve the local economy by opening a chicken-processing plant?) Third, people who genuinely care about fact and truth will not receive the message that science can offer them fact and truth.
Fourth, the whole debate has been bereft of practical suggestions. Where’s the handbook for scientists who want to talk to reporters, and the complementary book for journalists trying to write good science copy? The only halfway solidified suggestion to come out of the goop has been to push those uppity atheists back into obscurity — a proposition which is dubious on all levels. It ignores the way reform movements have historically achieved success (well-behaved infidels seldom make history). It deprives the media of an appealing hook which can keep critical thought in the public eye. It not-so-tacitly acknowledges that religion is above criticism, which is simply a loathsome idea. (If we can’t critique religion, on what grounds can we criticize astrology? Both are ancient traditions, both are omnipresent in our culture, and both are — let’s face it — factually wrong, unless we elevate them through learned talk to such an exalted plane that they no longer contact our daily reality. The stars in the sky only symbolize the stars which guide our lives; a perfect God does not need to perform miracles, and thus is indistinguishable from no God at all. It may be emotionally distasteful to tell a human being, “Your religion is not grounded in fact,” but is that discomfort any worse than the other products of skepticism? “Your sister is wasting money paying for Reiki massage.” “Those horoscopes your mother reads are nonsense.” We agree that such messages are appropriate to say regarding homeopathy or astrology, and that acknowledging the human frailties which lead to credulity is a wise move. Why isn’t the same message with the same acknowledgments appropriate for religion?)
Fifth, the argument that “scientists frame all the time” or, more broadly, that “all communication is framing” is not tenable. In the latter case, one might as well say “all communication is Wakalixication”; the insertion of the extra word brings no new information, and framing becomes like a Silly Putty God who moves in mysterious ways and can be twisted to fit any set of observations. The former, more restricted statement is a slippery slope argument in disguise.
Years ago, my father — who was usually a very talented cook — tried to make Swedish meatballs. Following a train of thought I find difficult to comprehend, he decided that if a little nutmeg was good, a lot of nutmeg must be incredibly great.
Neither I, my mother, my father or Ace the family dog were able to eat those Swedish meatballs.
The lesson, I hope, is clear: one must be careful when extending a proposition across regimes. When lecturing to a freshman biology class or writing a grant proposal, a scientist may be forced to constrain her statements and tailor them to her audience in a way she finds distasteful (and would not employ when speaking to a professional coeval). However, there are limits to the stretching and distortion tolerated, and more importantly, there exist error-correcting feedback mechanisms which keep the hype from getting out of hand and prevent the science from being dumbed down too far.
One should also mention that when giving an introductory lecture, a professor is typically interested in presenting the material as accurately as she can, given the limited background of the students, and under the assumption that the students will be using the knowledge later. This in itself holds back the amount of simplification the professor can employ. But pop science is not a continuing education: it lives and dies in the breadth of a newspaper story and presumes no continuity of material building upon prerequisites.
Sixth, the proponents of “framing” have not — in the material I’ve read — addressed the real divisions of human psychology involved in this situation. To appreciate this point, go now and read Bob Altemeyer’s The Authoritarians (2007).
OK, now you know what I mean when I talk about “authoritarian components” in the human psyche.
Authoritarian followers usually support the established authorities in their society, such as government officials and traditional religious leaders. Such people have historically been the â€œproperâ€ authorities in life, the time-honored, entitled, customary leaders, and that means a lot to most authoritarians. Psychologically these followers have personalities featuring:
1) a high degree of submission to the established, legitimate authorities in their society;
2) high levels of aggression in the name of their authorities; and
3) a high level of conventionalism
Because the submission occurs to traditional authority, I call these followers right-wing authoritarians. Iâ€™m using the word â€œrightâ€ in one of its earliest meanings, for in Old English â€œrihtâ€ (pronounced â€œwritâ€) as an adjective meant lawful, proper, correct, doing what the authorities said. (And when someone did the lawful thing back then, maybe the authorities said, with a John Wayne drawl, â€œYou got that riht, pilgrim!â€)
As Altemeyer explains (and illustrates with statistical evidence), a person’s score on the “RWA” scale indicates to a considerable extent how they acquire and evaluate ideas. Naturally enough, “authoritarian followers” are very good at accepting what their trusted authorities say (well, who wouldn’t?), but the phenomenon goes deeper:
authoritariansâ€™ ideas are poorly integrated with one another. Itâ€™s as if each idea is stored in a file that can be called up and used when the authoritarian wishes, even though another of his ideas — stored in a different file — basically contradicts it. We all have some inconsistencies in our thinking, but authoritarians can stupify you with the inconsistency of their ideas. Thus they may say they are proud to live in a country that guarantees freedom of speech, but another file holds, â€œMy country, love it or leave it.â€ The ideas were copied from trusted sources, often as sayings, but the authoritarian has never â€œmerged filesâ€ to see how well they all fit together.
We all do this, to some extent. In fact, one ability which scientists in particular hone and sharpen is the skill of holding two or more contradictory ideas simultaneously in mind as alternative working hypotheses. However, this is a means to a goal, namely the determination of a solid fact which can trump the other possibilities. This is quite a different approach than authoritarian doublethink (or n-tuplethink), which implies holding a clutch of contradictory ideas all in the mind’s filing cabinet indefinitely!
High doses of authoritarianism also mean that the distinction between “is” and “ought” is blurred or obliterated outright. The listener judges the factual correctness of a statement by its moral character. “Homosexuality is a sin,” they’ll say, and so the only studies which can be deemed “scientifically sound” are the ones which correlate homosexuality with broken families, pedophilia and so forth. Sherlock Holmes warned us against this kind of bad judgment, but we didn’t listen. Rather than learning all the facts, guessing the outcome of an action and evaluating that outcome against our moral principles, a Bronze Age morality is allowed to determine the acceptable facts.
So, that’s the frame within which science must be squeezed! A tight fit, isn’t it?
Seventh, dividing between short-term and long-term efforts is a good idea, but to suggest something slightly heretical, the side of reason is already doing pretty well on the short-term end (not perfect, looking at indices like court cases, but pretty well). The problem we then face is one of induction: winning the first N battles is no guarantee that you’ll win N + 1, so long-term thinking is vitally necessary.
Today, you have to “frame” evolution. Next year, you’ll have to “frame” nuclear power. In five years, you’ll be “framing” gene therapy. A decade after that, you’ll have to “frame” nanotechnology. Isn’t that an awfully long time to be repeating the mantra, “It’s good for jobs. It’s good for the economy. It’s good for America”?
If you want to hire a PR firm to repeat that mantra to politicians and lobbyists, that’s fine. In the short term, it might work. But isn’t it a good idea to invest some resources in long-term thinking? (Why, that’s exactly what people are saying we should do for the environment! How shocking.) You might be grateful a few years down the pike.
Here’s a trick. Science is a body of facts and a method for discovering facts, so every time you see a disagreement about science policy, why not replace the word science with the word fact? The question “does science support a particular political position” sounds deep and troubling, but by replacing equivalent words, we can parse it to the question, “Do facts matter in politics?”
This suggests that the scientific community should bend itself to the current political situation only insofar as we are changing that situation to reflect actual fact. Science is not just another pork barrel project. It’s not just a trendy metanarrative. It’s our path to the truths we need to improve our lives. Knowledge gives birth to technology, and technology — be it Napster or the atom bomb — has a nasty habit of changing the economic playing field. If we don’t put serious effort into making our political processes reflect this, we’re going to have ourselves an extremely unhappy century.
Previous Sunclipse posts on “framing”:
- “I Was Framed!” (10 April 2007)
- “Interlude: Framing” (16 April 2007)
- “In Soviet Russia, Evidence Frames You!” (17 April 2007)
- “Addiction” (20 April 2007)
- “Blaggregation at Darwin’s” (27 April 2007)
I also have a crude taxonomy of bad science journalism, entitled “All the News that Fits, We Print” (22 April 2007).