I nearly sprayed my breakfast across my friend’s new flatscreen monitor when I saw the latest from Michael Egnor:
Clearly the brain, as a material substance, causes movement of the body, which is also a material substance. The links are nerves and muscles. But there is no material link between our ideas and our brains, because ideas aren’t material.
Mr. Spock, are your sensors detecting any signs of intelligent life?
As I said just a few days ago, this sort of woo is obvious on its face. When I haven’t eaten, a sensation exists in my mind which I label “hunger.” Eating makes this sensation go away. Ingesting a material called “aspirin” (which sounds more provocative if you call it “acetylsalicylic acid”) makes the pain perceived by my consciousness go away. Four centuries before Herod, Hippocrates was already saying, “Men ought to know that from nothing else but thence [from the brain] come joys, delights, laughter and sports, and sorrows, griefs, despondency, and lamentations.” If you read Carl Zimmer’s excellent Soul Made Flesh (2004), you can get the juicy details on how the ghost was pushed into the gaps: by Newton’s time, Thomas Willis had already divided the human mentality into a “rational soul,” which was the immortal and immaterial part responsible for the highest functions of reasoning, and the “sensitive soul,” which processed sensory input and did all the things which Willis found could be affected by physically perturbing the brain. When interviewed on All in the Mind, Zimmer summarized the story, saying that Willis was able to place memory, learning, language, emotions, dreams and other complex behaviors in the sensitive soul, the one made out of matter. “So,” he said, “you kind of wonder after a while, well, what’s the rational soul there to do anymore.”
If you’d like a digestible history of neuroscience, check out Neurevolution’s chronicle of History’s Top Insights into Brain Computation. (Tip o’ the EEG helmet to Mind Hacks for finding the link.) The very first item is that, as I said, Hippocrates placed the mind firmly within the brain twenty-four centuries ago. And our data has only become more colorful since: everybody remembers Phineas Gage, the railway construction gang foreman who took a 13.5-pound iron rod through the head. After his frontal lobes had been injured, Gage’s personality changed, and he became by all accounts much worse company.
Back in 1996, Frank Vertosick, Jr. wrote a story for Discover Magazine’s “Vital Signs” column, in which he described a teenage boy who accomplished the opposite of Gage’s transformation by shooting himself in the head with his father’s .22. Vertosick suctioned bone and metal from the boy’s brain with a thin rubber hose, washed the wound with antibiotics and sutured up the dura mater. Thanks to this medical care, the boy lived.
Now comes the interesting part. Before the prank had gone horribly awry, “Stephen” had been what some folks call a “wayward youth,” a child of divorced parents, already treated for alcoholism and the proud owner of a juvenile arrest record. After he “performed his own perfect lobotomy,” Stephen’s personality changed remarkably.
A year later, Stephen came to our outpatient clinic. It was the first time I had seen him since he left the hospital. His speech had returned and his mind seemed intact. He had returned to school and was even doing reasonably well, at least when his record was compared with his dismal scholastic performance of the past. To his fatherâ€™s delight and to my surprise, Stephenâ€™s behavior had never been better. He no longer argued with his parents or teachers or had fits of rage. In fact, he showed little desire to do anything outside the house, good or bad, and that suited his father just fine. Even Stephenâ€™s occasional urinary incontinence didnâ€™t seem to trouble either the father or the son.
Vertosick asked the father about the changes in his son’s character. Was he the son that the father had known before?
The question took him slightly aback.
No, not at all. His personality is completely different; his sense of humor isnâ€™t what it was. No, he isnâ€™t like he was before, he replied, adding in haste, but I like him better this way.
“Well,” thought the surgeon to himself, “whether you like him or not, the new Stephen is here to stay.”
The brain contains no sights, sounds, smells, or tastes. It is a dark, semi-solid mass about the consistency of cold oatmeal. And yet this conglomeration of inert atoms somehow produces the entire visible, tangible world. If this metamorphosis can be explained, then we may find out how the brain might create subtler worlds, the kind traditionally known as heaven. If the secret lies not in brain chemistry but in awareness itself, the afterlife may turn out to be an extension of our present life, not a faraway mystical world.
As PZ Myers laconically remarked, if Chopra’s brain consists of inert atoms and cold oatmeal, we may have a clue into his problem. Egnor demonstrates that one can venture still deeper into Choprawoo than the master himself:
The remarkable thing about materialistic neuroscience, as applied to the study of the mind-brain problem, is how unscientific it is. Scientific materialism as a method in science intrinsically requires that a material cause and its effect share properties that link the cause to the effect. Materialistic scientists rightfully scorn pseudoscience like telekinesis, yet the view that ideas are caused by brain matter is merely a mirror image of the claims made on behalf of telekinesis. We know that Uri Geller can’t really bend a spoon just by thinking about it, because the thought ‘I’m bending this spoon’ and the spoon itself share no properties in common. They’re not connected. But the disconnection between matter and thought works both ways. It makes no more sense to assert that matter alone ‘moves’ ideas than it makes sense to assert that ideas alone move matter.
No, we know that Uri Geller is a fraud because (a) James Randi — and anybody else who puts forth a little effort — can do the same thing without “psychic powers,” and (b) when Johnny Carson took some simple precautions, Geller flopped completely on national TV.
Scientists are familiar with “immaterial” causes having “material” effects. I wiggle a magnet here, a compass needle moves over there. Surprise! Nothing “material” in between. We describe what’s going on with a magnetic field, whose strength and directionality we can calculate using precise mathematics. We aren’t concerned whether the equations of electromagnetism describe fields which aren’t “material”; the important thing is that we can test the predictions of those equations against the reality we observe. If the predictions didn’t pan out, we would throw the equations out (or try to patch them up, depending on how bad the discrepancy was), but in fact the Maxwell Equations for electromagnetism work astonishingly well.
We can debunk telekinesis, but we confirm the electrochemistry of the brain, and that makes all the difference in the world.
Furthermore, over the last hundred years or so, we have learned that matter itself is not “material,” in the old-fashioned warm-and-fuzzy sense of the word. Democritus taught that nothing exists save atoms and the void; today, we know that atoms themselves consist largely of the void. Most of an atom is empty space: a clutch of light electrons whirling about in a probability cloud described by the counterintuitive rules of quantum mechanics, surrounding a dense nucleus which is also irrevocably quantum in character. That solid ground beneath your feet? Mostly nothingness. Why don’t we fall straight through the floor? Pauli exclusion: the tendency of electrons not to like each other’s company.
Either ethereal quantities like the electromagnetic field must be called “material,” or material things like rocks and stars and people must be acknowledged to be ethereal at root. Nature will be the way Nature is, and the natural world is by no means constrained to fit the labels we in our ignorance invent so boldly.
Egnor’s inversion of reason and sense shines forth brightly in this passage:
Yet many things in the world, including our ideas and even our theories about the world, are not matter or energy. Altruism is obviously something very real; many people’s lives depend on it. We don’t know exactly what it is, but we know, by its properties, what it’s not. It’s not material. It shares no properties in common with matter. It can’t be caused by a piece of the brain.
By the same “logic,” we could deem the Wall Street stock exchange a supernatural phenomenon. It is obviously something real, upon which many lives (or at least livelihoods) depend. Yet the fluxes and variations of the financial world are not material: they are numbers on a ticker, whose rises and falls bring elation or despair. They share but few properties with, say, the physical and chemical nature of gold.
I conclude with the following excerpt from Egnor’s diatribe:
There is no shared property yet identified by science through which brain matter can cause mental acts like altruism. Material substances have mass and energy. Ideas have purpose and judgment. There is no commonality. The association between brain function and ideas is fascinating, and the association of ideas with regions of the brain is a proper object of scientific study. But where there is no commonality of properties, association cannot be causation. Ideas must be caused by substances that have properties common to ideas — such as purpose and judgment.
I think we can take this as evidence that Egnor once courted young ladies by standing on their doorsteps and offering them glass vases full of shit.
Because, don’t you know, roses can only be caused by rose-like things.
For a parallel discussion and a broader perspective of how neuroscience affects creationism, see PZ Myers’s essay, “Egnor’s machine is uninhabited by any ghost“, which I saw appear on Pharyngula while I was writing this post. As Kenneth S. Kosik wrote in a letter to Nature, accepting that the mind is a product of the brain challenges belief much more deeply than siding with evolution instead of creationism. “The matter now stands at an intellectual impasse,” he writes, “waiting for an issue around which polarized views will crystallize. We can expect some heady days.”