GrrlScientist has injected Encephalon 32 into the InterTubes. Among the very nice posts collected there is The Neurocritic’s take on that political brain study. And in the “Uh-oh” department, I note that the Neurocritic also mentions that Ray Kurzweil has a moving coming out next spring, entitled The Singularity Is Near (2008).
I get a weird vibe from the writings of Ray Kurzweil. (I emphasize that I’m talking about the writings here; I’ve only met the man once, at a speakers’ panel six years ago, which incidentally was moderated by Christopher Lydon — yes, the one from the Dresden Dolls song.) On the one hand, I think that all the evidence supports a materialist view of thought, consciousness and all that; like somebody once quipped, if the lungs breathe and the kidneys filter, then the brain minds. Furthermore, it seems to me that once you take that step, you have to consider the possibility that a mind could be implemented on a different material substrate. While it might be an insuperably difficult problem in practical terms, nothing in science as I understand it forbids “strong AI” in principle.
Yeah, yeah, we can argue about “Chinese rooms” some other time. My point today is that “Strong AI” is not nearly as kooky an idea as some of the claims Kurzweil has made, particularly in The Age of Spiritual Machines (1999). Machines which think and dream and emote are a possibility one must consider once one adopts a materialist stance on neuroscience, but telling when those machines will be built by “extrapolating” the increase of computer power into the 2020s is the futurist’s version of deducing the age of the Universe by adding up the ages in the Old Testament.
And, as PZ Myers pointed out long ago, Kurzweil’s attempt to anchor his “Law of Accelerating Returns” in the geological past is almost painfully silly. It depends upon conflating events in the past and using vague or inconsistent standards even for what constitutes a data point — and, of course, fitting straight lines on log-log plots is a risky business.