Bloggers of the 1930s

I wonder if it’s true that you can only be a starry-eyed youngster once. I discovered the science fiction of Isaac Asimov when I was a ninth-grader, so far out of the social whirl that I didn’t even expect I was missing anything. I had my books and my BASIC compiler, and a few friends who shared my tastes in both, so what need had I for cheap beer? I collected the whole “Greater Foundation” series, from the Robot stories to the last Foundation book; Robots and Empire (1985) was the hardest to get, only turning up in my relatives’ used-book store in Seldovia, Alaska. Looking back through them in more recent years, big chunks don’t hold up that well, but I find myself inclined to view even the clunky and dated parts of, say, Pebble in the Sky (1950) with favor. Is it just sentiment at work? Do we each get a quota of one author we read through rose-tinted spectacles, just because they were the first we discovered?

When I try to put sentiment aside, I find that some stories still work, chief among them The Caves of Steel (1954). Still, if I want a book to read for “comfort food,” or to give myself an emotional pick-me-up and reinvigoration, I’m more likely to turn to Asimov’s non-fiction, from which there is plenty to choose. (I’d actually discovered the nonfiction first, several years before beginning my love affair with the Robot stories. For years, the only Asimov fiction I knew was the English-language screenplay of Gandahar (1988).) Among his best nonfiction is, funnily enough, his autobiography, of which there is also no shortage, as it comes in two strictly chronological volumes carrying his story through the 1970s, followed by a memoir published after his death and a book of his letters published after that.

All this is added value to make the passage I’m quoting today somewhat more admissible under “Fair Use” law. One of the fascinating things about Asimov’s autobiography is that it begins with his family history in Czarist Russia, on the borders of Belarus, then follows his nuclear family through Ellis Island into slum living in Brooklyn, then public school during the Depression, on through World War II. . . right the way to Watergate and, in the third volume, to glasnost and the TRS-80. One human life can cover a great deal of territory. Then come the odd moments of synchrony, when a bit of 1930s New York springs out at you and gains a strange relevance. To that end, here is In Memory Yet Green (1979), p. 209, describing the background to the Greater New York Science Fiction Club’s splitting into the Queens Science Fiction Club and the Futurian Science Literary Society, all the way back in 1938:

Though science-fiction clubs were small, they were contentious. The membership tended to consist of intelligent, articulate, argumentative, short-tempered, and opinionated young men (plus a few women) who got into tremendous power struggles.

You might wonder how power struggles can possibly arise in small clubs devoted to something as arcane as science fiction, and I wonder, too — but it happens. There are arguments over what happened to the thirty-five cents in the treasury, who is to run the fanzine, and other equally momentous problems. I believe there were even arguments as to how best to “control fandom,” or, on a lesser scale, the world.

When the arguments overflowed the possibilities of word-of-mouth, letters flew from fanzine to fanzine — long, articulate, venomous, libelous letters, which often degenerated into threats of lawsuit that never materialized (largely because no lawsuit could ever result in substantial damages when no one being sued was worth more than $1.65, clothes, pocket change, blood chemicals, and all).

Naturally, it didn’t take a club long to split up into two clubs, with each then proceeding to put out competing fanzines. The main task of each fanzine was to vilify the other group with an intensity and a linguistic fluency that Hitler might have studied with profit.

This may sound as though I’m exaggerating but, honestly, I’m not. If anything, I lack the words (competent writer though I am) to describe the intensity of the tempests brewed in the microscopic teapots of science-fiction fandom.

Let me refer you instead to something else. Back in 1954, Sam Moskowitz, one of the most active of the fans of the 1930s (and a dear friend of mine for many years), recalled those days and wrote a book the subtitle of which was A History of Science-fiction Fandom. It dealt with the period from 1935 to 1938 chiefly, and yet Sam found enough to say to fill a closely printed book of 250 pages.

In that book, endlessly and (forgive me, Sam) unreadably detailed, are all the feuds and quarrels of the period among people known only to themselves, over issues unexplainable to others. The title Sam gave the book, without any intent of satire at all, I believe, was The Immortal Storm.

I can only imagine that if someone like Bora Zivkovic wrote a history of science blogging, the outcome would be much the same.

Asimov goes on to explain how his friend Sprague de Camp had a hypothesis about human factionalism, which ran something like this: a band of humans (or proto-humans) fifty members strong couldn’t cover any more territory than a band of twenty-five, and at some scale, the extra food brought in by the additional people cannot meet the higher needs for consumption, so the larger band will starve while the smaller could survive. Therefore, evolution favors the splitting of large groups into smaller, and the presence of a certain fractiousness in human nature. (Exercise: restate the hypothesis in terms of kin selection.) Whether true or not, this hypothesis matters here because, according to Asimov, the required text which de Camp suggested as the basis for studying human contentiousness was The Immortal Storm.

7 thoughts on “Bloggers of the 1930s”

  1. I’ll just note: I discovered Asimov and cheap beer in 9th grade (and lost the taste for both soon after).

    Incidentally, the magic number for having a cohesive social group is thought by many to be around 150 (sometimes refered to as Dunbar’s number), which is probably larger than most science fiction clubs.

    Also (I won’t try to source this, since I know I’ve forgotten), the idea definitely entered my head from somewhere that humans (especially men) are extremely fond of dominance hierarchies, and that not being high on a hierarchy of some sort (even one of SF fans) is very stressful, for reasons that are a bit hard to express. This certainly fits with my personal view of human interactions, but maybe that’s just selection bias.

  2. Well, authoritarianism varies from person to person, in both respects: not everybody has the same willingness to follow or desire to lead. Maybe a bunch of “low RWAs” would split into smaller groups while “high RWAs” could form a hierarchy encompassing more people. Naturally, my mind runs to the idea of simulating this process — as if I didn’t have enough to code already. . . .

  3. Just makes me think: if we had been able to start out on Python rather than BASIC back then, we would probably all be much better programmers now. :)

  4. Yeah — but there’s something else that bugs me now and then. Does it ever seem like the “barrier to entry” for programming has gotten higher, while the languages themselves have become generally better? We have Python instead of BASIC, but “back in the day,” we just had to pop a cartridge into the Atari 400 and the interpreter would glow the color of summer skies on our TV tubes. In the age of MS-DOS 5, we just had to type qbasic at the command prompt. Now, to get started with one of these wonderful, elegant languages, you have to find a Web site and download it. Last month, I had the privilege of helping a roomful of business and industry people do just that. . . and it was not an experience I’m eager to repeat.

    Sure, *nix comes with Python, but if you’re a Linux hacker, this isn’t about you anyway.

    And, back in ye olden times of Blade Runner‘s first theatrical release, when TAB flowed like water, school textbooks made at least a token effort to include BASIC programs in their math lessons. Maybe we lost a promising start in one direction while being all happy about our progress in another.

  5. That’s a really good point. Computers today invite exploration a lot less than did the old C:\DOS directory full of tasty executables.

    This makes me think of JavaScript. It has lots of crappy aspects (I should know…), but everyone has an interpreter. I mean, it’s definitely less exposed to the user than Atari BASIC was, and it’s a different creature from a general-purpose programming language, but it’s there.

    If I were trying to put together a pedagogical tool to get kids to explore programming for fun, I might take JavaScript plus some libraries, have a really simple server-side component to save programs and data and all that, and wrap all that in some sort of IDE in a Web page. But again, I get your point that it’s not what’s available but what’s exposed by default that’s diminished, and a nifty learning website doesn’t really get at that.

    Incidentally, the One Laptop per Child folks built a “view source” feature into their interface, so kids can see the source of pieces of their UI.

  6. Another curious data point supporting of what you’re saying: I just got a cheapish Linux-based Eee PC in an uncharacteristic splurge. On the plus side, I was impressed that the makers seemed to have nailed ease of use; a simple UI presents OpenOffice, Firefox, and the other few programs most users truly need. But they pretty well hid the way you get to a real package manager or a terminal; OS X makes that stuff more reachable. So, in effect, it’s Linux reaching more people but without the exposed hackability that helped make it so awesome in the first place.

Comments are closed.