Sunday, October 28, 2012

American Culture, American Character

Culture is the stories we tell for and about ourselves. Those stories in the past have taken the form of novels, magazine articles, song lyrics, movies, jokes, and shared anecdotes. Today we must include Twitter feeds, Facebook posts, blogs, and—to use the new word—“memes.” Behind those stories and memes, shaping and driving them, is a commonly held set of values, expectations, and dreams. And if you can understand what a person or a people holds dear, expects to happen, and dreams about, then you have solid clues to that person’s—or the people’s—character, to what they will choose to do and how they will react to both stresses and to opportunities.

Call the next 2,500 words my love letter to the American culture and character. That’s a big subject for a small space, so expect some prodigious thumbnailing.

First, our culture—or rather, the “popular culture,” expressed in the most obtrusive song lyrics, action movies, and bestselling novels—is going through some kind of disruptive transition right now. Raunchy song lyrics, blatantly pornographic imagery, violent explosions, adolescent defiance, and brutal disdain for the old societal values are all underlying currents. It seems that the perceived relevance of a novel or movie plot today is directly proportional to how close the world comes to blowing up and how that situation frees the main characters to be violently destructive, abandoned, and vicious. The dominant image: the bulbous red lips with tongue sticking out, à la the Rolling Stones. The dominant challenge and response: “What are you rebelling against?” and “Whaddaya got?” à la Marlon Brando in The Wild One.

This disdainful hedonism has been sixty years—two, maybe three generations—in the making, if we take the year of the Brando movie, 1953, as some kind of starting point. That’s almost my entire lifespan. I think the roots of this wildness can be traced to the sudden release of pressure, like popping a champagne cork, when the weight and strain of the Great Depression and World War II were lifted from the American psyche. We suddenly found ourselves on top of the world with our new industrial power providing plenty all around and no one left to beat except the Soviets and Red Chinese—and they were a grim, gray, robot- and antlike people safely half a world away. The fizz of this sudden release might have died out in a single generation if two scientific developments hadn’t intruded. One was the prospect of global nuclear war, with the promise that tomorrow might never come.1 The other was the invention and marketing of the Pill, with the promise that uninhibited sex need have no real consequences. Both developments rocked the psychic and cultural landscape.

But in the longer term—the sweep of history, if you will—the culture of youthful rebellion and apocalyptic abandon are still an aberration, a diversion, a departure from our true natures. If we count from the Plymouth and Jamestown colonies in the early 1600s up to the end of World War II and its aftermath in the 1940s, this nation had three and a half centuries to form a distinct culture. And that culture wasn’t based on “It feels good, and the world’s ending anyway, so let’s do it!”

North America under colonization by the English, Dutch, and French was a vast, empty, rich, and violent landscape.2 The colonization effort created stories, dreams, and aspects of character that had never before existed in the world. The promise of endless land and opportunity experienced during that colonization—and the state that developed after those early colonies broke away from the mother country—shaped the nature of the people who settled here.

1. We All Came from Someplace Else

Every human on this continent—even the “native” Americans—either in person or at some point in his or her ancestry, arrived in North America from someplace else. Some came as seekers, strivers, and dreamers. Some came to escape persecution or personal defeat. Some came following a game trail across the Bering land bridge. And some came in chains as slaves.

But whatever our motivations, none of us found exactly what we were expecting. Except for the very few who stepped ashore carrying a royal charter and a bag of gold, everyone who came here had to struggle and adapt, had to learn to survive, had to make his or her way in a strange land that was never quite as advertised. We are all either the immediate products of that struggle, or we carry the genes of people who struggled and survived.3

Americans carry a selection advantage. We are not among the frogs who will wait quietly in the pot for the water to boil. We have a shorter than average fuse and a built-in tendency to say, “This sucks! I’m outta here!” We carry—if not in actual memory, then as part of our cultural heritage—the idea that things can get better, and that it’s worth the risk to move on and find that better time and place somewhere else. We do not worship tradition. We do not submit gracefully. We are hopeful. We are strong. We are not afraid. And we’re pretty much free of envy and jealousy.

With this heritage, we have a penchant for, and a fascination with, the resilient character, the cat who lands on his feet, the taker of chances, the jack of all trades, the artful confidence trickster, the Huckleberry Finn, Jay Gatsby, Al Capone, and yes, even Bernie Madoff. We don’t necessarily want to be them, much less become their prey, but we understand and tolerate them. We recognize their nature, at least a bit of it, in ourselves.

2. We Have No Natural Peasant Class, No Aristocracy

In most of the rest of the world—in Europe and Asia and, yes, even in South America—the culture sustains unconscious awareness of a large, background population of people who do not count, whose days are presumed to pass in torpid normalcy, whose dreams and aspirations can be dismissed, because their lives will never change. These are the peasants, the rural dullards, the stay-at-homes from which society raises its army of taxpayers and cannon fodder.

The existence of a natural peasant class also implies the existence of a naturally superior aristocracy, the better families and hereditary landowners, the people with an inborn nobility and unearned gifts, from which society develops its statesmen, churchmen, scholars, cavalrymen, and officer corps.

America never had this duality.4 In the extreme East you might hear some reference to “Boston Brahmins” and the “Old Knickerbockers.” Elsewhere you hear derisive comments about “Mr. and Mrs. Gotrocks”—who are usually the nouveau riche and not an established, hereditary class. They return the compliment by talking about “the great unwashed”—but with the uneasy feeling that they themselves are only about three wrong moves away from the trailer park and vocational education.

Everywhere, in every city and town and countryside, America does have its rich and poor. But the lines are fluid, and no one naturally “belongs” to either class. For every wealthy family whose sons now go to Harvard and whose grandfathers made money in furs or oil or newspapers, you can count half a dozen more whose grandfathers were small farmers or shopkeepers and whose sons just made a killing with a new app in Silicon Valley or with a new way of leveraging finance on Wall Street. And for every family clinging to a bit of hard-scrabble land up in the hills or a patch worked out on the plains, you can find a dozen more opening car dealerships and managing restaurants. Our mantra is “You can’t keep a good man down.” We value enterprise, spirit, opportunity, and initiative.

America has no disposable people who can be dismissed as naturally less worthy, less sensitive, with less future potential than anyone else. And we have no naturally valuable people who can boast presumed gifts of talent, knowledge, and gentility. Everyone is exactly what he or she can make out of life, given the opportunities at hand and his or her own imagination and dreams. And so as a culture we value providing readily available tools and remedies: free public school education and usually affordable higher education; easy access to capital through credit, loans, and mortgages; recovery from personal disaster through health and property insurance and bankruptcy protections; freedom to travel, to think and speak, and to try something different. It’s not a coincidence that Americans write, publish, buy, read, and follow the world’s share of self-help books and guides.

3. We Come from Small Towns

In the rest of the world, the life of the country is played out in the capital. Englishmen go to London to make their name and fortune, as the French go to Paris, the Russians to Moscow, and the Chinese to Beijing or Shanghai or Hong Kong. The provinces are known as a backwater if not a place of exile.

In America, at least until the latter part of the 20th century, people lived in small towns. And since World War II, with the rise of the automobile, even people who work in the cities have chosen to move out to the surrounding suburbs. Yes, aspiring writers and actors and hopeful lawyers and bankers have all followed their dreams to the cultural and financial capitals of New York and, lately, Los Angles. But our nation’s political capital, Washington, DC, has never been an object of pilgrimage—at least until the late 20th century, when the national government, its bureaucracy, and its influence started to balloon.

Small-town life is different from city life. For one thing you know your neighbors and feel connected to them. You volunteer to help them raise a barn or serve at a pancake breakfast. You join the business and charitable clubs, the fellowships, the church choir, and the volunteer fire department. If you feel isolated and alienated in the big city, the problem is likely your surroundings. If you feel disconnected in a small town, the problem is probably your personality.

It’s only in the 20th century, starting with the Great Depression—or perhaps a bit earlier, with the temperance advocates for Prohibition and the reform-minded journalists, called “muckrakers,” who worked for social justice—that Americans have looked to Washington to solve their problems on a national level. For three centuries before that, the Federal government was far away and usually not very effective at solving local problems. Americans looked to their neighbors for safety and a helping hand.

4. We Respect a Person’s Privacy

While everyone knows what’s going on in a small town, Americans tend to draw a curtain over matters that are considered private. These include—or used to—a person’s religion, politics, financial situation, historical antecedents, and hereditary background. The spirit that a person can move on to a new place and make something new and different of himself depends on this sense of delicacy.

We wrote into our founding documents that a person can believe and worship as he chooses, say what he thinks, publish what he believes without retribution, and act in the expectation that before he can be punished, the case against him has to be proven, in public, with a jury of his peers. These are not dusty sentiments written on parchment two hundred years ago and largely forgotten today. They are values that we are still actively debating, testing in the courts, and redefining in new ways as the world changes around us. Also, they are not mystical, hieratic concepts that we common folk leave for our judicial betters to decide: everyone has an opinion, a point of view, and a personal sensitivity about where the needle should lie along the spectrum between casual tolerance and social control.

In this atmosphere, the first question an American asks about a man or woman is rarely “What religion does he follow?” “What party does he belong to?” “How much money does he have?” “Where do his people come from?” or “What race does he belong to?”5 With our rough-and-ready frontier heritage, we tend to ask instead whether he’s reliable in a crisis, able to pay his way, hold his liquor, provide a steady hand with the horses, and willing to coach Little League. We ask if she’s a gossip or a minx, raises responsible children, and whether she can stitch up a wound and supply a cake for the bake sale.6

The American character—and the cultural stories we once told about ourselves and should be telling again—is practical and pragmatic, impatient with formalism, ambitious for success and advancement, independent, tilted toward fairness, respectful of education and demonstrated worth, willing to trust proven competence, small-D democratic, and grounded in common sense. We value what works, because we’ve had to depend on it. We’re willing to try something new, because you never know until you try.

We’re not the best people in the world, but we’re pretty damn good. We still think there’s a future worth working toward and a world of opportunity waiting for us just over the horizon. We might entertain ourselves on Saturday night with movies about churlish adolescents lusting after supermodels and blowing up the world, but such stuff barely touches our minds and hearts. Come Monday morning, we roll up our sleeves and get back to work.

1. See Never the Devil You Know from October 14, 2012, for thoughts on nuclear war and other apocalyptic thinking.

2. As to being “empty,” well yes, perhaps 7 to 10 million native Americans were already living in this vast open space. But for the European settlers, they were a grim, gray, Stone Age people out beyond the edge of the woods—an impediment rather than actual, rightful landowners. The American colonization, at least from the European viewpoint, was not at all like the Vandals and Goths coming into the settled areas of the Roman Empire, or the Mongol hordes invading the plains and developed civilizations of India, the Middle East, and Eastern Europe.
       As to being “violent,” the forests of North America were a wild and stony wood, as yet untouched by ax or plow. The Europeans, who had long ago harvested their native forests for timber and fuel and now lived among trees that were carefully nurtured in “plantations,” had never seen anything like the dense forests of the eastern part of the North American continent. And when they got to the Great Plains—with their sudden tornados, torrential floods from thaws and storms far beyond the horizon, and herds of millions of stampeding bison—they marveled at the raw, untamed power of the land.

3. See We Get the Smart Ones from November 28, 2010.

4. Well, yes, in the Old South the black African slave was dismissed as a person of no account with no future by the slave owner, who thought of himself as a naturally superior being. But that was not a common view throughout the country. In fact, enough of the population abhorred the notion that we fought a war over it which killed 600,000 Americans.

5. Or not, thank God, so much anymore. In the past, skin color and physiological features have meant more, especially when dealing with those we considered “the Other.” Today, we—or at least, the best among us—tend to listen for tone of voice, reliable values, sensible advice, breadth of spirit, and level of education more than we look for gene pool and country of origin.

6. Yes, I know, these are rather sexist examples—and throughout this essay I’ve been writing “he” when I should have been using “he or she,” or liberally substituting “she” for the third person pronoun. But I’m reflecting three centuries of attitudes here, and it’s just recently that men and women have started to shrug off their past sexual stereotypes. Only when we can casually ask if a mother will coach Little League and a father provide a cake for the bake sale will we be totally free of gender roles.

Sunday, October 21, 2012

In the Palm of Your Hand

Back when I was at the university, in the late ’60s, if you had told me that one day my telephone would also be a camera, a record player, a television set, a compass, a bookshelf, a newspaper, a reference library, a stock ticker, a weather report, something called a “video recorder,” something else called a “global positioning system” that could find my exact spot on a local map and guide me to all sorts of nearby stores and services, and, oh yes, also a “picturephone” like the one that was demonstrated at the New York World’s Fair in 1964 and played cameo role in 2001: A Space Odyssey … I’d have said you were crazy.1

I wouldn’t have said this because I was a bumpkin with no imagination. I had been reading science fiction since grade school (the glory years of Asimov, Bradbury, and Heinlein), watched every sci-fi oriented television show (Captain Video, The Twilight Zone, The Outer Limits and later Star Trek), and listened to radio plays (X Minus One). While still in high school I wrote a full-length science fiction novel (not a very good one—I was a late bloomer). And as an undergraduate I took a three-credit course on the future (including various methods of prediction through analysis of historical, economic, and cultural trends). So I lived through my school years with one foot in the future, but I never saw the smartphone coming.2

It’s not hard to see why. At the time, all of these tools and services required vastly different technologies. The telephone ran on wires through a dedicated switching system. Cameras used film impregnated with light-sensitive dyes to produce flat images. Records created sounds by running a needle along wavering grooves impressed into vinyl disks. Television came through the air with pictures broken up into scan lines beamed across a phosphorescent screen. Compasses were metal needles mounted on pins. Books, newspapers, and libraries were heavy wads of paper. The stock ticker was essentially a telegraph, and weather had to be drawn out on large maps. Video recording was done in a studio with a camera the size of an ice chest and a tape machine as big as a desk. And locating places on the Earth by triangulating signals from satellites was, at best, a gleam in some defense analyst’s eye.3

If you had asked an engineer of the time to pack all these devices into one machine and then to make it portable, the result would have been a rig something like a one-man band’s. These self-styled musicians usually have a drum hanging off their shoulders, cymbals between their knees, harmonica on a bracket at mouth level, and washboard, guitar, and accordion slung across their chests, all held together with cinch straps and a belt. In the engineer’s vision of a 1960’s “smartphone,” you would wear a helmet with earphones and a microphone plus a pair of rabbit ears for TV reception, a chest pack porting various lenses for the cameras and spools or cassettes for film, a shelf at the back with a platter and tone arm for playing records, a cathode-ray tube bracketed at eye level for television, and a shelf with reading table across the stomach for those books and newspapers. It would trail long connecting wires that you could plug in for telephone service. Oh, and it would run on really big batteries or need to plug in somewhere else for power.

All of these devices and services—except for books and newspapers, and the film camera—had only one thing in common: electricity. But conceptually their use of that energy was still diverse: low-voltage diaphragms and carbon granules to run the telephone; electric motor and vacuum tubes to run the record player and its amplifier; high-voltage cathode-ray tube for the television; and so on. These were “electronics” in the primitive sense of analog signal propagation and simple amplifying circuits. And for the camera and the books, nothing.

The 1960s was still an analog world. Voices, sounds, and pictures were all captured in real time, stored physically, and recalled and played out as wavelengths. The rising and falling pitch of a human voice or the musical notes from an instrument were presented in infinitely variable gradations from loud to soft and from high to low. And these were captured as deeper and faster—or shallower and slower—zigs and zags in the endless spiral cut into a vinyl disk, or as stronger and more rapid—or weaker and more widely spaced—magnetic pulses in the iron oxide coating on a piece of tape. The light and dark areas that make up a picture were captured as more or fewer grains of dye held in an emulsion on film and turned sideways—or not—when struck by photons from the visible spectrum. The light and dark of a television image depended on the strength of a electron beam that scanned across a glass screen coated with phosphorescent dyes.

Signal strength—and the various ways to capture it, record it, and play it back—ruled the day. And what makes the smartphone and its many functions now possible is, of course, digital technology. Or rather, all of the various digital technologies that have come together in the years since the 1960s.

Sounds, for example—whether telephone voice transmission, music, or anything a microphone can pick up—instead of being captured as an amplitude-dependent analog signal, are now chopped into digital values representing pitch of voice and speed of delivery. Those binary values4 can be transmitted through a variety of signal formats (wire, microwave, radio), or stored electronically in a memory chip or on a magnetic disk, or manipulated through mathematical algorithms to compress the speed of transmission, filter out background noise, or even completely alter the signal content. Images—whether photographs, maps, or video frames—instead of being captured as pulses representing analog signal strength in each of three primary colors (the red, green, and blue of emitted photons) are now interpreted as an array of picture elements coded for hue and brightness, with more codes for coordinating the line scans and refresh rates in video. Simple book text is stored and sent as binary codes, with more codes for coordinating typeface, treatments, layouts, placement of illustrations, display of footnotes, and the reader’s current position in the text stream.

Computers existed in the 1960s, of course. And large amounts of data—mostly words and numbers—were already being interpreted, stored, and manipulated digitally. But the computers themselves were still huge installations, with core memories the size of walk-in closets and storage on banks of spinning tape drives or magnetic disks the size of washing machines. The cost and complexity of these assets restricted their use to large institutions, with access closely controlled and limited to highly trained operators.5 These users typed information on keyboards, which usually transcribed lines of data or code onto Hollerith or “punch” cards, which reading machines would then feed in stacks into the computer’s memory. Users received their output on reams of fanfold paper. Graphical user interfaces, with a high-resolution screen to display data and a mouse to select among it, were only just being invented. Even the science fiction of the time envisioned computers as massive, requiring dedicated rooms full of machinery. Robert Heinlein’s The Moon is a Harsh Mistress is a classic example.

It wasn’t until the next decade that microprocessors—“computers on a chip”—brought computing power down to something you could pack into a suitcase (the Osborne 1) or a typewriter frame (the Apple II). The designers of these original computing chips were not aiming at general-purpose, portable, personal computers but rather controllers for automated machinery. The computer that controls the fuel injection in a modern car or the clock functions of an automatic coffee maker6 is a microprocessor.

From the 1970s onward, we’ve seen digital technology advance and converge with the expanding capacity of computer chips, until today you can hold in your palm a telephone that’s also a camera, a television set, a record player, and so on. It all seems natural and logical in hindsight. But from the perspective of the 1960s, this combination of handheld devices was unimaginable. Even to contemplate it bordered on magical thinking—or madness.

The point of this meditation is simple. What other advances similar to the digitization of analog signals are out there, perhaps already on our intellectual horizon, that will make the world of 2050 unimaginable or magical to the average person of today? Up until this summer, I would have placed large bets on biological advances through manipulation of individual genomes and cellular functions. But with the announced discovery of the Higgs boson in July, I think we may also see a comparable leap in physics and possible applications through the manipulation of gravity, space, and time.

As I’ve noted before, scientific exploration and understanding have put humankind on an express escalator to a future we can’t even imagine. Fasten your seatbelts, it’s going to be a wonderful ride!

1. And if you had told me that one day I would like peanut butter on my grilled chicken and enjoy raw fish wrapped in cold rice and seaweed, I’d also have said you were crazy. But then, I was a suburban lad brought up on the East Coast, and the specialties of Thai and Japanese cuisine were still unknown to me.

2. Still, I and everyone else who read the Sunday supplements knew that by 1990 our cars would fly like George Jetson’s. We all seriously misunderstood the amount of energy and the sophistication of control required for that little trick.

3. The first global positioning satellite, Navstar 1, wouldn’t be launched until 1978.

4. Quick overview of the technology: All a computer knows and can handle is a two-value system—a switch at a certain location is either “on” (for “1”) or it’s off (for “0”). By selecting either 1 or 0 in each placeholder, and recognizing an orderly progression of placeholders, you can count up to any number, in the same way that you can select from among the ten symbols of Arabic numerals for 0 to 9 in one place, move a place to the left and count 10 to 99, move a place further and count 100 to 999, and so on.
       Binary counting, in base two, originally used eight placeholders (that is, eight binary digits—or “bits”—which can be either “1” or “0”) to count up to 256. A group of eight bits is generally recognized as a “byte.” By assigning each of those 256 possible values as code for an upper or lower case letter, Arabic numeral, punctuation mark, or other commonly agreed-upon representation, you can you can turn a string of eight binary digits into a letter, number, or symbol.
       You can then make groups of those bytes into words, sentences, and paragraphs. Or, with a different interpretation of the 1’s and 0’s, those bytes can become the pitch and delivery timing for a sound fragment in a conversation or piece of music; or the color and intensity for a picture element (or “pixel”) in a photograph or drawing; or a location reference on a map; or anything else that you and a community of programmers can agree upon. The programmers can then write software that manipulates those values mathematically in any way that human beings can imagine and absorb.
       Side note: When I was growing up, a long-distance telephone call was very expensive. This was because telephones using analog signals required human operators or mechanical switches to align various lengths of wire between one place and another into a single circuit—a connected span of wire—all the way from the telephone of the person making the call to the phone of the person receiving it, and then hold that circuit open while the two people talked back and forth. It took a lot of effort to arrange a call and dedicated a lot of expensive wire to carrying that single conversation back and forth.
       These days, using digital technology, a computer in the telephone itself (or alternatively in the local switching office) chops the conversation into convenient digital packets, attaches an identifying address to each packet, and sends it out across the system, which is a vast web of circuits, some carried by wire, some by microwave, and some by radio. Automatic switches along the route pass the addressed packets to the receiving office or phone, where they are reassembled into human speech or whatever else went into the handset microphone. Those digitized packets are chopped, sent, and reassembled by computers operating so fast that the process is virtually undetectable to the humans participating in the conversation.
       Those packets can also be shunted and stored on a memory chip (“voice mail”) or sampled and stored by government agencies (“wire tap”) just as easily as they can be reinterpreted through a loudspeaker for the human ear. Digital technology opens vast possibilities for the use and abuse of human speech or any other signal.

5. My alma mater, Penn State, ran most of the university’s accounting and academic recordkeeping with an IBM System/360, located in the heavily air-conditioned basement of the Computer Center. System capacity at the time was laughable by today’s standards: core memory in the thousands of bytes and tape or disk storage in terms of millions of bytes per mount. But then, my entire personal file and transcript—name, gender, home address, campus address, telephone numbers, courses taken, grades received, and payment history—would probably fit into the space of 2,000 bytes. So storing and retrieving the files on 26,000 active students was entirely feasible, although high-resolution photographs were not included.

6. Another question that someone from the 1960s would ask in wonder: “Why does my coffee pot need a clock?

Sunday, October 14, 2012

Never the Devil You Know

We live in the age of The Scream. Edvard Munch’s iconic melting person, paralyzed and destroyed by a nameless terror—that’s us, folks. This psychological condition, which seems to pervade the subconscious of all of western civilization, is more than simple alienation, malaise, or a free-floating anxiety brought on by the increasing complexity of technology and stresses of modern life. It’s the innate sense that the sky will soon rip open and the fire rain down upon us.

I’ve lived with this sense my entire life. I was born in the lull between the fission bomb based on uranium and the much more powerful fusion bomb based on hydrogen. My birth year coincides with the death of Gandhi and the birth of the State of Israel—both momentous psychological events. In the first grade, at six years of age, we practiced “duck and cover” as a defense against a nuclear explosion. We knew people who dug fall-out shelters in the backyard—not bomb shelters, because no one was expected to survive the initial blast, but places where you could live for two weeks or so while the radioactive dust blew over.1 But, of course, Nevil Shute’s novel On the Beach, and the Gregory Peck movie made from it, proved that no one could survive the cloud of radiation that would ultimately sweep across the planet.

Ask anyone of my generation how the world was supposed to end, and the immediate answer would be “Nuclear fire!” We all knew the U.S. and the U.S.S.R. were just sparring in Europe and were itching to obliterate the world many times over. We all went on some kind of twenty-four-hour warning over the Cuban Missile Crisis in 1962—after hearing Kennedy’s nationwide address on October 22, I remember at age fourteen thinking that this was it, we were all going to die before morning—and only narrowly drew back from the brink. The danger of nuclear holocaust remained right through the fall of the Berlin Wall in 1989.2

But even before the collapse of global Communism, a new end-of-world scenario had raised its head. Through my work at a public utility, I attended Energy Daily’s conference on national energy issues in 1987 and heard one of the early presentations by James Hansen of the NASA Goddard Institute for Space Studies on global warming. The amount of carbon dioxide being pumped into the atmosphere by the human energy infrastructure was going to tip the planet into a runaway greenhouse effect with catastrophic results. By the middle of the next decade, Hansen said—meaning the mid-1990s—the change in climate would be obvious to the man in the street. He also said that the immediate effect might not be rising temperatures all over but falling temperatures in some places, as the Earth’s climate went through an initial period of disruption.3

But even before the catastrophe of global warming played out in the 21st century, we had the impending doom of Y2K. Because the creators of the COBOL business language allowed only two digits for recording the year in all transaction dates (after all, it was going to be 1900 forever, wasn’t it?), all sorts of financial and commercial software installations, and the businesses that depended on them, were going to collapse at the stroke of midnight in the year 2000. Computers would be unable to distinguish between new transactions and those booked a century earlier. Calculations of interest earned and owed, rents due, and future deliveries to be met would go haywire. Civilization would fall.4

A couple of years later we had a spate of movies, some appearing back to back, about a comet hitting the Earth and wiping out humanity just as the Chicxulub asteroid took out the dinosaurs. In both movies, we benefitted fictionally from having a fully developed space presence and the ability to go and deal with the comet—something we don’t, in reality, happen to possess. Still, our ability to averting catastrophe before the closing credits was a near thing.5

Today we have the prospect of financial collapse, “the fiscal cliff.” And, as if the implosion of the Housing Bubble in 2007 and the five years of Recession-That-Might-Be-Depression since then were not bad enough, we are generally offered a future in which the democracies of western civilization, chronically unable to live within their means, haplessly tax and borrow and spend themselves into bankruptcy, penury, and dissolution. No one knows exactly what the doom will be. It’s not as easy to predict the effects of a financial collapse as it is to imagine nuclear fire and rising sea levels. But it will be grim: 1932 squared, selling apples and picking rags, perhaps aggravated by a hyperinflation like the Weimar Republic’s. Money will at once deflate, or perhaps it will inflate. Banks will fail, or maybe they’ll end up owning everything.

The apocalypse foretold in movies, novels, and comic books—er, graphic novels—has been part of the culture for so long that a writer today barely needs to provide any backstory at all. What do you care? Things went haywire. Money was no longer any good. Electricity—even if you could get it—became all hand cranks and windmills. Everybody broke out the guns and went tribal. But the upside is that nobody has to pay a mortgage or show up at the office on Monday—and you get to kill anyone who looks at you cross-eyed. You might miss the clean food, digital movies, and hot showers we enjoy today, but you’ll absolutely love the manly thrill of freedom and adventure.

Yeah, the world is going to end and it will be terrible—except for the bits that will be exciting and wonderful.

It’s said that western civilization went through something like this before. As the year 1000 AD approached, people anticipated the Second Coming of Christ, Judgment Day, the rise of the Antichrist, and the collapse of public order with riot and fire. Whether or not the rioting and the penitence actually occurred—because the first written accounts of this millennial fever didn’t actually surface until about 1500 AD6—the notion of millennial panic is one of our civilization’s memes. And in truth, going back beyond Christianity and its dating system, the pattern of human history has been that civilizations really do fall. Various empires—Sumerians, Akkadians, Babylonians, Persians—chased each other up and down the Fertile Crescent. Egypt’s dynasties collapsed several times—as did China’s. Rome grew, flowered, dominated the known world, and collapsed. Why shouldn’t the collective democracies and commercial superpowers of Europe and the New World follow Rome down the rat hole of history?

Like stories of a worldwide flood—found in almost all cultures and obviously representing some kind of species memory of rising sea levels after the last Ice Age about 10,000 years ago—so the human consciousness expects that, whenever things get too good for too long, a collapse is not far away. The 19th century began with the wars of Napoleon but quickly became, for the western world, a time of peace, prosperity, and technological advancement. That glide path toward a better future was interrupted in the early 20th century by two world wars with a global depression in between them, but the inexorable rise of technology and improvement in living standards proceeded without missing a beat. Now, like people who’ve been partying for twenty hours straight and have yet to see the lights go dim and the booze run out, we’re wondering what comes next.

Nuclear fire, global warming, financial collapse … each bogeyman rises up, waves its all-too-real arms in our face, and then subsides. I’ve seen that pattern repeated often enough in my own life that I just don’t believe it anymore. The world might end one day. It will surely end in about five billion years when the sun proceeds along its inevitable Hertzsprung-Russell life cycle and balloons into a red giant, and humans will go with it unless we find a way to get ourselves off this rock. Until then, I’m betting that the premature end of the world and life as we know it won’t be from any pattern we can predict and subsequently spend dozens of years and millions or billions of dollars agonizing about.

It’s never the devil you know who shakes your hand. Perhaps the world will end in a burst of gamma rays from a supernova exploding too close nearby. Perhaps we’ll freeze in another Ice Age brought about by another fading of the sunspot cycles.7 In any case, and until then, the best course both for individuals and for society as a whole is to pay our mortgages, show up at the office, and live each day with patience and fortitude.

1. I remember seeing shelter designs that had no elaborate seals to keep out the dust but did have a short cinder block entrance tunnel that made a single right-angle turn—presumably so the radiation, which followed straight lines, could not penetrate the living quarters. Pure magical thinking.

2. The psychological pressure remained, despite the fact that no one had ever exploded two nuclear bombs simultaneously, let alone hundreds of them over one hemisphere. The theory of fratricide—that either the electromagnetic pulse of one bomb would destroy the firing circuits of any others above its horizon, or that the initial wave of radiation would change the fissionable characteristics of the nuclear material in their triggers—suggested that a full-scale nuclear war was physically impossible. As a theory, it was never tested. But the risk was averted by changes in geopolitics and socioeconomics, rather than strategic considerations. While isolated nuclear exchanges from one country to another might still take place, a rain of fire across all the world remains only a distant possibility.

3. And yes, we’ve had some hot summers as well as some cold snaps. The Arctic sea ice is melting, while the Antarctic land-based glaciers appear to be growing. (I’m reminded that we’ve only been watching the world’s weather—and the icepack in particular—in real time with geosynchronous and polar-orbiting satellites for about forty years. Hardly time to establish long-term trends.) It’s a wonderful prediction that is to be proven by winds that blow both hot and cold.

4. And yes, a lot of programmers and accountants worked long into the night in the two or three years before 2000 to apply patches and workarounds. Did spending all that time and money save the world? We’ll never know.

5. I contributed my own bit to this futuristic folklore with my first novel, The Doomsday Effect, first published in 1986.

6. For a general overview of the controversy, see this article from the New England Skeptical Society about the more recent source, and debunking, of millennial panic.

7. Solar Cycle 24—which began in 2008, after a prolonged sunspot minimum, and will peak in 2013—is expected to be one of the weakest cycles in a hundred years. Since the sun’s energy output is greater during periods of high sunspot activity, this doesn’t bode well for a warm climate. It was the utter lack of sunspots during the Maunder Minimum between 1650 and 1725 that brought on the Little Ice Age in Europe and the freezing of the Thames. One might propose that the increasing temperatures since then were—at least in part—a product of the renewed and increasing solar activity that coincided with the Industrial Revolution.