Sunday, July 31, 2011

Labels as a Way of Life

I recently got into an online tiff in the comments section of a friend’s Facebook posting. Another commenter had mentioned the demotion of Pluto as a planet,1 and I responded that I think we can get hung up and waste intellectual energy on labels. A flurry of responses greeted this observation. One person pointed out that, without labels—terms for things that we can all agree upon—communication is impossible. Another quoted Plato on the necessity of having experts assign the names for things. At that point, I yielded the floor in disgrace. Now, I’d like to continue the thought in the form of an exploration.2

Clearly, people care whether Pluto is a planet or not. The new definition of “dwarf planet” seems to hinge on the fact that Pluto has not cleared out small objects and debris in the neighborhood around its orbit, as more massive bodies tend to do.3 To the extent that this new definition helps us understand something about the mechanics of planetary systems—and perhaps will help us interpret what we’re going to see happening around other stars—I applaud it. To the extent that people are unhappy because they grew up in a solar system of nine planets, learned mnemonics to remember all their names in order, and treasure each planetary discovery as a human achievement, then the label is a time waster.

Labels can indeed be useful. In the biblical creation story—or at least one of the creation stories4—God parades the animals he has made and lets Adam name them: cow, horse, sheep, goat. It’s very handy to be able to walk into a stable, point to a stall, and not have to tell the stable boy: “Bring me that animal with four legs, but with rounded hooves and no horns, for I want to go riding today.”

Things become less clear when you reach back up the evolutionary tree and delve into cladistics, or the grouping of animals based on their common heritage. For a long time, we all knew what a mammal was: warm-blooded, covered with hair, bearing live young, not poisonous, not scaly, not feathered. However, a traveling science exhibit called “Extreme Mammals”5 shows all the possible variations in mammals; some of them are hard to fit into the basic definition. Of course, there are also marsupials, who fit all the criteria of mammals but bear young that are hardly viable unless tended in a pouch, and the platypus which looks a lot like a mammal but lays eggs and has poison spurs. And current thinking now supposes dinosaurs to be as warm-blooded as birds. On the other hand, whales and dolphins are not noticeably hairy.

You can still use the term “mammal” in general, non-specific conversation. But at some point, trying to classify everything you see as either mammalian or not becomes a time waster. You either end up with a hopelessly complex label, full of subclauses and exceptions, or with a handful of animals that just don’t fit all the criteria. The newer classification systems build trees of descent from common ancestors, originally by comparing physical features, now by comparing genetic material. The old kingdom-phylum-class-order-family-genus-species organization is rapidly becoming obsolete. Today we know animals by how many genes they have in common and what SNPs they share.6

Moving beyond the rigors of scientific definition, labels become even more problematic. The Zen warn us to be wary of labels, because they lead us to think we know all about something and so keep us from really looking at it.7

In my youth, I tended to drink too much. It didn’t really bother me—of course, it never does—but I noticed that I was beginning to read and remember the definitions in self-help guides about what constitutes an alcoholic and how he or she acts. I would read about some contributing behavior, such as taking a drink in the morning to ward off a hangover, and think, “Well, I don’t do that, so I guess I’m not an alcoholic.” Avoiding putting a label on what I was doing, which would force me to take some kind of action, seemed important at the time. So long as I could wiggle around the terms of the definition, I could keep on drinking. Only when I put aside worrying about whether or not I fit the label “alcoholic,” could I then begin to examine my behavior directly and take action.

Labels have always dominated our political discourse. One adopts a party affiliation, the label “Republican” or “Democrat,” as shorthand for a selection of views and definitions. One feels more at home in one camp or the other. But, like taking a suit off the rack and wearing it, there are usually places where the label doesn’t fit. It tugs against or sags across what we really believe.

Most Republicans, for example, believe more or less in personal responsibility, libertarian rights,8 self-reliance, fiscal conservatism, shareholder capitalism, and free market economics. Although a lot of Republicans are strongly believing Christians, and some are dogmatic in their views on gays and abortion, among other things, I’m not one of them. And I don’t know any Republicans who would—according to the label used by some Democrats—abolish all social programs and safety nets in favor of some kind of extreme social Darwinism that would increase the position and power of the rich.

On the other hand, I can intuit—although I don’t know for certain, because I don’t vote that way—that most Democrats believe in fairness and personal equality, the power of community and social cohesion, group rights based on affinities like union membership or racial identity, the need for states and institutions to support those who can’t take care of themselves, and the wisdom of disinterested government employees and public servants in regulating the activities of people driven solely by self-interest. Although many Democrats strongly believe in the overall power of the state and in organizing principles like Marxism and socialism, and some are dogmatic in their animosity toward wealth and property, among other things, I know that not all of them so believe. And I don’t know any Democrats who would follow the example of the Cultural Revolution and the Khmer Rouge in moving academics and city dwellers out to the countryside so they can learn peasant values at the point of a gun.

There was a time, when I was young, that individuals could appreciate and find common cause with those across the aisle. Certainly an Adlai Stevenson, Everett Dirksen, and Daniel Patrick Moynihan had friends and admirers in the opposing camp. That doesn’t seem possible these days. The extremes in each party are pulling the center of mass away from the center on each side. Some people cannot say the word “Republican” without thinking “Nazi” and “evangelical crazy” and wanting to spit. Some people—I try not to be one of them—cannot say “Democrat” without thinking “socialist” and “Stalin apologist” and wanting to run screaming.

Of course, the plurality of voters in this country are wholeheartedly neither. They stand in the middle, and for them the labels don’t fit well at all. They’ll vote for a Republican or Democrat depending on what the candidate says and whether it makes sense. And, of course, when the party in power begins to act as if that vote was a mandate to pursue all of its pet policies, reasonable or not, we get a sudden turnover. Johnson and the Great Society beget a Nixon. Nixon and self-interest beget a Carter. Carter and international apologetics beget a Reagan. And so on, with repercussions, right down to today.9

It’s only the inherent power of our two-party system, and the infrastructure that supports it, which keep us from devolving into the tiny, fragmented political parties of Europe. “We’re not socialists but social democrats!” And then, to get anything done, they have to form coalitions that vote with bartered uniformity in their parliaments.

Political labels also have a tremendous power to cloud our vision of what a proposal actually is and how it will help or hurt the country. The current debate over debt limits, spending trends, tax rates, default, and the federal deficit—which I hope will be resolved by the time you read this—is an obvious example of such obfuscation.

Labels help us communicate, indeed, but they also can keep us from seeing what, exactly, is going on. Calling Pluto a “planet” blinded us all to the subtlety of its relationship with its environment. Calling someone by a political label can keep you from seeing the subtlety and humanity of his or her political thinking. Use labels for communication but, as the Zen advise, be wary of them.


1. The International Astronomical Union so ruled in August 2006. See, for example, the article at www.space.com.

2. Anymore, the only way I can think about a subject is to write about it. The act of writing makes me put my thoughts in order, examine and challenge my arguments, take them to a logical conclusion, and check my facts and references. In fact, writing a book is simply the exploration of a large, inviting, but vaguely known territory. Writers are explorers of strange landscapes.

3. I guess this is why biggish, spherical rocks like Ceres are not planets, even though they orbit the Sun as do Earth and Mars.

4. As we learned in Robin Lane Fox’s The Unauthorized Version, the creation story is told at least twice and with different details.

5. I saw it last year at the California Academy of Sciences, but I believe it’s going around the country.

6. SNPs are “single nucleotide polymorphisms,” representing one altered base in the genetic code. Sometimes that base change leads to an altered protein; often it does not. Sometimes the new protein works in a slightly different way from the original; often it does not. So SNPs are more sensitive as relationship-defining criteria than are physical characteristics. Because the rate of change in genetic material is fairly constant, you can track animals, diseases, and human populations through time with SNPs.

7. It was in this sense that I objected to the agonies about labeling Pluto as a planet. If you have a handy definition like that, it may keep you from seeing what’s really going on. In truth, at the time I was thinking, “Planet, shmanet, Pluto’s an interesting object regardless of what you call it.” Now, having looked into the actual IAU redefinition, I think the label “dwarf planet” actually helps us see something interesting about Pluto and other small bodies.

8. What do I mean by “libertarian rights”? It seems to me that “rights” today has developed two meanings. One responds to the progressive’s sense of “things that must be provided by law,” such as public education, child protective and abortion services, and—soon—health care. The other responds to the libertarian’s sense of “things that must not be curtailed by law,” such as freedom of speech and assembly, security of home and person, and—now under challenge—gun ownership. The line between them isn’t always clear. The right of two gay people to form a binding spiritual union with the same official standing as the union of two straight people seems, to me, to fall on the libertarian side. And the ability to choose abortion seems like a libertarian matter—although not the expectation that it will be provided at public expense.

9. I’d like to believe that we could distill the ideas of the middle-of-the-road, independent voter and come up with the platform for a third party. But what would happen, I think, is we’d get a hundred different possible mixes and matches: “More state support, please, but strengthen the family … Stronger borders, please, but spend more on immigrant education …” and so on.

Sunday, July 24, 2011

The Age of Intermediates

Today we like to think that we live at the pinnacle of human development. Nowhere on this planet—which, as near as we can tell, means nowhere in this star system and probably nowhere within thirty lightyears of here—has technological progress, or progress in a great many other areas, come so far.

Consider technology first. In 250 years, we’ve gone from the first inefficient steam engines to jet engines that whisk hundreds of people through the stratosphere and across continents. In 100 years, we’ve gone from the first crude radio messages to a globe bathed in radio and video transmissions, communication with satellites and probes all over the solar system, and seamless wireless links between people and with a worldwide information infrastructure. In 50 years, we’ve gone from simply identifying the molecular structure of our genes to reading and manipulating the code.

The extent of our achievements goes well beyond physics, electronics, and biology. Through the study of neuroscience, psychology, and games theory, today’s best diplomats and negotiators could run circles around a Richelieu or Medici. With an accumulated appreciation of tactics and the nature of battlefields and engagement, our best generals would handily defeat a Napoleon or Wellington. In the arts, our best writers can create characters and stories whose power to capture the imagination rivals those of a Shakespeare or Chaucer. Our best artists can create images of such visual strength that a Michelangelo or Monet would feel envy.1

We seem to stand on the peak of achievement. It hasn’t always been this way. The Romans, for all their military and engineering achievements, looked back with humility at the intellectual achievements of the Greeks from preceding centuries. And for a thousand years the Europeans who inherited the Greco-Roman tradition knew themselves to be living in a dark age. It wasn’t until the Renaissance that art and philosophy began to copy the classical forms and then exceed them. It wasn’t until the 17th and 18th centuries that thinkers began to replace classical superstition with scientific observation.2

Of course, from the viewpoint of objective truth, we do stand on a summit. The things the average person in the western world takes for granted in everyday life—from the consensual hallucination of the movie theater or video game to the instant contact of the cell phone and instant meals from the programmable microwave—have never existed before. We live in an age that would seem like magic to the most advanced thinkers of 500 years ago.3

But to the science fiction writer—and by extension, to anyone who reads and resonates with the visions of such writers—the glass is still only half full. We live in an age intermediate between what was and what will be or may be possible.

For all the trips we’ve made to low Earth orbit, our isolated visits to the Moon, the probes we’ve sent to sample and observe on Mars and the outer planets, we’re still only at the beginning of humankind’s use of space. The average person only knows about space through the achievements of a select few, the same way the average person of the 15th and 16th centuries only knew of Asia and the New World through the visits of Columbus, Magellan, and their crews. The average person in the West may own a satellite dish or a GPS unit, but his use of space is entirely secondhand.4 We still only dream of colonies on the Moon and Mars and voyages to the stars.

For all that we’ve learned and achieved with DNA, RNA, proteins, and our appreciation of cellular mechanics, we’re still only at the beginning of understanding and manipulating life. We still advance our stocks of cattle and horses, dogs and corn through the inspired guesses of breeders who bring together sire and dame, stamen and pistil. We still use whole animals in our animal husbandry. We can only dream of designing hybridized creatures like the chairdog5 or manipulating cells to create fossil fuels and exotic materials like synthetic spider silk and latex on demand. And molecular biology is still only the doorstep to a thriving nanotechnology.

For all our medical advancement, we still take organs from living or terminally damaged donors to implant in those who are dying for want of a healthy organ. We still supplement lost and immobile limbs with mechanical devices that only simulate original function. We are only at the threshold of growing new organs on synthetic armatures using a patient’s own stem cells. We are only starting to design mechanical arms and legs that interpret the body’s own nerve inputs. We can’t yet stimulate the brain’s optical centers to give sight to the blind, bud limbs to regrow missing arms and legs, or repair damaged nerves to recover function.

For all that we have achieved with civilization’s infrastructure to promote physical and emotional comfort, we still don’t have a comprehensive theory of economics and personal worth. We still let whole populations starve, not because there isn’t food enough in the world to feed them, but because they are on the wrong side of a war or economic system. We let individuals in our cities wander in confusion and fear and eat out of garbage cans because we don’t have an adequate understanding of human responsibilities, rights, and capabilities.6 We cannot yet differentiate freedom from license or supervision from tyranny. We are a long way from achieving the sort of balanced, stable, educated communities of equals that colonize the planets of our science fiction stories.

The water empires of the Egyptians and Mesopotamians stood halfway between the hunter-gatherer societies that came before them and the organized political and military regimes of the Macedonians and the Romans. The last centuries of the second millennium—the period of Europe’s Renaissance and Enlightenment—stood halfway between the philosophy and achievements of the Classical Age and our own modern era. And so today we live in an intermediate age.

We stand halfway between the first gropings of scientific observation and the full use of physics, electronics, and biology in everyday life.7 We are halfway between the first organized thinking about politics, economics, and psychology and a fully functioning society that appropriately accommodates people of all conditions and capabilities. We are halfway between the flight of the first fragile airplanes and a life among the stars.

The glass is both half full and half empty. We have a long and exciting road still to travel.


1. Of course, there’s also a lot of truly dreadful art and bad books afloat in the culture today, along with a lot of boneheaded generals and terminally clumsy diplomats. But the past had its tedious writers, uninspired painters, and second-rate professionals, too. To quote Theodore Sturgeon: “Ninety percent of science fiction is crap—but then ninety percent of everything is crap.&rquo;

2. Superstition may be defined as the dominance of imagination, hope, and fear in our interpretation of the world around us. “The world is whatever you think it is.&rquo; Science is the dominance of observation and logical reduction. “What do you actually see? What can you prove? That is what the world is.&rquo; Aristotle and Plato observed, but they did not always reduce their observations to what they could prove logically. For example, Aristotle explained gravity as the tendency of things to move toward their “natural place.&rquo; Hardly a robust definition. It took a Newton to recognize gravity as a force with predictable and calculable effects.

3. Or to quote Arthur C. Clarke once again: “Any sufficiently advanced technology is indistinguishable from magic.&rquo;

4. And—damn it!—I still don’t have the flying car or rocket backpack that the Sunday supplements promised me.

5. Found in the later Dune books of Frank Herbert.

6. Some will say that we have already found the perfect system for taking care of people—and then they invoke Socialism, Communism, or some other manifestation of the Leviathan state. I believe the various forms of collectivism practiced in various societies in the 20th century (Nazi Germany, Soviet Russia, Khmer Rouge Cambodia, etc.) adequately proved that regimentation, central planning, and state-mandated distribution of goods only destroy human ambition and creativity. But while free-market economics and shareholder capitalism have created great communal wealth and well-being in the West, anyone can still point to disturbing cases of exploitation and neglect. No, our economic and political thinking has a long way to go.

7. Is there a natural limit to our understanding through science? For all our theories of physics, we still do not understand the commonest forces like gravity, electricity, and magnetism. We can write equations that include their observed effects, like Newton, but we still don’t understand what force means at either the relativistic or quantum level. We don’t understand the structure of space or the nature of time. We are beginning to suspect they harbor forces like dark matter and dark energy, but we still have trouble even quantifying them. Perhaps our knowledge of the universe will always be limited by the fact that we are beings finite in space and linear in time, but that does not mean we won’t learn a lot more in the coming decades and centuries.

Sunday, July 17, 2011

Manifest Destiny in Space

With the last flight of the Space Shuttle program, everyone is giving the fleet a cheerful and tearful sendoff and wondering what comes next. For the moment, we’ll rely on the Russians to service the International Space Station and watch as the Chinese expand their presence in and above Earth orbit. No doubt the U.S. will continue to send un-manned missions to the inner and outer solar system. But what is the next big step for the U.S. space program?

With the current debate over our $14 trillion national debt and the spending cuts and tax increases needed to address it, clearly—at least to me—space exploration will take a back seat to mandated spending on entitlements and necessary spending on our military and infrastructure for the foreseeable future. The taxpayer-funded ride may be over.

I’m not exactly sad about that. Reflecting on NASA and its achievements, I harbor a dark thought: What if the U.S. computer industry had been dominated and directed by a similar large government organization over this same period? I think the IBM 360 mainframe, Fortran, and COBOL would have gathered the same loyal adherents and dedicated funding as the Atlas and Saturn rocket systems. They would have persisted as our main computing tools until a major change in policy brought out the VAX or PDP-11 minicomputer, which would be reaching its end of service in 2011. The average person would only interface with a computer in large organizations, such as banks, major companies, or government departments, and then only through a text-based terminal. Graphic user interfaces and personal access to the internet would be dreams of the future. Our cars would still be mechanically carbureted, and our cell phones would be the size of lunch boxes.

It was entrepreneurs like Hewlett and Packard, Wozniak and Jobs, and a thousand others that brought us the chip-based microcomputer. But they didn’t do it in response to a government mandate (“a computer in your palm by the end of the decade”) and didn’t ask for billions in funding to begin development. They took processors that were already being designed for machine actuators and put them in a box with a screen and keyboard. It was nerd candy: an unhandy, complicated appliance of limited usefulness that required the buyer to think in new ways and usually learn a new language.1 But thousands of people like me bought those first machines, the Apple II, Commodore 64, and the TRS-80. It wasn’t because we needed them so much as we were fascinated by the idea of owning a computer, a machine that did whatever you told it. And we sensed that it would someday be important and make a real change in our lives.2

The space program has proven that you can get to orbit and beyond on a large, expensive hydrogen-oxygen rocket, or an airplane-shaped hybrid riding on an external fuel tank that itself is boosted by solid-fuel rockets. Unlike the Saturn, which was fire-once-and-drop-in-the-ocean, the Shuttle was supposed to be reusable. But it wasn’t really reusable like a car or an airplane. To land the Shuttle and take it back into space took months of rebuilding, refinishing the ablative tile surfaces, re-assembling the orbiter with its fuel tank and boosters, and then prepping and training for a particular mission.3

If humankind is going to continue in space, we need a better reason than government mandate and a better approach than taxpayer funding. We need entrepreneurs like Burt Rutan and Richard Branson to find easy and sustainable ways to get us to orbit and then offer them at a reasonable price that will support our active presence and use of near space. But they won’t do it unless there’s some kind of demand. Putting communications, weather, and spy satellites in orbit is a meager market, easily served by NASA and the European Space Agency.

I know some people would pay a high ticket price—perhaps a year’s salary—for just one ride into space; the same way I once paid a lot more for an Apple II and its components than I ever would have paid for the most expensive typewriter. I suspect that bridging the gap between this early “enthusiast’s vote” and regular use of space travel will require a period of trial demonstration and public feedback, the same as it did for microcomputers. Space travel will have to demonstrate compelling features—like going from Los Angeles to London in two hours on a parabolic sub-orbital flight, or manufacturing pure crystals and high-strength alloys in zero-gee orbital factories—before the average person will pay for it out of pocket. In the same way, chip-based computers had to offer interactive games, information access, instant communications, and personal productivity before the average person would bother to own a computer.

Of course, the computer industry was helped along by Moore’s law: the truism that the capacity and capability of computer chips doubles about every two years. It has not yet been shown that any similar law will govern the chemistry of jet and rocket engines. This is why I can hold a powerful computer in my palm today, but the family car still doesn’t fly, and I don’t get to soar in the sky with a rocket-pack. But development of low-cost, personal alternatives to the billion-dollar Space Shuttle will take the effort of an entire industry of researchers and entrepreneurs and go through many cycles of trial and error. We’re only starting this process.

Still … I believe it’s our destiny to go into space, and not solely through huge government programs. There are obvious reasons to get off the planet: to control the local environment and protect ourselves from asteroid strikes and other hazards; to gain new real estate4 and resources; to have some forward base for meeting extraterrestrials, in case they turn out unfriendly. There is also the imperative of our genes.

Humanity is, by and large, a migratory species—not that we follow a north-south pattern like birds; we’re simply aggressive and restless. The majority of us walked out of Africa 50,000 years ago. Waves of immigrants washed into North and South America from Eurasia 12,000 years ago. Waves of invaders washed across Northern Europe from the Steppes during classical times, from the Dorians that flooded down into Greece to the Celts, Goths, and Vandals that plagued the ancient Romans. Waves of Europeans spread out across the Atlantic and around Africa to rediscover the lands and cultures of Asia and establish a new order in the Americas during the last millennium.5 Travel, exploration, the search for something new and better in the next valley are in our blood.6

Maybe not next year, maybe not in twenty years, but we will eventually stop paying government-funded visits to the edge of space and decide to make a permanent home there for humans. It’s what we do.


1. You didn’t exactly have to learn BASIC or some structured programming language to use most of the early microcomputers, but it certainly helped. But even to buy the machine in the first place you had to wade through features and capabilities measured in unfamiliar units like bytes and bits, ROM and RAM. To hook up a printer you had to learn the difference between serial and Centronics interfaces and learn to set a baud rate and a handshaking protocol. The first computers weren’t at all appliances on the level of a toaster or refrigerator.

2. Being a writer, I was already thinking of using a computer to enter and manipulate text. At that time industry was just starting to use word processors—text terminals hooked up to mainframes or massive, expensive machines harboring minicomputers inside bench-like consoles—and I suspected the little Apple II might be used in the same way. I was right.

3. Cost and environmental impact are also concerns. The Shuttle’s liquid hydrogen-oxygen main engines burn cleanly, emitting only steam. But stockpiling and handling these gases in liquid state are a major undertaking. And the solid boosters are fueled with ammonium perchlorate and aluminum powder, which don’t burn as cleanly and are difficult and dangerous to manufacture into motor cores.

4. Most of that real estate will be in tunnels and under domes. None of the local planets offers the possibility of ever walking on grass under an open sky. To understand why, see my blog The Myth of Terraforming.

5. Yes, and some of us also stayed in Africa. For every Marco Polo who went off to find China, there was a settled population of Chinese waiting to be found. But which of them captures our imagination? We read with interest the story of the Joads who packed up and went to California, not their neighbors who stayed and starved under a bridge.

6. It can easily be argued that these successive waves of exploration and invasion usually brought bad things for the indigenous peoples, whether early hominids outside Africa or native Americans in the 19th century. Some even compare humanity to a virus on this planet and fear that we should ever get off it and contaminate space. But we are what we are, however imperfect. For those humans who would wish to see us all removed from the ecological equation, I can only say it’s not a survival trait to vote with those who want to see you dead. I make it a practice never to vote that way myself.

Sunday, July 10, 2011

God’s Echo Chamber

This past week we’ve seen the jury acquit Casey Anthony, the Florida woman accused of murdering her two-year-old daughter because, apparently, the little girl was inconvenient to her lifestyle. The case had become a media sensation in the way that monstrous mothers, sociopathic serial killers, and other unnatural creatures often do. They strike a nerve with the public and arouse moral feelings that the more commonly explicable murders and rampages don’t quite touch.

Such unnatural crimes and the media attention they draw are not new. The London papers delighted to tell of Jack the Ripper’s latest murder, and U.S. newspapers and later television followed the cases of the Lindbergh baby and Nicole Brown Simpson with the same energy that the online news media and cable channels have reported on Casey Anthony. What’s new in this instance, however, is the presence of the internet and social media—tweets, blogs, Facebook postings, and comments on blogs and postings—through which the public has an opportunity to respond and share its outrage.1

Imagine for a moment that you possessed the all-hearing ear of God in the age before our current electronic communion. A people unable to tweet and post their outrage, their hopes and fears, their desires and discontents, would turn a private voice to God, sometimes prompted by the sermons of their priests and pastors. Aside from the background patter of grant me grace … please cure Mother’s cancer … help me get that promotion … make her love me … get me into medical school, which is the carrier wave of particular pleadings, the ear of God must routinely hear community-wide surges of anger, fear, and protest: burn the witches … death to the Corsican tyrant … protect us from the Hun … justice for little Caylee …

The united voices that once only God could hear—except for occasional grumbles and murmurs traded across the back fence or in line at the supermarket—are now ringing across the public webpages of the internet and in the comment spaces of social media. Suddenly public opinion is a real, instantaneous, and measurable force.

Once the editorial offices of local newspapers could only weigh the bags of mail that arrived for and against a proposition or public position, and then publish a scant two or three letters that seemed most fervent or articulate. Now every letter, every opinion, every murmur and howl is available for examination somewhere on line. Once people sent chain letters by paper mail asking for blessings and dollars to be sent to the top-listed originators. Now they circulate heart-felt messages by email and ask the recipients to forward the text to all their friends.

This is a new thing, and it raises some questions about where our society is headed.

One question involves the reputed wisdom of crowds. Science fiction author John Brunner, in his semi-prophetic Shockwave Rider from 1975, showed his main character, among other things, running a Delphi Poll. This is an artifact of Brunner’s vision of future electronic media, in which a person or organization might offer a general proposition on line and allow the public at large to comment and vote on it. The majority opinion would supposedly approximate the truth. Or, as Brunner put it, “while nobody knows what’s going on around here, everybody does …”2

For a while, the venerable Popular Science magazine was running similar back-page polls on popular questions like when we would go back to the Moon and when fusion power would become feasible. This was a sort of Delphi poll testing public feelings about the future.3

Nonfiction authors John Naisbitt and Patricia Aburdene originally published Megatrends in 1982. Their book discussed trends in society based on the number of column inches of newspaper and magazine stories assigned to particular topics. It made for entertaining reading. And it did capture a snapshot of what was on the minds of newspaper and magazine editors—and presumably, through their collective instincts, of concern to the reading public.

The question is, how reliable is all this? If we could combine and interpret all the tweets and postings, reading them like the hanging chad of a Florida poll, would the aggregate tell us anything useful? It would certainly tell us when we wanted to return to the Moon, or how strongly we felt about mothers who supposedly murder their daughters. But absent the mechanics of appropriations and taxation, engineering effort and public contracting, would it pinpoint the date of an actual Moon launch? Hardly. And if you were accused of a heinous crime, would you accept the average ruling of a million bloggers and tweeters over the deliberations of twelve identified, interviewed, and selected citizens? Personally, I’d prefer trial by combat.

A second question is whether all this outpouring of feeling is good or bad for society. Certainly it’s therapeutic. Everyone now gets his or her say. And the tweets, postings, and blogs are totally uncensored. No government or party organization controls them.4 The enthusiasms and prejudices of newspaper editors and journalists and television executives are not leading them. The parties and the popular media may feed and stir some of this sentiment with a stick, as with the coverage of the Anthony case. But individual human feeling outruns the wisdom of party leaders and media moguls. Indeed, the consensus political positions drawn up by the two national parties are becoming fragmented as websites and blog collections focus on every position across the spectrum.

For every opinion you might have, you can find a focus of a thousand or a million other people who agree with you. And for the rest … you can tune it out, just not go there. Once your town or city newspaper brought together a variety of opinions, and a few of the leading papers in New York and Washington purported to speak for the country. Once the three major networks sampled the news of the world. Now you can log onto one or another site to get the news you like, or click to a cable channel that agrees exactly with your views.

Under Gutenberg economics, when it was a serious investment to put out a daily newspaper or run a TV news department or print and distribute a paper book, just a few voices would actually be heard. They would represent public opinion only because a plurality, if not a majority, of the public supported them by purchasing the paper or book or tuning into the station and supporting its advertisers. Now, while it still costs something to run a cable TV channel, that cost is less than organizing a nationwide affiliation of broadcasters. And creating a website or an ebook costs only your time and attention. With the internet, the act of publication is virtually free.

I’m not the first to notice this, of course. As little as ten years ago you still heard about the “global village.”5 Electronic media and popular news and television were supposed to bring us together and create a single forum for conversation. Certainly the outcry over the Casey Anthony verdict has had this effect. But the potential for isolation and parochialism is also there. While many voices condemn the murder of inconvenient children, and some even look for vigilante action to correct the jury’s “mistake,”6 there may be quiet corners of the internet, reachable only through the right search words, that offer advice and instruction on the guilt-free elimination of unwanted toddlers. For every opinion and taste, there will be a magnetic pole to draw and align it.

In the last twenty years we’ve entered a new age. It offers exciting possibilities for human creativity and freedom. I certainly wouldn’t give up the technology that makes it possible. But we’re now also able to hear in detail all the cries and curses and squeals and pleadings and promises that once were reserved for God’s ears alone. I wonder if we’re ready for it.


1. I haven’t made any kind of survey, but the fragments I see floating through my little window on the internet suggest the tweeting public rejects the jury’s verdict on Casey Anthony.

2. The principle is taken from the old carnival and charity scheme of collecting money for letting people guess the number of jellybeans in a jar. Presumably, if you average all the guesses—from the village idiot who hazards “Two?” to the wide-eyed child who says “A billion!” and all the people in between who squint at the jar and guess 1,000 or 1,300 or 1,200 or 1,150—you get a number that’s correct to the last bean. But this only works, supposedly, if you get enough people to guess.

3. A Popular Science poll in 2000 also showed that 45 percent of respondents believe Earth has been visited by intelligent aliens. Opinion is not provable fact.

4. Although the Chinese are trying, at least within their own borders.

5. The term appears to have originated with Marshall McLuhan and his views of the media. Of course, McLuhan still lived in the Gutenberg age.

6. Apparently, right after the verdict people were tweeting for the TV-fictional vigilante Dexter to visit the mother.

Sunday, July 3, 2011

Abracadabra

Magic has traditionally used the spoken word, or incantation,1 to precipitate its power. In the Harry Potter world of J. K. Rowling, the spells are just one or two words—usually with a Latin flavor—spoken with particular emphasis by persons of a wizarding nature, and the spell is usually supported by a wand or other prop. In the memorable Black Easter of James Blish, all magic is performed by conjuring and contracting with demons, but the conjuration must still be spoken, aided by the use of particular configurations, signs, and instruments, and adhering to strict rules and formats.

We are now living in a magical age indeed.2 We have achieved, on an everyday level and accessible to the common man, a degree of magic to which priests, conjurors, and charlatans going back to the ancient Egyptians could only pretend. With a few words, whether written or spoken, we can control vast machines, both physical and metaphysical, to produce vast riches, carry us bodily to far places, spread our voices and bring us tidings, and otherwise do our bidding.

The incantation may be as long and as complicated as writing a hundred thousand lines of C++ code or Java script. It may be as simple as placing a fingertip on an icon on a touch screen—which then invokes those thousand or million lines of complicated script.

My fellow Baen Books author Rick Cook caught the flavor of this in his Wizardry series. A systems-level programmer from our world is transported to a world where magic is ascendant. He quickly becomes a champion wizard by regimenting all the complex magical spells under programming rules and automating them with a Unix daemon. The fun ensues as he befuddles the archaic magic users who are stuck in their old ways.

If you don’t believe that computers and their code run our modern world and provide us with the power of truly advanced magic, consider the following.

Every robot you’ve ever seen or known about, from the tiny Roomba that sweeps your floor to the Cave Crawler that explores old mines and performs underground rescues, is controlled by a computer. They have sensors for detecting their environment and chips with hard-coded rules for moving and stopping. They and the machines to follow are the golems of our time.

In modern factories, the flow of goods—from the incoming materials deposited at the loading dock, through all the steps of processing and manufacture, then into inventory and out to the loading dock again—everything is tracked by a computer using barcodes and scanners. No one is running around with a clipboard anymore, asking where the boxes of vacuum tubes have gone. The individual processing lines and their robots are certainly controlled by computer. Where refrigeration (called “cold train” in the industry) or other special environmental factors such as humidity are required, storage conditions are monitored and adjusted by computer. In the most automated factories, the physical movement of goods and their processing with machines and robots is orchestrated by a computer system that runs the motors on conveyor belts, measures the entry and exit of materials and goods in surge bins, and adjusts the processing speed of each section to keep the flow smooth. The software controlling the computer may be hard-engraved on a chip or called into memory from a disk, but somewhere in its history are lines of code written by a programming wizard.

If the factory is automated, so is the store where you buy the goods it makes. If you go into a Home Depot or Wal-Mart, you won’t find anyone with a clipboard running around taking inventory of goods on the shelf. All modern packaging bears a universal product code (UPC) that uniquely identifies the item. Pallets are scanned at the loading dock and entered into the computer’s inventory. Items you buy are scanned at checkout and removed from inventory. When the inventory gets low, computer software notes the fact, consults purchasing trends both in store and nationwide, and reorders goods at the prevailing sales level. Every transaction goes through accounting, along with your payment and the invoice from the supplier. Accounting is a computer, too, as are payroll and purchasing.

If you step onto the local subway, light rail, or train system, you’re putting your trip under computer control. Every block of track, switch, train movement, and station announcement is either controlled or monitored by a computer. No one is running around throwing switches by hand. In the most automated systems, like BART in the Bay Area, a central computer directly senses the presence of the train on the tracks, controls its speed, stops it at the station, and announces the train’s imminent arrival with real-time estimates. And every action is a line of code—a written incantation—in the computer.

Computers are essential to the design of every new aircraft. Humans may suggest new wing shapes and engine capacities, but computers finalize the details. And no one is producing engineering drawings and schematics with a pencil and a T-square. The final shape is dreamed in a computer. In the most advanced planes, the pilot is no longer heaving on controls that move the flight surfaces with pulleys and cables. They move by hydraulics—oil running through pistons, pumps, and pipes—and computers operate it all based on inputs from strain gauges attached to the pilot’s familiar control yoke and rudder pedals. It’s called “flying by wire.” Behind the wire is a programming incantation. The movement of airplanes through the sky are tracked by computers attached to radar stations. Human air traffic controllers may give voice commands to human pilots, but those controllers know about the situation in the air only by staring at computer screens.

The car you drive used to respond to mechanical inputs. When you pushed on the gas pedal, a lever opened a throttle plate in the carburetor. When you pushed on the brake, it put pressure on hydraulics through a master cylinder. Now electronic systems—computers backed up by lines of code—operate the engine and monitor the braking system. If you drive into the mountains, the engine management system adjusts the fuel/air mix for reduced oxygen. If you brake aggressively on a slippery surface, the computer in the ABS system will override your foot pressure and ease the brakes to prevent the wheels from locking up and skidding.

There was also a time when, if your battery was dead, you could start the car by rolling it downhill in second gear and popping the clutch. The distributor would spark mechanically, the carburetor would feed a bit of fuel by gravity, and the engine would catch and run. Now, of course, if your battery is dead, the engine management computer is asleep and no amount of rolling friction will wake it up. Time was, also, when a thief could hot-wire a car by rubbing together two wires leading to the ignition switch. Now that switch and the key that operates it are only a convenience. The real protection and authorization to start the car is a radio-frequency identification (RFID) chip buried in the key head. It communicates with a reader that talks directly to the computer. No signal, no start—no matter how many wires you rub together.

If you think the bank holds your money somewhere in coins or bills or gold bars, think again. Most banks on the street have some ready stash of bills and coins. If you ask for it, some of them may be denominated and handed out as “yours.” Similarly, the U.S. Treasury has some gold bars at Fort Knox—but don’t try asking for them. Nowhere is the physical amount of money in the system equal to even a small fraction of what the economy and your bank think of as “money.” Your account is an agreement between you and the banking company, and between the bank and anyone to whom you owe, or who owes you, money. The bank enters certain zeros and ones into your account on deposit, and removes them on demand, representing a sum you all agree represents “your money.” But it’s actually just promises, tracked and pledged by a computer. The transactions are just the bank’s computer talking to other computers somewhere else.

And, finally, any of our human transactions taking place between people outside the same room (and sometimes even among people sitting next to each other) are facilitated by computer. The phone system doesn’t use a “switchboard” anymore; it’s all done with computers. And your voice isn’t the amplitude modulation of a carrier signal, but instead it’s digitized, packeted, and sent in pieces down the wire. Your Tweets, your Facebook page, the information streaming over your computer or iPad screen—all are computer generated and computer enabled. Even the page you’re reading now is backed by lines of code and stored on a server. (To see some of the HTML programming, go to your browser’s command menu, pull down on the tab that says “View,” and look for a command using words like “Source” or “Coding.”)

Modern civilization, at least among the developed countries, is an interlocking series of “abracadabra” incantations, executed by software daemons, using the engines of chips, disks, and circuits.

If you’ve ever done any programming,3 you know how finicky and precise the language can be. The old English-class bugaboos of grammar, syntax, and punctuation apply with unbelievable force—but using new rules, depending on the programming language. Drop a comma, or put the semicolon inside the quote mark instead of outside, and you won’t merely be thought unlettered by the people around you. You will see the machines stop dead, or give false answers, or light up your screen with strange colors.

Bugs and glitches in even the simplest programs used to be common. Now, of course, much of the code that runs our machines is modular, written and checked by computer programs themselves. This is called computer assisted software engineering. Humans only intervene at the highest levels of programming—that is, deciding what the “abracadabra” should actually do. Glitches arise only because there are so many millions and millions of lines of code that undreamed-of combinations and conflicts are bound to arise . And in any system approaching the scale of grains of sand on the beach, random errors will creep in because of cosmic rays, mechanical errors, and tiny voltage fluctuations. And so the bug fix for the next version collects. But none of it is actually, ahem, a human error.

We live in a world run by sufficiently advanced magic that humans are not really in control anymore except at the very top level. All of this has happened in the last forty years, with the advent of the microchip.< sup>4 Today, we only invoke the daemons and then stand back in wonder.


1. From the Latin incantare, meaning “to enchant.” But consider also that this word includes the Latin root cantare, meaning “to sing.”

2. Of course, as Arthur C. Clarke wrote, “Any sufficiently advanced technology is indistinguishable from magic.”

3. My own dealings with computers go back to 1979 and a spiffy little Apple II that I bought because it was, well, a computer and I never had a chance to own one before. Using it was a combination of running programs you bought on tape or disk and programming your own in BASIC. I learned a bit of BASIC and even started to learn the machine’s structured language, Pascal. I had dreams of writing code and creating games as a way to get rich. Then I saw my first Pong game that invoked gravity and I knew that, far from getting in on the ground floor, I was way out of my league and that only people with solid wizarding credentials would make a go of this. But I still like using computers.

4. My father was a mechanical engineer who started working between the world wars at Bell Labs and finished his professional career at Sylvania. He used to say that two facts of modern life in the twentieth century had gone generally unnoticed by the general public. One was the revolution in materials—with the chemistry of polymers and alloys replacing basic materials like rubber and steel in everyday goods. The other was the increasing number of small motors in our lives. When he was a boy, the average person might have one pump out at the well to draw water, an engine under the hood to power the car, and maybe a couple of household motors in electric fans and the phonograph. In his lifetime we suddenly had motors all over the car to run the wiper blades and roll up each of the windows; motors all over the kitchen to run mixers and dishwashers; motors in the bathroom to run shavers and even toothbrushes. Toward the end of his life he would add small computers. If your shaver—which a hundred years ago was a piece of really sharp metal with a folding handle—has a window to tell you the state of the battery charge and when to clean and change the cutter head, you’re served by a computer as well as a motor.