Sunday, July 29, 2012

The Language of Technology

I’m in the middle of reading John McWhorter’s The Power of Babel, which studies language families and language mixing. It’s full of fascinating ideas, the most controversial of which seems to be that what laymen think of as a “language,” like English or French or German, is really only a collection of dialects that flourish in their own separate localities which are spread over a wide area. Thus Yorkshire English versus Cornish English, or Swabian versus Swiss versus Viennese German. Dialects constantly keep growing apart, and the concept of a “national” language is mostly a philological construct. What we think of as national languages—French, Spanish, Italian—are really local dialects of Latin that grew apart over time.

Some of this was known to me before reading McWhorter’s book. It’s common knowledge in the age of modern telecommunications that BBC Radio created “standard English” out of the dialect spoken in southeastern England centered around London. In the same way, the rise in U.S. broadcasting of announcers from the Midwest, like Johnny Carson and David Letterman, helped create a kind of national American accent. In the same way, the new media of the printing press helped Martin Luther make his particular dialect the standard “high German” through his translation of the New Testament. And the language of Parisians became, through central government influence and Académie française enforcement, the definitive form of French.

One area of linguistic mixing and borrowing which McWhorter hasn’t addressed (at least as far as I’ve progressed with the book) but has always fascinated me is how technology is changing language. Like the spread of foreign words into a new area after conquest—think of the spread and mixing of Spanish words and grammar into Central and South American native dialects in the 16th century and later—new technology words and the concepts behind them change not only the way we speak but also how we think.

Imagine an English speaker from Tudor England. He lives 550 years removed from the present day and about 200 years removed from the start of the Industrial Revolution, which really began the changes that we see all around us today. Our Tudor specimen knows about machinery, but only in limited and what we would consider primitive forms. A grist mill powered by water or wind, with its great wooden gears to convert the water wheel’s or the sail’s circular energy in one plane into turning the millstone in another, would be the most complex machine he might usually see. For the rest, his life uses much simpler machines: the wedge of the plow to break the soil; the turn of a wheel to carry the load in his cart; the screw that forces the platen and paper against the type bed of a printing press; the hinged lever that works the bellows of a forge or a pipe organ. In fact, the most complicated machine he might ever observe—if he could look inside the cabinet to study the mechanism—would be the system of stops and keys that direct the air flow in the organ.

Now imagine going back in time and trying to explain to him the workings of an internal combustion engine. The basic principle of a piston moving inside a closed cylinder and pushing on a connecting rod to turn a crank would be foreign to the Tudor mind. He could understand it, surely, but he would probably need drawings, a demonstration model, and a new vocabulary to grasp what’s going on. And that’s long before we get into details, like whether that piston is pushed by expanding steam in Mr. James Watt’s engine of 1765, or spark-ignited exploding gasoline in Mr. Edward Butler’s model of 1884, or compression-ignited oil in Mr. Rudolf Diesel’s of 1893.

Although most people in the modern world who drive automobiles and trucks every day may be hazy on the technical details, they still can understand the basic concepts and are familiar with the language necessary to discuss them. One cannot really talk about the internal combustion engine without using now-familiar terms like “piston,” “carburetor,” “crankshaft,” and “spark plug.” But take any of these words in this context back to Tudor England, and you would have a difficult time making yourself understood. “The carburetor, well … you see, that’s a little box attached to the intake manifold that mixes the air and the fuel vapor in correct proportions for ignition. It’s connected by a lever or cable attached to the throttle.” Disbelieving expressions. Sideways glances. Presumptions of lunacy.

In Shakespeare’s England, “throttle” was a verb meaning to choke someone, and so you might make some headway with the idea of opening up an air passage. “Piston” and “carburetor” were totally unknown words that would not see their first usage for about 150 and 300 years, respectively, although with time and patience and a sketch pad you might introduce the concepts to the Tudor mind. But “spark plug,” embodying as it does the use of electrical energy in a circuit, would require opening up a whole new discussion in the unknown realm of physics. You would have to start with lightning and work forward from there.

But we’ve been dealing with steam engines for about 200 years now, and with internal combustion and electrical circuits for more than 100 years—to the point that we take them for granted. Now consider the true magic of our age: computers. These machines, which really only arose in the past 60 years or so, have become ubiquitous and in their own way more powerful than the steam engine.

They, too, use their own language. Often the words are acronyms for complex thoughts compiled from word chains: CPU for “central processing unit,” RAM for “random access memory,” ROM for “read only memory,” USB for “universal serial bus,” KB for “kilobytes,” MB for “megabytes” (which are a totally different measure from the MH of “megahertz”), and so on. Anyone who has ever bought a computer and done comparison shopping among the available brands and models must deal with these terms.1

Now consider taking these terms back to Renaissance England. The Tudor mind has some experience with machines that twirl and spin. But machines that can, well, think and respond; can send messages down a wire or through the air; can capture, store, and recreate ghost voices and ghost images; and can communicate with and control other machines all by themselves, all based on these little pieces of lightning called “electricity” and little fragments of enchantment language called “software” … Within five minutes you would be condemned for witchcraft and communion with demons. An hour later they would be burning you at the stake.

Most of us will never have the chance to go back and visit Tudor England and confuse the locals with our technologically tainted language.2 But one group of people travel there and elsewhere in history all the time: writers and their more freewheeling brethren, filmmakers. Anyone who has attempted to write a historical novel or the script for a costume drama must be continuously on guard against casual references to impossibly modern thinking.

Some slips are easy to avoid. For example, the word “keyboard” had historic usage only in relation to musical instruments: pipe organs, pianos, and the stringed predecessors of the piano, the harpsichord and the virginal. The word’s relationship to writing, through typewriters and computers, is so mechanical and obvious that the modern writer is not going slip and describe a Shakespeare or Marlowe working away at his keyboard. Similarly, the mouse was just a small gray mammal until the Palo Alto Research Center put a box on tiny wheels to aid eye-hand coordination in manipulating the cursor on a computer screen.

But some thoughts are modern and have no place or usage in the past. My favorite is the song from The Phantom of the Opera, about Christine going “past the point of no return.” We think the concept is obvious: a stage in the journey where you know you cannot turn back. In the Phantom’s usage, she is passing a moral station, a level of knowledge about dark and secret things that nice girls should not know. And this can be explained in a 19th century context—but not in those specific words. No one’s journey had ever reached a “point of no return” in the days of horseback or sail. You could always go ahead.3 The concept, and that precise verbal formulation, only became current when pilots began flying over large bodies of water. Since the amount of fuel available in the plane is both limited and essential to survival, every such journey reaches a precisely calculable point at which the plane lacks enough fuel to turn around and go back to the original airport. You must fly on and either make it or not: the point of no return. The Phantom, who was singing long before the Wright brothers’ first flight, would not have used those precise words: “point of no return.”4

Use of modern words and the concepts behind them need special care when placing a story in a historical context. The writer of science fiction, however, who travels forward into the unknown future, faces a different and somewhat more complex problem: how to suggest technologies, their concepts and language, that do not yet exist. Some writers prefer to leave things vague and treat far-future technology as a sort of invisible magic. Isaac Asimov created robots that move and think on their own, and he explored their moral relationship to the universe through the Three Laws, but he never probed deeper into their mechanism than reference to a “positronic brain.” On the other hand, Frank Herbert in the Dune series imagined a future 10,000 years in the future, and he made it come alive with references to things like Holtzmann effect field generators, electrostatically controlled oil lenses, the paracompass, and ornithopters flying with their “primaries” and “coverlets” and using “jetpods” for takeoff. Herbert wove similar glints of language into all kinds of familiar-but-hidden technologies and training regimens.5

Ever since James Watt created a workable and efficient steam engine some 250 years ago, languages around the world have been grappling with a new wave of conquest: the imperatives and underlying concepts of an ever-expanding technology. For most of the world’s languages, the choice has been adopt the English usage, which itself adopts and adapts many root words from ancient Greek.6 But all of them must deal with the warping effects of an accelerating and relentless advance. We’re all—linguistically as well as socially and economically—on an express escalator to … who knows where?

Blink, and you’ll never catch up. Live another fifty years, and the world you knew as a child will seem as distant and primitive as that of the Tudors, while the world you live in at that future time will be as unimaginable to a person of today as ours would be to a citizen of 16th century England.

1. However, the need to immerse yourself in these words of power has recently started fading, as computer processing has become so ubiquitous and the power described so huge and standardized that no one bothers to think in terms of its applications and limitations anymore. Consider the smartphone, which has become the most advanced and complex of Univac’s children. Aside from asking about memory capacity—stated in GB for “gigabytes”—which affects how many songs, photos, and videos the phone can store (“Videos … on my telephone?”), no one cares how fast or powerful the central chip is. We just presume it’s wonderful enough to keep up and get the job done. We don’t much care about the “pixel” dimensions of the screen, either, because it will simply be so dense as to look and feel like … reality.

2. Or not yet, anyway. But the physicists of our own era, who are playing with concepts that the average person of today can barely comprehend, may be getting closer to understanding the true nature of space, time, and energy. Who knows what travel possibilities lie ahead of us? Although that opens up another whole can of wormholes, doesn’t it? If people from our future will have the ability to navigate backward into the past, then why haven’t we met them? Either they’re very good at hiding, camouflage, and dissimulation—or they just never mastered the fundamentals of time travel.

3. Odysseus went ahead for ten long years through the strangest of physical and moral adventures. No thought of when he might pass beyond the possibility of returning to his original starting point, the burned-out hulk of Troy.

4. Which, according to the dictionary at hand, was first used in written form in 1941, although pilots had probably been speaking of this particular point since the time of St. Exupéry and Lindbergh.

5. Of course, Herbert also avoided a lot of technical grief in a land of talking robots and other semi-magical beings by positing a future revolt against the machines, the Butlerian Jihad, followed by a re-ignition of human mental powers in the Great Schools. And that took his stories into a wholly fascinating new direction.

6. For example, “telescope” from tele (far) and skopein (to see); “microscope” from micron (small) and skopein; “telephone” from tele and phonos (voice); “microphone” from micron and phonos; and so on for a century or more of industrial advances. And let’s not forget the words representing orders of magnitude: kilo, mega, giga, and tera (representing thousands, millions, billions, and trillions, respectively) as applied to “watts,” “volts,” and “hertz” (all derived from the names of the discoverers or first propounders of certain physical properties) and, of course, “bytes.”

Sunday, July 22, 2012

My Idea of Heaven

Most conceptions of the afterlife include some form of judgment about a person’s time and activities during the earthly life, followed by assignment to someplace suitable to that judgment, and residence there for an eternity.1

This is certainly the basis of afterlife in the Christian and Islamic belief systems. The Christian heaven is associated with spiritual comfort, music, and the contemplation of God—as befits the Christian church’s promotion of a godly life through worship and prayer. The Muslim paradise is a place of cool shade, good food, and beautiful, compliant women—as befits the experience of life in a hot dry climate with a male-dominated culture. In both traditions, heaven’s alternative depends on the outcome of that judgment. For those who never personally affirmed the core religion, the afterlife is a place of nothingness, limbo, or at worst a separation from God—it’s a dumpster waiting to be emptied, eventually. For those who heard the word, accepted it, and then broke the rules or later denied them, there is an eternity of torment, anguish, and despair, plus the separation from God.

Heaven is a place which people can anticipate with serenity, “a better life than this”—especially if they’ve met all the earthly requirements for admission. Its contemplation is a comfort to take the mind forward, in those final minutes at the point of death, into what is really the unknown. It is the unknown because, while we may have articles of faith, sanctioned stories, myths and legends, and spirit-transmitted messages about the experience of being dead, no one has actually come back with any kind of scientific proof. We do have the recollected experiences of people who went up to the point of death and claimed to see beyond—but none of them stayed “dead” for very long, certainly not to the point of dissolution and corporeal decay, and so the suspicion lingers that they weren’t really dead after all.

As to defining heaven—or hell, for that matter—with any kind of map, activity schedule, calendar of events, or detailed history2 … forget it. Heaven is somewhere above, possibly in the clouds, also possibly somewhat higher, full of light and coolness and that persistent music. Hell is somewhere below, possibly underground or further down, full of shadows and steam, screams and persistent noise. What you do in either place, whether glory or despair, you will do there for eternity. Music, houris, religiously sanctioned pleasures or torments, forever and ever.

Picking holes in this concept is just too easy, but sometimes my mind runs away with me.

One assumption about heaven is that you will be reunited with the people you’ve loved during your life on earth. People, especially family members, are important to us, right? I mean, what else is there? Eating ice cream and playing with that big, red Tonka fire truck you lusted after as a kid? Those are ephemeral activities. But associating with loved ones and cherishing them for eternity—that just doesn’t get old, does it?

But which people? Only the ones you actually loved and want to see again? From your perspective? Or all of your family, including the crazy aunts and selfish nephews? Or perhaps, instead, you want to meet famous people from the past. Here’s your chance to come face to face with Julius Caesar and Napoleon Bonaparte. You can ask them important questions and catch up on your history. And, by the way, whom did they come to heaven to see?

I imagine going up to my great-great-great-grandfather, whom I never knew. “Hi there, Granddad!”

“Who are you again, boy?”

“I’m your gr-gr-great-grandson.”

“What’s that supposed to be to me?”

“Well, I represent your future. Although, by now, I guess I’m part of the past, too.”

“Oh, yeah? Well, I represent only a thirty-second part of you, if we can believe all this genetics stuff. So why does that make you my problem? Why don’t you go pester one of those thirty-one other people? One of them might care.”

So much for reuniting with family. Maybe I’ll wander over and visit Caesar. He might like an addition to his audience. Or … where’s my fire truck?

Frankly, I can’t imagine any activity, whether a family reunion, eating ice cream, or playing with that fire truck, that I’d want to do for more than about three days. Then I would become bored and look for something else to do. Perhaps this is just the nature of the human mind, set in the hurly-burly of living, and when the soul passes through death and all the earthly details are stripped away,3 it won’t want the diversions of activity based on time passing or the sequence of cause and effect.4

The human mind is a product of samsara, which is a Sanskrit term used by the Buddhists to mean the cycle of births through reincarnation. But the word at its root means “continuous movement,” “continuous flowing,” or “the running around”—and that we certainly are. Our minds were designed by evolution to move continuously, from one experience to the next, from one antelope hunt to the next, from one berry bush to the next. We never stop learning, experiencing, questioning, demanding, making, building, doing, destroying, loving, hating, growing, and fading. Whether the motion is upward or downward, we never stop becoming something new and different. Place us in a sensory deprivation chamber, and our minds will spin webs of fantasy and vision to keep us occupied.

The tragedy of the human condition—and its triumph as well—is that we know all this from personal experience, and yet we also know from observation that there will come an end point, a stillness, when the body does not move, the eyes do not see, the breath does not flow. We observe this empty state in the old who have “gone beyond,” and we see it in the young who are taken by illness, accident, or predation. And since we identify those others as beings like ourselves, we know deep down that the stillness and silence are coming for us one day.

This is the human mind at work. A dog or a horse can never understand the correlation. The dog or horse might be responsive to contact with a human or another animal, but when that other dies and goes still, the dog or horse only pauses for a bit, wonders what happened to the playmate, and then goes on with whatever it was doing. It doesn’t have any awareness of self, and so no awareness of others like itself. The animal has no sense of the future, and certainly not of a future that might not include itself as an active player.

Confronted with the mystery of being oneself and contemplating a world outside that is not the self, humans have created imaginary dimensions that cannot be measured in the physical environment. We speak of infinity, to represent all the numbers beyond which we can conveniently count. We speak of eternity, to represent all the time beyond what we can personally experience or comfortably imagine. We value omnipotence as a representation of all the energy and presence beyond any that we can name. These are the attributes of God and His realm, which are beyond the human experience.5

Knowing that stillness that will come, we put it off mentally and emotionally. Yes, it’s there ahead of us, but instead of an end, it will be a transition to a more lasting state. And how long will we stay in heaven or hell? What comes beyond and after them? Why, nothing: these states are permanent.

Or we imagine that, after the body stops moving, the mysterious transition places the “real” part of our existence, the soul, into a new body, to begin the cycle over again. And how long does this cycle of birth and rebirth go on? Why, forever: life is eternal.

As near as I can tell, only the Buddhists believe in an end point. The eternal return of rebirth, the endless climb upward, from insentient animal to human to god, and the endless offering of choices that represent the possibility of error and backsliding, the swings of karma—all of this is too oppressive to contemplate. The Buddha’s solution was to stop the karmic pendulum through an act of will, unbind oneself from the cycle of birth and death, and extinguish one’s soul. Like a candle flame, the Buddha at his death did not go forward, to return someplace else. Instead, he went out.

To which my simple and obvious mind replies, isn’t that what we’ve been talking about all along? The body goes still, the eyes stop seeing, the breath ceases, and all that we have been carrying in terms of knowledge, skill, responsibility, love, hate, and all the rest … vanishes. We go out.

Is this a terrifying thought? Perhaps. All that we know and love—as well as whatever we hate and fear and despise—simply disappears. But I find equally terrifying the prospect of holding the same conversations, exploring the same memories, reliving the same triumphs and tragedies, with loved ones for eternity. Or eating ice cream—whole oceans, gaseous nebulae, spiral galaxies of ice cream—until you smother and drown and still can’t die. Or playing with that red fire truck until the wheels fall off, the paint blisters, and the steel crumbles away into rust, and yet the damned thing still moves and clangs and fascinates me. Those thoughts are truly oppressive.

It’s much more comforting to simply imagine going … out.

1. Of course, the Hindu tradition, and others that similarly posit reincarnation, do away with the place of eternal residence and instead recycle the person, or the soul stripped of current personal details, back into the hurly-burly of earthly life, usually with an upgrade or downgrade in status based on that end-of-life judgment. Since these traditions deny a final disposition—any kind of “heaven”—I’ll leave them for another meditation.

2. Except for that embarrassing account in Milton’s Paradise Lost about Lucifer, first among the angels, who became dissatisfied, started a war, split heaven asunder, and either started up or was cast down into his own place, called hell. But that story seems to have been more about the politics of God’s other intelligent creation, the angel folk, rather than the final resting place of mortal human spirits.

3. This opens a whole other meditation. What is a person other than the welter of activities, skills, knowledge, responsibilities, appointments, likes and dislikes, memories and fondnesses that we build up over a lifetime? Take away all those details and what is left? A spark? A beaker full of shimmering, translucent material called “ectoplasm”? Some kind of interchangeable life-force, like the 12-volt battery you put in a car? Take away one earthly detail—say, my preference for the color red—and you diminish me and my experience. Take away all of them, and there’s nothing left.

4. They say that when a person passes through the event horizon of a black hole and falls toward the central singularity, he accelerates to light speed as terminal velocity. Of course, according to Einstein, as you approach light speed your subjective time—from the viewpoint of a person in normal, unhurried space—slows to a stop. Your seconds stretch out to become minutes, hours, days, eons. You never reach the crushing singularity because, as you make the final acceleration to light speed, time stops for you. You are frozen in mid-thought, mid-speech, mid-scream, or mid-scratch and never experience the final obliteration. Perhaps the experience of death is something like this. You just slow dow …

5. And then science, by measuring events in the known universe, shows that the universe itself is not infinite in space nor eternal in time, and the power expended to create it, the Big Bang, while beyond our experience, was not actually any kind of omnipotence. And then, confronted with this dead stop at one extreme of history, the beginning, we start to imagine what might have come before the Big Bang. Infinity, eternity, and omnipotence keep creeping into the most rational discussions.

Sunday, July 15, 2012

The Parasite Problem

Every government functions as a parasite on the economic activity within its jurisdiction.1 That is, it supports its own activities by taxing various aspects of the economy, or by taxing its citizens directly—who in turn must support themselves and pay the tax by participating in the economy.

Like any good parasite, the government also prepares its host to withstand the infection by passing laws and regulations that encourage economic activity and increase the tax base. But eventually the successful parasite encounters a problem: as it prospers and grows, taking a larger and larger bite out of the host, it approaches a point where further growth of the parasite will cripple and sicken the host. If the parasite is perfectly adapted to exist on that host, then it will also limit itself so that it does not jeopardize the health of its only means of survival.

Unless, that is, that parasite is the United States government—or the government of any of the most developed countries in the world today. They all seem to have forgotten the fundamental rules of being a good parasite.

First, they have fed not only on economic activity but also, in any democratic structure, on the votes of a popular majority. The only way to ensure those votes is to offer benefits—tax breaks, supported services, monopolies—on sectors of the population that will confer a majority in support of the government. In the United States today, approximately half of the population is favored with paying no taxes through “exemptions,” and even getting “credits” back from the government, while enjoying a wide range of free services. These are the people considered to be burdened with low social or economic status. The other half pays all the taxes and takes less in services. These are the people considered privileged with high social and economic status. The people of low status have only their votes to give, but alert politicians have already figured out that votes are better than gold. Really cunning politicians have found a way to convince the privileged sector to donate their gold in support of the campaigns needed to persuade the underprivileged to vote for them. This exchange is by definition unsustainable.

Second, these governments have passed laws and regulations that either tend to, or are outright designed to, inhibit economic activity. For example, they promote the interests of labor over management and owners in a productive enterprise. These laws encourage workers to command higher wages; demand greater services in terms of medical care, family benefits, and vacation and retirement compensation; and limit the hours they can be asked to work and tasks they can be asked to perform. When these laws apply to activity in the “private sector”—the arena of the economy—they limit labor’s productivity and so the capacity of that economic activity to grow. When they apply to workers in the “public sector”—the arena of government itself—they expand the government’s budget and its need for greater taxing power.

Modern governments inhibit economic activity in other ways. They pass laws in favor of what are called “the commons”—expansion of public lands, restriction of commonly held resources involved with those lands, like mineral extraction and grazing, and the less visible goods of clean air, clean water, biodiversity, and climate protection. By restricting economic activity that might infringe on these commons, the government appears to be doing good. But an ideologically guided government can see too much good in restricting these commons and too little good in allowing the economic activity that makes use of them.

Third, governments overestimate the amount of economic produce they can safely reap in the form of taxes. When they move from “taking no more than the host will miss” to “leaving only enough for the host to survive,” they enter onto dangerous ground. Any miscalculation about what the host needs to survive can lead to crippling, sickness, and death. Today, governments are making ultra-fine calculations about the amount of incentive left to the promoters of economic activity—the entrepreneurs, managers, shareholders, capitalists, and bankers—without whose effort and attention the enterprise ceases to function.2

Today, governments all over the world have overestimated the amounts of economic activity available to be taxed and underestimated the amounts of tax revenue they need to survive. In times of greatest economic growth, the government grows and prospers and promotes more vote-getting programs and services. In times of stagnation, the government borrows money to support itself, absorbing the energy3 that could be used to promote economic growth. This exchange is by definition unsustainable.

Read the newspapers or skim the blogosphere and you find that almost everyone, on both sides of the aisle, agrees that our government and its obligations have grown significantly. From a historic 18% of gross domestic product, the federal government now takes about 25% of our annual economic activity. Add in state and local governments, and the take is almost 40%. Federal debt that will eventually have to be repaid is about 100% of our annual GDP—the equivalent of a person owing credit card debt equal to annual income. As a parasite, the government at all levels is growing perilously large and endangering the health of the host.

I’m not in favor of having no government, because the government is a legitimate expression of society’s need to provide for things that individuals cannot efficiently do for themselves, like build roads, defend the borders, drain the swamps, and other useful treatment of our natural commons. It is appropriate that some people work in the government rather than in the private economy to plan for and execute these communal services. It is also appropriate that government take a share of individual wealth and economic activity to pay for these services that benefit both the taxpayers and the economy as a whole.

But there is a natural limit to which government can grow. Not a legal or constitutional limit, but a natural limit. When the host labors and struggles to feed the parasite, it is never recorded in the biological sphere that the parasite will suddenly leap free, stand upon its own legs, and become the host, fulfilling the host’s niche in the surrounding ecology. Similarly, a government that is based on taxation of economic activity—rather than the Communist type, which sets out from the beginning to be the economy—cannot suddenly morph into a substitute for that taxed activity. The lines of communication, the signaling systems, the feedback mechanisms are all wrong for such a transposition. The host will instead sicken until either the government backs off its requests and the host recovers, or the government continues to grow and the host eventually dies.

Today, every government in the modern, developed and developing world is testing the borderland between these two conditions. In Europe and America, the host is sickening and may soon go comatose. In Japan, the host is largely populated by zombies on life support. In China, the originally Communist government is letting the capitalist host grow as a wild type of cancer, with no plan for where or when this strange experiment will end.

But the old adage still applies: Whatever cannot continue is not sustainable and eventually will come to an end.

1. The exception to this would be a Communist state, where the government defines and controls the economy directly—in effect, the government is the economy. But then woe betide the citizens.

2. In the popular imagination—fostered by supporters of ever-larger government—the capitalists (who essentially buy corporate, municipal, and U.S. Treasury bonds) and the shareholders (who buy stock in companies) are just the greedy yet faceless tycoons of Wall Street. Certainly, no one cares if their ox gets gored to within an inch of its life. But some of the biggest players in the stock, bond, and futures markets are the “institutional investors,” who manage the money in the retirement funds of employees in both the private and public sectors. Everyone has a stake in what’s going on.

3. Money is nothing but a fixed form of human energy. See The Economy as an Ecology from November 14, 2011.

Sunday, July 8, 2012

Yin and Yang

I am still trying to understand the political fracturing that is taking place in these United States. The level of rhetoric on both the Right and the Left has become deafening, and the issues now seem to be more a question of ends than means.

When I was growing up and becoming politically aware, the differences between Democrat and Republican presented themselves as preferential and procedural. The Democrats seemed to favor more pluralistic solutions, with people banding together to achieve their aims, and government stepping in to support the poor and weak. The Republicans seemed to favor more individualistic solutions, with people striving to do their best in the marketplace of effort and ideas, and government enforcing the rules to ensure fair play. Each political decision revolved around which approach—pluralistic or individualistic—would provide the best outcome for the case in hand. Both parties adhered to the same essential aims: make the nation strong, help people live better, build for the future. They shared a tacit agreement about the meanings of “strong,” “better,” and “future.”

Perhaps that was all a dream, my own personal naïveté.

Today, the discussion about means and method is over. Democrats seem to want total government control of the economy and collectivization of the population—the Soviet Union or Maoist China but without those ugly, repressive bits. Republicans seem to want no government at all and free-range capitalism—the economics of winner-take-all robber barony amid perpetual cycles of boom and bust. Like two football teams on the line of scrimmage, each party is dedicated to moving the ball over their own goal line while denying the opposing team an inch of ground. Collaboration, let alone reconciliation, is out of the question. Victory or death! The ultimate solution! Yee-HAH!

Perhaps this is all a nightmare, my personal penchant for seeking the heart of things.1

The fact that the extreme wings of either party have taken over and now drive the rhetoric with unassailable positions, litmus tests, and apocalyptic visions tells me that there’s an emotional disjunction at the root of their disagreement. It’s not just that “they’re wrong and we’re right.” It’s not that the sides disagree over either means or ends. Instead, they experience a complete difference of vision, a different level of comfort with the universe.

Disappointed Idealists

Democrats, liberals, progressives, Marxists—wherever you vibrate along the spectrum to the left of center—are idealists. They view the world as an imperfect place and see all too clearly how it could be made, not just better, but perfect. Their glass is half-empty and they don’t see why it cannot be filled to capacity.

In their ideal world, everyone should be happy. Everyone should be free to act as they please, equal in stature, and able to live as brothers.2 If people are not free to do as they choose, then some condition imposed from outside must be holding them back, whether it’s a repressive society, bourgeois morality, or harsh parenting and a mean boss. That condition should be identified and removed. If people are not equal, then some condition is dividing them, whether it’s class structure, lack of inherent opportunity, or denial of basic resources. Find and remove. If people are not brothers, then their feelings are estranged by some external influence, whether it’s competitive capitalism, religiously inspired hatreds, or teachings of racism, sexism, and classism. Find and remove.

Human nature, they believe, is perfectible—if only the idealists can find the right mix of emotional, aspirational, linguistic, and legal levers to pull and push. And the negative levers like private property, individualism, elitism, greed, and anger must be set to zero and then broken off. The idealists have seen enough success in this venture through the activities of religious reformers and popular writers that they believe a wholly secular approach, dedicated to teaching and promoting the right impulses, can perfect the human soul.

Idealists are focused on the future—what the state of the world can or should be, rather than what it is at present. They find nothing to like or preserve about the past, because previous conditions and traditions were the source of humanity’s failure and half-emptiness. The existing social order, religious teaching, forms of transaction, personal relationships, and ways of thinking about the world are all clearly flawed because they have not produced the perfection the idealist desires. They need to be rooted out and remade according to a better, more workable, more effective model.

The more deeply one believes in human perfectibility, the more cavalier he or she will be about uprooting society, wiping the slate clean, and beginning again with fresh clay. Since people past the age of reason, the older generation who are set in the ways that have worked for them in the past, are reluctant to change, they need to be either silenced, eliminated, or leapt over. The idealist focuses on the young, the changeable, those whose links to the past can be weakened and easily broken. The idealist looks ahead and dreams of utopia for future generations.

Of course, people are not clay. People who act out of perfect personal freedom usually make a mess of the world around them. No two people are equal in their inheritance of talents, energy, ambition, or even the perception of what “perfection” might be. And people are not all born brothers.3 So the idealist is doomed to suffer disappointment.

But disappointment is not disproof of the underlying impulse, which has great emotional strength, personal fascination, and power of attachment. Disappointment only leads to renewed attempts, with greater vigor, stricter parameters, narrower goals. The idealist who does not examine his or her disappointments and re-evaluate personal goals can, over time, become the zealot, the martyr, the destroyer.

The unexamined ideal, driving to its logical ends, can become the enemy of personal freedom, human growth and expression, and love of humanity.

Hopeful Realists

Republicans, conservatives, traditionalists, fundamentalists—wherever you vibrate along the spectrum to the right of center—are realists. They view the world as an imperfect place that could be, and has been, a whole lot worse. Their glass is half-full because they know that, long ago, and sometimes not so long ago, it was really empty.

They know that human nature can be fairly raw, driven by impulses to seek advantage for oneself, one’s family, one’s tribe or nation. They know that people are neither good nor bad by nature but have been bred and trained to survive. Humans, like all products of evolution,4 are strivers and strugglers against a planet which cares not whether we live or die. The solar system with its collection of loose rocks did in the dinosaurs, and it has no greater regard for the bipedal, talking apes that currently sit on top of the heap.5 In most cases, the realists believe, life is all too often “solitary, poor, nasty, brutish, and short.”6

Human nature, as they perceive it, is about as perfected as it can be right now. They see the world as slowly, naturally—sometimes despite the efforts of messiahs, reformers, and intellectuals—getting better. Where advanced societies of the ancient world once freely practiced slavery and human sacrifice—and justified them as appropriate—our now more enlightened, educated view relegates these practices to backwardness and barbarism. Where people once believed in the anger of gods, our more scientific view posits causes and effects. Where human nature once went untamed and rapacious, society now promulgates laws and teachings that promote equity and balance.

The realist looks back a hundred or a thousand years and knows it was better to be an old Greek or Roman than an ancient Egyptian or Akkadian, and it’s now better to be a modern Englishman or Frenchman than either of those ancient peoples. The realist looks at the sweep of history, of social order, of technology and sees that positive changes tend to be preserved and embellished while destructive changes tend to die out. The advancement is not perfect, and there are periods of loss and savagery,7 but things are getting steadily better overall.

The realist is backward-looking to the extent that he or she does not want to lose these improvements in social organization, moral order, knowledge, and technology that humankind has built into our current legacy. Proposals to forcibly modify human nature and “remold the clay”—rather than waiting for gradual improvement—risk losing the gains we’ve received from the past. The realist knows all about the law of unintended consequences. He or she believes that, while individual humans may be the drivers of art, science, literature, and invention, no one person is smarter than the aggregate of minds in the society her or she inhabits. The realist knows that societies will try new ideas when they seem right but will also discard those that don’t work. He or she is content to accrete new layers slowly, like a conch building its shell, adopting what does work only after it’s been tried and tested.

Conservatives and traditionalists are not pleased by the remnants of poverty, illness, indifference, ignorance, desperation, and wickedness they can see in the world around them. They are not pleased that the glass is still only half full. But they are hopeful that the upward trend will continue. Despite the occasional slides into war and savagery, they expect that new understanding, new teachings, advances in science, and the upward trend in technology since the Industrial Revolution will make tomorrow better than today. They are hopeful that the human enterprise will eventually overcome all obstacles—perhaps even death and the destruction of worlds.

Of course, those remnants of poverty, illness, desperation, and so on mean that some people are suffering now. Their lives are short, and they cannot wait for human history to glide incrementally to that better place where everyone will be treated equally well. The conservative’s accepting attitude can lead to complacence, to disregard, and to callousness—especially if one is already on the upside of that gradual improvement.

Unexamined preservation and celebration of things past can lead to stagnation. If one is surfing the wave of gradual improvement, one accepts the troughs when the wave subsides and “creative destruction” leaves more losers than winners. To accept that every boom comes with its bust leads to an expectation of recurring dark ages and long slides back into barbarism. Unexamined acceptance of such failures can be the enemy of personal freedom, human growth and expression, and love of humanity.

Which view is correct—the idealist’s or the realist’s? Both and neither. The choice is more than intellectual and emotional—it’s almost glandular. Yet these two aspects are the yin and yang of our human nature. They reflect our simultaneous acceptance and rejection of the natural world and of our place in it. The ultimate triumph of either view, or the abandonment of its opposite, would diminish us as humans.

1. I can be a little stupid sometimes—“a bear of little brain”—and I like to put things in simple terms. This is reflected in my tendencies toward lumperdom: pushing things together and finding their similarities; instead of splitterdom: separating things out on a plate into ever smaller, more refined piles and counting up their differences.

2. Echoing the first Jacobins: Liberté, égalité, fraternité!

3. And, of course, not all brothers are loving, caring, and sharing. The fact that siblings are by nature born at different times in the lives of their parents and families, and that their parents may acquire skills or lose patience along the way, tends to create a natural pecking order. And some children can peck pretty hard!

4. Yes, many Republicans believe in evolution. If you can leap over the literal interpretation of the Bible and that whole Six Days thing, anyone can see that (A) tetrapods all share a common bone structure, all animals share a common molecular structure, and all life shares a common DNA coding system; (B) eons of geologic time and tectonic shift have created changing environmental conditions for which the flora and fauna that were representative of biblical times in the Levant might not have been well adapted; and (C) an intelligent, an observant, a clever God who created a changeable planet might easily have created a mechanism for generational change, so that the creatures inhabiting that planet would always prosper and enjoy their surroundings. From a certain viewpoint, DNA-driven evolution is simply another aspect of His wondrous creation.

5. This formulation, of course, leaves out the whole God question, where He loves us and cares for us and wants the best for us—except when He sends a Mongolian horde to burn down cities at the edge of the Christendom, or the Black Death to burn out half the population of Europe.

6. With thanks to Thomas Hobbes, even though he posited the necessity of the all-encompassing Leviathan state.

7. Which pretty much defines the horrors of the 20th century.

Sunday, July 1, 2012

The Politics of Cuteness

Passing under the foot bridge across Interstate 80 in Berkeley the other day,1 I saw a hand-lettered sign: “ECOPALYPSE NOW.” It’s a clever juxtaposition of letters, in homage to an anti-war film from the 1970s. But I’m not sure I know what the person posting it intended to say. Are we to understand that the world is so ecologically damaged that no amount of effort can save us now? Or are we supposed to be rooting for an ecological catastrophe to hurry up our own elimination and so remove a pestilence from the Earth? Or a little bit of both? Or is the poster just being clever because he or she could think of changing the letters around, and let the subtext fall where it may? “We all know the world’s screwed … so enjoy!”2

As I thought about this and tried to interpret the message, I became more and more insulted and irritated. Political discourse in this country appears to have moved on from the political pamphlet, the essay, and the magazine article. We’re even leaving the postcard and the 140-character Twitter tweet behind at light speed. The bumper sticker says it all these days, and soon—as with Ecopalypse Now—the clever anagram.3

The philosophical proposition is presumed to be proven, and all we need is a reminder.

I find myself most disturbed by this uni-dimensional thinking in terms of the environment and the course of human civilization. People speak casually now of humans as some kind of a exponentially spreading virus and of our ability to destroy the planet. Catastrophe is a given. “Human” equals bad. “Civilization” equals bad. “City”—and especially “suburb,” yecch!—equals bad. “Noble savage picking berries and cracking marrow bones after the lions have dined” equals good.

The people who seriously think we’re destroying the Earth need to get out of their apartments in Berkeley. Travel up the Pacific coast and you’ll see a lot of rolling forested hills that are pretty much doing their own thing. Could they be more untouched—totally without roads or fire trails, no selective logging or brush management, no small towns appearing here and there, no dams on any rivers, no sign of human intervention? Sure. But they wouldn’t appear much different, would function very similarly, in fact would be much the same—except in the minds of environmental fundamentalists, for whom any trace of activity later than the Pleistocene is a deadly blot.

Are we losing species and biological diversity? Of course. It happens all the time, with or without the works of humankind. Rivers silt up and change course, creating mud sinks and oxbow lakes: one kind of fish dies and another kind of fish, or a species of frog, takes its place. There is nothing magical about a species, except its fitness for a particular environment. The world was changing long before the first hominid walked out of Africa.

Northern Europe once was covered by dense forests. Humans eventually cut them down for buildings and for fuel, creating the open farmlands of Roman and medieval Europe.4 But before the arrival of the first proto-humans, successive ice ages had stripped the land down to its topsoil.

If you want to see a real travesty, look at the photos Mathew Brady took during the Civil War. Not the carnage of armies, but the background scenery—or lack of it. Battlefields in Maryland and Virginia show barren hills, the result of a dozen decades or more of people cutting local forests first for shelter and then for fuel. These same places are now suburbs and rolling hillsides overgrown with trees, since people have started burning coal and natural gas in power plants far away rather than going out each day with an ax. The result is more songbirds whom thoughtful suburbanites attract and support with offerings of seed and suet.

In the meantime, modern human cities have become a paradise for, of all things, hawks. The cities harbor flocks of pigeons who gorge on handouts and lunchtime leftovers. They offer plentiful ledges on high buildings where, protected from predators, the hawks can raise their fledglings and stoop on the pigeons. Life is making a comeback in the urban canyons.

Think of your own carbon footprint. Carbon dioxide is a trace gas in the atmosphere. By itself, it has negligible affect on retaining the Sun’s reflected heat—the greenhouse effect.5 But it’s a potent fertilizer, promoting plant growth and so creating more ecological niches and biodiversity. This planet was once a much warmer, more fecund place that has since become colder and dryer with the various ice ages.6 More carbon dioxide might reverse that age-old global catastrophe.

People who agonize over a few degrees of heat, a few feet of rising sea level, and a gradual northward migration of arable cropland tend to forget that as little as 18,000 years ago humankind lived through devastating winters that had barred us from northern Europe and central North America by generating vast, mile-thick sheets of ice. During those periods of glaciation, the sea level dropped so that large swathes of the continental shelf were exposed, including a “land bridge” where the Bering Strait now lies.7 The surface of the Earth is constantly changing, with or without the help of humans. The best we can do is use our brains and our collective technology to try to adapt and prosper.

People who see environmental catastrophe also see the sweep of human civilization as essentially negative. Instead of steady, successive waves of enlightenment, leading us out of ignorance and tribal narrowness, through accretions of variously acquired teachings and shared skills with increasingly powerful technologies, and with deepening expectations of personal and societal sophistication, they see only mechanization and brutalization. They forget that the world heritage that gave us Hitler and Stalin, the machine gun, biological warfare, and the atomic bomb, also gives us Gandhi and the Dalai Lama, electric lights and telephones, antibiotics and genetics, computers and a global civilization that makes life richer and better.

The dominant environmental meme of the bumper-sticker mindset reduces the vast sweep of our evolving planet to a human-scaled morality play. In this conception, the smokestack emissions of the past 100 years have upset a finely tuned balance in the atmosphere, and the buried and discarded garbage of the past 50 years has poisoned the land and the sea. At the same time, this meme raises up a vision of global collapse and a healing Earth—for example, the History Channel’s “Life After People”—showing that 1,000 years after humans disappear their works and effects will have largely disappeared with them. Somehow, I don’t think you can have it both ways: either the planet is fragile and we can destroy it, or the planet is robust and can shake us off like a mild head cold.

If all of this were just the fantasies of a few people who lack the perspective of the Earth’s long history and knowledge about the nature of human civilization and technology, it wouldn’t much matter. But this meme is now shaping and driving policies that will have real effects on our lives, livelihoods, and future ability to adapt. By considering the Earth to be a fragile, unforgiving environment and humans as a thoughtless, devouring horde, this kind of thinking leads to calls for abrupt and divisive change. Human choice and our collective wisdom, expressed in social norms and exercised through democratic means, are actively discounted. The theories and models of a relatively small group, representing academics, scientists, and politicians, are given undue voice and power. Apocalyptic, catastrophic thinking trumps the evidence of gradualism and adaptation that we can plainly see all around us.

And then some wisenheimer thinks up “ecopalypse now,” as if 5,000 years of civilization and 500 million years of evolution can be so easily summed up, reduced to a clever catchphrase, and resolved in favor of the destruction of humanity … Gaaahhh!8

1. This bridge is a large suspension structure spanning eight lanes and carrying bicycle and foot traffic between the city and the marina area. It has become a familiar place to impede highway traffic by posting signs and waving arms in protest against almost everything about our civilization. At the city end of the bridge is a sculpture in bronze of academics holding up aluminum frames in the shape of book pages to examine the sky, and this celebrates the University of California, Berkeley. At the other end, a group of people fly aluminum frames in the shape of kites, and this celebrates a popular activity at Cesar Chavez Park next to the marina. I am told—but have not yet found with my own eyes—that one of the figures is a defecating dog. It’s all about social commentary.

2. Only after I began thinking about this kind of bumper-sticker politics—and had this blog about half-written—did I bother to Google “Ecopalypse Now” and came across this anti-AGW gem from Russ Vaughn in American Thinker.

3. I’m reminded here of the various word games making the rounds in email and the internet, where you create a new and supposedly ironic meaning by changing one letter in a word. For example, “juxtapose” becomes “just a pose.”

4. The land became so denuded that the word “plantation” has special meaning in Europe. Elsewhere, it might mean a large farm where a single crop is grown. But in certain parts of Europe it refers to a stand of trees—which grow together only because humans have planted them.

5. All the models of future climate that show change due to CO2 concentrations are actually predicting large changes in water vapor—a potent greenhouse gas—that will have been “forced” by modest changes in CO2. If you can imagine a relationship and model it with an equation, does that make it real? (See my blogs “Fun with Numbers” from September 19 and 26, 2010, in Science and Religion.)

6. According a hypothesis examined in the NOVA segment “Cracking the Ice Age,” airing in 1997, the upthrust of the Himalayas after the Indian subcontinent crashed into Asia 45 million years ago exposed a tremendous amount of new rock to atmospheric weathering. The process supposedly bound a large fraction of the atmosphere’s CO2 in calcium carbonate that ran off in the Ganges and Indus rivers, was absorbed into the shells of marine animals, and deposited in sediments on the floor of the Indian Ocean. That may have led to a cooling of the Earth’s once temperate climate and the rise of glaciation.

7. Borings through the Greenland ice cap have brought up samples from the sub-glacial soil that include vegetable matter from northern arboreal forests. DNA testing of these samples reveals that these buried forests do not date from the last interglaciation, some 100,000 years ago, but from the period before, approximately 200,000 years ago and perhaps even earlier. This evidence suggests that Greenland’s glaciers are older and more persistent than once thought, lasting through more than one ice age.

8. At this point the narrative is concluded in the hand of Assistant Undersecretary Toadpipe, because his Abysmal Sublimity Undersecretary Screwtape has inadvertently transformed himself into a large centipede.