Sunday, June 24, 2012

Outside Looking In

I find a disturbing trend in our current public discourse:1 areas of human behavior that once were addressed through the principles of morality and the study of ethics have now—slowly, quietly, almost unprotestingly—become matters of merely medical, scientific, legal, or economic interest.

Take all the issues related to human reproduction: attraction, love, sex, fertility, and abortion. Once these were matters of a mostly moral dimension. Is the attraction appropriate for these two people—considering issues of consanguinity, previous marital commitments, and religious compatibility? Is the decision to enter into a sexual relationship appropriate—considering issues of the participants’ ages and level of maturity, their previous emotional involvement, and their ability to care for any potential offspring? Is the choice of bearing a child or terminating the pregnancy appropriate—considering the health and interests of the unborn child and the parents? Such questions were once some of the deepest human riddles of human relationships.2 But these days—and perhaps starting with Alfred Kinsey’s research nearly fifty years ago—sex and love have become matters of merely psychological interest. Fertility and abortion are only medical and legal issues. Questions of right and wrong seem to be locked out of the bedroom, the clinic, and the debate.

Take the issue of child rearing: expected behavior, education, guidance, and discipline. If ever there was a situation designed for ethical and moral consideration, how to socialize and civilize a wild human so that he or she can take part in the family and in the greater social setting of humanity itself would surely qualify. Children will become the kind of adults they are raised among, which is the basis for how they are expected to behave and the kind of choices they will make. But today these issues revolve around political considerations, often apparently centered on the child’s legal and civil rights, as if he or she were a miniature adult already with full personal consciousness and independence. Child rearing has become the province of the day-care center and the schoolhouse, and a parent who interferes too vigorously in teaching, training, and disciplining an offspring may get a visit from Child Protective Services.

Take the issue of right and wrong action in and of itself. This is the basis of morality and ethics. How do individuals relate to each other? What is appropriate action in the current setting? What does it mean to be a responsible person? Yet today these questions are subverted by a miasma of confusion—raised like a smoke screen by the proponents of moral and cultural relativism. Such a doubt about the issues of right and wrong having any universal application recently came up in a Facebook discussion which quickly devolved into a cost-benefit analysis. Morality itself had become a subset of economics!

This saddens me greatly. In the transfer of basic human relations from issues of personal decision making to the externalities of science, medicine, and law, I think we lose an important human element.

People exercise their greatest freedom in the choices they make. To be an actor on the stage of human relationships and in the sphere of human endeavor is the greatest differentiator between human beings and animals. Animals do not have the reflective capacity to know right from wrong. They can make only limited choices, based upon their physical ability and available opportunities, rather upon than any moral insight. Animals are creatures of instinct, not reason.

To treat the issue of sex in purely physical terms without moral relevance, to treat child raising as a matter of purely civil rights, to treat moral choices as matters of pure stimulus and response—is to debase our view of human beings. By reducing the human capacity for reasoned action to a psychological or legal or economic basis is to view humans from the outside. They become automata, political or economic units, members of an undifferentiated mass reacting in a theoretical domain.

Now, I am not promoting an anti-science agenda here. I believe human psychology, physiology, medicine, and biology are worthwhile studies with extremely useful results. I also applaud the study and exploration of politics, law, and economics. It was the upward trend in all of these areas, starting during the Renaissance and flowering with the Enlightenment—as well as the basic idea of “knowing” in and of itself—that has given us the comforts and sophistications we enjoy in the 21st century.

And I am not promoting religion here. Morality and ethics can be studied and applied purely in terms of human relationships and transactional equities. They do not relay on any reference to mysticism or the supernatural, on a divine and all-powerful creator and his/her/its commandments, or on any hypothetical judgment and personal disposition to be experienced at the point of transition from this life to someplace else.3

My concern is that by removing the moral dimension of these issues, we remove the human dimension. Sex becomes just something that humans do along with the other animals. Socializing children becomes akin to weaning and raising puppies and kittens. Choice of action becomes an analysis of physical opportunities and perceived consequences. Questions of personal responsibility and personal honor disappear.

A well-raised child absorbs a sense of moral self from his or her parents and family members. Yes, school and peer group also have an influence. But the lessons of the family begin to be ingrained before the child has left the backyard for the schoolyard. An attentive parent—as my mother certainly was—uses pride and shame to shape the child’s reactions. “I’m proud of you.” “I’m disappointed in you.” “We always do things this way.” “We never do that kind of thing.” “You know better than that.” By example and through the pressure of love and parental approval the child learns to know the world.

The child also builds a strong sense of identity as a member of the first circle, the family. This is the place of tradition, the touchstone of identity. A child with a strong moral sense built on parental love and guidance is the one best able to navigate the many choices of action and behavior that will come later in life.

When a child knows who he or she is, and what is expected of him or her by the approvers in that first circle, then that child develops a strong sense of personal honor. “We always do this … We never do that …” gradually becomes “I always … I never …” A child with a strong sense of personal honor is more reliable—and more predictable—when confronted with moral choices than a child who reacts only situationally, seeking the best balance of costs and benefits, or—worse—who only obeys when the minions of an outside law are watching.

This is not some kind of universal morality. Some families may teach “always” and “never” in terms that other families might object to or reject outright. Those are matters of cultural and social norms. But a family which partakes of its surrounding society and culture, and which raises its children with a sense of honor related to the norms of that greater life, does the best service to the society around it.4 Children so raised will know and do the right thing without having to think about it. They will be able to resist temptations to avarice and cowardice more reliably than those who were raised in a climate without moral relevance but with reference to medical, legal, or economic principles.

Each human, at the core, is an individual acting from a unique moral sense. To view and value human beings from the outside looking in, as mere units of unknown capability and responses in a moral pinball machine, is to dehumanize them.

1. By “public discourse,” I mean in the news media, the blogosphere, and the opinions offered through personal updates and comments in social media. This exchange of ideas, both official and unofficial, has created a new kind of national dialogue and national awareness. As I’ve said before, we’re in a whole new world.

2. Not to mention the plot motivations for some of our best plays and stories.

3. However, I do believe that at the moment of approaching death, if we are allowed time for reflection and a “summing up,” it matters whether our last thoughts be those of pride and satisfaction or remorse and regret. A moral being must expect some kind of final review and comment on the quality of the life he or she has led.

4. And raising children without the experience of this first circle, without the family traditions, will lead to the fracturing of that society. I believe this is one of the problems we are facing in America today.

Sunday, June 17, 2012

Warriors vs. Soldiers

Video games—at least of the shoot-’em-up variety1—teach young people some excellent martial skills: eye-hand coordination, quickness in identifying friend from foe, and some theoretical familiarity with the capabilities and ranges of light and heavy weapons. That will make them excellent warriors, but those skills do not and cannot make them soldiers.

Warrior is how the old Norse berserkers fought: pick up an ax, go a little crazy, start chopping, don’t stop until physical damage or unconsciousness ends your spree. It’s how the Germans fought along the Rhine in the first century. Big on valor and individual initiative, hazy on coordination and teamwork, almost impossible to withdraw once the hacking begins. Victory or death.

Most battles were fought in ancient times along the lines of bringing a horde of skilled fighters together at a particular place and time—which was pretty good coordination for nations of shepherds—and letting them fight it out in individual bouts along a more or less narrow front. This worked until the Greek hoplites came up with something better: forming a line of overlapping shields, advancing pace by pace, protecting yourself and the man to your left from the enemy’s haphazard thrusts, coordinating your own spear thrusts to create mechanical advantage. With this system of coordinated fighting, shields got bigger and spears got longer. The Greek phalanx required months or years of training to manipulate, took time to assemble on the field, and was fiendishly difficult to move around corners. But once it got going, the barbarian warriors who faced it could not stand. The Greek phalanx conquered the Peloponnesus, then Macedonia, and then Asia to the Indus.

The Romans did the phalanx one better by breaking it into more maneuverable pieces. These smaller groups of maniples and cohorts possessed different skill levels, armor, and tasks. Swift velites dashed in to confound, trip up, and pin the enemy. Then they stepped back in orderly fashion to let the sturdy and disciplined principes move in and engage the front lines. When they had worn down both the enemy and themselves, they retreated so that the hardy and heavily armed triarii could advance and mop up the field. Each group was launched at an adversary in a particular order, did its work, and retreated in good order so that the next units could advance and complete the task. Roman soldiers were not expected to die in the service of their country. They were expected to be patient and brave, obey commands, do their jobs, serve their twenty years, and retire honorably on the lands their fighting had won. Even the disciplined soldiers of the phalanx who faced the Roman “mowing machine” could not stand against it. With superb organization and coordination, the Romans conquered the Greeks and the known world.

That sort of coordinated fighting—steady soldiers doing their jobs, advancing shoulder-to-shoulder in step, firing on command, holding the line—survived until the age of the high-explosive shell and the machine gun. Medieval knights, British and French regulars, Union and Confederate brigades, they all perfected the mass charge, moving in measured cadence to overwhelm a less well organized enemy.

But even the machine gun and the explosive shell did not bring back the warrior class. The modern battlefield is an outgrowth of the terrible debacle that was World War I, where men tried to do mass charges against devastating weapons and were slowly driven underground, into trenches and bunkers. They still tried to charge out of the trenches—“over the top”—and died for their effort. The power of the machine gun and high-explosives was a shock to everybody: if the enemy could see you and shoot at you, you died.

In World War II, new tactics developed for the “invisible battlefield.” Rather than forming long lines, marching in step, and firing on command, the successful platoon and company disappeared behind cover. Lead elements took a position, set up their automatic weapons and mortars, and performed “overwatch” on the enemy—that is, they remained hidden and fired to encourage the enemy to stay down behind cover or in their foxholes. Follow-up elements advanced in the lull while the enemy was kissing the dirt, then these elements would find cover for themselves, set up overwatch, and clear the field for rear elements to move up. Anyone who has seen a police SWAT team advance down the street or take a building understands the principle: unit one provides covering fire; unit two advances, takes cover, provides covering fire; unit one advances … moving inexorably forward. The bullets are flying only when everyone has their head down.

In the latest wars in Iraq and Afghanistan, the field of battle is more often a village or town than an open field somewhere. The “invisible battlefield” remains the surest tactic for a soldier to do his job, stay alive, and go home at the end of tour.

This sort of fighting requires planning and coordination, cooperation and teamwork, communication about group objectives, and a shared sense of pacing. It’s a different kind of teamwork from the fighting line. It requires more initiative and awareness from the individual soldier, who must identify and take potential points of concealment, spot enemy locations and points of fire, then offer intelligent covering fire. It’s more teamwork rather than less, and no place for the individual heroics of the barbarian warrior.

Add to this basic coordination of movement and application of force the new technologies available to the modern soldier: rapid-fire rifles, light machine guns, grenade launchers, laser sighting technologies, surveillance drones, instant person-to-person and hierarchical, command-level communications, coordination with aerial and armored units. The flexibility of the individual unit and its coordination with other units spans the battlefield and sometimes the whole theater of operations.

In this environment, the individual soldier’s training with his—and now likely to include her—gear and its underlying technology, his sense of tactics and teamwork, his understanding of the mission and its underlying strategy make that soldier a more valuable fighting unit.

Consider just firepower alone. The average soldier of the Napoleonic era—early 1800s—carried a smooth-bore, muzzle loading musket. The soldier manually assembled the charge of powder, patch, and ball driven home by a ramrod. It fired either with a flint and striker, which had to be primed separately, or with a percussion cap containing fulminate of mercury, which had to be fitted to a nipple at the trigger end of the barrel. A skilled soldier in practice on the firing range might get off four or five rounds per minute with accuracy at perhaps 100 to 150 yards. Under the stress of battle, that would drop to two or three rounds per minute and accuracy out to about 80 yards.

Fifty years later, with the American Civil War, rifles were becoming more common. Rifles have spiraling “lands and grooves” inside the barrel that spin the bullet and increase its long-range accuracy. But pushing the bullet down the length of the barrel from the muzzle end against the pressure of that spiral and its closer tolerances increased the reload time. The Civil war was also beginning to see the introduction of breech-loading rifles and even some repeating rifles with manufactured bullets that put the primer, powder charge, and projectile all into a brass casing.

In either era, artillerymen and grenadiers—men trained to throw bombs, which were essentially iron casings filled with black powder and ignited by a lit fuse—were specialists who did not march in line with the musketeers and riflemen.

Eighty years later, in World War II, the M1 Garand was the soldier’s weapon: a clip-fed, gas-operated, semi-automatic rifle in which the exploding gases from firing one cartridge operate a mechanism to eject its brass shell and bring the next cartridge into the firing chamber. It could fire eight rounds as fast as you could pull the trigger and had an effective range of about 400 yards. The soldier was also equipped with fragmentation grenades that ignited by pulling a pin and releasing a flipper handle, with a range of about as far as you can throw a rock.

Twenty years after that, the U.S. soldier was equipped with the M16, which could be fired fully automatic with a cycle time of about 700 rounds per minute and a range of 600 yards. Of course, the clip only held five cartridges, so the soldier selecting full automatic ran himself out of ammunition in a fraction of a second. Later models were limited to three-round bursts.

Modern weaponry includes light-weight machine guns and small grenade launchers that give a rifleman the capability of a short-range mortar, and—coming soon to a theater of war near you—“smart” bullets that can be sighted on a target and programmed to explode a certain distance from the muzzle according to the number of revolutions the bullet makes in flight. You no longer have to be able to see the enemy to hit him and cause severe damage.

A single modern soldier has the firepower of a World War II squad and—with the right assortment of weapons and a little elbow room—a platoon of Civil War soldiers and perhaps a company of Napoleon’s Imperial Guard.

What does this kind or capability and coordination imply? First, the term “cannon fodder” becomes a total misnomer. Soldiers are no longer expendable, replaceable units who are expected to walk into certain death. The idea of drafting young men for mass charges as part of large armies is absolutely antique. The last thing you want in the modern army is a petulant, resentful conscriptee whom you have to drill into performing rote actions without thinking and on command. Instead, you want tech-savvy, team-oriented specialists. You really don’t want them to “die for their country” (although they’re prepared to) but make the other fellow die for his.2 In that respect, almost all soldiers are modeling their training and tactics on the elite Delta Force and SEAL teams. Those are the soldiers of the future.

Second, with teamwork and coordination at a premium, the old notions of unit formation go out the window, too. In Vietnam, individuals were cycled into the country for one-year tours and, when their time was up, cycled back out. Any fighting unit was a mix of new-in-country greenhorns and hardened, short-time veterans. The average soldier of the former variety didn’t even start to make friends until he had survived about six weeks of combat and gotten all the bad luck and carelessness out of his system. That’s lousy for teamwork. So now units train together, deploy together, and end their tours together wherever possible.

Third, when the modern soldier returns to civilian life, his or her skills are more in line with the modern requirements of the workplace: awareness of mission, loyalty to team, flexible response to obstacles, willingness to take personal initiative, seriousness of purpose. Willingness to kill and be killed might still be part of the mix, but not the most important part.

Increasing capability has made soldiering a matter of professionalism. The warrior mythos may be part of their dreams, but has little to do with their reality.

1. Perhaps the kind of online gaming where an individual fights alongside others in a consensual universe will teach some of the skills that I lament here in the gaming soldier. I’m not a denizen of these new gaming emporia, so I don’t know. Certainly, if offline individuals start forming online teams, the benefits will develop with continued success.

2. With respectful thanks to General George S. Patton.

Sunday, June 10, 2012

On Going Negative

Over the years, various book editors, business communicators, and writing group members have noted and commented on my tendency to define situations and objects with a negative slant: “It wasn’t your typical Saturday … This wasn’t the treatment James was accustomed to … It’s not your father’s Oldsmobile.” They find this sort of negativity objectionable, defeatist, pessimistic. They want me to be positive, upbeat. State what the thing is, rather than what it isn’t. And I try—I do try.

But for me, to speak and write only in positive terms is working rhetorically with only half a palette. White without any black or grays remains undefined. Light without the shadows it creates would be indiscernible. To say what a thing is tells only half the story; the fuller description also includes what it is not, chooses not to be, or will never be. This usage identifies the negative as a form of context.

And it’s not that I am actually defeatist or negative in my outlook … (There it is again!) I generally have a very positive outlook, both politically and economically. I believe that the world and the human situation are generally improving, despite some current and perhaps enduring setbacks. More people in places that we used to think of as economically backward when I was a child, places with overcrowding, mass starvation, and hopelessness, places like India and China, are actually doing much better and taking care of some, if not all, of their people. Technology is providing the human race with better crops, more energy, more accessible communication, more opportunities for individuals to learn and know and express themselves. The world is a richer place than when I was growing up.1 We will still see local and temporary failures, of course, like the current money bind in Europe and the United States, but the long-term trend is upward. So I am not a pessimist.

There it is again. And in a strange kind of double negative. If “pessimist” is something negative, then I am not something negative. Double negatives are like the mathematical trick of adding two negative numbers and creating a positive. Why not just say “I am an optimist”?

Well, because sometimes that goes too far. To call someone an optimist is sometimes to imply a failure of judgment. The word “cockeyed” fits too well with the word.2 Rose-colored glasses. Worn by Pollyanna, the Glad girl. Candide wandering in this best of all possible worlds. Ain’t me, babe.

I know that two paragraphs back I talked about the world improving. But it’s a sliding scale, two steps forward, one step back. Hate and malice, envy and greed, superstition and fear are all still with us. The wealth of information available on the internet has also spawned the biggest, fastest rumor mill the world has ever seen. We keep blowing economic bubbles and then wondering what has happened when they burst.3 I’m not an optimist, nor a pessimist. I am—to put it positively—a neutral realist who gathers facts and checks off boxes.

My real faith, the source of my optimism, is that western civilization has survived the fall of Rome, the Mongol invasion, the Black Death, and the Spanish Inquisition. Eastern cultures survived the eruption of Krakatoa and the Japanese Tsunami. Not everyone survives on these occasions, but life goes on. And we still live on a little green planet with the amount and hue of sunlight for which our eyes were adapted, atmospheric pressure for which our lungs were adapted, gravity for which our feet were adapted, and day-to-day problems for which our brains and hands were adapted. Things will get better—but sometimes slowly.

And note that I’m still talking negatively—invasions, death, and eruptions—rather than dwelling on the positives like invention of the printing press, the glories of Renaissance art, the human flowering of the Enlightenment. Why is that? Because, at heart, I am also a contrarian.

Ornery me. When the crowd is cheering, I tend to go quiet. When everyone is rushing for the exits, I sit down. When everyone else is buying a new house or refinancing the old one to dip into that “equity line of credit,”4 I hang back. The road that’s well traveled is likely to be crowded, and I instinctively avoid crowds.

I miss a lot of opportunities that way. I also get strange looks from some of my friends. But I haven’t fallen for too many hoaxes, or gone broke from buying a tulip bulb. When everyone is cheering on house prices that exceed the traditional “three times your annual earnings,” I start asking what’s wrong with this picture. That sort of dispassionate squint can be useful for a survivor, or a novelist.

For me, that bit of negativity is not Eeyore’s depressed and perpetual groan. Instead, it’s an element of doubt. Is the world only what we can see and touch? Is the situation only what we think it is? In a political and economic environment full of promoters and marketers, partisans and pushers, each wanting to channel the crowd through this door or that, a tendency to doubt, to look deeper, to seek hidden motives and agendas, is again useful, both for a novelist and a survivor.

And finally, I believe in the basic mystery of life. The Buddhists tell us that the world is illusion. What we can see and hear, touch and smell, is only a shadow in the mind, interpreted from the evidence of our six senses.5 Further, our theories about what we can see and touch, whether based on our imaginations or our mathematics, arise from someplace even deeper in the mind, not from any awareness that’s closer to the external world.

Our brains are mechanisms of layered complexity. Out of the complications of our personal reactions to our perceived experiences of cause and effect, we build up a world view and a personality that exists within it. Layer on layer until we come to believe that the “I” which inhabits this chemical pot and electric box actually exists.

But what’s “out there,” reflected in the photons that are impacting our retinas and the electromagnetic pressures that are pushing against our fingertips, is still a mystery. It’s a mystery we dance with every day of our lives.

So the ultimate message of my doubting, negative side is really a perpetual reminder: “The world is not what you think.”

1. Consider information and entertainment. When I was in school, if you wanted to find things out, you had to go to the library, maneuver through the Dewey Decimal System, locate the book or magazine on the shelves, take it out, use the table of contents or index, and then read the information from the printed page—which might not have been updated in the past ten or twenty years. If you wanted to hear music, there were records and the radio, rather than having to locate a performance and buy tickets at the concert hall, as my great-grandparents would have had to do. But if you wanted to see a movie, you still had to show up in a certain place, on time, and buy a ticket. Oh, and you could see it on television—three years later, at whatever time the network happened to show it, in black-and-white, and cut up with commercials. Today, all of that information, all that music, all those movies and even the television shows themselves are available through a computer and, more and more, on a telephone that talks without wires. Just amazing!

2. Thank you, Rodgers and Hammerstein.

3. Do you want to see a silver lining in all this? Well, what is a bubble? From the Dutch tulip mania, to the Tech Bubble of the late 1990s, to the Housing Bubble of 2008, a common pattern emerges. When you think about it, people with too much money—or access to too much credit, which only means that someone else has the money to loan—are chasing too few, and so overly valued, items in a particular class. Flower bulbs, internet startup companies, desirable houses—people with ready money are now always chasing something.
     When you consider that the basis of all human economics for 100,000 years, until we settled down in river valleys and started growing things, was want and hunger, that’s a pretty positive development. Hunter-gatherers might gorge themselves when the game was plentiful and the berries ripe, but for the rest of the time life consisted walking from one bush to the next looking for something to eat.
     Consider that even in the river valleys, the good times were when the granaries were full, the bad times when the harvest failed and they went empty. Today, our downturns, our recessions and even our depressions, are not due to crop failure or declining production. Instead, we go into economically negative territory when demand fails, when the warehouses fill with unsold goods, and when productive factories must go on short work weeks. That’s 5,000 years of economics turned on its head. With the productive capacity that ever-rising levels of technology bestow, we’ve entered a whole new world, a new kind of economics.

4. Which my mother would have called what it is: a second mortgage. For my folks, those words were “the worst thing you ever heard.” And yes, I’ve taken on debt in my life. I’ve bought a few too many toys on my credit cards and then had to delay other gratifications while I paid down the balance. But I never put my living situation at risk. That is a very bright, sharp line.

5. Okay, what’s the sixth sense? We all know the traditional five: sight, hearing, touch, taste, and smell—with the last two, while interpreted through different organs, actually partaking of the same chemistries. But within our ear, and having nothing to do with detecting sounds, is a mechanism for directly perceiving gravity. Our sense of balance, which is separate from and sometimes in conflict with the evidence of our eyes, is the sixth human sense.

Sunday, June 3, 2012

Schrödinger’s World

Sometimes I get strange ideas, notions that flit through my mind and occasionally stick. One idea that flits across now and then, mostly when I go out for a motorcycle ride and then—contrary to the knowing predictions of my family and friends—come back alive, is that maybe there are an endless number of probabilistic choices occurring all around us and forcing the endless creation of new universes. Maybe we all live in Schrödinger’s World.

Erwin Schrödinger proposed his famous thought experiment in 1935. Put a cat in a box with a vial of cyanide and a triggering mechanism based on radioactive decay of a single atom—a truly random event. Close the box and wait a while. During your wait, the cat may have died, or not. The point is, you cannot know whether the cat is alive or dead inside the box until you open the lid.1 From your point of view then, the cat is neither alive nor dead—it is both alive and dead—until you open the box. The cat exists in a “superposition” of two states, an unresolved wave function, a state of indeterminacy.2

Schrödinger was making a point about affairs at the subatomic level, where the act of observation tends to interfere with the experiment. There are certain things in science—particularly in quantum mechanics—that you simply cannot know. If you want to know where an electron is at any particular instant, you have to impact it with some other particle, like a photon, in order to “see” it. Bang! There’s your electron. But the impact itself sends the electron off in a new direction. So you can know where the electron is, or where it’s going, but not both at the same time.3

This kind of observational blindness occurs only at the subatomic level. On the everyday level of darting cats and speeding cars, the object to be observed has enough mass of its own not to be deflected by its impacts—either casual or intentional—with photons.4 But, of course, Schrödinger has the last laugh, because there are myriad events in the universe about which you don’t know the outcome—sometimes you don’t even know of their possibility—until you look. The existence of water under the sands of Mars, the existence of life under the ice on Europa, the fate of stars at the galactic center … all are mysteries subject to endless wonder, speculation, and disputation until someone goes there and actually observes.5

What does all this have to do with motorcycle riding? Whenever I go out on the road upon two wheels, I become subject to unknown probabilities. Of course, all life is subject to unpredictable events like blood clots, stray shots, and falling flowerpots. But on a motorcycle the odds of unexpected entanglements increase from perhaps one in 10,000—the random events of everyday life—to something like one in 1,000 or less. Even when practicing constant vigilance, wearing protective gear, maintaining my margins,6 and observing my doctrine,7 I am still at risk for the unexpected: a sudden, unobserved pothole and loss of control; something falling off a truck; the driver who impulsively decides he likes my lane better. When I ride out, I know I just might not come back. Today might be the day I go under all eighteen wheels—or at least nine of them—of a semi. When I return to my parking stall at the end of a ride and put down the kickstand, I know that once more I’ve beaten the unpredictable odds and am back at the old one-in-10,000 until the next time I ride.

But then it occurs to me: what if I didn’t avoid the unpredictable? What if that garbage truck I passed8 today really did drop a flowerpot on my head? The event is random and the odds, like the atomic decay in Schrödinger’s box, are incalculable. This gives rise to a disturbing thought: suppose that, at the instant the truck’s load, unseen by and unknown to me, shifted and went fifty-fifty on the prospect of ejecting a flowerpot or other lethal object, I entered a state of superposition. For an instant of time—which is all the time probability needs—I existed in two states at once: both the flowerpot remaining on the load bed and me zooming safely past and the flowerpot leaving the load and catching me full in the face, causing me to backflip off the bike or forcing me to swerve violently, and putting me down on the pavement and under the wheels of that semi in the next lane.

If the cat and I can enter a state of indeterminacy, both alive and dead, then we can exit that state in either condition. Maybe we exit it both ways: in one rendition of the universe, the cat comes out of the box screaming and scratching, and I pass the garbage truck with a thankful, satisfied grin; in another, parallel, equally possible rendition of the universe, the scientist picks up the limp, dead cat, and I exit the realm of the living, the place, according to the Buddhists, of impermanence and illusion. Two worlds proceed from the same point. In one I later park the bike, take off my helmet, and exhilarate in having experienced a great ride. In the other, I leave behind a grieving family whispering, “Well, we knew it would happen one day,” along with a pile of unfinished manuscripts, some minor debts, an unrealized IRA, and unclaimed Social Security benefits.

For me and mine, as for the cat, the consequences are very real. For the rest of society, the seven billion other people on the planet, and the whoevers evolving on planets light years distant, the matter remains unknown, the effects inconsequential. A few books that might have entertained some readers won’t get written. The Social Security fund is extended into the black for a few more nanoseconds. Otherwise, no appreciable change in this universe.

But still, the live-Thomas universe might experience some far-reaching effects. In the universe where I go on to write more books, there exists the possibility of changing a future reader’s thoughts, and he or she might then go on to change the future course of that universe. The dead-Thomas universe lacks this potential. They really do become two entirely different universes, not just a minor variation of life in the same place.

So the point in time at which the flowerpot does or does not eject from the truck is the budding point for two of the many possible universes that sit side by side in whatever vast, echoing null-space which contains the multiverse.

Multiply my experience, today, with one effect, the flowerpot, by a million such problematic points in seven billion lives. Throw in the imponderables of earthquakes, tornadoes, and volcanic eruptions. And you have a multiverse that is constantly calving off new universes of their own, new paths that were taken or not, new results written on the pages of personal diaries and acted out on the stage of history. It was a meditation on this sort of endless budding multiverse that led me to the metaphysics of time in The Children of Possibility.9

How far does reality extend? Does the universe change if you flip of a coin when no one is watching and you don’t bother to look at the result? Does the order of cards matter when you’re playing solitaire? Or when you’re betting only penny ante stakes in blackjack? Do some of us inhabit these tiny, mutual worlds that exclude the rest of humankind and the fate of all the stars in the sky? Or do they all end up in the ocean of the multiverses? If some of these events lack the power to change history, then what is history when no one is watching?

As I said, sometimes I get strange ideas, notions that flit through my mind and occasionally stick.

1. Well, okay, it’s a thought experiment. Everyone knows what happens if you try to put a live cat into a box and close the lid: yowling, meowing, hissing, scratching. This goes on until you open the lid—unless the vial cracks first, in which case you hear a sudden urk! and then nothing. In any event, you have a pretty fair idea of a cat’s state of mind at all times.

2. Note: No animals were harmed during the making of this thought experiment.

3. Recent discoveries in quantum mechanics suggest that subatomic particles may actually exist in multiple states or locations simultaneously, subject to interference and entanglement. This leads me to think that either our understanding of the nature of the universe, or our structure of rational thought, may be incomplete.

4. The social sciences also suffer from a form of observer’s paradox. If people know they are being watched or questioned, they tend to behave and respond differently than they would “in the wild.” Sometimes simply the form of a question—some hidden bias in the wording—deflects the respondent’s thoughts and influences the answer. This is why sociology and opinion polling are considered to be art forms.

5. Put another way, Mars right now both does and does not harbor water. Europa both does and does not harbor life. Our premises are in a state of superposition until resolved by observation.

6. Every motorcyclist knows about margins: maintain a two-second distance from the car ahead (two seconds is the time, at any speed, it takes a roadside object like a fence post to pass from the car’s rear bumper to your front wheel); don’t let anyone tailgate you; watch the three-foot spaces beyond your knees. If your margins are in place, then someone has to cross that gap to get to you.

7. What is “doctrine”? For me, it’s the body of rules and observations that guide the “reaction” step of my SIPRE process (see-interpret-predict-react-execute; cf. SIPRE as a Way of Life from March 13, 2011). The basic doctrine on other drivers is in three parts: “1. No one’s looking out for you. 2. Even if he’s looking right at you, he doesn’t see you. 3. Even if he sees you, he doesn’t care.” On interaction with bouncing balls: “If a ball bounces into the street, steer ahead of it. Behind it will be coming either a small child or a dog.” Every motorcyclist builds his or her own body of doctrine, which is merely thinking about things before they happen.

8. More doctrine: “Do not ride behind or alongside garbage trucks, frozen-fish trucks, or cattle trucks. They leak. And in the case of garbage trucks, things tend to fall off.”

9. “Time is not a river …”