Sunday, August 25, 2019

Leadership

Conductor hands

If bringing people together is an art form—and I believe that helping people work efficiently together certainly is—then the highest level of that art is leadership.

A “leader” may be defined, in the broadest possible sense, as one who accomplishes work or effort through the active participation of others. This definition would immediately rule out, then, the sort of supervisor, manager, or even chief executive who is so smart, experienced, or goal driven that he or she shoulders aside subordinates who are trying to do a task and completes it him- or herself, so that it gets “done right.” Such a person is not a leader but a star performer surrounded by frustrated assistants.

This is not to say that the leader cannot take a hand in the effort. After all, when invaders are pouring in over the walls, everyone picks up a rife and shoots. And yes, sometimes, the leader has to lay the first brick, make the first cut with the axe, or otherwise physically participate. This might be a ceremonial start to the project. Or it might be the leader demonstrating how he or she wants the task to be performed. And it may serve the psychological purpose of showing that the leader is not above getting his or her hands dirty. But for the most part, in day-to-day operations, the leader’s role is hands-off.

So how does a leader perform work “through the active participation of others”?

At first, it would seem that the leader simply gives orders. And this might work in situations and institutions where the command structure is clearly defined, enforced by rules and sanctions, and embedded by long-standing practice. Think of the Roman Catholic Church or the U.S. Army. Bishops have absolute authority over parish priests, and archbishops and cardinals have authority over bishops. Sergeants have absolute authority over privates, and lieutenants have authority over sergeants. And so on up and down the rigidly prescribed chain of command.

Except … a weak or vacillating bishop may have no effective control over the priests under him. A green or unskilled lieutenant may have no influence with an experienced, battle-hardened sergeant. Every organization, even the most hierarchical, has ways for underlings to evade or obstruct the orders they don’t like or can’t follow. In some cases, and with collusion, this might be a “white mutiny,” in which subordinates follow stupid orders to the letter until the whole effort—along with the command structure—just falls apart. And in the Vietnam War, there were supposedly incidents of “fragging,” in which soldiers rolled live grenades into the tents of incompetent officers whom they feared would get them killed in battle.

While simply giving orders can work in unusual or extreme situations, where other forms of leadership might take too long or be misunderstood, there is a further caution. In one of my favorite books, Dune by Frank Herbert, Duke Leto Atreides advises his son: “Give as few orders as possible. Once you’ve given orders on a subject, you must always give orders on that subject.” This is because people—especially those in a functioning command structure—can be fearfully literal-minded. They note and observe where a leader or a person up the line from them places his or her interest, attention, and focus. Written or verbal orders are specific and purposeful, and they will tend to be obeyed until rescinded or countermanded. And if the situation surrounding those orders changes in the slightest, the person receiving them will expect a new order amending the previous order. Otherwise, and fearful of doing something wrong or displeasing, they will follow the original order or wait, frozen, until the countermand arrives.

The trap with direct orders is always that they undermine personal initiative. “The chief said to do this and, so help me God, that’s what I’m doing.” Sometimes, perhaps with front-line soldiers in a shooting war, that kind of unthinking dedication and persistence is valuable. But few of us are engaged in a shooting war.

American business culture over the last seventy or eighty years has undergone a sea change. Coming out of World War II and into the boom times in which most of the returning soldiers found civilian jobs, the military-oriented, top-down, command-and-control structure was dominant. It worked to beat the Germans and the Japanese, so why not apply it to the assembly line and the sales floor? And that worked, too—so long as the boss was all-seeing and always right; the jobs were correctly structured and easily performed; and unusual situations were rarely encountered. It also worked because America was in its prime, being the last industrial country left standing, after Germany and Japan had been bombed flat, and England, France, and the rest of the world were exhausted by war. It was a good time for us Baby Boomers to grow up, because American business could make few lasting or damaging mistakes.

But soon, under pressure from rising global markets, advancing computer controls and information technology, and competition in a shrinking internal market, the rigid, top-down model started to falter. Employees who could only follow orders were not as nimble, flexible, or effective as those who had the initiative to identify problems and either fix them on the spot or recommend solutions to supervisors who tended to trust—rather than question—their judgment. Getting things right the first time on the assembly line avoids a lot of lost time in quality control, remediation, and recalls. Solving the customer’s problems on the sales floor avoids a lot of exchanges and restocking—not to mention loss of the customer’s good will.

The need for personal initiative has followed this cultural change back into the military. The modern army unit is no longer made up of people who shoulder a rifle, march in straight lines, and charge on command. The modern naval ship no longer needs teams of people who only pull on a rope when they are told. The modern battlefield is interconnected, data-heavy, and invisible to the naked eye. It is the place of “combined arms,” where soldiers on the ground coordinate intimately with air support, indirect artillery, and electronic intelligence sources. The modern soldier must be a team player, versed in complicated weapons systems—some of which he or she never personally handles—and react to a changing environment both flexibly and nimbly. The modern sailor is a technician operating one of the most complicated and versatile weapons systems ever made.

So how does a leader in either the modern business model or the modern military accomplish work “through the active participation of others”? Primarily, through articulating, setting, and enforcing values. Values are better guides to individual initiative than any set of standing or direct orders. Values last longer in the mind and memory. And when they are correctly stated and willingly adopted, they are more flexible in the face of emergent and changing situations.

The first value a leader must articulate, demonstrate, and encourage, is dedication to and respect for training. The modern soldier or employee needs to know every aspect of his or her own job, and it helps the person take initiative if he or she also understands and can on occasion fulfill the jobs of the people on either side, with whom the employee deals directly, and above, so that the employee understands the reason for and purpose of those orders and directions. Further, the employee should be educated in broader issues like the company’s or the unit’s mission, the needs of customers, the nature of the competition, and the factors leading to success. Some companies try to do this in blanket form with some sort of universal, homily-laded “mission statement” printed and laminated in posters that hang on the wall. But that’s the easy way out. If the mission is going to become a value embedded in the employee’s mind and heart, then it needs to be taught and refreshed at every opportunity.

The complementary value to training is trust: for the leader to trust that the employee has absorbed this knowledge and to encourage and reward the employee when he or she shows initiative. Without trust and encouragement, the employee will suspect that all of the training he or she has received was some sort of management gimmick. That Human Resources is manufacturing another “policy du jour,” a temporary thing like all the others, and what the company really wants is unquestioning obedience. That attitude taps into the vein of cynicism that lies not far beneath the skin of any alert and educated human being. And that sort of cynicism is poison to personal initiative.

Ultimately then, the leader establishes performance standards. This is the hands-on part of the job. The leader voices these directly: “This is how we do things.” “Here is what I expect.” And the leader rewards and publicly praises good performance: “That was well done.” “This is what success looks like.” Setting standards is not a do-once-and-done process. It cannot be achieved with a memo or a motivational poster. It is an active, repetitive, daily flow of discovery, example setting, and encouragement. The leader looks for occasions to teach and train to standards. Only with repetition and reinforcement do the values in all their varied applications become part of the team’s understanding and sense of purpose.

Modern leadership is more about persuasion than power. It treats people as willing participants with minds of their own, rather than as slaves, servants, or “meat robots” to be programmed. It is a personal relationship between supervisor and employee, between officer and soldier or sailor. It is built on mutual trust and respect, rather than on fear and obedience. And it takes conscious effort and an appreciation for the science of psychology and the art communication to establish this relationship.

But once the bond it made, it is stronger than any other. People will literally die for it.

Sunday, August 18, 2019

Grace

Graceful hands

If living is an art form—and I believe that living well certainly is—then the highest level of that art is what I call “grace.”

In religious terms, grace means having the favor of, making oneself pleasing to, or benefiting from the generosity of God. These are acceptable meanings, but not the subject of this discussion. Religious grace is a state that comes from, or is predicated on, or defined by the outside influence of the Almighty. Similarly, in archaic terms, a sovereign might grant a pensioner or a loyal servant grace along with some form of bounty. But again, I’m not talking about the grace that comes from outside.

A person of any religious or political stripe has grace, and can be considered gracious, if he or she adopts a particular set of values and their corresponding behaviors. So let me attempt to list them.

First of all, a gracious person is interested in and sympathetic toward others. He or she lives outside of him- or herself. Such a person, when approached, offers a smile; when asked for an opinion, offers encouragement and support. He or she cares about the impression made upon other people. More important, he or she wants other people to feel secure, comforted, and at ease. The gracious person creates an atmosphere of calm and contentment.

This is not to say that a person with grace has no inner life, or sacrifices everything for the people in his or her immediate vicinity, or never says “no” to a request. But any necessary “no” is always framed in terms of regret rather than righteousness. The protection that a gracious person puts in place around his or her inner life and personal substance is understated if not actually invisible. This is not deceit but an advanced form of minding one’s own business.

Second, and corollary to the first, the gracious person is generous. The old-fashioned word is “magnanimous.”1 When confronted with a direct appeal, he or she generally gives what is asked and a bit more. Or, if direct giving is not possible due to the state of his or her purse and other resources, the gracious person goes out of his or her way to offer other forms of help and aid. Once again, that giving or that aid may have limits, but they are generally invisible to the person receiving the help. At the very least, the gracious person makes time to listen to the request and, if help is not forthcoming, refrains from letting the supplicant feel bad about the asking. This again is a form of sympathetic treatment.

Third, the gracious person respects the freedom and personal agency of others. This is a delicate point, because it is so often overlooked in today’s communications. Offering respect in this case is a subtle negative: refraining from making assumptions about another person’s situation, motives, capabilities, or prospects. Just because someone is in a bad way and asking for help does not mean that he or she is unwilling, incapable, or lazy. The other person’s life is not a subject for debate and speculation. The other person’s story is his or hers alone to tell—or not. The gracious person respects personal boundaries—both in speaking and in imagination—just as he or she maintains his or her own boundaries and secrets. The gracious person trusts that others—barring direct evidence to the contrary—have abilities and prospects similar to his or her own. After all, the basis of respect is granting that others have an equal footing in this business of life.

Fourth, and corollary to all of the above, the gracious person is confident of and secure in his or her own situation, motives, capabilities, and prospects. To be confident means that a person has examined his or her own life and knows him- or herself at some depth.2 To be secure in life is to have control of one’s person and resources and to maintain a calm appreciation of what the future may hold. A person who is insecure or afraid cannot be—or, more generally, believes he or she cannot afford to be—generous, sympathetic, or respectful of others. A person who does not trust him- or herself is unlikely to extend a similar trust to other people.

Fifth, a person with grace generally leads a sober and thoughtful life. He or she knows that alcohol and drugs, loose talk, unrestrained behavior, reckless impulses, crude humor, and even excessive jocularity—all tending toward a loss of personal control and restraint—have the possibility to be damaging to oneself and to those nearby. Wit and a well-told joke are acceptable. Imprudence, impudence, and a generally sarcastic attitude are not.

Sixth, and finally, the gracious person maintains his or her balance. We think of a person who is physically graceful as moving in a controlled, purposeful, balanced manner, without sudden lunges, hesitations, missteps, or reversals. In the same way, the gracious person’s life proceeds in balance, with a sense of purpose and direction, and with forethought. It avoids both impulsive lunges and fearful retreats. The gracious, magnanimous person is superior to events, plots a life course focused on the long view, and remains steadfast, loyal, and calm in the face of both opportunity and adversity. He or she thinks before acting, and acts with precision and purpose.

All of these traits exist irrespective of any political views. A generous, thoughtful, sympathetic, and controlled person can exist in any political sphere—and even more so in times of disruption and confusion, which are merely opportunities to exercise these talents. Similarly, the life situation that allows a person to demonstrate grace is not dependent on wealth, education, intelligence, health, or other outward resources—although it is easier to maintain a gracious demeanor if one is not scrabbling for breadcrusts, ignorant of the world, dim in perception, or wracked with pain. Still, the true gift is to rise above these shortcomings.

I have been blessed in my life to know a number of gracious people. My mother was one, my late wife was another, and I cherish their memory. They both lived with a calmness of person and generosity of spirit that made the people around them feel both confident and secure. And that is something that can only come from the inside.

1. This is a direct combination of two Latin words, magnus, meaning “big,” and animus, meaning “spirit” or “soul.” Such a person is big-spirited.

2. It was written above the door to the oracle at Delphi: gnōthi seauton, “know thyself.” To understand the will of the gods, you must first understand your own situation, motives, capabilities, and prospects.

Sunday, August 11, 2019

Who Gets to Say?

Founding Fathers

“Declaration of Independence” by John Turnbull, 1818

All political questions, every one of them, come down to a basic proposition: Who gets to say? Who in society will decide the big questions and, by extension, the smaller ones for the rest of us? Who will make the rules? Who will rule on what is allowed and what forbidden? Whose vision of culture and society will shape the world in which the rest of us must live?

For a hundred thousand years or so of our hunter-gatherer existence, going back even before the H. sapiens genome settled down in its current configuration, the question of who-gets-to-say was probably handled as close to democratically as one could wish. After all, each group or band was about the size of an extended family or tribe, probably no more than fifty individuals maximum. In such a group, you might think that the oldest, wisest, strongest, or bravest would be the natural leader and decider. But I believe such a small group would probably have its own ideas.

The person whom the tribe picked as “wisest,” “strongest,” or “bravest” might be the result of a lot of individual choice and input. It would really come down to whomever the collected individuals trusted or respected or hoped would take them in a direction that found food, water, shelter, and safety. And that might not be the person with the smartest mouth, the biggest muscles, or an unflinching attitude when staring down a wild boar or tiger. It ultimately would be the person to whom the most members of the group would listen, respond, and obey.

It’s also possible, with a group that small, to have a situation of shared leadership. One person might know the most about and be the most skilled at hunting. Another might have special diplomatic skills when dealing with other tribes. And a third might be the best to consult about campsite rules and interpersonal relationships. In this, the hunter-gatherer tribe might be like any extended family, with different natural leaders at the focus of different family problems.

But when the tribe settles down and becomes a village, a culture, or a nation, the daily push and pull of personal familiarity, family loyalty, and individual input no longer works to select a leader to speak for and direct the whole district or city-state. Each street and neighborhood might have its local wise man or woman, its go-to person for conflict resolution, and its natural spokesperson. But as soon as that local neighborhood bands together with others to form an association that is larger, better able to plan community rules and infrastructure, and benefit from extended trade with other cities and states—and so become richer and more powerful as a whole—then the need for a unified leadership becomes apparent.

The district chief, thane, or regional king would not necessarily be chosen by popular election, and probably not by simple trust, respect, and response on an individual level. And this would be the initial selection, when the old king dies without an heir or the community undergoes a popular revolution. Otherwise, for all our long history, humans have been willing to trust the children of the king to take the mantle, crown, or talking stick when the old king passes. This trust in inherited capability was not blind, because it responded to an innate understanding of biology, observed from a thousand generations of animal husbandry and seed selection: strong and capable parents usually beget strong and capable children.

But in that time of dynastic disruption—either death of a childless king or social revolution—then the choice of who-gets-to-say would be based on both self-selection and individual input. First, the candidate must put himself1 forward, and then the people who have risen to positions of respect and authority in their own neighborhoods and districts must decide whether they will trust, follow, and obey him or hold out for somebody else.

If the king or his son proved to be incompetent, then the matter of who-gets-to-say falls to the privy council, the closest advisors, or the “kitchen cabinet.” Power will still be exercised in the name of the king or the idiot son, but the decisions will be made by those who are closest to the throne. This is, of course, an unstable situation, because proximity is not a finite but a relative measure. And among the counselors and advisors there will always be those who must choose how far they will trust, follow, and obey the loudest or most persuasive speakers.

The purest form of democracy—where no one leader emerged as king, archon, or dictator—seems to have been the government of Athens in the fifth to the fourth centuries BC. There every male citizen of a certain age had a voice in the Assembly and a chance to represent his village, tribe, or administrative district (his deme) for a year’s time in the Council of 500, whose members were drawn from each tribe or district. In the Assembly, citizens could discuss and vote on proposed rules and legislation. The Council prepared the legislation for discussion in the Assembly and could also make rules by itself.

This system worked well, for as long as it did. But it was also subject to takeover by groups of oligarchs2 and individual tyrants. Rule by the direct vote of all citizens in either the Assembly or the Council is a nice idea—unless you were a woman, a slave, or a non-citizen—but it was also unwieldy. Discussion, negotiation, and consensus all take time. Not everyone is happy with every outcome. And it requires great personal and moral discipline to accept an outcome that you don’t like, or perhaps one with which you violently disagree, solely on the basis that your neighbors find it good and acceptable and that you have sworn—or just personally chosen—to uphold the democratic ideal as more important than your feelings on this one matter or string of issues.

Turning the decision power over to a smaller group of smart fellows—or even one particularly persuasive, charming, and capable fellow—who can get things done in a sensible amount of time and with a minimum of fuss, might seem like a good alternative. Especially so when there’s a crisis at hand—and almost any issue can be made to seem like a crisis—or when the bickering among Council and Assembly members has gone on so long that the issues are piling up, the roads and sewers aren’t getting fixed, the wharves are crumbling down at the port, and the rats are everywhere.

Something in the human soul gets tired of taking direct responsibility for every detail of government, especially when you can’t win every debate. It just becomes easier to turn the issues of the day over to a bunch of smart guys, or one smart guy, and let them or him figure things out. And then, if they really mess things up, you can always go out into the street with your neighbors, carry a torch or a pitchfork, and call for reforms. You can at least get the heads of the current smart guys mounted on a pike and put in their place the people who promise to do things better. And that’s a sort of democracy, too—direct action by social revolution.

In the modern political entities of the West—and in the Western-inspired parts of Asia—direct democracy and its upheavals have been replaced by representative democracy, usually in the form of a republic with an agreed-upon charter or constitution. Instead of letting smart fellows just take over the government, citizens vote to put them in power. In most of the world, exemplified by the United Kingdom, the representatives sit in an assembly or parliament, and the leaders in that body also form the administrative government. In the United States, the assembly of representatives is the Congress, with two houses whose members are seated differently—either on an equal basis among all states (the Senate) or proportionally by population (the House)—and with different terms for each. The administrative government is voted directly in the person of the Executive—the President, with cabinet members and various department heads proposed by the President but confirmed in Congress. Congress makes the laws, the President’s administration enforces them, and the federal courts up to the Supreme Court—with justices also named by the President but confirmed in Congress—rule on the laws in practice.

This system has worked well—or at least without outright revolution, and only one civil war—for more than 230 years. Like Athenian democracy, it has its satisfied winners and its unhappy losers. Sometimes the Congress has more direct power as the law-making body. More recently, the actual power of interpreting the details of Congress’s large and clumsily written laws has fallen to the Executive. Like all governments and all agreements among human beings, it’s a plastic situation that changes with the times.

The genius of the American system—or its defect, in some eyes—is that we have generally focused our discussion and debate through two main parties: once the Democrats and the Whigs, now the Democrats and the Republicans. The issues and emphases of these two groups change and sometimes reverse themselves over the generations, but they keep the discussion within bounds, and each party forms an inclusive group for handling similar issues under similar principles. The alternative is the multiplicity of parties—sometimes with single issues, sometimes splintering off of larger parties—that is common in Europe. There government by coalition—shared power without necessarily shared viewpoints—has become more common. And this is an unstable situation, especially when coalitions can break up and that precipitates an unscheduled election through a “vote of no confidence.”

We may be headed in that direction in this country, as the two major parties move their centers of gravity further out to fringe positions left and right, leaving the voters in the moderate middle with fewer and fewer good choices. So far, third parties in this country have been a pit filled with wasted votes, but that may soon change. And then we may have more government by uneasy coalitions.

But whatever comes, and whether the issues of the day reflect real or invented situations and dangers, all political issues are ultimately just wedges to put one person, one party, or a strong coalition of similar intentions in a position to make day-to-day, on-the-ground decisions for the rest of us. Issues, principles, and voiced positions are one thing, but access to and the use of actual decision-making power is the final purpose.

1. In this part of the discussion I am purposely using the male pronoun. Yes, there have been some matriarchies and matrilineal societies—but not many. And yes, in recorded history many Western nations have been led by queens. During the Roman invasion, the English had a warband leader known as Boadicea and she must have been an extraordinary woman. The English have also had queens in recent history, too: Maud, Elizabeth, Anne, Victoria. But these women—extraordinary or not—generally came to power when the dynastic tradition was strong but there was no better-placed male heir at hand. The free and unrestricted choice of a woman as national leader is a much more modern phenomenon.

2. The roots of this word are the Greek olig, meaning “few, a little, or too little,” and archon, meaning “ruler” and being one of the nine chief magistrates of the ancient city.

Sunday, August 4, 2019

Alive by Two Seconds

Motorcycle rider

When I first started riding motorcycles, 46 years ago,1 I would experience what I term a “close call”—being cut off by a car, or a car passing too close to my line of travel, or my moving into the path of imminent injury or death—about once a week. Over time and with practice, the frequency lessened to once a month, then once every three to six months, until now it’s about once a year. Understand, that these are not actual accidents, where I hit something or dropped the bike,2 but just incidents where collision, injury, and death are real possibilities.

The incident for this year—or what I hope was this year’s only close call—happened two weeks ago, and it was a doozy. And this time it was not from action by other drivers. It was my own stupid fault.

I was riding in Contra Costa County, on Alhambra Avenue, going to my BMW motorcycle shop in Concord early on a Saturday afternoon. I had taken this route only a couple of times before, and this was just the second time I had traveled it in the southbound direction; so it was not entirely familiar ground. I knew I was looking for a left turn onto Taylor Boulevard, which I thought was still one or two intersections ahead.

This stretch of Alhambra Avenue—before it morphs into Pleasant Hill Road—is a divided four-lane thoroughfare with a median strip and protected left-turn lanes with their own left-arrow signals. As I came up to a particular intersection, thinking I was still short of the turn, I looked up and saw the overhead sign for Taylor Boulevard—coming to it sooner than I expected. I was already in the leftmost of the two through lanes and had a green light. The left-turn lane beside me was empty, and I was only vaguely conscious that it had a red arrow light. Without thinking—and that’s the critical point in all this—I made a long and graceful swerve to the left onto Taylor Boulevard. But as I cleared the intersection, I heard an angry horn behind me.

My sin, and it’s a grievous one, which has caused me to rethink and think through the incident several times over the last couple of days, is that I mentally mistook the two clear lanes ahead of me and the green light for my through lane at the intersection for all the road there was. I totally forgot, for the moment, that this was a divided road and that the other side of that median strip had oncoming traffic with their own green light; so they had the right of way, too. I assumed the role of the carefree, stupid, reckless motorcyclist and blindly made that wide left turn, cutting across the left-turn lane and into oncoming traffic.

On reflection, if I had been traveling two seconds later, I would have hit, or been hit by, that oncoming car. But with the timing I inadvertently had, the driver was now only honking angrily in my wake. I was traveling about 30 miles an hour when I made my left-hand swerve, and the impact might well have injured me badly or killed me.

Stupid. Stupid. Stupid. And not to mention being against the law, because I ran the red light in that left-turn lane. I am a better rider and usually more cautious than this.

In this instance, this moment of inattention and temporary confusion about the number of lanes at my disposal, I was just lucky to pass through an empty space in that oncoming traffic. Except … I don’t believe in luck. The driver or motorcycle rider who relies on luck generally has a short lifespan and makes repeated trips to the hospital before finally checking out. I rely instead on observation, interpretation, maintaining my spatial margins ahead, behind, and to the sides, and on a set of rules for riding that I have built up over forty years of experience and that are now supposedly ingrained into my reaction process.3 Breezing through a left turn on a green light because I’ve suddenly discovered I’m in the intersection that I needed to find is not any part of this.

Besides not believing in luck or in the watchful eye of a benevolent god, I do happen to believe—most of the time, as a kind of mental exercise—in the multiverse. In this theory, the time stream that we all follow day to day is actually constantly branching out with each decision we make or probability we encounter. I cover this in some detail in my novels of time travel, starting with The Children of Possibility. In the story, the time travelers from the eleventh millennium must navigate around what they call a Wahrschein Punkt, a probability node, or probabilistic decision point. It’s a place where an event and its consequences could go either way. Think of Schrödinger’s cat, both alive and dead, until the box is opened and the probability wave resolves itself.

For two seconds on that Saturday afternoon, I passed through a probability node. In one result, at the start of those two seconds, I sailed through the intersection, straightened out on Taylor Boulevard, and heard only the angry honking of an infuriated motorist somewhere behind me. In the other result, at the end of those two seconds, I collided with the car at 30 miles per hour, was mangled on impact, thrown over or into several lanes of oncoming traffic, was mangled some more upon landing, and died either on the spot or in the ambulance soon after on my way to the hospital. All the rest is silence and darkness.

In one universe, I’m sitting here two weeks later, still bothered by my immense stupidity, vowing to myself to be more careful and aware in the future—both on my motorcycle and in my general driving—and writing up the experience for the benefit of other riders.

In the alternate universe, which goes on the same in all other respects, my brother and my family have already been notified, my body has probably been cremated, perhaps my will is being read today, and the process of dismantling and disposing of the rest of my life has already started. The book that I’m working on—curiously, the sequel to the time-travel novel Children—will never be published. My dog will never understand why, for the first time among all my previous promises, I never came back to her from going out “for just a little while.” And my friends will write off my death to another useless and stupid motorcycle accident.

It’s a curious feeling, being both alive and dead in two different realities. The experience is not enough to make me give up riding and sell the bike—a process I have gone through before, at various times in the past, but I always return to the pleasure and freedom of rocketing along in three dimensions with the wind in my face. But it’s enough to make me stop and think.

And now I have another rule to add to my personal riding doctrine: know where you are at all times and always count the number of lanes before turning.

1. For my history on motorcycles, see The Iron Stable.

2. My first motorcycle accident occurred on my first bike, my first rainy day, and my first encounter with a wet manhole cover. I was riding a motorcycle, the Yamaha RD350, that was far too small for me, where my tailbone and so my center of mass were back over the rear axle. I was crossing an intersection on San Pablo Avenue in El Cerrito at about 20 miles per hour and went down in a shower of sparks, causing major cosmetic damage to my motorcycle and ripping up a pair of pants but fortunately getting no more than a bruise on my backside. Other times that I’ve dropped a motorcycle have all been either while stopped or stopping—when I skidded, fell over, and rolled off the bike—and for that I am now thankful for antilock brakes.

3. See SIPRE as a Way of Life from March 13, 2011.