Sunday, August 25, 2019


Conductor hands

If bringing people together is an art form—and I believe that helping people work efficiently together certainly is—then the highest level of that art is leadership.

A “leader” may be defined, in the broadest possible sense, as one who accomplishes work or effort through the active participation of others. This definition would immediately rule out, then, the sort of supervisor, manager, or even chief executive who is so smart, experienced, or goal driven that he or she shoulders aside subordinates who are trying to do a task and completes it him- or herself, so that it gets “done right.” Such a person is not a leader but a star performer surrounded by frustrated assistants.

This is not to say that the leader cannot take a hand in the effort. After all, when invaders are pouring in over the walls, everyone picks up a rife and shoots. And yes, sometimes, the leader has to lay the first brick, make the first cut with the axe, or otherwise physically participate. This might be a ceremonial start to the project. Or it might be the leader demonstrating how he or she wants the task to be performed. And it may serve the psychological purpose of showing that the leader is not above getting his or her hands dirty. But for the most part, in day-to-day operations, the leader’s role is hands-off.

So how does a leader perform work “through the active participation of others”?

At first, it would seem that the leader simply gives orders. And this might work in situations and institutions where the command structure is clearly defined, enforced by rules and sanctions, and embedded by long-standing practice. Think of the Roman Catholic Church or the U.S. Army. Bishops have absolute authority over parish priests, and archbishops and cardinals have authority over bishops. Sergeants have absolute authority over privates, and lieutenants have authority over sergeants. And so on up and down the rigidly prescribed chain of command.

Except … a weak or vacillating bishop may have no effective control over the priests under him. A green or unskilled lieutenant may have no influence with an experienced, battle-hardened sergeant. Every organization, even the most hierarchical, has ways for underlings to evade or obstruct the orders they don’t like or can’t follow. In some cases, and with collusion, this might be a “white mutiny,” in which subordinates follow stupid orders to the letter until the whole effort—along with the command structure—just falls apart. And in the Vietnam War, there were supposedly incidents of “fragging,” in which soldiers rolled live grenades into the tents of incompetent officers whom they feared would get them killed in battle.

While simply giving orders can work in unusual or extreme situations, where other forms of leadership might take too long or be misunderstood, there is a further caution. In one of my favorite books, Dune by Frank Herbert, Duke Leto Atreides advises his son: “Give as few orders as possible. Once you’ve given orders on a subject, you must always give orders on that subject.” This is because people—especially those in a functioning command structure—can be fearfully literal-minded. They note and observe where a leader or a person up the line from them places his or her interest, attention, and focus. Written or verbal orders are specific and purposeful, and they will tend to be obeyed until rescinded or countermanded. And if the situation surrounding those orders changes in the slightest, the person receiving them will expect a new order amending the previous order. Otherwise, and fearful of doing something wrong or displeasing, they will follow the original order or wait, frozen, until the countermand arrives.

The trap with direct orders is always that they undermine personal initiative. “The chief said to do this and, so help me God, that’s what I’m doing.” Sometimes, perhaps with front-line soldiers in a shooting war, that kind of unthinking dedication and persistence is valuable. But few of us are engaged in a shooting war.

American business culture over the last seventy or eighty years has undergone a sea change. Coming out of World War II and into the boom times in which most of the returning soldiers found civilian jobs, the military-oriented, top-down, command-and-control structure was dominant. It worked to beat the Germans and the Japanese, so why not apply it to the assembly line and the sales floor? And that worked, too—so long as the boss was all-seeing and always right; the jobs were correctly structured and easily performed; and unusual situations were rarely encountered. It also worked because America was in its prime, being the last industrial country left standing, after Germany and Japan had been bombed flat, and England, France, and the rest of the world were exhausted by war. It was a good time for us Baby Boomers to grow up, because American business could make few lasting or damaging mistakes.

But soon, under pressure from rising global markets, advancing computer controls and information technology, and competition in a shrinking internal market, the rigid, top-down model started to falter. Employees who could only follow orders were not as nimble, flexible, or effective as those who had the initiative to identify problems and either fix them on the spot or recommend solutions to supervisors who tended to trust—rather than question—their judgment. Getting things right the first time on the assembly line avoids a lot of lost time in quality control, remediation, and recalls. Solving the customer’s problems on the sales floor avoids a lot of exchanges and restocking—not to mention loss of the customer’s good will.

The need for personal initiative has followed this cultural change back into the military. The modern army unit is no longer made up of people who shoulder a rifle, march in straight lines, and charge on command. The modern naval ship no longer needs teams of people who only pull on a rope when they are told. The modern battlefield is interconnected, data-heavy, and invisible to the naked eye. It is the place of “combined arms,” where soldiers on the ground coordinate intimately with air support, indirect artillery, and electronic intelligence sources. The modern soldier must be a team player, versed in complicated weapons systems—some of which he or she never personally handles—and react to a changing environment both flexibly and nimbly. The modern sailor is a technician operating one of the most complicated and versatile weapons systems ever made.

So how does a leader in either the modern business model or the modern military accomplish work “through the active participation of others”? Primarily, through articulating, setting, and enforcing values. Values are better guides to individual initiative than any set of standing or direct orders. Values last longer in the mind and memory. And when they are correctly stated and willingly adopted, they are more flexible in the face of emergent and changing situations.

The first value a leader must articulate, demonstrate, and encourage, is dedication to and respect for training. The modern soldier or employee needs to know every aspect of his or her own job, and it helps the person take initiative if he or she also understands and can on occasion fulfill the jobs of the people on either side, with whom the employee deals directly, and above, so that the employee understands the reason for and purpose of those orders and directions. Further, the employee should be educated in broader issues like the company’s or the unit’s mission, the needs of customers, the nature of the competition, and the factors leading to success. Some companies try to do this in blanket form with some sort of universal, homily-laded “mission statement” printed and laminated in posters that hang on the wall. But that’s the easy way out. If the mission is going to become a value embedded in the employee’s mind and heart, then it needs to be taught and refreshed at every opportunity.

The complementary value to training is trust: for the leader to trust that the employee has absorbed this knowledge and to encourage and reward the employee when he or she shows initiative. Without trust and encouragement, the employee will suspect that all of the training he or she has received was some sort of management gimmick. That Human Resources is manufacturing another “policy du jour,” a temporary thing like all the others, and what the company really wants is unquestioning obedience. That attitude taps into the vein of cynicism that lies not far beneath the skin of any alert and educated human being. And that sort of cynicism is poison to personal initiative.

Ultimately then, the leader establishes performance standards. This is the hands-on part of the job. The leader voices these directly: “This is how we do things.” “Here is what I expect.” And the leader rewards and publicly praises good performance: “That was well done.” “This is what success looks like.” Setting standards is not a do-once-and-done process. It cannot be achieved with a memo or a motivational poster. It is an active, repetitive, daily flow of discovery, example setting, and encouragement. The leader looks for occasions to teach and train to standards. Only with repetition and reinforcement do the values in all their varied applications become part of the team’s understanding and sense of purpose.

Modern leadership is more about persuasion than power. It treats people as willing participants with minds of their own, rather than as slaves, servants, or “meat robots” to be programmed. It is a personal relationship between supervisor and employee, between officer and soldier or sailor. It is built on mutual trust and respect, rather than on fear and obedience. And it takes conscious effort and an appreciation for the science of psychology and the art communication to establish this relationship.

But once the bond it made, it is stronger than any other. People will literally die for it.

Sunday, August 18, 2019


Graceful hands

If living is an art form—and I believe that living well certainly is—then the highest level of that art is what I call “grace.”

In religious terms, grace means having the favor of, making oneself pleasing to, or benefiting from the generosity of God. These are acceptable meanings, but not the subject of this discussion. Religious grace is a state that comes from, or is predicated on, or defined by the outside influence of the Almighty. Similarly, in archaic terms, a sovereign might grant a pensioner or a loyal servant grace along with some form of bounty. But again, I’m not talking about the grace that comes from outside.

A person of any religious or political stripe has grace, and can be considered gracious, if he or she adopts a particular set of values and their corresponding behaviors. So let me attempt to list them.

First of all, a gracious person is interested in and sympathetic toward others. He or she lives outside of him- or herself. Such a person, when approached, offers a smile; when asked for an opinion, offers encouragement and support. He or she cares about the impression made upon other people. More important, he or she wants other people to feel secure, comforted, and at ease. The gracious person creates an atmosphere of calm and contentment.

This is not to say that a person with grace has no inner life, or sacrifices everything for the people in his or her immediate vicinity, or never says “no” to a request. But any necessary “no” is always framed in terms of regret rather than righteousness. The protection that a gracious person puts in place around his or her inner life and personal substance is understated if not actually invisible. This is not deceit but an advanced form of minding one’s own business.

Second, and corollary to the first, the gracious person is generous. The old-fashioned word is “magnanimous.”1 When confronted with a direct appeal, he or she generally gives what is asked and a bit more. Or, if direct giving is not possible due to the state of his or her purse and other resources, the gracious person goes out of his or her way to offer other forms of help and aid. Once again, that giving or that aid may have limits, but they are generally invisible to the person receiving the help. At the very least, the gracious person makes time to listen to the request and, if help is not forthcoming, refrains from letting the supplicant feel bad about the asking. This again is a form of sympathetic treatment.

Third, the gracious person respects the freedom and personal agency of others. This is a delicate point, because it is so often overlooked in today’s communications. Offering respect in this case is a subtle negative: refraining from making assumptions about another person’s situation, motives, capabilities, or prospects. Just because someone is in a bad way and asking for help does not mean that he or she is unwilling, incapable, or lazy. The other person’s life is not a subject for debate and speculation. The other person’s story is his or hers alone to tell—or not. The gracious person respects personal boundaries—both in speaking and in imagination—just as he or she maintains his or her own boundaries and secrets. The gracious person trusts that others—barring direct evidence to the contrary—have abilities and prospects similar to his or her own. After all, the basis of respect is granting that others have an equal footing in this business of life.

Fourth, and corollary to all of the above, the gracious person is confident of and secure in his or her own situation, motives, capabilities, and prospects. To be confident means that a person has examined his or her own life and knows him- or herself at some depth.2 To be secure in life is to have control of one’s person and resources and to maintain a calm appreciation of what the future may hold. A person who is insecure or afraid cannot be—or, more generally, believes he or she cannot afford to be—generous, sympathetic, or respectful of others. A person who does not trust him- or herself is unlikely to extend a similar trust to other people.

Fifth, a person with grace generally leads a sober and thoughtful life. He or she knows that alcohol and drugs, loose talk, unrestrained behavior, reckless impulses, crude humor, and even excessive jocularity—all tending toward a loss of personal control and restraint—have the possibility to be damaging to oneself and to those nearby. Wit and a well-told joke are acceptable. Imprudence, impudence, and a generally sarcastic attitude are not.

Sixth, and finally, the gracious person maintains his or her balance. We think of a person who is physically graceful as moving in a controlled, purposeful, balanced manner, without sudden lunges, hesitations, missteps, or reversals. In the same way, the gracious person’s life proceeds in balance, with a sense of purpose and direction, and with forethought. It avoids both impulsive lunges and fearful retreats. The gracious, magnanimous person is superior to events, plots a life course focused on the long view, and remains steadfast, loyal, and calm in the face of both opportunity and adversity. He or she thinks before acting, and acts with precision and purpose.

All of these traits exist irrespective of any political views. A generous, thoughtful, sympathetic, and controlled person can exist in any political sphere—and even more so in times of disruption and confusion, which are merely opportunities to exercise these talents. Similarly, the life situation that allows a person to demonstrate grace is not dependent on wealth, education, intelligence, health, or other outward resources—although it is easier to maintain a gracious demeanor if one is not scrabbling for breadcrusts, ignorant of the world, dim in perception, or wracked with pain. Still, the true gift is to rise above these shortcomings.

I have been blessed in my life to know a number of gracious people. My mother was one, my late wife was another, and I cherish their memory. They both lived with a calmness of person and generosity of spirit that made the people around them feel both confident and secure. And that is something that can only come from the inside.

1. This is a direct combination of two Latin words, magnus, meaning “big,” and animus, meaning “spirit” or “soul.” Such a person is big-spirited.

2. It was written above the door to the oracle at Delphi: gnōthi seauton, “know thyself.” To understand the will of the gods, you must first understand your own situation, motives, capabilities, and prospects.

Sunday, August 11, 2019

Who Gets to Say?

Founding Fathers

“Declaration of Independence” by John Turnbull, 1818

All political questions, every one of them, come down to a basic proposition: Who gets to say? Who in society will decide the big questions and, by extension, the smaller ones for the rest of us? Who will make the rules? Who will rule on what is allowed and what forbidden? Whose vision of culture and society will shape the world in which the rest of us must live?

For a hundred thousand years or so of our hunter-gatherer existence, going back even before the H. sapiens genome settled down in its current configuration, the question of who-gets-to-say was probably handled as close to democratically as one could wish. After all, each group or band was about the size of an extended family or tribe, probably no more than fifty individuals maximum. In such a group, you might think that the oldest, wisest, strongest, or bravest would be the natural leader and decider. But I believe such a small group would probably have its own ideas.

The person whom the tribe picked as “wisest,” “strongest,” or “bravest” might be the result of a lot of individual choice and input. It would really come down to whomever the collected individuals trusted or respected or hoped would take them in a direction that found food, water, shelter, and safety. And that might not be the person with the smartest mouth, the biggest muscles, or an unflinching attitude when staring down a wild boar or tiger. It ultimately would be the person to whom the most members of the group would listen, respond, and obey.

It’s also possible, with a group that small, to have a situation of shared leadership. One person might know the most about and be the most skilled at hunting. Another might have special diplomatic skills when dealing with other tribes. And a third might be the best to consult about campsite rules and interpersonal relationships. In this, the hunter-gatherer tribe might be like any extended family, with different natural leaders at the focus of different family problems.

But when the tribe settles down and becomes a village, a culture, or a nation, the daily push and pull of personal familiarity, family loyalty, and individual input no longer works to select a leader to speak for and direct the whole district or city-state. Each street and neighborhood might have its local wise man or woman, its go-to person for conflict resolution, and its natural spokesperson. But as soon as that local neighborhood bands together with others to form an association that is larger, better able to plan community rules and infrastructure, and benefit from extended trade with other cities and states—and so become richer and more powerful as a whole—then the need for a unified leadership becomes apparent.

The district chief, thane, or regional king would not necessarily be chosen by popular election, and probably not by simple trust, respect, and response on an individual level. And this would be the initial selection, when the old king dies without an heir or the community undergoes a popular revolution. Otherwise, for all our long history, humans have been willing to trust the children of the king to take the mantle, crown, or talking stick when the old king passes. This trust in inherited capability was not blind, because it responded to an innate understanding of biology, observed from a thousand generations of animal husbandry and seed selection: strong and capable parents usually beget strong and capable children.

But in that time of dynastic disruption—either death of a childless king or social revolution—then the choice of who-gets-to-say would be based on both self-selection and individual input. First, the candidate must put himself1 forward, and then the people who have risen to positions of respect and authority in their own neighborhoods and districts must decide whether they will trust, follow, and obey him or hold out for somebody else.

If the king or his son proved to be incompetent, then the matter of who-gets-to-say falls to the privy council, the closest advisors, or the “kitchen cabinet.” Power will still be exercised in the name of the king or the idiot son, but the decisions will be made by those who are closest to the throne. This is, of course, an unstable situation, because proximity is not a finite but a relative measure. And among the counselors and advisors there will always be those who must choose how far they will trust, follow, and obey the loudest or most persuasive speakers.

The purest form of democracy—where no one leader emerged as king, archon, or dictator—seems to have been the government of Athens in the fifth to the fourth centuries BC. There every male citizen of a certain age had a voice in the Assembly and a chance to represent his village, tribe, or administrative district (his deme) for a year’s time in the Council of 500, whose members were drawn from each tribe or district. In the Assembly, citizens could discuss and vote on proposed rules and legislation. The Council prepared the legislation for discussion in the Assembly and could also make rules by itself.

This system worked well, for as long as it did. But it was also subject to takeover by groups of oligarchs2 and individual tyrants. Rule by the direct vote of all citizens in either the Assembly or the Council is a nice idea—unless you were a woman, a slave, or a non-citizen—but it was also unwieldy. Discussion, negotiation, and consensus all take time. Not everyone is happy with every outcome. And it requires great personal and moral discipline to accept an outcome that you don’t like, or perhaps one with which you violently disagree, solely on the basis that your neighbors find it good and acceptable and that you have sworn—or just personally chosen—to uphold the democratic ideal as more important than your feelings on this one matter or string of issues.

Turning the decision power over to a smaller group of smart fellows—or even one particularly persuasive, charming, and capable fellow—who can get things done in a sensible amount of time and with a minimum of fuss, might seem like a good alternative. Especially so when there’s a crisis at hand—and almost any issue can be made to seem like a crisis—or when the bickering among Council and Assembly members has gone on so long that the issues are piling up, the roads and sewers aren’t getting fixed, the wharves are crumbling down at the port, and the rats are everywhere.

Something in the human soul gets tired of taking direct responsibility for every detail of government, especially when you can’t win every debate. It just becomes easier to turn the issues of the day over to a bunch of smart guys, or one smart guy, and let them or him figure things out. And then, if they really mess things up, you can always go out into the street with your neighbors, carry a torch or a pitchfork, and call for reforms. You can at least get the heads of the current smart guys mounted on a pike and put in their place the people who promise to do things better. And that’s a sort of democracy, too—direct action by social revolution.

In the modern political entities of the West—and in the Western-inspired parts of Asia—direct democracy and its upheavals have been replaced by representative democracy, usually in the form of a republic with an agreed-upon charter or constitution. Instead of letting smart fellows just take over the government, citizens vote to put them in power. In most of the world, exemplified by the United Kingdom, the representatives sit in an assembly or parliament, and the leaders in that body also form the administrative government. In the United States, the assembly of representatives is the Congress, with two houses whose members are seated differently—either on an equal basis among all states (the Senate) or proportionally by population (the House)—and with different terms for each. The administrative government is voted directly in the person of the Executive—the President, with cabinet members and various department heads proposed by the President but confirmed in Congress. Congress makes the laws, the President’s administration enforces them, and the federal courts up to the Supreme Court—with justices also named by the President but confirmed in Congress—rule on the laws in practice.

This system has worked well—or at least without outright revolution, and only one civil war—for more than 230 years. Like Athenian democracy, it has its satisfied winners and its unhappy losers. Sometimes the Congress has more direct power as the law-making body. More recently, the actual power of interpreting the details of Congress’s large and clumsily written laws has fallen to the Executive. Like all governments and all agreements among human beings, it’s a plastic situation that changes with the times.

The genius of the American system—or its defect, in some eyes—is that we have generally focused our discussion and debate through two main parties: once the Democrats and the Whigs, now the Democrats and the Republicans. The issues and emphases of these two groups change and sometimes reverse themselves over the generations, but they keep the discussion within bounds, and each party forms an inclusive group for handling similar issues under similar principles. The alternative is the multiplicity of parties—sometimes with single issues, sometimes splintering off of larger parties—that is common in Europe. There government by coalition—shared power without necessarily shared viewpoints—has become more common. And this is an unstable situation, especially when coalitions can break up and that precipitates an unscheduled election through a “vote of no confidence.”

We may be headed in that direction in this country, as the two major parties move their centers of gravity further out to fringe positions left and right, leaving the voters in the moderate middle with fewer and fewer good choices. So far, third parties in this country have been a pit filled with wasted votes, but that may soon change. And then we may have more government by uneasy coalitions.

But whatever comes, and whether the issues of the day reflect real or invented situations and dangers, all political issues are ultimately just wedges to put one person, one party, or a strong coalition of similar intentions in a position to make day-to-day, on-the-ground decisions for the rest of us. Issues, principles, and voiced positions are one thing, but access to and the use of actual decision-making power is the final purpose.

1. In this part of the discussion I am purposely using the male pronoun. Yes, there have been some matriarchies and matrilineal societies—but not many. And yes, in recorded history many Western nations have been led by queens. During the Roman invasion, the English had a warband leader known as Boadicea and she must have been an extraordinary woman. The English have also had queens in recent history, too: Maud, Elizabeth, Anne, Victoria. But these women—extraordinary or not—generally came to power when the dynastic tradition was strong but there was no better-placed male heir at hand. The free and unrestricted choice of a woman as national leader is a much more modern phenomenon.

2. The roots of this word are the Greek olig, meaning “few, a little, or too little,” and archon, meaning “ruler” and being one of the nine chief magistrates of the ancient city.

Sunday, August 4, 2019

Alive by Two Seconds

Motorcycle rider

When I first started riding motorcycles, 46 years ago,1 I would experience what I term a “close call”—being cut off by a car, or a car passing too close to my line of travel, or my moving into the path of imminent injury or death—about once a week. Over time and with practice, the frequency lessened to once a month, then once every three to six months, until now it’s about once a year. Understand, that these are not actual accidents, where I hit something or dropped the bike,2 but just incidents where collision, injury, and death are real possibilities.

The incident for this year—or what I hope was this year’s only close call—happened two weeks ago, and it was a doozy. And this time it was not from action by other drivers. It was my own stupid fault.

I was riding in Contra Costa County, on Alhambra Avenue, going to my BMW motorcycle shop in Concord early on a Saturday afternoon. I had taken this route only a couple of times before, and this was just the second time I had traveled it in the southbound direction; so it was not entirely familiar ground. I knew I was looking for a left turn onto Taylor Boulevard, which I thought was still one or two intersections ahead.

This stretch of Alhambra Avenue—before it morphs into Pleasant Hill Road—is a divided four-lane thoroughfare with a median strip and protected left-turn lanes with their own left-arrow signals. As I came up to a particular intersection, thinking I was still short of the turn, I looked up and saw the overhead sign for Taylor Boulevard—coming to it sooner than I expected. I was already in the leftmost of the two through lanes and had a green light. The left-turn lane beside me was empty, and I was only vaguely conscious that it had a red arrow light. Without thinking—and that’s the critical point in all this—I made a long and graceful swerve to the left onto Taylor Boulevard. But as I cleared the intersection, I heard an angry horn behind me.

My sin, and it’s a grievous one, which has caused me to rethink and think through the incident several times over the last couple of days, is that I mentally mistook the two clear lanes ahead of me and the green light for my through lane at the intersection for all the road there was. I totally forgot, for the moment, that this was a divided road and that the other side of that median strip had oncoming traffic with their own green light; so they had the right of way, too. I assumed the role of the carefree, stupid, reckless motorcyclist and blindly made that wide left turn, cutting across the left-turn lane and into oncoming traffic.

On reflection, if I had been traveling two seconds later, I would have hit, or been hit by, that oncoming car. But with the timing I inadvertently had, the driver was now only honking angrily in my wake. I was traveling about 30 miles an hour when I made my left-hand swerve, and the impact might well have injured me badly or killed me.

Stupid. Stupid. Stupid. And not to mention being against the law, because I ran the red light in that left-turn lane. I am a better rider and usually more cautious than this.

In this instance, this moment of inattention and temporary confusion about the number of lanes at my disposal, I was just lucky to pass through an empty space in that oncoming traffic. Except … I don’t believe in luck. The driver or motorcycle rider who relies on luck generally has a short lifespan and makes repeated trips to the hospital before finally checking out. I rely instead on observation, interpretation, maintaining my spatial margins ahead, behind, and to the sides, and on a set of rules for riding that I have built up over forty years of experience and that are now supposedly ingrained into my reaction process.3 Breezing through a left turn on a green light because I’ve suddenly discovered I’m in the intersection that I needed to find is not any part of this.

Besides not believing in luck or in the watchful eye of a benevolent god, I do happen to believe—most of the time, as a kind of mental exercise—in the multiverse. In this theory, the time stream that we all follow day to day is actually constantly branching out with each decision we make or probability we encounter. I cover this in some detail in my novels of time travel, starting with The Children of Possibility. In the story, the time travelers from the eleventh millennium must navigate around what they call a Wahrschein Punkt, a probability node, or probabilistic decision point. It’s a place where an event and its consequences could go either way. Think of Schrödinger’s cat, both alive and dead, until the box is opened and the probability wave resolves itself.

For two seconds on that Saturday afternoon, I passed through a probability node. In one result, at the start of those two seconds, I sailed through the intersection, straightened out on Taylor Boulevard, and heard only the angry honking of an infuriated motorist somewhere behind me. In the other result, at the end of those two seconds, I collided with the car at 30 miles per hour, was mangled on impact, thrown over or into several lanes of oncoming traffic, was mangled some more upon landing, and died either on the spot or in the ambulance soon after on my way to the hospital. All the rest is silence and darkness.

In one universe, I’m sitting here two weeks later, still bothered by my immense stupidity, vowing to myself to be more careful and aware in the future—both on my motorcycle and in my general driving—and writing up the experience for the benefit of other riders.

In the alternate universe, which goes on the same in all other respects, my brother and my family have already been notified, my body has probably been cremated, perhaps my will is being read today, and the process of dismantling and disposing of the rest of my life has already started. The book that I’m working on—curiously, the sequel to the time-travel novel Children—will never be published. My dog will never understand why, for the first time among all my previous promises, I never came back to her from going out “for just a little while.” And my friends will write off my death to another useless and stupid motorcycle accident.

It’s a curious feeling, being both alive and dead in two different realities. The experience is not enough to make me give up riding and sell the bike—a process I have gone through before, at various times in the past, but I always return to the pleasure and freedom of rocketing along in three dimensions with the wind in my face. But it’s enough to make me stop and think.

And now I have another rule to add to my personal riding doctrine: know where you are at all times and always count the number of lanes before turning.

1. For my history on motorcycles, see The Iron Stable.

2. My first motorcycle accident occurred on my first bike, my first rainy day, and my first encounter with a wet manhole cover. I was riding a motorcycle, the Yamaha RD350, that was far too small for me, where my tailbone and so my center of mass were back over the rear axle. I was crossing an intersection on San Pablo Avenue in El Cerrito at about 20 miles per hour and went down in a shower of sparks, causing major cosmetic damage to my motorcycle and ripping up a pair of pants but fortunately getting no more than a bruise on my backside. Other times that I’ve dropped a motorcycle have all been either while stopped or stopping—when I skidded, fell over, and rolled off the bike—and for that I am now thankful for antilock brakes.

3. See SIPRE as a Way of Life from March 13, 2011.

Sunday, July 28, 2019

The First Rule of Biology

Welsh dragon

I am not a thing; I am a process. That, applied to all living organisms, should be the first rule of biology. I—and you, your dog, your begonia, the cows and corn we raise for food, and the mold that lives in our drains—we may all appear to be solid objects, have substance, and react in many ways to physical forces (like, for instance, the accelerations of gravity). But all living beings are processes before we are things.

Why is this important? Because it will change the way we think about our own bodies and other creatures.

Take, for example, physical corrections like orthopedic appliances and procedures, or the orthodontic teeth-straightening that most of us endured as teenagers. All of these procedures are a fix for “right now,” for what ails us at this moment. But in the span of seventy, eighty, ninety years of life, the body inevitably changes. The new mechanical knee you had implanted at age sixty may not work as well at eighty because your bones are constantly reacting to physical stresses—strengthening webs of calcium here, hollowing out spaces there—and that new knee simply alters the stress points.

Or, in my case, where I began a process of tooth alignment with those “invisible” braces in my late sixties.1 I had one tooth on the left side of my lower jaw that had loosened because the teeth across the center of my mouth were collapsing inward, and one of them had actually taken a full step back behind the other three. That and a misalignment on my right side between the upper and lower jaws, where some teeth were not actually engaged, told my dentist (and so me) that in a few years I would start losing teeth. It wouldn’t be bad oral hygiene or gum disease—which is so often the case in tooth loss—but simply a lack of the bite pressure that keeps the bony socket firm in supporting each individual tooth. Teeth left hanging with no engagement from their opposite number eventually loosen and fall out. Two and a half years of realignment have fixed this problem—at least for now.

What I learned from the dentist is that the teeth in your mouth are always moving around and adjusting. So the perfectly straight teeth you acquired as a kid with your parents’ orthodontics bill will probably be snaggly and uneven again by the time you’re sixty. This is called living. It’s a process.

Take, for example, the issue of tattoos and how they change over time. Most people think that the artist injects the ink under the skin and it just stays there, like the pigments that the plaster layer absorbs in those medieval frescoes. But the reality is that ink, or any foreign object injected into the body, is first highly mobile and second considered an invasion by the body’s defenses. When the ink is injected, white blood cells called phages rush to the skin’s deeper layers and consume it. That is their purpose, to capture dead matter and dispose of it by allowing themselves to be processed into waste in the liver. But these cells become so engorged with the ink, which is plentiful in their surroundings, that they are trapped in place below the skin’s surface. This is why the tattoo artist advises you to keep the site covered, clean, and moisturized: you are protecting not only the surface punctures but the deeper “infection” to which these phages are responding and attempting to heal.

All cells have a lifespan, however, and when one phage dies in place, another enters to consume it and the ink it carried. This keeps the tattoo pattern in place, more or less. However, as the artist who did my two tattoos2 advised, over time the design will spread out on a millimeter scale—that’s the jostling of the phage cells as they consume one another and their loads of ink. And the tattoo will also fade—that’s the bits of ink that some of the phages get to carry away for disposal, along with the sun’s UV rays fading the ink itself. The process is ongoing for as long as you live. Nothing on or in the body is permanent.

And take, for example, the concept of beaming people across inconvenient distances in Star Trek. As I’ve written elsewhere,3 it does not matter how fast you decompose—that is, burn up—the human body to access and address each atom and molecule, nor how fast your computer takes in and correlates all this data, nor how big your “transporter buffer” may be. All that mapping of atoms and molecules, no matter how accurate, is still going to be ineffective.

The human body is not a thing but a process. Not only do these molecules move around inside the liquid medium of the cells; many of those molecules are also in a constant state of reaction: enzymes breaking larger molecules apart, or putting smaller molecules together, or copying strands of DNA for replication, or the hundred other processes that constitute life on a cellular level. In each of the thirty-odd trillion cells of the human body, hundreds of these reactions are taking place all at the same time. If the transport computer mistakes the direction of any number of these reactions, or interferes to the point of changing them, or causes the replication of a DNA strand to change, it will flood the body with poisons and mutations just as surely as a large dose of ionizing radiation. You might transport a brick from place to place, or even a ham sandwich without much harmful effect. But a living organism would be at great peril.

Our bodies and those of every living organism are constantly changing, growing, reacting, and also—just a bit—dying. This is what it means to be alive.

1. I never had orthodontia as a child. My parents didn’t think it was necessary, and I wasn’t vain about my appearance. Also, I had heard friends talk about the pain involved in the process and the inconvenience of chewing and brushing around the wires, and so I was thankful to be spared all that.

2. On the back of my right wrist I have a large semicolon, about an inch long. This is a commemoration and a warning about depression and suicide. On the back of my left wrist, for balance, I have a Welsh dragon, about an inch and a half square. Unlike the design shown on the flag of Wales, where the dragon is mostly a solid form with incidental lines, mine is more of an open design of loops and curls, taken from the website of the Devolved Government of Wales. The artist said this design would be easier to show as a tattoo, and it would not spread into an ugly red blotch over time. Why at all the Welsh dragon? Because that’s part of my heritage, and my late wife always loved dragons, that being her sign in the Chinese zodiac.

3. See Transporter Beaming from August 21, 2010.

Sunday, July 21, 2019

The Dream of Control

Surveillance camersa

A group of friends and I were having a friendly discussion the other night, and the subject turned to China, its growing economic power, and its growing social control. I said that I was not worried about China as a competitor. I don’t wish the country or the people ill, but they have built an economic miracle on what I believe are the wrong principles, and that will eventually doom them.

First, they have sacrificed everything for economic growth. The view from afar is that they have built infrastructure and facilities in excess of their current needs simply to notch up growth numbers. And in the process, they have made bad investments—those ghost cities sitting on the edge of the desert—and incurred massive debt. You can do this when you have a command-and-control economy that is intermittently beyond the reach of market forces. But those forces eventually come into play.

Second, they have built a manufacturing economy based on cheap pairs of hands. They bring peasants in from the countryside and offer them jobs in special economic zones where their own country’s labor laws are abridged. They use this cheap labor to make goods for foreign sale, and this has worked so far. But the manufacturing of the future is going to be automated: robots doing the tedious, mindless, dangerous jobs that humans have done in the past. Once robots and their control software are worked out, they are easily cloned and copied.1 When robotics, 3D printing, and interlinked supply chains take over in the next couple of decades, Western Civilization will have to figure out what it means to be human and how to keep people usefully occupied. And then having a billion pairs of relatively unskilled hands willing to work cheap just won’t cut it. Having an economy based on the “labor theory of value” will be a disaster.

Third, Xi Jinping’s and the Communist Party of China’s attempts to reassert control over the culture and its politics, after the lightening and leveling with the market-economy reforms under Deng Xiaoping, is going to backfire. China’s long history is one of loosening and tightening control under various emperors and dynasties—until the country snaps in a popular rebellion that overthrows the system, calls for a new “mandate” and a “rectification of names” (in Confucian terms, making words correspond to reality), and asserts a new social order. And then the process begins all over again.

When I made this last point, one of our friends raised the issue of the new social controls that China is instituting, with computerized surveillance, public facial recognition systems, and a new “social credit system” that will link a citizen’s access to government-controlled privileges and benefits like travel, job choice, and financial credit back to the state’s approval of his or her every expressed thought and action. Such control, this friend suggested, will make a social revolution impossible and will lock the ruling party’s power forever.

I don’t believe it. The idea that you can so completely control human beings—especially masses of them in a densely populated country—is a dream. It cannot work.

Consider Nazi Germany, which had an extremely effective and ruthless secret police, the Geheime Staatspolizei (Secret State Police) or Gestapo, and yet in 1944 a group of politicians and general officers were able to orchestrate a plot and plant a bomb at Hitler’s feet. The fact that the coup failed was an accident. The fact that it got so far is a demonstration of human resourcefulness.

If the postal service is monitored and subject to interdiction, if the internet and email are subject to electronic surveillance, then you resolve to meet face-to-face, work through trusted couriers, and write nothing down. There are ways to avoid the strictest surveillance, although you may have to learn the art of codes and ciphers, and use invisible inks and flash papers.

Even then, you may be hounded by police spies—but you will also have the opportunity to access disaffected and trustworthy spies within the police. The party and the police are both human organizations, which means that every member ultimately keeps his own counsel. Yes, the state’s bureaucracy will have many fanatics whose loyalties are switched on by party rhetoric and permanently locked in place by idealism and careerism. But you will also encounter many underlings who are stuck in their current job with no chance of promotion, who see opportunity in a change of regime. And for every dedicated officer willing to sacrifice for the state you will have one or more who may have been forced to watch and turn in his own brother or sister, parent or child, and now bitterly regrets it. People are not idealistic robots.

Even robots are not robots. Any electronic surveillance system, any system of social monitoring and scoring, eventually delivers its data and summaries to a human decision point. And the person watching the monitors or reading the electronic files is not going to be Heinrich Himmler or Lavrentiy Beria himself. The head of the police organization spends too much time “managing upward” to involve himself with the doings of average citizens. No, he delegates the watching and reporting downward. And his immediate deputies are not going to do the daily dirty work, either. Ultimately, the summarizing and reporting will be in the hands of police lieutenants in each prefecture—and these are the people who will have the potential for either promoting or ignoring a negative report or suspicious contact, for blinking at the television monitor, for corrupting the system because they were passed over for promotion or regret their part in the system.

This is not to say that such a system will not do a massive amount of harm to innocent and well-meaning people. Or that efforts to break it are not dangerous. Or that plots against the state will not be detected. The system of police surveillance and social credit will be marvelously effective at harassing and harrowing the lives of average citizens. But it is not foolproof, and it is not ironclad, because this mechanical system will be run and managed by human beings in the interests of a human-based organization.

And finally, China has for years been training computer hackers and cyber terrorists in order to launch them against the West. Such people are by nature curious, restless, and inventive. A computerized system of public surveillance and social credit becomes more vulnerable the more completely it automates. Some internal Chinese hacker—likely whole groups of them—will surely be at work somewhere getting themselves and their friends erased from the system and building up social credits they can distribute at will in a dark market. You know this is going to happen, again because the system is run by human beings.

The dystopian vision of 1984, where every room is blasted with patriotic speeches and music from unswitchable television screens that also watch every citizen at work and play … is a fiction. Behind that all-seeing eye is not a singular Big Brother but a million Little Brothers, variable human beings. And some of them will keep their own counsel. It’s not a benevolent system—far from it!—but it’s not infallible, either.

1. See Gutenberg and Automation from February 20, 2011.

Sunday, July 7, 2019

A Strange Esthetic

Red file folder

I have an aversion to the obvious. The world that is apparent to the senses at first look, the answer to any question that comes first to mind, the most widely accepted social and political views—all of these fall somewhere between ennui and ick! with me.1

I prefer the second glance, the deeper meaning, the hidden truth. And that’s if I’m feeling philosophical. In other areas—music, for instance—I like the strange chord progressions, the minors over the majors, the first and fourth over the first and third, and the transitions that feel just a little bit “off” and odd. In paintings, I prefer bold but unusual color combinations, hues, and shadings, or perspectives that are slightly skewed. In photography, I like shots taken from an angle or from noticeably above or below eye level.

So a guiding principle in my writing—my esthetic if you will—is to avoid the obvious. It’s easy enough to tell a story from one point of view, the first-person narrative. Or from the omniscient narrator, who observes and reports all the action at once, sampling the story from inside A’s point of view and then, in the next sentence, popping into B’s head to get the reaction to whatever A has done or said. It’s easy enough to set the story in familiar old Grover’s Corners—the background of Thornton Wilder’s Our Town—or the anonymous American suburbs that Steven Spielberg mined so artfully for his characters in many of his early movies.

To me, that’s bland and boring. And it lacks style. I much prefer a definite setting with a few kinks and quirks and special needs, like the summertime resort of Amity Island in Spielberg’s Jaws, where everyone is dependent on tourist dollars and so has an economic as well as a visceral reason for hating and fearing the shark. I’m not looking for an Everytown as the place to tell a story, but a town with an edge and maybe a secret.

This is one of the reasons that I have settled on telling my novels through the tangled stories of multiple characters and tightly controlling their viewpoints. In this form, every scene is told from the viewpoint—that is, from inside the head, as if written in first person but with third-person pronouns—of a single character. The narration tells, and the reader knows, only what is available through that character’s senses and perceptions, intuition and insights, and knowledge of the story so far. If I want to show the reader the immediate reaction of another character to what the viewpoint character has said or done, that reaction must be discernible from an exclamation, facial expression, or other clue visible to the viewpoint character—and it will depend on the viewpoint character being the sort of person to notice the reactions of other people in the first place.

Limiting the story to the separate viewpoints of a cast of characters forces me as a writer to consider and choose. That narrowed viewpoint is like someone holding a flashlight in a darkened room. (I’ve used this analogy before.) The viewpoint character’s attention, vision, understanding, and reactions can focus on one thing at a time. This is like stream-of-conscious writing, but with the ability for the character to reflect, recall, and question what he or she is perceiving and doing.

And then I let the reader, who is riding along inside the viewpoint character’s head, have his or her own reactions to the world as the character sees it. For example, if the character sees but does not note or distinguish an obviously misplaced object or clue, the reader is tacitly invited to note it for him- or herself and thereby wonder about the perceptions, understanding, and even the intelligence of the viewpoint character.

This kind of limit on the scope of my writing—and these mind games I play with the reader—force me out of the obvious ways of telling a story. The story doesn’t start just anywhere but in a particular place and time, and with a particular viewpoint. And from there, I am using the perceptions of the viewpoint characters to make the setting unique. Not just a china cup but a china cup with a crack in the rim, or fading paint in its design, or the character’s memories of the cup once sitting in Grandma’s china cabinet. The world in this place is not obvious, not simple, not the expected. It’s a different world, filtered through the perceptions—and sometimes the misperceptions and misunderstandings—of a particular person.

By avoiding the obvious, by looking for the strange, the skewed, the particular, I am forced to make the novel’s setting and circumstances come alive in my imagination and in the reader’s mind. I give the world an element of surprise leading—sometimes but not always—to a consideration of what might be new and different this time.

Of course, there is a danger in taking this aversion to the obvious too far. Some combinations of musical notes are not mysterious but simply discordant. Some color combinations are not only surprising but clashing and garish. And some stories so violate the norms of sensibility and end up in such bad places that readers are not enticed and intrigued but simply repelled. So, as always, the dominant force in the storytelling—as in music and art—is the creator’s sense of control.

The author’s imagination—as with the composer’s ear and the painter’s eye—can run all over the place. The artist can reach for the weird simply in order to be weird. The intent can be to create the strange rather than the interesting. And sometimes, if the artist is in a bad mood, to create the repellent and offensive, to trick the reader into stepping into a metaphorical manure pile and then, presumably, to laugh as the reader vainly attempts to wipe his or her shoes.

So the aversion to the obvious requires an element of restraint. In every art form, there are reader/listener/viewer expectations that are shaped and honed by experience and catalogued for the artist in volumes concerning poetics, music theory, or art appreciation. Stories, for example, don’t always require happy endings,2 but they do have to end in a place and manner that explain the actions that have gone before and render a set of consequences that the reader finds intellectually and emotionally satisfying. To push the story in a direction or to a conclusion that avoids the obvious to the point of not making any kind of sense would be a mistake.

But, that said, I do spend a lot of time between the first impulses recorded in my outline and the final set of words on the page looking for images, responses, and story lines that rise above the obvious first-take and arrive in someplace unique, interesting, and sometimes even surprising.

1. In fact, an element of my somewhat strange and dry humor is to state the obvious with a degree of apparent boldness, as if I were drawing a new insight, or absolutely deadpan. This usually gets me funny looks and explains why some people think I’m really kind of stupid.

2. For this, see my recent blog Classic Comedy from May 19, 2019.

Sunday, June 30, 2019

That Apple Story

Apple with bite

In the Bible, in Genesis 2:16-17, the Lord God warns Adam and Eve against eating the apple from the Tree of the Knowledge of Good and Evil. They do so anyway, because the Serpent—Satan in disguise—tempts Eve with flattery. And after she eats, she entices Adam to join her. This act is the one thing in all of their idyllic existence that God has forbidden them to do. It is the one fruit in the Garden, a place of plenty and pleasure, that is denied to them. And when they eat it, they are banished forever into a hard and desolate world to make their way amid toil and suffering.

For Christians and Jews, this is the “original sin.” It is the moment of mankind’s “fall from grace” and separation from God. This one ancient act removes all of us—even the most innocent, the newborn babies, the harmless imbeciles, and saints who strive to do good and create love and understanding all their lives—from God’s perfect love and understanding. This knowledge of good and evil is also, by the way, the major functional capability that separates us from the animals. This is what makes us human.

The wolf hunts the rabbit and eats it. The wolf is not mean-spirited or bad-tempered. The wolf does not hunt the rabbit because the wolf is evil. The wolf also knows nothing of love and mercy. It cannot refrain from hunting out of compassion for the rabbit’s meek little life. So the wolf also cannot become good in any moral sense. The wolf is merely fulfilling its nature as decreed by evolution or, if you will, as designed by God. The wolf does what it needs to survive in its environmental niche. If the wolf knows anything about good and evil, it is a simple equation: hunt and eat equals good; fail and starve equals bad. The wolf cannot even try to be good or sink to being bad. It can only be effective or ineffective in its ultimate purpose, which is hunting and eating rabbits and other small mammals.

Human beings have reasoning power. We also have refined emotional states. We can, we do, and we must think and feel about what we are doing. This is the cap that the prefrontal lobe—the center of the brain’s analytical, planning, directing, and decision-making capabilities—puts on our motor functions and our biological, endocrinological urges. We cannot help but think about ourselves and our actions—and ask questions about them. The content of those questions are not so much biologically based or inherited as they are culturally transmitted. For human beings are cultural and social beings, no matter whether we were developed that way by evolution or designed that way by God.

In one case, if your parents and your society laid emphasis on right and wrong, good and bad, you will likely ask if what you are doing or intending to do is moral or immoral in that context. But if your parents and society only laid emphasis on utility and efficiency,1 you would probably only ask if what you were doing or planning was going to work or not, to succeed or fail in your own interest, with less regard with whether you ought to do it or how it might affect others. But you would still ask yourself questions. You would still apply values, although biology and evolution cannot tell you what those values should be.

But later in the Bible story, in Genesis 3:22, God reveals his true purpose in denying humanity the apple. By knowing good and evil, Adam and Eve have become “as one of us”—that is, like God Himself and the higher order of angels. The human pair must be banished from the garden before they also eat from the Tree of Life and become immortal. Immortality plus ethical judgment apparently equals status as a rival godhead.

So the sin was not just disobedience—crossing a line that God had drawn around a certain tree in the garden. There are other sins of disobedience that humans have since repeatedly committed—particularly those carved in stone by the Moving Finger: taking other gods, worshipping idols, abusing the holy name, breaking the sabbath, disrespecting parents (the sins against reverence), as well as murder, adultery, theft, perjury, and envy (the sins against other people and property). None of these sins will damn a person or a race of beings for all time. And none is so grievous that it cannot be forgiven.

But the sin with the apple, coupled with the potential for sinning with the Tree of Life, was the one that could not be forgiven, ever. It comprised these two first humans—creations of the Lord God—aspiring, whether knowingly or not, to become gods themselves. In doing so, Adam and Eve broke up the order of things. They stepped above their proper place. They were set to become uppity.

This sort of thinking, that human beings were supposed to fit into a divine order and had to know their place, smacks of the medieval faith in what was once called the “Great Chain of Being,” or in Latin, Scala Naturae, the “Ladder of Being.” It’s an idea that goes back to the Greeks, with Aristotle and Plato. In this hierarchy, all of creation fits into a stepwise progression from inanimate matter, to plants, to animals, to human beings as a special and favored kind of animal, to angels, to God himself.2 This was the way of the world. This was the fixed order of creation.

Stepping outside this order would tend to challenge and subvert the notion of Creation itself. In a theology that depends on a Creator God, and places that God at the apex of all things inanimate and animate—in the old sense of having an animus, or life-giving spirit—the created being cannot rise above the creator. Cause and effect just don’t work that way.

But in the world in which we actually find ourselves—and putting aside cultural myths and creation stories3—the top of this world’s hierarchical existence is human-scale intelligence. Other animals may be larger, stronger, or faster. Some may even have intelligence approaching our own—like dolphins, whales, and elephants. But none of them has yet invented writing, or calculus, harnessed fire, or sent radio waves above the atmosphere. And that’s not for the lack of opposable thumbs—or the fact that dolphins and whales live in the sea, without access to fire and electricity—but the lack of understanding, planning, and imagination.

We human beings did not create ourselves, or not in the original biological form. That form with its massive brain evolved from the primates, who go back to ancestors among the apes, and from there back to the mammals, reptiles, amphibians, fish, etc. We did not do a whole lot of self-inventing for the hundred thousand years or so during which we hunted the slower animals, gathered the most edible berries, and experimented with fire when we found it resulting from a lightning strike. But once we settled down in the richest river valleys, discovered agriculture, and invented writing, our days as simply clever animals were over. We invented ourselves by capturing the thoughts of other people and previous generations and committing them to stone, clay, and papyrus in forms that people who had never met the original thinkers could interpret and understand. And from there, we went on to invent culture, science, and—yes—even religion, hierarchies, and notions of “divine right.”

Aside from our meaty parts, which evolution gave us, we are the animal that invented itself. We worked out notions of right and wrong, good and evil, utility and efficiency, and all the other values which we conjure and apply to the universe around us. We worked them out from observation, shared experience, and transmitted culture. Although still Homo sapiens in physical form, we are no longer our primitive, stone-chipping, berry-picking ancestors, any more than we are still H. neanderthalensis or A. afarensis. We have continued evolving in our minds, our culture, and our understanding.

We are the closest thing to gods as exist on this planet. And that will have to do until the Saucer People show up and can teach us something new.

1. You can imagine a society composed mainly of scientists and engineers—say, the Bene Tleilax of the Dune universe—who would place efficacy and utility as values above kindness and compassion. Their world might be mechanically perfect, but still it would be a hard place to live.

2. I suppose subdivisions are possible in this hierarchy, too. For example, a horse, for its general utility and gentle nature, would be ranked higher than the wild ass or zebra. And the more useful cow would outrank the moose and deer. If the Greeks and medieval Christians had known about protozoa, microbes, and germs, they would probably have classed them somewhere between dirt and plants, down there with the corals and sponges. It’s a game scholars can play endlessly.

3. And yes, I know: in this case, the Theory of Evolution and its great chain of ancestors, from the first self-replicating molecules, to RNA, to DNA, through the Cambrian explosion, to the armored fishes, the lobe-finned fishes, tetrapod amphibians, reptiles, and finally mammals to human beings, is just another creation story, revered with semi-mystical status.

Sunday, June 23, 2019

A Lot Like Madness

Midnight writer

Writing a book is a lot like madness. This applies equally to painting a giant canvas or composing a symphony—any major work that is unseen and unshared until all the effort has been expended and the thing is finished and ready for the world to read, see, or hear.

You labor with issues, detect and solve problems, and have daily bouts of triumph and despair about which your family and friends, the people who read your last book—viewed your last painting, or listened to your most recent symphony—and the general public know nothing.

And you know that much of what you have been living with for the past month, six months, a year, or more will never be known, never seen, by those same friends, readers, and the public. You alone will have any memory of your attempts to fit the working outline, or preliminary sketches, or evolving musical themes to the vision you had when you started, which was the whole reason you began this particular project. You alone will understand the compromises you had to make, the choices you considered and discarded, the opportunities that might have taken the project in another, more exciting direction but would have meant going back to the beginning.

For every finished work on the page, the canvas, or the score, there are half a dozen or more echoes and images in your mind of what you might have done, how it might have gone, the pieces that didn’t fit, and the pieces that would only fit after you changed them irrevocably.

Between the inspired first vision and the finished work there exists a rude, splintered, tentative thing that the writer calls an “outline,” the fresco painter used to call a “cartoon,” and I don’t know what the composer calls this intermediate step, perhaps indeed it’s a “theme” or a “melodic line.” It is the organic1 structure, upon which the writer hangs incident, description, narrative, and dialogue. It moves from the beginning, which occurs almost anywhere in space and time, toward a definite and specific end that, when reached, the reader feels is both inevitable and satisfying. But the writer, the creator, knows nothing about inevitability and feels nothing of satisfaction. How it will all come together, viewed from the midpoint of creation, is still a mystery.2

While the writer/artist/composer lives with—more often wrestles with—this intermediary function, you forget that none of this will ever be known to the reader, viewer, or listener. If the outline/sketch/theme serves its purpose, the finished work will be complete and stand alone as an organic whole. Alternative plot lines, overpainted details, and unsung melodies will not exist for the person who receives the work from your hand.

This is the source of what I call the “Frankenstein effect.” While the good doctor sees every mismatched part, every dropped stitch, and all the bits of skin he had to stretch to fit, the person who comes upon the finished production sees only the final effect, the monster. For the doctor, the body under construction remains a sad botch of might-have-beens. For the person facing it on the path, the monster is complete and terrifying—perhaps even perfectly so. And we must remember that, in Mary Shelley’s story, the good doctor is insane.

So … back to the question of madness.

In what occupation, other than the arts, does a person live so completely inside his or her own head? The surgeon is surrounded by a staff of other professionals—the anesthetist, surgical assistant, scrub nurses—who watch every cut, anticipate some of the surgeon’s moves, and are constantly aware of the progress of the surgery. The project manager or construction engineer works with a team of subcontractors, laborers, and logistics specialists, and communication about the project plan and design, so that they each will understand and can execute it, is a major part of the manager’s job. Even the orchestra conductor works through the various musicians and instrument sections, and when the conductor has a particular vision or novel interpretation of the score to execute, he or she must still communicate it to these performers, elicit their cooperation, and sometimes experience their negative feedback.

The writer, painter, or composer—especially when young or just starting out—works alone. If the artist is unsure of him- or herself, it is always possible to begin by copying and thereby studying the masters: those authors, painters, composers whose work first inspired him or her to take up this particular art form. But creating copies and rendering homage are not satisfying for the true artist, or not for long. The artist wants to tell his or her own stories, create a unique picture of the world, conjure up a theme or melody that no one else has ever heard before. A person without this personal vision or driving sense of individuality might then lapse into copying—but without the reverence due to a master—just whatever has become popular.3 There can also be money in simply going through the motions: copywriting, graphic design, elevator music. But it’s not art, not satisfying.

An experienced artist with one or more successes to his or her credit does not necessarily have to suffer the uncertainty of the beginner. After all, there are glowing reviews, public recognition, and actual royalties to confirm his or her talent and bolster the ego. But then, an artist can’t keep cranking out the same work, variations on a theme, over and over again.4 It may be what the public wants, but the job soon begins to look like a rut. And when the established artist attempts something new and different, then he or she is working alone again, and the perennial uncertainty resurfaces. Yes, you know how to write a mystery novel, a police procedural, or decent science fiction, but can you research and write historical fiction or undertake a literary novel? And suddenly you are wrestling again with the question of your own talent.

Madness is a private thing. No one outside your head knows what’s going on in there. And when you try to tell people, you get funny looks or stares of disbelief.5 So maybe your thoughts really are disjointed, delusional, and dissociated from the world of rational men and women. Maybe you’re not at all a writer or any kind of “artist.” As a writer with a desire to tell a new story, a painter trying to capture a new vision, or a musician trying to articulate an unheard theme—and not sure you even have the talent to pull it off, let alone create something that will become publishable, saleable, eventually famous, and perhaps even profitable—you are halfway to the state of the schizophrenic patient who hears voices, battles with strange ideas, and struggles to sort out the affairs of daily living from the madness going on inside your head.

The only thing worse than this is not doing it at all. To turn your back on the whole artistic process. To ignore the random ideas that come in the night, or while you’re doing something else in daylight, and not stopping to write them down. To let die the whispers from your subconscious that have finally resolved the problem with your next chapter, and remaining steadfastly incurious about whether the solution will work or not. To stop being this unique, special, creative, frustrated person, who every day owes a debt to the muse, is responsible for the assignments you set for yourself in the outline, and is dedicated to producing the book a year that you promised yourself. And then you would become just another person in the world, who gets up in the morning, goes to the paying “day job,” comes home at night to drink or watch television, and simply … exists until it’s time to die.

And that’s worse than being just a little mad.

1. “Organic” is Aristotle’s word, from his Poetics. It implies the various parts of a work growing together and functioning as a unified whole, like the parts of a plant or animal. In terms of the reader’s perception, this organic arrangement is mentally and emotionally satisfying because the interrelation of the parts is mutually supporting. For example, Julia is in love with David, but David’s lack of returned feeling makes Julia sad and angry; vengeance and murder ensue. The work’s every part relates to support the outcome. Or as Anton Chekhov said somewhere, if you’re going to show a gun in Act One, you had better fire it in Act Two or Three.

2. Any community or group of writers always discusses “the O-word,” as if the outline had strange powers and attracted superstitious awe. The alternative to developing an outline before sitting down to write the actual book is sometimes call going by the “seat of your pants,” or “pantsing.” For me, the outline is the once-through of the story at the 10,000-foot level. It ensures that I have a complete story in hand when I start to write.
    Sometimes the outline is complete in some places, fragmentary in others, but still it provides a structure, however tenuous, between start and finish. If I don’t have at least a partial outline, the problem is not that I cannot write. Without the promise of a destination, I can write pages and pages of useless description and idle dialogue between characters in search of meaning and direction. And that’s no good to anybody—not even to me, who cannot hope that a plot will coalesce out of all this chatter, like due on the morning grass.
    My deal with myself is that the outline is there as an assignment, a starting point, and a bridge to the next time I sit down to write. My bet with myself is that, prompted by the outline—especially if the day’s assignment is not a particularly strong scene or chapter—then my subconscious will click in during the writing process and come up with something better … perhaps more exciting … maybe even superb. And so far, the little demons at work down there have always come through on the bet.

3. When I was casting about for an agent, I heard several variations on the pitch to write what I would call a “coattail” novel, to cash in on the popularity of, and public hunger for, the current bestseller. When Harry Potter was new and fresh, the thought of many envious agents and publishers was to get their untried authors to write about “a boy wizard with glasses” in a school for witches and wizards. Before that, it would have been some kind of epic about magic rings in the middle of an elf war. And later, it would be a naïve young woman in the grip of a billionaire sadist. This is always a trap, because by the time you finish and try to sell your copycat novel, the public taste will have moved on to something else newer and fresher.

4. Think of writing the thirteenth James Bond novel or the seventh Harry Potter novel. Sooner or later—unless your plots follow a greater story arc than can be told in just one novel—it all becomes repetitious and unsatisfying. I think Stephen King expressed this frustration best with his novel Misery.

5. My wife read and generally liked my early novels—up until Crygender, which I wrote on assignment from my publisher, and she found the story disgusting. We never really discussed my works in progress, because I had learned long before that you can’t talk out a story with other people: it just makes the ideas become set in place, go cold, and disappear. But one night she asked me what I was thinking about, and I described the plot point I was wrestling with. She later told me that she had thought she might become my inspiration and have a dandy solution for me, but on the spot she could think of nothing. She left me alone with my thoughts after that.

Sunday, June 16, 2019

A Preference for Weakness

Food allergy

It appears that no one wants to appear to be—and act as if they were—strong anymore. Has our culture so decayed that we actually admire—and aspire to be—weak, defenseless, vulnerable, a victim?

Item one is the sudden outbreak of food allergies among the middle class. Yes, some people have acquired allergies that may have something to do with plastics or hormones in the water. Yes, some food allergies are so severe that the person can go into anaphylactic shock and die. But that is not the phenomenon under discussion here. It seems that suddenly everyone is allergic to, or sensitive to, or just afraid of, the protein in wheat known as gluten. People with certain identified autoimmune disorders, such as celiac disease, do get sick when they digest gluten. And people whose traditional diets did not include wheat, such as those who come from areas dependent on rice cultivation in Asia, may experience a reaction to gluten. But that is not what I’m talking about.

Far more people with a Northern European background than just a few years ago seem to be claiming gluten sensitivity, almost as if it were fashionable to throw up your hands at bread and pasta. Couple this with our wandering dietary pronouncements—meat’s bad, butter’s bad, cholesterol’s bad, butter is better than margarine, fish are good, fish are bad, carbohydrates are energy, carbohydrates are bad, avocadoes are bad—and you get a population that is suddenly exquisitely picky about what everyone is eating, and no one can adequately explain why. That’s a deplorable state for a human digestive system that was tempered and hardened by a hundred thousand years of hunter-gatherer existence and can usually bolt down and profit from almost anything organic except tree bark and dead leaves.

Item two is the sudden outbreak of reactive offense. Yes, some people say and do truly mean, stupid, hurtful things, and they should be either quietly confronted or politely ignored. And yes, sometimes people intend those things to be offensive in order to get a reaction from the public at large. But it seems that suddenly everyone is incapable of taking the polite route and refraining from noticing or reacting to the rudely offensive. Now everyone almost seems to hunger for opportunities for taking offense. To quote from one of my favorite movies, The Duellists: “The duelist demands satisfaction. Honor for him is an appetite.” And so, it would seem, for many people today the satisfaction of reacting with horror, scorn, and outrage at remarks and gestures, whether meant offensively or not, has become an appetite.

In my view—in the view of my parents, who raised me in their beliefs—to give in to such failings, to demonstrate such vulnerability, is a weakness. That succumbing to precocious food sensitivities and minor discomforts was to make yourself vulnerable to the world. That reacting with offense at every slight and injury was to allow yourself and your emotions to be played upon by potentially hostile forces. They believed in strength and wanted their sons to be strong.

As a child, I suffered from remarkably few allergies but had a bad reaction to mosquito bites. Let one bite me on the wrist, and my lower arm would swell until my hand was fixed in position. If one bit me on the cheek, that side of my face would swell. As a boy growing up among the wetlands surrounding suburban Boston, mosquitos in summer were an inevitability. My mother sympathized with my condition, but she didn’t agonize about it. I never was taken to the emergency room, and no one suggested any medications to counter the allergy. My parents believed I would grow out of the affliction, and I did.

My parents tolerated few childish food dislikes. My brother and I had to eat what was put on our plates, like it or not—and we mostly liked it, because my mother was a good cook. I had one real aversion, to cheese. I suppose that, in later life, I could excuse this as my sensing that cheese was “rotted milk,” but as a child I just hated the taste, smell, and texture of the stuff. It wasn’t until pizza became popular that I would eat even the mildest of provolones and mozzarellas in any form, and never just as a chunk of the stuff on a cracker or melted into a sauce. My father, being a cheese lover, disdained my aversion and tried to beat it out of me as an example of childish attitude. My mother, being of a kinder heart, would make separate portions without cheese for me when preparing cheese-heavy dishes. But she still considered my aversion a sign of personal weakness.

The Protestant ethic was strong in my family. You were supposed to work on yourself, learn as much as you could, eradicate your failings, take up your responsibilities, be dependable and loyal, work hard at whatever you undertook, and be ready to account for your actions. Claiming extenuating circumstances when you failed at something was just not allowed: a properly prepared mind should have foreseen those circumstances and worked to overcome them. Claiming to be a victim of other people’s actions, whether intentional or not, was unacceptable. We were the people who paid our own way and made our own fate. We helped others along as we could, but we did not succumb to their malice and their schemes, if any. We anticipated traps, spotted them in time, and side-stepped them. We were in control of our own lives and not anyone else.

In another of my favorite stories, Dune by Frank Herbert, the female school of political agents and manipulators of bloodlines, the Bene Gesserit, had as an axiom “Support strength.” I do not take this as some kind of class statement, such as favoring the oppressor over the oppressed. Instead, it means finding what is strong in each person and developing it, helping to create people who are capable, self-aware, resilient, brave, and strong. It is an attitude of which my parents would have approved.

The current preference for weakness in our popular culture—expressed in accepting every allergy and food phobia as a sign of personal sensitivity, and accepting every cause for offense as a sign of spiritual purity—is a dead end. It creates people who are incapable, self-serving, brittle, scared, and weak. This is not a people who will take up arms against an invader, or volunteer to land on the Moon or Mars, or do the hundred other daring and dangerous things that previous generations have been asked to do and responded without a whimper.

But we may not be that nation anymore.

Sunday, June 9, 2019

On Rational Suicide

Line of suicides

The human mind is a mechanism of many moving parts. The human brain has several major components and many subsystems or neural circuits, all built up over millions of years of evolution, based on the rudimentary brains of the early fishes. In the most fortunate of us, the mind can be a fine-tuned instrument, capable of analyzing the mathematics of splitting the atom or taking the vision of a sunset and reimagining it in words or music. For many of us, the brain is a stable platform that enables us to live happily, deal with setbacks and anxieties, and savor moments of pleasure and triumph. But for some of us, the mechanism goes out of whack, the parts don’t cooperate, and the result is a mental life full of misperceptions, missed cues, and chaos.

And then there is the issue of suicide. Most people would like to consider suicide as a form of mental illness, a breakdown in the brain’s systems. For them, it is something that might happen to other people, like clinical depression or schizophrenia, but nothing that might be waiting around the next turn of events to grab them, spin them on their head, and kill them. For these people, suicide is never a rational act, never what a sane or well-balanced person does.

This view is reinforced in our society and in Western Civilization with cultural shaming—that the suicidal person was not strong enough or did not consider his or her own value or responsibility to family, friends, and community—and with religious prohibition—that the act of suicide, self-murder, is contrary to God’s law and a usurpation of God’s will, which alone tells us the direction that our lives will take and the point at which they will end. But these cultural arguments and prohibitions against suicide are not universal. For example, in feudal Japan—at least among the samurai order—ritual suicide was linked to notions of obedience and personal honor. A samurai who failed his lord in any significant way could only atone by taking his own life, and this was considered proper and good.

In my view, which is based on notions of evolutionary biology, suicide is the unfortunate triumph of the prefrontal cortex—the center of thinking, organizing, projecting, and deciding, as well as expecting, hoping, and dreaming—and its supporting circuitry in the hippocampus and amygdala, which coordinates both memory and emotions, over the hindbrain or cerebellum—center of autonomic functions like breathing, heartbeat and circulation, swallowing, and digestion—and the reticular activating system in the brainstem that coordinates consciousness. Suicide is the temporary—although ultimately final—triumph of human thinking over the brute, animal functions of the body.

Suicide is the collision of the brain mechanism that creates expectations, performs planning, perceives hope, responds to dreams, and does whatever else drives the mind and the soul forward into the future, with what the mind perceives as reality, the end of hopes, plans, expectations, or whatever was driving it forward. And when that forward-thinking part of the mind gives up and dies, when it contemplates all the damage and despair that continued living might cause, what going forward without hope or dreams might mean, then the rational mind can overcome the brain mechanisms that keep the body breathing, eating, and living. The decision to act in harming the body, whether by self-starvation, asphyxiation, overmedication, cutting of blood vessels, or other ways of knowingly causing irreparable and permanent damage, is the temporary but permanent defeat of those systems that keep the organic parts functioning and moving forward blindly in time.

And again, there are cultural traditions and physical mechanisms that work to delay or even nullify this collision between the mind and the body.

On the cultural level, good parenting will teach a child that, first, we don’t always get what we want or deserve in life, and that disappointment and denial are a part of the human condition. Common sense will teach a person eventually that, while dreams and expectations are nice, they are not the basis of reality, that a person must work for what he or she wants, sometimes sacrificing dreams for mere survival, and that plans sometimes go awry. Wise teachers and friends will tell a person that hope does not live in one shape or future but is a free-floating quality, and that a person can find strength in the smallest moments of peace, visions of beauty, or gestures of good will. And many religions, especially the Zen parables, teach that expectations and assumptions are phantasms of the mind and not a reflection of reality.

On the physical level, most of the methods for ending a life—like starvation, asphyxiation, and cutting—are painful and create their own immediate need for the forward-thinking mind to stop, reconsider its decision, and actively reverse course. Other methods—like a drug overdose—take time to create their effect, and then the busy mind focused on the blankness of the future may unmake its decision to choose death and so choose medical help. Only the methods which put in train an immediate and irreversible course of events—like jumping out of a high window or pulling the trigger on a gun aimed at the head—offer no immediate pain and allow for no useful second thoughts.

Why am I going on about suicide like this? First, as an act of individual courage—and in defiance of most social taboos—I long ago, even as a child, decided that no thought would be unthinkable for me. If it can be imagined, it can be explored rationally and soberly, as a fit subject for the human mind. Second, there have been times in my own life—not often and not recently—when I have felt the pressure of lost hope, of a great, gray blankness lying ahead with nothing to counter it, no expectation to fill it, and no reason to avoid it.

And then my dear wife, my partner in forty-one years of marriage, died a year and nine months ago. When you lose someone with whom you have shared the majority of your life, aside from the immediate grief, you also have moments of simple emptiness. Everything the two of you did together—the events, memories, pleasures, and inside jokes you shared—are irretrievably gone. The daily routines you built up in living together are now meaningless. The habits you yourself curtailed and the actions you avoided because you knew she did not like them are now empty gestures. And when so much of your life has been taken away, you can sense that great, gray void in the years ahead. In many respects, you become a ghost yourself.1

There have been times since her death when I thought I would simply “go under.” That it would not matter if my hindbrain stopped the automatic functions of pulling on my lungs, beating my heart, and cuing my desire to eat, and that I would just fade away. Now, this is not a warning to my family and friends. I am not going to follow any of this up with positive or passive action, because I share those same cultural traditions and physical mechanisms designed to prevent suicide. Or not, that is, until my own brain is so far gone with dementia that I become disabled, unable to care for myself, and so mentally isolated that I cannot move forward. And that’s a vow I made long ago.

But what I am trying to say is that I can understand the impulse toward suicide. It is part of the human condition. It comes about when an animal brain grows so advanced, so complicated, and so beholden to its own vision and hope for the future that the denial of that vision and hope leads to irremediable despair with no alternative.

Suicide is not an irrational act. Its very possibility is part of what makes us human and not animal.

1. Some days, as I move around our old apartment, I have the sense that perhaps I have entered an alternate reality: that I was the one who died and now drift through my daily routines as a ghost trapped in this empty place, while she lives on, somewhere else, in a future filled with hopes, plans, and expectations.