Sunday, February 19, 2017

Something Outside Yourself

The other night I was watching a movie about an unhappy woman, actually three unhappy women. It doesn’t matter which movie it was, because it seems that these days all the movies and books are about unhappy people—except for science fiction, where people are usually running or fighting for their lives, and movies based on comic books, where the main characters are superheroes with no personal problems at all who are fighting each other.1

Anyway, at one point in the movie I said to myself, “This woman needs to go write a book.” Not, mind you, not a book about her unhappy life, her frustrations, her poor choices, and her existential ennui. God, no! But a book about something other than and outside of herself, like the conflicts and eccentricities of the Tudor dynasty, or the life cycle of the sand flea, or the probability of finding life on Mars. “Go write a book”—that’s my solution to life’s problems and the greater issue of life’s existential meaning.

This is because, since the age of sixteen or maybe even a few years before that, I have felt the psychological pressure to write novels. No, rather, I have accepted the psychological burden of knowing that my purpose in life was to line bookshelves, to tell stories in the 60,000-word range, to invent unreal people in unreal situations and then observe and report on them. Whatever you want to call this activity, it has been part of my function in life which I have totally accepted and left unquestioned and unexamined. My daily business for as long as I can remember has been to eat and sleep, bathe and groom, exercise for health reasons,2 attend to the paying job and all its obligations,3 take care of the family, read for pleasure, do household chores, walk the dog, plus think about the book currently in hand and make time to sit down and push the page marker forward.

My books don’t have a message; I’m not promoting a political or economic agenda. They are not a reflection of my life; I’m not trying to tell my story or explain my point of view. Other than the fact that the plots and characters come out of my own thoughts and experiences, and so must reflect something of the way I think and feel, these are people and situations meant to stand on their own. The books come through me, like light through a prism. I have no sense of actually originating them, and when they are finished, I feel no special glow of recognition for them.

Whether these books are good or bad is really not my concern.4 I work on the story and the text until they are as good as I can make them. When I’m done, the book is as good as it can be. But whether other people will find it good, or interesting, or worth reading … that’s not something I can control. Whether the book will sell and make money is a matter of speculation, not purpose. If an agent or editor told me I would sell x percent more copies or make y percent more money by writing in a different way, or in another genre, or taking the story in a different direction5 … that’s not something I can really do. The stories are coming through me, originating outside my conscious control—if anywhere, from the depths of my subconscious—and I’m not the master of that process.

I did not intend this meditation to be an explanation of my writing process, other than to show it is something outside myself. The act of writing and completing a book exists, for me, as something more important and more meaningful than what I am thinking or how I am feeling at any one time. If I am feeling lazy, bored, or sad at the current moment, I cannot invigorate myself or cheer myself up by turning to the writing—not unless I already have an outline in hand and the story is active in my brain, insisting that it be written. And when the story is “hot” and ready to be told, it doesn’t matter if I’m sad or tired. I must drop what I’m doing, go to the keyboard, and start telling it. All I know is that, at the end of the day, if I have solved a knotty plot problem with a solid patch of outline, or written a scene that worked and advanced the marker—or, when I was at the paying job, had just finished writing a long and technically important article or procedure—then I know a sense of peace. For that day, at least, I have fulfilled my purpose. The internal word tank is empty, and a new idea has been made real on the screen, in the file, eventually to appear on another screen or on paper for someone else to read.

When I tell the unhappy women in that movie they should “go write a book,” what I mean is they should find something bigger, more important, more involving than their own lives, something outside of themselves which demands attention and offers fulfillment. They should find that thing and surrender to it, dedicate their lives to it, leaving it unquestioned and unexamined. I say this not because they are women—oh, no!—but because they are human beings. For some women, taking care of their children and their families are a greater purpose in life. Just as, for some men and women, building a professional career and making money are that greater purpose. But if someone can take up those burdens and still feel angst and ennui, then I would place children or career among those basic life tasks along with exercising, doing household chores, and walking the dog.

For Michelangelo, the greater purpose was the Sistine Chapel ceiling and, also, releasing heroic figures trapped in marble. For Mother Teresa, it was helping the poor of Calcutta. For some people, it’s volunteering in the community, taking up a political cause, acting in amateur theatricals, or making music. For some, it is becoming a good soldier, or a surgeon, or the Wichita lineman. The point is, the undertaking must be active, not passive, and the effort must be all-consuming, not a pastime. It must be the thing that, when you are doing it, you live more brilliantly; when you are unable or prevented from doing it, you die by inches. For some people, it is enough to ask God to reveal their place in His plan—but for me, that’s like waiting to conceive the plot and characters of the next book, which will take me through the next twelve to eighteen months to complete.

The tragedy of the human condition is that life on Earth has no particular purpose. Your father’s sperm met and fertilized your mother’s egg, and so you are here. At a greater remove, DNA acquired the ability to encode and retain certain protein sequences—or, if it misplaced them, to invent new ones—and so life appeared. For all plants and the vast majority of animals, this is enough. Eat and sleep, groom, fight, and reproduce are all the purpose their lives need; the animal’s occupation and its goal are simply survival. But for human beings—and maybe for dolphins, whales, and elephants, too—with our larger and more complex brains that engender self-awareness and raise questions, mere survival is not enough. Because we conceive of ourselves as unique beings, and not replications of a species type, we seek for ourselves a unique purpose, something of our own to do, greater than survival.

Life itself, mere existence, will not give you this purpose. It is our greatest freedom—and life’s great trap—that each person must choose and decide for him- or herself what purpose, what destiny, this life will fulfill. It does not have to change the world or leave a monument to which others will look up and offer praise. It does not even have to be something that others will understand. And it may not be something that you have bravely and consciously chosen for yourself. My focus on books came, I think, from my grandfather’s love of reading and collecting certain authors. And then, somewhere in there, someone else in the family—perhaps my father—might have praised the works of a particular author. And I do remember reading, at a young age, that Nathaniel Hawthorne once told his mother he did not think he would be much good at the law or medicine, but what if he could give her a nice little shelf of books to read?

In this way, grandparents, parents, families, and teachers shape and point the earliest ideas of young children. They place the notions and start them spinning to become lifelong passions. And those of us who grow up to follow those passions risk everything—love, health, sanity, and even life itself—to fulfill them. And we are thereby fulfilled.

1. All right, the movie I was watching was The Girl on the Train. It’s about three women immersed in their own problems, which they eventually find out are the same problem. Plus, of course, booze is involved.

2. Because I do karate katas as a form of aerobic exercise, the workout also helps maintain my balance, coordination, flexibility, and psychological preparedness against attack.

3. Or this was part of my life until I retired. Now the book writing is my day job. It doesn’t pay much, but that’s not the point.

4. Here I am reminded of the line in C. S. Lewis’s The Screwtape Letters: “A man is not usually called upon to have an opinion of his own talents at all, since he can very well go on improving them to the best of his ability without deciding on his own precise niche in the temple of Fame.” I find that thought comforting.

5. For a while, at Baen Books, I wrote collaborations at the direction of the publisher. That is, I accepted an outline and partial notes—in whatever state of preparation—from a senior author and sat down to write the book; then the senior author would get a chance to read and correct it. I found this process tremendously instructive, because it gave me insight into how other authors think and prepare. These weren’t my books, of course, but like the books I wanted to write, they came from somewhere outside myself, and I could do the work. And no, they didn’t make much more money than the novels that came out of my own head.

Sunday, February 12, 2017

The Practical Kōan

It is a common thought that all religions are based on some acceptance of the supernatural and the mystical as the basis of belief. God or gods, angels or other divine beings, the environs of the afterlife encompassing either eternal ecstasy or unrelenting torment, and the minds of the priests or shamans who are in touch with these things are all supposed to be matters beyond or outside of the real, mundane, everyday aspects of human life. To be religious is thought to be entering another plane which does not touch the world as the average person perceives it.

This may be so for mystery, just-so religions like Christianity, Hinduism, or Islam. But from my own study, I have found Buddhism—that is, the mind practices of Hinayana Buddhism, as originally taught by Gautama, rather than the social worship of Mahayana Buddhism, as practiced in most countries today1—to be eminently practical and not at all supernatural. True, some of the sutras, the dialogues of the Buddha, can be obscure and flowery, but the teaching itself rests on common sense and good psychology.

One of the supposedly mystical aspects of the Zen Buddhist tradition are the kōans, or questions, puzzles, and riddles used in meditation. “What is the sound of one hand clapping?” “What is your original face before your mother and father were born?” These ask you to explore the concepts of duality and oneness: it takes two hands to make a clapping sound; it takes a father and a mother to make a baby.2

The purpose of the kōan—as nearly as I understand it, and one must always qualify one’s understanding of Zen with humility—is to interrupt the normal, everyday chatter of the active mind, which keeps throwing out words and sequences, images, memories, and logical interpretations of everything that is passing through the forefront of our brains. The kōan is meant as a logical impossibility, an irresolvable question, and so an end of logic and a stilling of the active mind. Once the little squirrel inside the wheel stops its endless spinning, the mind can be at peace and open to new ways of seeing and interpreting the world.

Unlike the catechism, scripture, or doctrine passed along in other religions, Zen has no object, no direction, no predictable endpoint, and no core teaching that can be put succinctly into words and formulas. It is not about turning the student into something, but turning him or her out of old habits and everyday perceptions. No system of thought or doctrine can, by itself, create a clear and settled mind. You must do that for yourself, because your mind, your mental habits and thought patterns, are unique. You alone know best about your own mind and its workings.

Some commentaries suggest that the purpose of these kōans about clapping hands and faceless babies is to force the meditating student to confront—and become one with—the nonduality of subject and object. In this sense, Zen might almost be the spiritual precursor to quantum theory, because it turns observation and action on their heads. It teaches—that is, one of its core ideas postulates—that the observer and the subject under observation are not separate. “Look at the flower, and the flower also looks.” Or, rather, when you detect a quantum particle, the act of detection also changes the particle’s momentum and direction. So you can know where the particle is, or where it’s going, but not both.

Like quantum mechanics, Zen is usually repugnant to the everyday mind. In our normal state, into which we are essentially born and which becomes confirmed by our every social interaction, most of us assume that what goes on inside our heads—my thoughts, my perceptions, my values, my life—is separate and distinct from whatever is going on outside in “the world.” There is the existence of I-me-my and the existence of you-they-them-it, and the two are separate. In the greater view, the perspective that resides somewhere above our skulls and—for some people at least—might be thought of as the god’s-eye view, there is no I-or-you, no I-or-they, but only one system of reciprocating interaction, cause and effect, playing out eternally.

When you become attuned to this level of thinking, then you can attempt to still your mind, close down those actions and reactions, and “disappear from the play” or “go out like a candle”—which was the whole point of nirvana. Not a state of unending bliss, but a final rest from the oppression of being and becoming. To enter nirvana is to achieve the stillness from waking each day—or, in the Hindu tradition of reincarnation, returning after death to another new life—and climbing the same hill, finding your way anew, struggling to make your life function, reacting to pressures and influences, igniting more pressures and influences, and remaining caught in the middle, like a fly struggling in a web.3

Much of the original Buddhist teaching is filled with poetic language that I find simply impenetrable. But the Zen variant is often clear and practical. The stories are meant to impart a simple understanding.

For example, in “A Cup of Tea,” a university professor visits a Zen master. The master pours him a cup of tea, and when the level reaches the brim, the master keeps pouring. The professor watches and finally says the cup is full; no more will go in. The master replies that the professor’s mind, like the cup, is full of opinions and ideas. He cannot show the professor Zen unless the man firsts empties his cup.

In “Eating the Blame,” the cook at a Zen monastery is late in preparing the evening meal. He rushes into the garden and begins hurriedly chopping up greens for the monks’ soup. Unfortunately, in the process he collects and chops up a small snake. When the soup is served, the abbot pulls from his bowl the snake’s head. “What is this?” he asks in astonishment, because the monks are supposed to be strict vegetarians. “Oh, thank you, Master!” the cook exclaims and pops the snake head into his mouth.

In “Black-Nosed Buddha,” a young nun has a statue of the Buddha covered in gold leaf. She enters a shrine where there are many Buddha statues, each with incense burning before it. She wants to burn incense, too, but for her own statue and not share it with any other. She makes a funnel to guide the perfumed smoke to her Buddha’s nose. The smoke blackens the nose and makes the golden statue ugly.

These are not specifically riddles but stories about relationships and perspectives—being open to ideas, accepting blame in an intolerable situation, sharing blessings with others—that are the components of a magnanimous, or “great-souled,” existence. They are good psychology rather than any kind of mystical doctrine.4

For me, that is the attraction of Buddhism, in its original variant, over other religions. It does not require belief. Instead, it asks for patience and acceptance of principles that anyone can understand.

1. Hinayana translates as “Lesser Vehicle” and refers to the strict moral and mentally purifying practices taught in the earliest form of the religion, by which a person achieves release from the wheel of rebirth, quiets the endless give and take of personal karma, and so prepares him- or herself to enter the condition of nirvana. And that does sound pretty mystical, doesn’t it? Mahayana translates as “Greater Vehicle” and refers to the later definitions of the religion, where advanced souls, the bodhisattvas, can be prayed to and will intercede for and share their good karma with less advanced souls in order to hurry them along the route to salvation.

2. Clearly, the riddle is not meant to be something that only a clever mind can answer. “One hand clapping” is not the sound of snapping fingers. Nor is it an invitation to indiscriminately approach the experience of reality, so that the “clapping” is something universal or whatever you happen to be hearing at the moment. If the answer were that simple, it could be published in a book of proverbs or jokes with the punchlines spelled out and explained.

3. It is ironic that the Hindu idea of reincarnation—that what happens after you die is not an actual ending, fearsome death, but a return to life in a new body—has generated a sense of ennui and discontent. It is an oppressive vision of eternal climbing and slipping back, life after life, with no possible escape. Even the thought of sitting on a cloud in Heaven, playing the harp and singing with the angels … forever can be oppressive.

4. For more such stories and perspectives, see Zen Flesh, Zen Bones by Paul Reps and The Buddhist Teaching of Totality: The Philosophy of Hwa Yen Buddhism by Garma C. C. Chang. The latter was one of the first books I edited—and really very lightly—while working at the Pennsylvania State University Press.

Sunday, February 5, 2017

Afflicting the Audience

It was the journalist Finley Peter Dunne—writing in the character of “Mr. Dooley”—who coined the original of the saying that journalism’s job was to “afflict the comfortable and comfort the afflicted.” This has since become one of the rallying cries of the liberal/progressive mindset. I first heard it attached to the art of storytelling from a fan in the audience of a panel I attended in the early 1990s at OryCon, the Oregon Science Fiction Convention held annually in Portland.1

This artistic doctrine seems by now to have thoroughly caught on with a majority of novelists, poets, painters, movie producers, and directors. In practice, the doctrine demands that one of two interpretations or results be associated with any creative work.

The first result is that the only acceptable subjects of a story or expository documentary would seem to be people in the bottom twentieth percentile of society economically and preferably also persons of color, either female, homoerotic, or transgender, who are conspicuously without political power, obvious talents, or personal attainments. The subject—normally I would say “hero” or “heroine,” but that would imply some dominant positive characteristic and a position of responsibility—does not succeed in the story by virtue of personal strength or through the application of any special wit, talent, or insight, because that would suggest an access to powers or characteristics lacking in others inhabiting similar situations. Instead, the subject must endure in his or her negative situation and arrive at whatever success the story allows either through dogged persistence or with the benevolent assistance of some community element or social group. The subject of the story is not supposed to be unique or elevated in any way, as that would suggest he or she was part of an elite rather than a random member of the mass society. Stories told in this vein are intended to comfort the afflicted by never suggesting that their negative situation and current humiliation is a reflection on their lack of education or acquired skills, their diminished initiative, or some poor decision-making. No one is to blame for their failure except society at large or the establishment hierarchy in particular.

The second result is that any story or documentary which does not focus on a member of the afflicted underclass or someone in a minority and oppressed condition is required to be couched in the most ugly, hateful, bizarre, or nonsensical terms. People gifted with strength, wit, talent, wealth, or a history of attainment must also harbor psychological deficits, shameful secrets, or criminal backgrounds. Their success must be based on unethical activity. Such people must be depicted as cold and brutal, or unfeeling and yet unhappy, or obscenely pleased with themselves in their allegiance to a corrupt world order. If they have wealth, it must be unable to yield them either pleasure or security and must also be displayed in the most crudely frivolous fashion. If they have talent, it must have been magically acquired without the application of discipline and hard work. If they have intelligence and cleverness, it must be free of any association with study, education, or imagination. If the subject starts out normal—that is, possessed of some education, modest talents, capable attitudes, and a degree of innate goodness, like most people on the planet—then he or she must be subjected to degradations and humiliations at the whim of wealthy psychopaths which put the subject in the proper—that is, afflicted—state of mind. Stories in this vein are meant to ridicule the comfortable and remind them that their world and their attitudes are hollow, corrupt, mean spirited, and unworthy of decent human beings. No one is innocent of wrongdoing.2

Do I exaggerate? Oh, a bit. And for effect. But still, it seems that too many books, plays, and movies these days are full of wretched and unhappy people, struggling against conditions they cannot master, living hollow and unfulfilled lives, and ending up badly.

Now, I am not saying I want only stories that are peaceful and serene, full of happiness and cheer. That would be the best of all possible worlds as imagined by a Candide or Pollyanna—and no more real that the depressed and desperate lives depicted in current popular culture.

True and meaningful stories involve conflict and loss. The main characters—even the heroes and heroines—must struggle against opposition that tests their resolve, their mettle, their wits, their hard-won talents, and their humanity. But the characters must start with at least some of these positive graces. The conflicts must be resolvable, rather than engrained in the hostile injustice of an uncaring universe. And the loss must be retrievable, or at least capable of being ameliorated, replaced with something better and more lasting, or accepted with a renewed and refreshed spirit. Finally, the characters must learn and grow, develop emotionally and spiritually, and come out at the end of the story in a new, better, more complete, or more resilient psychological space.

As a novelist, I reject the doctrine of celebrating the afflicted and denigrating the comfortable. Of course, I’m going to favor characters who start with at least some self-awareness, which includes their knowledge—and the reader’s—about their own faults and deficits as well as their talents and strengths. And although the characters may start in a comfortable position, or at least with their boat on an even if unsteady keel, it’s my job as a storyteller to pull away the cushions, rock the boat, and toss the character out on a hard shore. This is how personal character and resolve are tested, talents revealed, and native wit and resourcefulness demonstrated.

My business is to tell an entertaining story that reflects a certain kind of life: the kind that my readers will find interesting, instructive, or enlightening. Since most of my readers are at least moderately well-educated and possess the ambition and energy to set themselves up in a position where they have the money to buy and the time to read books, I am writing for a certain kind of person.3 Add to that my own interest in complex situations; my fascination with technology and machinery; my occasional bafflement at the mechanics of personal relationships, puzzles, and politics; and my drive to follow the logical consequences of actions in the past which promise to affect the future—and you have a very particular kind of writer who hopes to attract a like-minded reader.

In order to “engage”—to use that literary word—and satisfy these readers, I create and set in motion a certain kind of character: skilled, resourceful, self-reliant, wary, resilient, and tough. Whether my lead character is a man or a woman—and I am comfortable impersonating either gender4—the character must be ready to cope with problems, struggle against adversity, weigh resources and take chances, risk death and dismemberment, keep moving forward, and not complain about the cruelties of fate, the tactics of the opposing camp, or the burdens of personal weakness. In short, none of my characters—or my readers, who I hope would want to be like them—has any notion of being either smugly comfortable or sorrowfully afflicted.

I also like to write books from multiple viewpoints. In my stories, A may know something that B must learn or can only guess. And C may be waiting on one side of a door on which D is about to knock—or which he will shortly break through. In these cases, where I and my reader temporarily assume the viewpoints and personae of many different characters, I shy away from having any outright “mumping villains,” psychopaths, or despised wretches driving my stories. Most people, I believe, are trying to follow their beliefs and do their best—although some may be mistaken in those beliefs, and their perceptions may put them in conflict with the people around them. I find it much more satisfying to toss the reader into a situation where every character has a little bit of right and a bit of wrong on their side, rather than paint some as virtuous heroes and others as dastardly villains. I think this is also a truer picture of the way the world works.

Finally, I would rather engage my readers with a positive vision of how things might be than disgust them with a horrific vision of how things supposedly are. I want to catch their minds with honey, not vinegar. But then, sometimes I’m just an old romantic at heart.

1. That same convention yielded another fan in the audience who insisted that the only basis for modern stories was “race war, class war, gender war.” Oh, my! Where do I begin with a vision so narrowly focused on, and so tightly blinkered by, the political sphere?

2. Unless, of course, the main character is a superhero. Superheroes are magical beings operating under their own rule sets.

3. I decided long ago that I’m not trying to write for people who have no educational background or knowledge base, no sense of curiosity or wonder at the universe, and no interest in reading for pleasure. Such people simply are not going to buy or invest their time in books. Duh!

4. When I write from a character’s viewpoint, I try to reflect the inner person, who has goals to reach, friends and loved ones to keep and cherish, personal honor to defend, self-respect and public reputation to win and keep, and a place or niche in the larger society and economy to maintain. These are attributes that go deeper than the affiliations of gender. While I might offer a physical description to place the character in the reader’s mind, I usually don’t bother with fixations on secondary sexual characteristics and bathroom habits.

Sunday, January 29, 2017

The Insurrection of 2017

I’ve written about the changing political winds before.1 And the events of the past couple of months have shown that this is definitely an historically interesting time.

Since the election on November 8, we’ve had sporadic street protests in various cities, sometimes attended by breaking glass and burning cars. The same thing happened on January 20 while the new president was inaugurated several blocks away. And the following day, January 21, we had what many sources are calling the largest demonstration in U.S. history. The multiple events in cities around the country were called for and primarily attended by women, but many sympathetic men were also there. Estimates include 500,000 marchers in Washington, DC; 750,000 in Los Angeles; 250,000 in New York City; 200,000 in Chicago; 145,000 in Boston; 130,000 in Seattle; and comparable numbers in smaller cities. If these estimates are correct, we may have had as many as three million people on the streets of this country protesting the projected policies of the new administration. That approaches one percent of the U.S. population. And sympathy marches of nearly equal size took place elsewhere in the developed world.

Even if these numbers are wishfully overestimated, even if the protests were joined by many people who just went out for a sunny day among friends, that is still a huge effort in communication, coordination, and logistics. The images, the numbers, and the statement of public discontent are sobering. I am reminded that the Tsar of Russia fell after fewer people—but perhaps including more discontented workers and distraught soldiers—appeared on the streets of Moscow and St. Petersburg.

I am also reminded that the last election was close—hardly a mandate for the new president. The number of people appearing on the streets on January 21 roughly equals the edge in the popular vote won by his opponent—although it would be fatuous to say that these people represent, one-for-one, the number of Americans who now feel disenfranchised. The new president won because of the way our governing documents and our representative democracy are structured.2 His election was the result of some brilliant campaigning, sharp calculation of electoral votes, and a raw emotional message. Add to that a relatively weak message from his opponent—other than “I deserve this”—and embarrassing communications that were leaked from her campaign.

As a country, then, we seem to be sitting on a knife edge in 2017.

For half the country, the last eight years of a Democratic administration have been about moving the nation towards the sort of regulatory state and social democracy practiced in Europe, Asia, and much of the rest of the developed world. The promised “fundamental transformation” has been toward a larger role for government in the economy, expressing direct concern for people who cannot or will not compete against their fellow citizens for their share of the “American dream.” The focus has been on providing safeguards for poor people, various disadvantaged and marginalized groups, and the environment itself and protecting them from the uncaring and undisciplined practice of free markets and a capital-funded industrial sector. For this half of the country, generous welfare benefits, redistribution of incomes, and centralized control of the economy are all positive goals. And what has been achieved so far is a start, but it must continue to move forward before everyone is safe.

For the other half of the county, the last eight years have been a rolling back of what they see as the strengths of America. Regulatory control by centralized bureaucracies increases the costs of doing business and throttles the initiative and creative spirit of the entrepreneurs who have built the strongest, most innovative, freest economy in the world. The focus on welfare rather than opportunity creates a downward spiral: as the average person becomes unable to participate—or indifferent to participating—in the economy, fewer goods and services are created and provided, and fewer people can afford to purchase and consume them. The centralized controls engaged by the executive branch represent an unearned place in the productive process, where politicians get to exercise their animus against certain industries or else create skewed incentives to reward their political donors. For this half of the country, greater economic activity, popular skill building and labor flexibility, and the expansion of product lines and choices are all positive goals. And this “natural” state of things has been under attack in the recent Democratic administration.

In international affairs, half the county wants the United States to accept a reduced role and diminished influence, becoming no different from or better than any other country. In this role, we have no need for a large naval fleet, a readily deployed military force, or nuclear capability. The other half sees the United States as the paragon of strength and fairness, inheritor of the fundamentally free and benign Western Tradition, and needing a strong military to project power in its role as “the world’s policeman.”

In cultural matters, half the country wants to erase the old definitions of and relationships between male and female, parents and children, artists and their viewing/reading/listening public, the rich and the poor, the weak and the strong. They reject the “melting pot” of America in favor of a racial and ethnic “mosaic.” They want to redefine and thereby control human nature itself, in order to create a fairer, more just, more equal society. The other half views these attempts at redefinition and revision as both frivolous and dangerous, and considers them a weakening of our national character.

In scientific matters, half the country wants the other half to become more scientifically literate, to base its policies and programs on good science and hard evidence, and to beware of science “denial” and perpetrated “hoaxes.” The other half wants the same thing. Just no one can agree on where truth lies and where biased interpretation and unsubstantiated reporting begin.

For the last dozen years or so, I have heard, both in the alternative news media and in social media—less so in the mainstream media, but now growing there, too—the terms “culture war” and “soft civil war.” These are code words for all of the above tendencies. One half of the country wants progressive, collectivist, and socialistic adjustments made to the regulatory state that has been a long time building here, at least since the Great Depression if not before. The other half wants a return to more the individualistic, unfettered, self-reliant, and self-determined approaches to our perceived problems—and this half also perceives far fewer problems with our society and economy in the first place.

In the last three months, I have begun to hear, in all the different media, the term “second civil war”—without the qualifying adjectives “soft” or “cultural.” In California, I also hear the word “secession” bandied about, and not by kooks.

In my own writing, I am no stranger to the idea of another civil war in America. In my early novel First Citizen, from 1987, I showed a possible breakup paralleling the civil wars of the Roman Republic, with factions based on competing economic and cultural spheres à la Joel Garreau’s The Nine Nations of North America, which was published in 1981. And in my two-volume novel Coming of Age, from 2014, I show the country falling apart along the lines of coastal states vs. central states when the international holders of the U.S. national debt call for an accounting.3

What I’m sensing in the often-violent protests that have periodically broken out since the election, and now in the peaceful but determined march that took place on January 21, is a further hardening of these viewpoints. Half the country wanted the changes that took place in the most recent Democratic administration, and the other half voted just as strongly in the recent election to slow, halt, or abolish those changes. Maybe we will have another eight years of rollback and consolidation of the old economic and cultural conditions. And maybe after that will come a renewed push toward more social and economic change. Maybe we will teeter on the knife edge indefinitely.

Or maybe things will come to a head much sooner. The protest at the Democratic National Convention in Chicago in 1968 drew perhaps 10,000 participants. They were focused on two issues: ending the war in Vietnam and voicing the discontents of the new “counterculture.” The keynote of that protest was “The Whole World is Watching.” Now we have two to three million people protesting in the streets—peacefully on January 21, thank goodness—and the whole world seems to be joining them.

I don’t know how this can be reconciled peaceably. Maybe half the country will just get tired and quietly accept a new status quo—in one direction or the other. Maybe these January 21 protesters did just come out for a sunny day among friends, and maybe other protesters at other times have come out simply for the animal joy of screaming their lungs out, breaking windows, and torching parked cars. Maybe one side or the other will finally admit that they’ve been foolish, wrongheaded, and stubborn all along … but I doubt it.

I begin to entertain the notion that the insurrection has already started, and that it will end in blood.

1. See Something Happening Here from May 8, 2016.

2. Yes, the Electoral College keeps us from having a simple democracy and appears to block the “will of the people.” Every party that wins the popular vote and loses the electoral vote makes this claim. But the time to pass a constitutional amendment changing the rules would be well before the next presidential campaign season starts. Party organizers and campaign managers on both sides tailor their planning and efforts towards the realities of the electoral college. If the winner were going to be decided on a straight popular vote in 2016, both candidates would have campaigned very differently. It’s not too much to say that neither one would ever have left the hot spots in California and New York. And people in places like New Hampshire and Iowa might as well have stayed home and not voted at all. No one really wants that.

3. And that scenario still haunts me. A debt of $20 trillion will have to be paid back somehow, and I’m betting that at least half the country will balk at scrimping and sacrificing in order to repay it. Even if the government decides to make the debt disappear through a roaring, Weimar-style inflation, half the country may still find its voice to object.

Sunday, January 22, 2017

A Money-Making Enterprise

This is another rant, inspired by a fellow novelist’s observation that good editors in traditional publishing—the sort who can help you take your book apart and put it back together again, let alone catch typos and correct the grammar—seem to be in short supply these days. And that got me thinking about the current state of the arts in popular culture.1

I recently saw the 2016 sequel to the 1996 movie Independence Day, this one subtitled Resurgence. I really liked the first movie, have watched it many times, and still enjoy the visuals, the characterizations, and the snappy dialogue. But, after sitting through the sequel, I was stunned when the credits showed three people involved in the “story” and “screenplay.” The movie had almost no story—or at least no new story. It was, in sum, an uninspired gloss of the first film, with cameos and throw-away lines by the earlier actors in their characters, as well as dull portrayals by new young actors playing their supposedly grown-up children. The new alien ships were so much bigger and badder, and their actions so haphazard, ludicrous, and almost unexplained, that it was clear the director, Roland Emmerich, told the CGI department to go have fun and not bother adhering to any script. The entire movie was just a blitz of imagery and walk-on acting without any focus on telling a succinct and involving story.

Why is this relevant to books that don’t get the editorial love they deserve? Because I know that the people responsible for the Independence Day sequel knew they had a bankable property and they didn’t have to care much about engaging the audience’s full attention or respect. They weren’t out to tell an interesting story. They weren’t intending to make any kind of art. They were intent on making ninety minutes of passable scenery and recognizable characters that would draw boobs who had liked the first movie into theaters and then not actively disgust and disappoint them—as they might have been with, say, an hour and a half of a blank screen or a play performed with finger puppets. The filmmakers had nothing new to say, show, or share, but that didn’t matter, because the fame of the first movie was going to sell it for them.

The J. J. Abrams treatment of the recent Star Trek movies works on the same principle. And I think a lot of editors handling the manuscript of a famous and bankable author are working from the same mindset. “It doesn’t have to be good. There’s a built-in audience for this stuff. They’re fools anyway. So this book or movie just has to not be terrible.” In other words, this enterprise isn’t about art or imagination of any kind, it’s about packaging a two-hour film clip or a wad of paper filled with black marks that will be “good enough” for commercial purposes. It’s a money machine, not an artistic endeavor. Get the butts into the theater seats. Get the boobs to pick up the book or DVD and take it to the register.

It may not always look that way, but in my own writing I will often spend a good ten minutes—sometimes much longer—working on and worrying over one verbal image, sentence, or paragraph. I am trying to get the meaning, the tone, and the flow just right. Sometimes these things simply come out of my fingertips and onto the screen as I type. Sometimes I have to sweat for them. But I’m not satisfied with a book and won’t let it go out to my readers until every scene fits—at least according to my sense of the story—and every image and line of dialogue strikes the right gong note—at least to my particular ear.

When I worked at Howell-North Books, which was self-consciously a money-making operation, we still spent time and effort trying to create good books that would satisfy our readership, who were variously interested in railroad histories, steam technology, California history, and Western Americana. We were choosy about selecting our manuscripts. And I was given all the time I needed to edit and polish them, sometimes taking apart the work of non-professional writers and putting it together again to make an easily readable and intelligible story. Mrs. North—the company’s president, who was also our expert at page layout—would spend days over layout sheets with her pica rule and sizing wheel, creating the finished pages with an eye to flow and fit between text and photos. We all read galley proofs twice, went over page proofs line by line, and inspected every cut and mark on the blueline proofs2 to make the books as flawless as possible. We respected the readers who would buy our books and wanted to make each volume meet their expectations, even when we were publishing the second or third or later book by a successful author. We knew that if we produced anything half-hearted, or started cynically playing on a big author’s following, we would lose customers.

In these days, I think, the empires of publishers and moviemakers have become much more dollar driven, and more cynical about the taste and expectations of their buyers. We still have the occasional gem. But most of what gets produced is a slick wrapper around a neglected product. Their motto isn’t “Let them eat cake,” but “Let them eat stale Ding-Dongs.”

But then, crass commercialism has been the order of things among lesser lights in New York and Hollywood over the past century. For every Edgar Rice Burroughs and Louis L’Amour who came up with something new and exciting in popular fiction, there have been thousands of volumes, millions of pages, of “dime novels” and “pulp fiction” that were published with no other purpose than to coach those dimes and dollars out of readers’ pockets. Wads of paper filled with black marks.

For every big-budget movie—or “tent pole” in the current marketspeak—with name stars which might become a classic, there have been thousands of “B movies” set in noir New York or Los Angeles, or in the Old West, or in outer space on Planet Mongo, where actors who would never be stars spoke forgettable—or laughably embarrassing—lines while dressed in cheap costumes in front of papier-mâché sets as the cameras rolled. Millions of feet of celluloid dedicated only to getting butts into theater seats.

Whenever I start to think this way, however, I remember and invoke Sturgeon’s Law: “Ninety percent of science fiction is crap. But then, ninety percent of everything is crap.” And I add Thomas’s corollary: “By the crap shall you know the good.”

1. For further thoughts on the writing process, see the email exchange between myself and a former colleague who also writes novels in Between the Sheets: An Intimate Exchange about Writing, Editing, and Publishing.

2. The blueline is a photo proof of the stripping process, which puts together the bits of negative film representing text, screened images, hairline rules, page numbers, and everything else that will appear on the finished plate for printing. These days, the blueline has been replaced by a PDF of the final layout from a software package like Adobe’s In Design.

Sunday, January 15, 2017

True Leadership

For a while when I worked in employee communications at the public utility, I edited—which really meant researching and writing—a newsletter for managers and supervisors. The basic theme of the publication, at least in my mind, was the art of leadership. I believed then and still believe now that leadership is one of the highest of human callings. Its basic function is to perform work through the good will and participation of other people to achieve goals that could not otherwise be attained.

This definition immediately rules out the person in a position of authority who views his or her subordinates as merely helpers, hangers-on, or dependents. Such a person usually believes he or she has all the skills and knowledge necessary to achieve the goals, just not the time or energy to do so. During critical phases of the effort or at crunch times, such a person swats aside the subordinates’ hands and initiative and takes on the task him- or herself. This is not leadership; this is solo mastery.

Teamwork is a major part of leadership. But in its common usage these days, the word “teamwork” focuses on the responsibilities and attitudes of the team members. Teamwork is considered to be a communal quality, arising from the actions of participants who subordinate their own interests, ideas, and energies for the good of the group. Teamwork is usually characterized as a kind of sacrifice, where highly competent people stop working for themselves so that others may prosper equally. Thus conceived, teamwork is supposed to be an antidote to competition among members of the group. For example, in a sales department, competition would have each sales rep trying to contact the most customers and ring up the most orders, so that he or she could win the most commissions. In this environment, stealing customers and failing to transfer calls would be a winning strategy. The commonest form of teamwork, on the other hand, would have the sales reps sharing their leads, passing off calls to each other, and going out of their way to satisfy each customer, even if someone else on the team got the credit and the commission.

Teamwork may or may not be a better approach to good effort compared to competition, depending on how the teams are structured, how incentives are distributed, and how the group’s values are stated and enforced. Still, the usual notions of teamwork are that it somehow arises on its own, out of the good will and creativity of the group members. But that structure and those incentives and values do not simply float around in the air, waiting to be applied. Someone must take a hand in creating, proposing, and enacting them. That person is usually the unidentified and unrecognized member of any team’s story, its de facto leader.

The leader may be someone in a position of authority over the team. Or it may be an individual on the team who senses the existing group dynamic; sees opportunities for improvement; voices a new structure, relationships, and values; and then advocates for them with the rest of the group. In this non-authoritarian position, the leader can do the necessary structuring and value creation, but he or she still cannot revise the incentive program—at least not in a business setting—without recourse to and buy-in from upper management.

The leader who is also in a position of authority might simply order the new structure and announce the new values—but he or she would be a fool to do so. Perhaps forty or fifty years ago, the industrial and commercial culture of this country favored top-down, command-and-control leadership. This was probably a hangover from the previous forty years in the 20th century, which endured two world wars separated by, first, a decade of wild economic success and, then, a decade of economic collapse, precipitating a more robust and authoritarian form of leadership.

This top-down leadership style could work in an organization which, like the U.S. military, had a mostly captive workforce.1 The expectation in business and industry through the late 1940s, ’50s, and ’60s was that an employee joined the company or the union for life, looked to the organization to provide not only work and pay but also health benefits, scheduled vacations, regular advancement, moving allowances, and a pension upon retirement. In return, the employee performed whatever job he or she was told to do, did not moonlight or freelance, relocated to another part of the country or overseas when asked to, and offered the organization his or her unfailing emotional support and allegiance.2

But along about the 1970s—and certainly in full swing by the ’80s—a new style of employee was created, mostly from the pages of bestsellers by strategy gurus and management consultants. The new employee was not supposed to simply take orders but to anticipate them, foresee opportunities and directions that would benefit the company, and pursue them with the blessings of management. The new word was “entrepreneurial,” and in that guise the average employee in the average position within the company was expected to exercise the eagerness and foresight of an Andrew Carnegie, a Hewlett or a Packard, a Wozniak or a Jobs. But, where the true entrepreneurs of industry were usually following a hunch or a dream, operating on a shoestring of finance, and working without guidance on a venture that would all too likely fail, the corporate entrepreneur was still working within a defined product or service area, on an annual budget, and with plentiful if not mandatory guidance on a venture that had better not fail.

This situation was, of course, unstable. So along about the ’90s—and growing through the aughts and teens3—a newer style of employment was created, characterized by the paradigm “Me, Inc.” This employee was usually not actually hired by the company but worked as a contractor or temporary staff supplied by an agency. This employee had no expectations of the company which actually needed the work to be done—not continuing employment, not advancement, benefits, or retirement. And those people who were still formally employed by the company were understood to be working “at will”—which meant they could be laid off or fired immediately and without cause. These formal hires also received from their employer a “defined contribution” to each employee’s personally managed retirement account, rather than the “defined benefit” of guaranteed retirement at a certain age with a certain residual income.

Leadership in the era of Me, Inc. is a different proposition from that in the top-down era. In this new work environment, the leader becomes less of an authority figure and more like the individual team member who sees opportunities, proposes solutions, and enlists the participation of others in trying them out and making them work. This kind of leader does not give orders except in unusual situations or from dire necessity.4 Instead, he or she points out necessities and opportunities in the organization’s current situation or the economic environment. And rather than propose solutions directly—as if he or she possessed all the answers—the leader invites others on the team to come up with the ideas. The delicate step, then, is for the leader to guide the discussion of options and force the group into realistic appraisals, so that appealing but harebrained notions don’t capture the group’s imagination and let people run away into foolish or reckless actions. The leader stays fixed on the hard and indisputable realities of the situation, rather than making appeals to authority—which always, in the end, come down to “because I said so.”5

Letting others devise and implement solutions is a form of delegation. The good leader delegates where appropriate—meaning once the subordinate has been prepared with the organization’s and the leader’s values (“What’s important around here”) and standards (“How we do things” and “What’s acceptable around here”). Setting values and standards are probably the biggest part of the leader’s job. A “natural” leader, if there is such a thing, has both a feeling for group sentiment and group dynamics as well as the capability to appeal to—and direct the group toward—a higher vision. That vision might be one involving morality, fairness, efficiency, personal honor, or some other good. The vision is almost always positive (“Things work better if we do it this way”) rather than negative (“You’ll get in trouble if you do it that way”).

With a positive vision, the leader aligns him- or herself with the belief that most reasonable people want to do the right thing, and most employees want to create a satisfactory product or service experience. Every job and every market sector or political function has its own canon, whether written or unwritten, of acceptable practices and work product. People who have chosen a career or a position on their own—rather than being dragooned into or enslaved by the organization—already have notions about what is the right and proper way to act and to do the job. The leader works within those canons and notions, rather than against them, and builds on or shapes them to fit the particular task at hand.

Leadership, like much else in this life, is an art form. It is a blending of personal force with perceptive deference to the ideas and opinions of others. It enlists the motives, creative potential, and dreams of the team members. And it works best when the leader is positive, relaxed, and confident—even when he or she might not actually feel that way. True leadership is the highest expression of personal strength and capability.

1. But at the highest levels of military command, the good leader is not always a top-down order giver with his or her immediate staff. Soldiers on the battle line are expected to follow orders implicitly and without question, but the headquarters personnel who originate those orders and the colonels and majors—or, at sea, the captains and commanders—who must execute them should always be given the freedom to offer suggestions and then to exercise initiative in acting upon them. A good general or admiral invites comment and criticism, within bounds, to elicit trust and participation.

2. Many employees also met their romantic interests, significant others, and future spouses in this environment. They would even, at the company’s prompting but without irony, consider themselves to be part of “the XYZ Corporation family.” Work represented a cultural as well as an economic proposition.

3. And the trend was further exacerbated by the employment conditions spelled out in the Patient Protection and Affordable Care Act of 2010, which put economic pressure on employers either to provide more comprehensive medical benefits or to limit the scale of their employment.

4. To quote from Frank Herbert’s Dune: “Give as few orders as possible. Once you’ve given orders on a subject, you must always give orders on that subject.”

5. No one liked hearing that line of reasoning when Mother or Father used it with them as a child. No adult really likes to hear it now.

Sunday, January 8, 2017

Between Perception and Reaction

We have a small dog, a terrier-mix rescue named Sally, who has separation anxieties. If we leave the apartment for even a few minutes, she will be up on her hind legs, waggling her whole body, and smiling1—not to mention pawing and licking—when we return. If we leave for a couple of hours, the greeting process is longer and more energetic.

Since this is California and it never gets really cold—not by East Coast standards—and because my feet often get hot, I usually wear sandals2 without socks when we go out. After years of wear, my sandals are a bit loose and tend to slap against my heels as I walk down the hallway to our apartment door. But even before I’m halfway there, I can hear Sally dancing and whining on the other side of the door.

All this got me thinking. She hears the sound of the sandals slapping. She knows from experience that this sound heralds the joyous experience of her “big guy” returning home and ending her loneliness. So … familiar aural stimulus equals predictable emotional response. At some level, a human being might have a similar reaction. You hear the jingle of keys in the hallway, you know your wife is home.

But a human being—at least during the first or second time of receiving this stimulus—would interpose words between perception and reaction. The human brain would automatically ask, “What’s that sound?” The mind would then sort through comparisons in memory and come up with not only a mental image of jingling keys but also a word, “Keys.” And from that follows the thought, in words or perhaps just in images and sense memory, “My wife.” We humans are such verbal creatures—made so by an environment that showers us with spoken and written words; with captioned images in our books, magazines, and even our advertising;3 with vital information spelled out on warning signs and labels;4 and with demands that we respond aloud or in writing to specific questions—that supplementing our thoughts with words is second nature to anyone over the age of six.5

I know Sally understands some spoken words. At the appropriate time in the evening I might say casually to my wife, “Do you want me to take the dog?”—meaning but not bothering to add, “out for a walk?” Sally will immediately lift her head and begin dancing. She knows “take” and “dog” are associated with the worship-words “out” and “walk.” Our previous dogs could even understand what we meant when we spelled, “T-A-K-E,” and I’m sure Sally will graduate to interpreting spelled-out words one day soon.

But spoken words and spellings are still just learned stimuli in the dog’s brain, like the sound of flopping sandals and jingling keys. Or rather, I’m almost sure of that. The dog may associate them with memories of the humans coming home or taking them outside, and these memories may be connected with visual imagery and, probably, scent cues for the imminent and enjoyable experience of sniffing the bushes. But I don’t think that the dog, when it wants to go and relieve itself, supplies the word “out” or “walk” from its own recalled memory, as a human would. When a human feels a full bladder, he or she will often think and even say, “Gotta find a bathroom”—even if no one is nearby to receive this timely information.

Supplying words as an intermediary step between stimulus and reaction enriches and modifies the human experience. For a dog, it may be enough to hear [jingle] and think [returning-human-happy-happy]. For a human, the mental insertion of the word “keys” can lead to other thoughts. A husband may remember that his wife had left her keys on the counter that morning, and so someone jingling keys in the hallway must be the occupant of the apartment across the way returning home, not the wife—or it could be a stranger trying the lock on the door. When confronted with visual, aural, or tactile cues for which the brain has no learned referent, the dog will either ignore the stimulus or become confused. The human will sample and compare past cues and fit names as well as images to them. The process will insert knowledge acquired from past training, through reading as well as from direct experience, to identify the cue and decide whether it is a cause for reassurance or a threat.

This verbal dimension of human thought allows us to categorize and compress information. The word “key” encompasses may meanings: the toothed metal probe used for aligning the tumblers in a lock; the coded list of references on a map; the text used as a starting point for solving a cipher; the charm or plaque used to identify a fraternity or sorority; as well as visual images of my household keys, my car key, my wife’s keys, the huge iron keys used in medieval locks, and the diamond-studded charms sold at Tiffany & Company. Having all these meanings associated with one word, the human brain is a field of rich connections. We are not limited to simple, singular mental connections like [familiar-jingle] equals [return-happy].

These word associations give power to particularly human activities like storytelling and poetry. A word captures a number of visual—or aural, tactile, and other sense—images that cascade through the mind of the listener. The storyteller uses these images to put listeners or readers inside the scene and make them part of the action. And the wonder of it—from my point of view as a novelist—is that the associations I make with a particular word can be trusted—most of the time, for most of the population—to arise in the minds of those who read my stories. Of course, there are differences. The word “clown” for most people has happy, funny, or outlandish associations, calling to mind red bulb noses, orange string wigs, squirting boutonnieres, and long, floppy red shoes. But for people with a morbid fear of clowns, the word gives rise to images of creepy things with leers and teeth.

I try to imagine a human being, a true Homo sapiens in mind and body, but who lived in a time—which would be the majority of our line’s history on Earth, sixty or seventy thousand years or more—before the invention of writing and our hyper-literate civilization. Words, their meanings, and the grammar and syntax of language would then have been a private thing within the tribe or even isolated within the extended family: rock, path, pot, stick, and a dozen inflections for words describing weather, game, edible roots and berries, and the ways to hunt and gather them. The tenses to describe action in the past or future would have been simple, with little need for the pluperfect or the subjunctive. I try to imagine a hunter-gatherer expressing “By this time tomorrow, if it doesn’t happen to rain, I will have tracked and shot the deer I saw yesterday.” The Greek and Sanskrit aorist indicative mood, denoting simple action without reference to completeness, incompleteness, duration, repetition, or any particular position in time past or present—“I hunt. I fish. I pick berries.”—would reign supreme.

And yet, within a few hundred years after learning to cut cuneiform wedges into wet clay, or scratch angular letters on potsherds, the Sumerians were inventing and reciting the epic struggles of Gilgamesh, and the Greeks were telling a convoluted story of old wounds and grudges as the gods and mortals vied for supremacy at Troy. And today we read translations of these stories into modern English and marvel at the power and beauty of each word’s imagery and its associations.

In the human mind, the word itself has become the stimulus to a reaction. We do not need visual, aural, or other sense cues and perceptions from the outside world to spark an intellectual or emotional reaction. We draw the images, ideas, and emotions from inside our own heads, reacting to nothing more than black squiggles arranged on a white page or screen. We all live inside our heads. Our brains and their pathways have no direct contact with the outside world except through chemical nutrients, drugs, and poisons. So we each make up the world inside our minds from sensations fed from our eyes and ears, the taste and smell receptors in our mouths and noses, and sensors all over our skin. For the human of ten or twenty thousand years ago, that world entered the directly mind from all these senses. For a modern, literate human, the world can also enter from a single source: the eye and its trick of interpreting those squiggles inside the visual cortex.

And for me, that trick is a continuing source of wonder and mystery.

1. I never noticed this with our other dogs, but Sally smiles by lifting her upper lip over her front teeth. I always thought this was a dog’s warning, prelude to growling and snapping. But from the way her eyes squint and her body gyrates, she is clearly happy. I think this is something she learned from watching humans smile.

2. The Keen sandals have good toe protection, unlike Birkenstocks or flip-flops. Because of the way the sides wrap up and connect over the instep, a wargaming friend who is deep into Roman history calls them “calyxes,” or boots, the Latin name for the legionary’s hobnailed sandals. And like the calyx, Keens even have sturdy, gripping soles with deep lugs.

3. I learned in the book-publishing business that, while a picture may be worth a thousand words, modern readers often have trouble understanding or giving full value to a picture without a caption to read alongside it. In a book about mountain climbing, for example, if you reproduce a photo of a beautiful, snow-covered peak, the reader will look around for a caption that tells the name of the mountain, elevation at the summit, and whether or when the author has scaled it. Even a picture of a beautiful woman holding a perfume bottle with the maker’s name clearly shown on the label will give that name again in bold type under the image.

4. In California, we have warning signs in English and Spanish. And just in case the viewer speaks only Cantonese or Vietnamese, they will include a stick-figure demonstrating the danger. (The polyglot Europeans long ago did away with words on their traffic and warning signs in favor of imagery and figures, but in California as in the rest of America we persist with words.) My favorite stick figure, in a warning about overhead high-voltage lines, shows a person sticking a length of irrigation pipe up into the wires and dancing like crazy.

5. This poses special problems for people who are either deaf or dyslexic. But although they may not be fully capable of either hearing spoken commands or reading complex information with easy comprehension, they are not relieved of the human association between thoughts and words. By now, in the modern form of H. sapiens, it’s hardwired into our brains.