Sunday, January 14, 2018

Public Lives and Private Lives

People puppets

A whole slew of public figures—mostly politicians, actors, and journalists, but what other kind of “public” is there anymore?—have recently been brought down by accusations of boorish behavior, inappropriate touching, lewd comments, harassment, and other activities that border on, but usually don’t meet the qualifications for, the Rape of the Sabine Women.

I don’t condone this behavior. I believe that women are not disposable pleasures but fully actualized people who should be treated with respect and courtesy, especially the closer a man gets to them and the more intimate their relationship becomes. If a man wants to interact with a woman at the closest level, it should be as friend, companion, and lover—not an object of pleasure. A man who deals intimately with a woman, entering into a position of responsibility for her, and then mistreats or abandons her is lacking in commitment and a sense of personal honor. A man who willfully injures a woman either verbally or physically shows himself to be a diminished person.

But that said, something has changed in our world having to do with relations between the sexes. The 1960s and the sexual revolution, driven by the convenience of the pill and other forms of birth control, and clouded by a haze of pot smoke, cheapened and trivialized love and commitment in the interest of physical pleasure. Self-restraint, caution, and deeper feelings of caring and responsibility were thrown to the winds. If it felt good, you did it, and thought about the consequences later.

Now, in the 2010s, the diminishment of sex has come full circle. Using sexual activity for self-gratification (on the man’s part) and for personal enhancement and career survival if not advancement (on the woman’s part) has become commonplace. It is the quid pro quo of the entertainment and journalism industries and in power politics. In fact, the circle has turned so far that we have suddenly entered a newly puritanical era. In the space of the past year, dating to the upsets of the 2016 election, but with significant outbreaks from the years before, a man’s sexual history and his boorish behavior have become worse than criminalized. His touches and gropings, his comments, his unwanted moments of closeness—let alone any calculated rapes—have now become grounds for public humiliation, economic execution, and spiritual exile.

When I was growing up, in the decade and a half before the sexual revolution, sex stayed in the bedroom. It was nobody’s business what went on in a private home and behind closed doors. Yes, there were laws about incest, sodomy, and other public taboos, but the cops never broke down the bedroom door looking for infractions between consenting adults. You had to perform the unlawful act in public—or solicit it from a member of the vice squad—in order to get arrested. And although most people weren’t exactly comfortable with homosexual activity and other relationships that were considered vices at the time, they tolerated the situation so long as the acts remained between consenting adults and stayed behind closed doors, along with the rest of human sexual activity.

The press offered public figures a measure of polite silence and turned a blind eye to their personal proclivities and weaknesses. That is a far cry from the howling chorus we have today. Yes, John Kennedy had affairs, most notably with the actress Marilyn Monroe. And who can say that all of those affairs with a popular president were free of the taint of gross or subtle coercion? Kennedy also suffered debilitating back pain for most of his life and probably was dependent on opioid painkillers. Yes, Dwight Eisenhower took a mistress while he was stationed in Europe as Supreme Allied Commander. Yes, Franklin Roosevelt had a mistress while in the White House. Roosevelt was also crippled with polio and confined to a wheelchair that his aides artfully concealed. All of these failings were known to insiders and to the press. But none of them was made explicitly public, because wiser heads decided it was proper to judge the man on his political skills and his public policies, not his private life.

I think the hue and cry started with Bill Clinton.1 His lecheries with staffers, and even women with whom he only briefly came into contact, became the stuff of public ridicule in the media. His reaction to questions about these lecheries became the stuff of political obsession by his opponents, which culminated in an attempt at his impeachment on charges of perjury and obstruction of justice. Clinton’s private life became a political liability.

Was Bill Clinton a sexual predator? Probably. Are the men who are now being publicly tarred and feathered sexual predators? On a case by case basis … probably, many of them. While I think sexual predation is a low form of behavior and personally dishonorable, I also understand how the world works, and how it has always worked. The pattern goes back to the Bronze Age and probably to our earliest hunter-gatherer beginnings—as soon as one man in the tribal band became socially powerful and started being treated as a decision-maker and chief.

In any given social or interpersonal situation, one person will always have the advantage and psychological, if not actual physical, power over another. Usually, the person who wants something puts him- or herself in a subordinate position, and so becomes vulnerable to, the person who can bestow the gift, benefice, or advancement that is desired. Similarly, the partner in a relationship who loves the most subordinates him- or herself to the person who cares the least. Men in public office who can grant favors, even if it is just proximity to the wheels of government, or those who can make or break careers, like a movie producer, will always have power over those who need favors or want to boost their careers.

In the human world, unfortunately, the women who need a favor, are attracted to power, or have a career to make usually end up putting their sexual persona, or sometimes just their personal submissiveness, at the service of powerful men who can grant those favors and build those careers. This is the disadvantage of being a women, not just in 20th-century American society, but in all societies going back to hunter-gatherer tribes. Still, it can also be an advantage because, for most adults in our sexually liberated age, contrived intimacy is no longer that big an issue and need have no lasting consequences. A man in similar need of favor or advancement would have to compensate the power broker with a pound of uncut cocaine, a box of Cuban cigars, unlimited personal loyalty, or performance of some legally or morally questionable service.

There is no way to stop this kind of transaction. It is built into human nature, which derives from the social interactions of all primates and all mammals who happen to gather in groups or herds. The big gorilla, the alpha male—or the alpha female—gets what he or she wants. The only way to stop this transactional relationship is to eliminate all positions of power, all distinctions between people, and the basis of all human interactions. Either that, or impose horrendous penalties on all persons engaging in extramarital sexual activity—and enforce existing laws about cocaine and Cuban cigars.

Or, we could return to a culture that holds a man—or a woman—to a standard of personal behavior and expects him or her to follow a code of personal honor. We don’t have to turn a blind eye to the mistresses and the addictions. But we also don’t have to bribe doormen and sift through other people’s diaries looking for infractions, because we can trust that most members of society, even those in positions of political or economic power, are decent, honorable, and living according to such a code.

You can’t enforce decency by civil or ecclesiastical law. But you can make it personally attractive and … perhaps even expected once again.

1. However, the public shaming of Colorado Senator Gary Hart, over an extramarital affair that surfaced during his bid for the presidential nomination in 1988, was a precursor to the outing of Clinton’s follies. Treatment of the Hart affair set a precedent for the next stage of journalistic voyeurism.

Sunday, January 7, 2018

Silent Movie

Scene from Tol’able David

In early December, I saw the movie Tol’able David from 1921 at the San Francisco Silent Film Festival. The screening was presented with bravura accompaniment—think of playing nonstop, with expression, and mindful of what’s happening on the screen, for one hour and thirty-nine minutes—by local ragtime and concert pianist Frederick Hodges. I’m not really a fan of silent movies, but the experience made me think about developments in my own art, which is fiction writing.

In the early silent pictures, you can see the transition in pictorial storytelling from stage plays to modern moviemaking. For one thing, the actors are heavily made up: eye shadow, eyeliner, exaggerated brows, and darkened lips in otherwise pale faces. This recalls stage makeup, where the actor’s facial features are emphasized so as to be distinguishable in the tenth row back and not fade into a white blur from the vantage point of the balcony. A friend explained that the heavy makeup in the silent screen was also required by the film stock of the time. The film’s emulsion was “orthochromatic” and was good at capturing the blue wavelengths of reflected light. Red wavelengths, such as those coming from ruddy faces and the thinner, blood-tinted skin of lips and eyelids, would tend to disappear; so the makeup heightened and emphasized these areas. It is also no coincidence that the mouth and the eyes are the keys to a person’s facial expression.

Today’s movie makeup works harder to create a realistic scene. Yes, a woman’s lips and eyes get special treatment as elements of modern female beauty—just as a fashion-conscious woman spends time smoothing her skin and highlighting her eyes and lips with cosmetics. But more often the makeup department attached to a set is creating realistic beards for beardless actors, applying mud, blood, and bruises for a fight scene, visualizing alien or zombie features, or making sure that a laborer’s beaded sweat stays in place and shows up well on camera.

Just as silent-film makeup emphasized delicate facial features, the actors also exaggerated their face and body language, relying heavily on pantomime. And again, this was an adaptation from the stage, where a slight lift of the eyebrow or a millimeter dip of the lips in a frown would go unnoticed by the general audience. And so actors of the silent screen squinted when they looked into the distance, struck their foreheads when expressing surprise, put a fist to their chins when deep in thought, and held their bellies when they laughed. These gestures became a visual code or language. An audience sitting at some distance from the stage, or on the other side of the silver screen, understood them and received the right message.

Silent-film actors would also mouth words of intended dialogue to fit the situation, because you can’t carry the whole story with subtitles. For example, in Tol’able David the heroine runs down the street in distress, and you can see her throw back her head and scream “Help me!” If you are paying attention to the action, you don’t need a dialogue card to tell you what she’s saying.

In today’s films, the actor is cautioned to minimize facial expression and gestures. I once saw a wonderful clip in which Oliver Reed gives acting advice to a beginner. He tells a young broadcaster to control his inflections, minimize his accents, and hold his eyes still instead of winking and blinking. The point is, when your face is shot in close-up and rendered forty-feet wide on the big screen, and when your voice is amplified throughout the theater with modern electronics, you don’t need histrionics to get your point across.1

Another artifact of the theater stage that carried over to early movies was the point of view through the proscenium arch. The audience of a stage play is out of the action, looking through this arch—sometimes called the “fourth wall” of an interior set—which frames the view. Even with theater-in-the-round, an audience member sees the action only from his or her side of the stage, not from within the action. Early movies adopted this more distant point of view, framing most shots to encompass the middle distance and all of the action, then letting the audience member look here and there on the screen to decide where the most important bit of stage business was taking place. Only occasionally would the movie focus on an actor’s face in reaction to what was going on, or pick out a specific object, like the gun under the bureau in Tol’able David, to make a point to the viewer.

Modern movie directors and cinematographers use the camera like a roving eye, focusing on details that are important to the story, fixing one actor’s face and then another’s, or isolating a bit of action to give it special meaning. Only occasionally—unless the director is Stanley Kubrick—will the camera pull back and show the whole scene as if viewed through the proscenium arch.

And finally, the silent movies only dabbled in special effects. In the black-and-white Tol’able David, for example, most of the action takes place during the day, and here the film is untinted and rather stark. For a scene at night outside a building, the film is sepia-toned. And for a dance in the moonlight, the film takes on a blue-green tint. Similarly, when a character is running across a field, the speed of the film in the camera is noticeably reduced, which speeds up the action and makes the character appear to run faster than normal.2

Today’s movies, of course, use special effects to portray many places and things that used to be shown with actual scenery and stage props. Why go to Rome or build a fantastically expensive set when you can shoot on a sound stage with a green screen and let matte painting or computer-generated imagery (CGI) fill in the imaginative blanks? In the most recent movies, the same digital effects have been applied to people, as when Andy Serkis played Gollum and Benedict Cumberbatch played Smaug in the Tolkien-based movies. The Star Wars franchise has even used CGI on unnamed actors to recreate the late Peter Cushing as the Grand Moff Tarkin and a young Carrie Fisher as Princess Leia without changing the role.

By comparing the silent movie era with today’s productions, you can see how the art form moved from stage to screen and then reestablished, or more often invented, its own conventions as cues to guide the audience’s experience.

Similarly, in studying English at college, I had the opportunity to read and examine early novels and compare them with other forms of storytelling. My coursework traced the art of contemporary fiction from the epic poems of the Greek Homer and the Roman Virgil, from the plays of the Athenian stage, and from the Celtic legends and the Norse sagas to medieval “mystery” plays and the Elizabethan theater, and then on to the first books that were actually published to tell a new—or “novel”—fictional story as opposed to one based on biblical stories, myths, and legends.

The first of those, as any English major will tell you, is Samuel Richardson’s Pamela, or Virtue Rewarded, from 1740. The story was told through a series of letters which Pamela sends home from her job working on a wealthy man’s estate. This novel was followed by Henry Fielding’s The History of Tom Jones, a Foundling, from 1749. Both books were keen on exploiting lust and sex, thereby establishing a convention—and a reputation—for the full-length book form of storytelling.

Like the earliest stage plays, the poetry of Homer’s Iliad and Odyssey and Virgil’s Aeneid adopted a global, omniscient viewpoint. The reader is carried along by a voice that portrays all of the action equally, from outside the character’s heads. The text might say that Achilles is sulking or that Odysseus is scheming, but the action only describes what each man does and is supposed to be feeling while he is doing it.

The novel Pamela, being told in letters written in the character’s own voice, introduces the first-person viewpoint. Here a young girl tells only what she herself saw, felt, and did. To introduce the thoughts and motivations of the other people in the story, the author must write of them as speculations and assumptions—or fragments recalled from earlier conversations—in the heroine’s letters. This is a confining way of telling a story, but it has the positive effect of taking us inside the viewpoint character’s head in the way that a third-person narrator cannot convincingly do.

When I started out writing fiction, the third-person, omniscient, godlike narrator was the voice of most novels. In some cases, the narrative voice would affect an accent and a personality, make jokes and observations about the story’s situations or the human condition, and act like a human being telling a story around a campfire or in your parlor. But then, after years of stripping away the narrator’s affectations—mostly under the influence of the journalistic style and of Ernest Hemingway, who wrote simple, direct, unadorned sentences—popular fiction spoke with the voice of an invisible, uncharacterized narrator. The words came out of the air and into the reader’s head.

One problem with this omniscient approach is that the story loses all effective viewpoint. Like an audience viewing the described action and the characters’ reactions through a proscenium arch, the narrator’s viewpoint danced into and out of the heads of various characters during the course of a single scene or dialogue exchange. One character would see something and reflect on it, then another character would view the first character’s reaction and think about it. The invisible narrator would offer physical descriptions of all the characters at once, without any distinguishing flavor or the discernment and judgment particular to one point of view. The observations and revelations of the characters would flow back and forth with no control. The viewpoint was both omniscient and omnipresent.

This approach is fine for telling a story fast and completely in one movement. The action takes place once, is over and done, and everything is revealed by the end of the scene. And this can be satisfying for the reader. But it is difficult for such a narrator to conceal anything from the reader except by willful misdirection: the narrator must play tricks to keep the reader in suspense. If that narrative voice chooses to conceal one character’s assumptions or understandings for the moment and only reveal them later, there is no good explanation for why the narration didn’t dip into that head, too, at the appropriate time during the action.

Out of dissatisfaction with this omniscient narrator, I have developed a personal style that some call “indirect discourse” but it is more a form of tight viewpoint control.3 I believe it more accurately reflects the way people think, act, and speak. So, during a scene, my narration is from a single viewpoint and examines the thoughts and feelings of only one character at a time. Other characters may act and speak inside the scene, but then they are viewed as objects, as things outside, by my viewpoint character. It is much like writing in first person—“I did, I said, I thought”—but using the third-person voice.

If I want to show how another character felt or what he or she thought about the action, I must move to another scene told from within that person’s head. For critical action sequences, I break the story down into multiple short scenes passing from one character to another. Or, more rarely, I pause the timeline of the story and go back through the action, telling what was going on from another person’s point of view. Or, if I want to keep the story moving forward, I incorporate in a later scene the viewpoint character’s recalling and reflecting on the earlier action while he or she now attends to something else.

Tight viewpoint control can work through a single main character, who is followed throughout the book as if it were a first-person story. But then every other character’s thoughts, feelings, and motivations must remain outside, as speculations and assumptions by this viewpoint character. And anything that is unknown to the main character—like who is really on the other side of the door, or how his lover is about to betray him—must come as a surprise to both the character and the reader.

It’s more fun—and better storytelling—to reveal stories through various fixed viewpoints from multiple characters. Then I can play one person off against the others. I can also pick one person or group to represent the detached, expectant viewpoint that parallels the reader’s, serving the same function as the Greek chorus on the Athenian stage. The chorus invokes for the audience the justice of the gods, the mores of current society, or simply the common sense of the average person. And I can have one character plotting an action or pursuing a motivation about which a second character has no notion—but I can let the reader can see, anticipate, and wonder how the second character will fare when the trap springs.

Modern storytelling—like the creative focus of the camera lens in modern moviemaking—allows the writer to direct the reader’s attention and understanding in ways both subtle and overt without being obviously or inexplicably manipulative. Like a modern screen actor’s ability to convey emotion through a look in the eye, without pantomime and facial histrionics, it’s just a better, more realistic, more satisfying way of proceeding.

1. When I first met her, my wife had a tendency to raise her voice an octave or more when she became stressed or angry. But this made her sound shrill and nervous, and so she became less convincing when she confronted men in a professional or commercial setting. I suggested—from something I had heard elsewhere—that instead, when she became angry, she should drop her voice and speak very calmly. My wife perfected this technique and could sound downright scary in an argument.

2. The last time I saw this speed-up effect in modern movies—outside of those portraying comic-book superheroes—was in the early James Bond movie From Russia With Love, where the fight scenes were accelerated in an attempt to show the supposedly eerily fast moves of judo and karate.

3. Of course, I am not alone in this. Most modern novels are told with some form of indirect discourse.

Sunday, December 17, 2017

Dealing with Death

2001 Hotel Room

For those of you who don’t like personal stuff in blogs,1 you can stop reading right now and come back next year. If you can stand the personal view, then read on.

It’s been more than three months since my wife Irene died in early September. By now, I have gone through all her desk drawers, sorted years of old financial records from her photographs and memorabilia, and hauled the resulting boxes of wastepaper to the shredding service. I have also gone through her armoire and her various closets, separating slacks and tops from sweaters and jackets, bundling underwear and socks, and taking piles of clothing to Goodwill Industries. And, as of this writing, my co-trustee and I are in the final stages of closing her bank and credit card accounts, making the bequests in her will, and transferring her retirement accounts according to her instructions.

All of this is part of putting away a life, as much as taking her ashes to the gravesite and holding a remembrance in her honor among family members and friends.

What struck me is how long all this took.2 We all know that wills take a while to process, even if the deceased had a living trust, designated clear beneficiaries for each account, and the estate does not go to probate. Still, the companies that offer investment vehicles such as individual retirement accounts and annuities can need weeks to complete the paperwork that transfers the ownership and moves the assets into a trust account. But all that’s just on the legal and financial side.

Part of the delay in wrapping up the rest of her life was my own sense of emotional inertia. I managed to go through the drawers of old bank statements and receipts in the first couple of weeks after Irene died. Partly, this was because it was easier to deal with these paper records, which have an impersonal and antiseptic flavor to them. Partly, because I needed to find documents like the deed to the condo, title to her car, and any surprises like unacknowledged debts and obligations, before the legal part could proceed. Irene managed our household finances as part of the arrangements governing our marriage. We always kept separate bank accounts and credit cards, because Irene didn’t really trust me with money.3 So going through her accounts was a voyage into terra relatively incognita and once jealously defended.

I found it harder to go through her closets and prepare her clothes for donation. Partly, this was a lack of urgency, because those clothes weren’t going anyplace soon and I didn’t need the extra closet space. Partly, it was the personal nature of the exercise. Somehow, sorting her entire wardrobe for disposal brought home, as neither sitting in the lawyer’s office nor closing her the bank account could, that Irene was really gone and wasn’t coming back. Not at all. Not ever.

As with the paperwork, Irene’s closets were also an unknown area in our lives.4 My wife—and all women, I think—like to keep their wardrobe space private. Even though we dressed and undressed in the same bedroom, she wanted to appear to the world—and that included me—as a completed presentation. Her decisions about the components that went into the look of the moment—this top with those slacks, this sweater or jacket—were supposed to remain a relative mystery. She never asked for my approval of any piece of apparel. And Irene did the laundry anyway; so she could keep her clothing arrangements entirely separate from mine.

It’s not like I did no chores around the house. She managed the laundry as a preferred activity to vacuuming and dusting, because she hated the noise, the smell, and the routine dirt of those two chores. I didn’t mind cleaning the house, but I didn’t care for the dampness, fumes, and critical timing of running loads through the washers and dryers down in the laundry room. So, for forty years, we traded off these tasks. And then, I could cook but not as well as Irene, and she was more adventurous with recipes—my meals being a bachelor’s basic fare of homemade chili, hamburgers and hotdogs, pizza from scratch, and other fast-food staples. Oh, and bread, because my mother insisted her boys know how to bake. After Irene retired, she took over most of the cooking and mostly had fun with it. And because she had the car and reserved rights to the condo’s one assigned parking space, while I rode a motorcycle as my basic transportation, she did most of the shopping, which became part of her keeping the household accounts. And finally, we traded off walking the dog four times a day, because when you live in a twelfth-floor apartment, you can’t just open the back door and let the animal run out into the yard. Of such long-standing agreements is a marriage made.

Now that she’s gone—in fact, from the first week after she died—I inherited all of these chores for myself. And it’s surprising how small a deal they really are. I reverted to my bachelor routines of forty years ago for assessing and doing the laundry and shopping for my groceries. As with the cleaning, I organized these tasks into basic routines, put them on a schedule, and started executing them efficiently.

They say that when you start drinking again after a long period of abstinence, you don’t start over again with fresh tolerances and work your way back up to heavy indulgence. Even after years, the body remembers exactly where you left off. Stay sober for decades,5 and within a week of taking that first drink again, you’ll be downing a bottle of wine or a couple of six-packs a night. So it is with the routines and skills you develop early in bachelor life. Irene had little quirks about performing her chores: how to fold the towels, which items to let air-dry, how to load the dishwasher—even after I had already put my dishes in—and what pans had to be scrubbed by hand. Within two weeks after her death, the linen closet and the dishwasher looked like my bachelor days. Not less neat, and certainly not less clean, but just different. And I’ve also started modestly rearranging the furniture according to my own ideas, rather than the placements we could agree upon—or fail to reconcile—together.

But forty years of living with one person leaves a mark—no, the years wear deep grooves in your psyche. I may be adapting now to the old patterns of taking care of myself. But sometimes I hear a rattle in the hallway. I know it’s the wind, but I think it’s Irene with her keys. The dog hears it, too, and her head comes up and her ears go erect. I hear someone say, “Tom …” in a crowd, and it’s in Irene’s tone of voice. I know she won’t be coming back—I’ve done the necessary sorting and packing away—but she is still there at the edge of my mind.

The condo which was enough for two people now feels too big and empty.6 I feel like David Bowman from 2001: A Space Odyssey, alone at the end of his voyage in an absurd French-empire hotel room, listening for someone to come, for something to happen. But I don’t exactly listen, because I know I’m alone. Yes, I have friends—most of them associations that Irene and I made together—and family members in the area. But the days are long. And thank heavens for the dog as a companion to give the apartment some life. Thank Irene, actually, because this particular dog was her choice—it was her turn to pick—at the animal rescue shelter.

But now I’m caught at the end of my life, too old to start over. I’m wondering what, if anything, comes next.

1. Of course, my blogs are all personal, but some are on a higher intellectual and emotional plane than this.

2. In the recent movie Arrival, the heroine asks about one of the aliens who has gone missing after a bomb attack. Her interlocutor replies, “Abbott is death process.” I now understand that phrase, “death process,” all too well.

3. In every relationship, one person is always the Grasshopper from Aesop’s fable, while the other is the Ant. Guess who was the Grasshopper? In the same way, in every relationship, one partner is always Oscar from the Odd Couple, while the other is Felix. Guess who was Oscar?

4. One thing I learned from handling every piece of Irene’s clothing is how lightweight and insubstantial women’s clothes really are. Men’s slacks are made of heavier material and then thickly sewn with substantial waistbands, belt loops, gussets, and lots of pockets. Men’s shirts usually have reinforced collars, shoulder yokes, button plackets, and one or two chest pockets. Women’s slacks and blouses are thin material with hidden seams and virtually no pockets. Even items that would seem to be common to both, like polo shirts, are made of a different weight of yarn or a lighter weave. I guess this is because women’s clothing is supposed to drape and cling, while men’s clothing is meant to be essentially shapeless.

5. I now have thirty-two years of sobriety, and forty-four years without tobacco, both after going cold turkey for the sake of my health. Not that I’m thinking about starting up either vice again.

6. Friends keep asking if I plan to stay there or move. This was the home Irene and I moved into when we first got married and we never left. Partly, that was because we didn’t want to be house-rich and cash poor at Bay Area real-estate prices. Partly, it was because we never could agree on any other house we saw or any move we might make in any direction. But the condo will always be Irene’s-and-mine and not mine-alone. I think this is what my friends sense.

Sunday, December 10, 2017

Learning as a Form of Evolution

Neuron cells

I’ve been making some existential comparisons lately—Life Like a Sword and Language as a Map—so I thought I would round out the sequence of metaphors by looking at the way we form our knowledge.

The popular conception is that we acquire new knowledge the way a coin or stamp collector makes a new acquisition: pick up the fact, place it in our memory box, recall and retrieve as necessary. Our head then becomes a database of acquired facts, like a phone contact list or a Rolodex of business cards. Take one and slot it in. Modify it with an overlying note if any of the information happens to change. And remove the card—that is, commit ourselves to forgetting the subject heading and its content—when the information is shown to be wrong or is no longer of use.

But is that really how it works, all neat and tidy, like filing little pasteboard squares?

Actually, our brains are representational devices. We interpret our sensory input and apply it to the process of assembling a model or representation of what we know and think about the world outside our heads. We are model makers, map makers, myth makers, and story tellers. What we learn goes into a vast web or network or congeries of impressions, summaries, conclusions, and projections that collectively represent the world as we know it. We are constantly testing and expanding our worldview. We are always asking, consciously or not, “Is this the way the world really works?”

We are constantly—although perhaps unconsciously—looking for comparisons with and similarities to the things we already know. When we get a new fact or form a new impression, we test it against our worldview, the structure of our model of the world. We ask, “How does this fit in?” And if the fact or impression conflicts with what we know, our brain goes through a small crisis, a scramble for immediate understanding. We test the new knowledge against its background: “Where did I get that idea?” “What was its source?” and “Do I trust it?” We also experience a small—or sometimes large—tremor in our worldview: “Why do I actually think that?” “Could I have been wrong?” and “Is this new knowledge a better way of seeing things?”

The habit of referring back to our internal model runs deep. For example, when learning a new language, such as French from the perspective of an English speaker, we leverage the grammar and the words we already know and understand. When we learn a new French word like chien, we don’t immediately associate it with a four-footed pet of a certain size range, disposition, coloring, and similar physical details. Instead, we link it to the English word dog and then concatenate onto chien all the past impressions, learned attributes, and personal feelings we already associate with the concept in English. In the same way, we adapt French grammar and syntax to our known English way of speaking, and then we extend our knowledge with new concepts, like the issue of nouns and objects that we normally think of as inanimate and sexless now acquiring a specific gender. By learning a new language, we expand our general knowledge of both our own language and its place in the way the rest of the world communicates.

In this sense, each piece of new knowledge—both the facts, impressions, and understandings that we acquire by the happenstance of general reading and daily experience, and those we acquire by conscious study such as a new language, or the history of an unfamiliar place and people, or a closed curriculum like mathematics, physics, and chemistry—each discovery is making a series of minute changes in the brain’s internal environment. And the effect that these new facts and impressions have on our existing ideas—the current model or myth that is running in our heads—is like an organism’s response to accidental modification of a protein-coding gene: the new knowledge and the resulting change in our worldview either enable us to live more fully, completely, successfully, and confidently in the environment that we actually inhabit, or the changed worldview contributes to our failure to compete and thrive by causing us to interpret wrongly, make mistakes, and suffer feelings of doubt, denial, and depression.

But some facts or interpretations—perhaps most of them—don’t cause an immediate change in our relationship with the outside world. We can carry a bit of false data, a misremembered fact, or an untested impression in our heads for months or years at a time without it affecting our personal relationships, our social standing, or the decisions we make. And then, one day, we will learn something else that will contradict the comfortable model and bring on the crisis. In the same way, some mutations to a gene have neither a helpful nor harmful effect in the current environment. The modified gene and the changed protein it makes gets passed down from generation to generation without challenging the fit of the organism to its environment. But then the environment changes, and the organism is either better able to compete under the new conditions, or the changed environment shows up an inherent weakness, and the organism either thrives or dies. Sometimes the environment doesn’t have to change, but another mutation enhances the effect of that earlier genetic change, and the organism either excels against other members of its species or fails to compete.

As an example of the mutability of our worldview, both as individuals and as a collection of academics building a body of scientific or historical interpretations, consider the advance of human knowledge in the field of genetics.

At first, back in the early 1950s and the world of Watson and Crick, we valued the newly discovered DNA molecule and its messenger RNA strands solely for the proteins they made inside the cell body. Genetic scientists held to what was then called the “central dogma” of molecular biology, that DNA transcribes to RNA, which translates to proteins. Geneticists could point to the start and stop codes associated with the protein-coding genes. By finding and fishing out these codes, they could pull out sequences of DNA, copy them over to RNA, and analyze the resulting coded calls for each of the twenty possible amino acids in the developing protein string. These twenty amino acids are the universal building blocks for all of an organism’s complex proteins—in fact, for all life on Earth.

This central dogma held until about the year 2000, when the Human Genome Project and Celera Genomics published draft sequences of the entire three billion base pairs in twenty-three human chromosomes. Analyzing the code, geneticists then discovered that only about ten percent of this DNA was used for making proteins.1 So what was the other ninety percent doing? Many scientists figured that this genetic material was “junk DNA,” old code left over from our prior evolution, from genes that coded for proteins that our evolutionary ancestors might have needed as fish or reptiles, but with no meaning now and so abandoned to gradually mutate into genetic mush.2

The new facts about the frequency of protein-coding genes forced a reevaluation—a modification of the scientists’ mental model—of the nature of the genome. The scientific community remained with either the “junk” hypothesis or a condition of wonder until about 2004, when a new bit of knowledge emerged. Botanists working with certain flowers discovered that a short strand of a particular RNA, when introduced into a seed, can change the color of the flower. They hypothesized that the RNA either promoted a gene that had previously been silent or blocked a gene that had previously been expressed. They dubbed this effect “RNA interference,” or RNAi.

Soon, the genetic scientists were studying a class of short RNA strands, about fifty bases or less, that they called “microRNAs,” or miRNA. They began to see that these bits of RNA were used inside the cell nucleus to promote genes in different patterns of expression. And then Eric Davidson at Caltech, by working with sea urchin embryos, mapped out the network of genes in an undifferentiated embryonic cell that produced bits of microRNA to promote other genes to make different miRNAs—all without coding for any proteins. Depending on a cell’s position in the sphere of identical embryonic cells that develops shortly after fertilization, the pathway through this miRNA network changes. Some of these cells, through the proteins they eventually produce, become the internal gut, some the epidermal surface, and some become spines. By comparison with another organism far removed from sea urchins, the Davidson laboratory could trace out a similar network—which means it operates in most animals and plants, and likely in humans today. This miRNA network is the timing and assembly manual by which some embryonic cells in our bodies become liver cells, some brain cells, and some bone cells.

This discovery addressed a question that apparently no one had ever considered. If the entire genome is for producing proteins, then why doesn’t every cell in the human body make all the proteins required by all of the other cells? Why don’t neuron cells pump out liver enzymes and bone cells create and then, presumably, ignore neurotransmitters? Davidson’s work suggested that, while ten percent of the human genome makes proteins, functioning as the parts list of the human body, the other ninety percent is the sequential assembly manual.

But the story didn’t end there. Other geneticists noted that simple chemical compounds called methyl groups (CH3) often became attached to the promoter regions of genes—usually a sites where a cytosine base is followed by a guanine—and inhibited the gene’s expression. They at first considered this an environmental accident, randomly closing off gene function. But they also noted that an enzyme in the nucleus called “methyltransferase” worked to add these methyl groups to newly replicated DNA strands during cell division. If methylation was an accident, why was there a mechanism to preserve it in daughter cells?

From this question, the scientific community began studying methyl groups attached to DNA and learned that this was the cell’s way of ensuring that those brain cells didn’t begin producing liver enzymes or bone cells make neurotransmitters. Once a cell had differentiated to become a certain type of tissue, methylation locked out its other possibilities.3

So the community of microbiologists had to work gradually, discovery by discovery, to develop and refine their model of human genetics. From the central dogma of protein production being the only purpose of all DNA, to a whole new use of RNA to differentiate cell types, to the inclusion of “accidental” methyl groups to lock in that differentiation.

Every science goes through such an evolution and refinement of knowledge, discarding old ideas, patching in new discoveries, building, tearing down, and rebuilding the model, each time coming closer to what’s really going on in the world. In the same way, every human being learns certain skills and truths, discards old notions, patches in new understandings, building and tearing down his or her worldview, until the person attains something approaching … wisdom.

1. This did not leave them with too few genes to account for all of the body’s proteins, because they also discovered that many genes have alternate splicings. The scientists already knew that some gene sequences had “exons,” or patterns that expressed the code for the protein, interspersed with “introns,” or non-coding intrusions into that pattern. What they learned from the human genome was that the promoter region ahead of the start code could specify different ways to combine those exons to create variations in a family of proteins. Nature is more wonderful than we can imagine.

2. Not everyone agreed with this assessment. The human body spends too much time and energy replicating the entire genome each time a cell divides for us to be carting around this much junk. After all, the phosphate bonds (PO4) that are the backbone of each paired strand of DNA are also the working part of the cell’s energy molecule, adenosine triphosphate. And phosphorus is not so common, either in nature or in the body, that we can afford to hoard it and squander its potential with junk DNA.

3. Methylation also would explain why the early methods of reverting a body cell to a type of embryonic cell, by starving it until the cell almost dies, worked so poorly. This was how the scientists in Scotland cloned Dolly the Sheep, and in order to achieve the one viable Dolly, they had to sacrifice hundreds of attempts at cloned embryos and raise not a few genetic monsters. The starvation method must have essentially stripped out methylation as the cell approached its death, reverting the genome to is undifferentiated state.

Sunday, December 3, 2017

The Trials of Vishnu

Tower of Pisa

In the Hindu pantheon, three major gods form a trinity:1 Brahma the Creator, Shiva the Destroyer, and Vishnu the Preserver. Although I constantly try to emulate Brahma in my writing, creating imaginary worlds, people, and situations as if they were real, I find that in everyday life my patron deity is actually Vishnu.

I tend to hold on to things, sometimes cherishing them for sentimental value and sometimes simply because they might, one day, under the right circumstances, become useful again. For example, I have an old and now somewhat tattered bath mat that my paternal grandmother once crocheted. That’s from sentiment. I have in storage books that I decided years ago I wasn’t going to read right away but that I might need again. And in my closet I think I have the dress shoes I wore to my high-school prom, now fifty years ago. And at the back of my closet … even Vishnu doesn’t want to look there.

A lot of this preservation deals not so much with just keeping things as keeping them in their proper, pretty, pristine, and like-new state. This is a manifestation of my own particular form of obsessive-compulsive disorder, or OCD. For many people with this disorder, the compulsive task is repeatedly washing their hands, or checking for their car keys, or reconfirming that the stove is indeed turned off. Performed enough times, especially under conditions of stress, the disorder can be crippling to a normal life. My form of OCD deals primarily with the two S’s: surfaces—which includes scratches—and straightness.

If an object in my possession has a shiny surface, I am constantly checking it and cleaning it of dust and smudges: the screens of my iPhone, iPad, and computer monitor; the cases of any of these devices; and the shiny bits of my motorcycles such as gas tank, fenders, windscreen, and dial lenses.

Before I go for a ride, I use a wet paper towel to float off the dust, followed by a microfiber towel to wick off the water without leaving droplet marks or fine scratches. After I ride, I use the still-damp towel to clean off any bug carcasses and dust I’ve picked up along the way. If I find a scratch—however minute, no matter that it only shows up in direct sunlight and from certain angles—I bring out the wax or the plastic polish to address the defect. When I wash the motorcycle, I immediately follow it with a coat of wax or acrylic sealant to preserve the surface. If there’s a deeper scratch, visible under any light conditions, I bring out the polishing compound and work it to oblivion, hoping that the mar doesn’t go through the clear coat and into the paint. And if there’s a stone chip, I go after it with touchup paint, followed by compounding and sealant.

You might think that the solution here would be the new matte finishes that motorcycle manufacturers have introduced over the past several years. But they can’t be touched up for visible scratches. And then I worry about wear, especially the sides of the tank where my knees grip the surface and the fabric of my trousers would leave—horrors!—a shiny spot. There’s just no way to win this game.

This is why my favorite material is glass. It wipes down easily, and usually it will break—and thereby have to be thrown away—before it will scratch. For this reason I like tempered glass for my eyeglasses instead of the new plastic lenses. (I’ve had enough polycarbonate motorcycle windshields to know that, while they might take a bullet, they also scratch fairly easily.) I also favor polished titanium and stainless steel for my watches because of their wear resistance. I’m just picky that way.

If an object has a scratch or wear mark that I can’t polish out, I agonize over it. I see that point of infinitesimal damage more than the whole bright surface or the shape, design, and purpose that brought me to admire and desire to obtain the object in the first place. Is this a crippling affliction? To my daily round of activities—such as incessant hand washing would be—then no. But to my emotional stability—when I have actually considered selling a motorcycle because of a deep and unfillable stone chip in its lustrous black paint—well, yes.

The other aspect of my disorder is the alignment and straightening of things. Part of this was my upbringing as the son of a landscape architect. My mother had an innate sense of design—my father had it, too, but not to her degree—backed up by her training as a meticulous draftsman and landscape gardener. Even though her courses taught her to “avoid straight lines” when laying out a flowerbed, she appreciated things that were square and even. And she wanted everything to have its own place and its own space. So if I, as a youngster, pushed my desk into a corner of my bedroom—so that its edges were touching both walls—she would gently advise me to pull it out, at least an inch from the back wall and six or seven inches from the side wall, so that the desk “owned” its space in the room. Crowding furniture side by side and pushing area rugs up against the baseboard were a violation of her own particular feng shui.

So I practice straightness in my environment. Pictures hang level. Wall clocks have their twelve and six aligned vertically. Rugs are square with the room. When we bought the condominium, we had a hardwood floor installed instead of the usual wall-to-wall carpet. The pattern of the parquetry is a series of small wood oblongs arranged in larger squares. Thank God the installers aligned the sides of these squares with the walls of the room—although the plasterboard itself is none too straight in some places. Otherwise, I would live in a nightmare of constantly trying to square up the floor and walls in my mind, or squint until they almost aligned. But I do keep pushing the area rugs—which are all rectangles, no ovals or circles here—to align with the edges of the floor squares. And I judge the position of a table or chair by counting the wood blocks in the floor pattern at each leg. On my daily path through the apartment, I am constantly straightening a rug with my toe, squaring up the hang of a picture, pushing at a table edge, aligning a place mat, adjusting the spacing between items on a shelf. It’s an endless job.

Some of this compulsion has made me a better writer and editor. I see grammatical looseness as a violation of alignment. I see unfinished thoughts as incongruent with the shape of an argument. The nits of spelling and punctuation are minute scratches—some of which only I can see, and then only in certain lights—that must be polished or repaired. Of course, when it comes to laying out a newsletter page or a book cover, I try to give pictures and other graphic elements their own space and not crowd them. And I have an eye that is calibrated—or used to be—to a printer’s point, or a seventy-second of an inch, about a third of a millimeter.

For example, in choosing the photo of the Tower of Pisa, above, to accompany this article I selected the best image for its lighting and background. But then, in Photoshop, I had to rotate the image two-point-five degrees counterclockwise because, while the tower was leaning most dramatically, the buildings, the light pole in the background, and the implied horizon were also tilted. I notice these things. They bother me.

So, is this an affliction or a source of strength? I don’t know. It is quirks like this, if anything, that define me. But I do know that, when I am in my grave, my ghost will be haunting my last dwelling place, nudging ineffectually at a crooked rug and scrabbling with ectoplasmic fingers to straighten a tilted picture.

1. What is it with religion and triads? First, there’s the familiar Father, Son, and Holy Ghost—which never made much sense to me, because of the “ghost” part. And then, in the ancient Celtic religion, things usually came in threes and their bards were expected to declaim in rhyming triplets. And finally, the Scandinavians had the Norns, three old women, one to spin the thread of life, one to measure it, and one to cut it off. I find that in my writing, which generally comes from the subconscious, I sometimes feel the work is incomplete unless an argument is supported by three examples, a list includes three members, or an object is tagged with three distinct adjectives. I guess I take after the bards in that way.

Sunday, November 26, 2017

The Problem With “No Problem”

French marketplace

I believe that the spoken forms of courtesy are the grease that lets our creaky social machinery, as well as our badly joined—as in a cabinet whose drawers and hinges aren’t quite aligned—personal senses of self-respect and shared obligation, function within tolerable limits. Speaking those little, objectless sentence fragments like “please” and “thank you,” as our mothers taught us, is what keeps us all from screaming, leaping, and tearing at each other’s throats.

In the same way, those almost pointless physical courtesies, like making eye contact with the people you just met, to show that you are both an interesting person and willing to communicate in depth, or offering your empty hand, to show that it doesn’t hide a weapon—or, in some cultures, offering a short bow, to expose your neck as a sign of trust—are all ways that we signal not just our common humanity and a benign spirit but also our willingness to take risks in meeting strangers and new acquaintances halfway.

Almost all such polite phrases in common usage are archaic forms, spoken relics with the sharp edges and the grammatical functions worn off.

“Please,” for example, is a much-shortened form of “may it please you.” That is, with the thought fully spoken: “Do this for me only if it would give you pleasure to accommodate my request.” You see an echo of this in the French form: “s’il vous plait,” or literally “if it pleases you.” The French verb plaisir, which has its root in the Latin placere, means not just being pleasant but also enjoying or finding satisfaction. This is a fairly gracious way of thinking and acting. This little verbal elision says, “I know that you have freedom of action and can make choices in how you do things. My request is probably going to be a burden on you in some fashion, and I don’t want to cause you any trouble. So consider complying only if it would give you pleasure or satisfaction.”

In modern usage, at least in American society, “please” has become some sort of code word for an intended enhancement, like adding “really” or “very” to a descriptive adjective. The word “please” has acquired the effect of saying, “I’m serious about this.” So, from someone trapped in a locked room: “Let me out! Please let me out!” This goes back to the childhood escalation of entreaty: please, followed by pretty please, and ending pretty please with sugar on top.

In the formula “may it please you,” the verb is in the subjunctive mood. This is an element of grammar that almost nobody teaches anymore, which means that students may have a hard time recognizing and using it, although it still sticks in the ear and in the mind, and it represents something we all more or less understand.

We all learn about the three basic tenses in English: past (something happened), present (something is happening), and future (something will happen). But these tenses also have their completed or perfect and their uncompleted, progressive, or imperfect forms: the perfect implies something that happened once and is now over and done, while the imperfect implies something that has happened before and may still be happening and recurring. In addition, we can work the changes of past, present, and future as verb forms from the viewpoint of one of the non-present tenses. You can use English to indicate something that happened in the past from the viewpoint of the past tense itself—such as, “I had gone to the store”—or something that is in the past from the perspective of the future tense—“I will have gone to the store.”

All of these tenses—which you learn in depth when you study the grammar of English root languages like Latin and Greek—are in the indicative mood, which deals with actions that really did, do, or will occur. But English and most other languages also have the subjunctive mood, describing what we hope or expect but are not certain will occur, or an occurrence about which we want to express some doubt. “May it please you,” in the example above, is a wish that honoring the request will be pleasing to you, not a statement that the request will automatically be pleasing. In the same way, when a person says, “God bless you,” it is not a statement of fact, that God actually does bless you, or a command in the imperative mood, “God, bless this person” (which is on the same plane as “Dog, sit!”). Instead, we are saying “[May] God bless you”—which is a hope, a wish, and an offering of benediction that the well-wisher is not in a position to grant him- or herself but that is for the Supreme Being to provide.1

When we say “Thank you,” things are a little more straightforward. Here the elision is simple, just dropping off the subject of a complete sentence: “I thank you.” No subjunctive need apply: you are offering actual thanks, an acknowledgement of a benefit received, and perhaps an obligation to return the favor in the future. This is that bit where we graciously entertain a risk: we are saying that we do accept the obligation to one day respond in kind. As adult individuals able to stand on our own two feet, we are shy about receiving gifts and favors. The usual word is “gratis,” meaning “free,” which goes back to the Latin gratia, meaning favor or kindness. To receive a favor or a free gift puts one in a lowered position, because only children, servants, and slaves expect to receive something for nothing. So we express thanks, with the implied obligation to return the favor, which puts us back in a position of social equality with the giver.

The response of the giver used to be, in American and in most English-speaking societies, “You are welcome.” This phrase has its roots in the words of a householder or host greeting visitors: “You are well come,” meaning it is good that you have come; your arrival is appreciated; and by extension, my house and my hospitality are at your disposal. You can see the sense of this in the opening song by the Master of Ceremonies in Cabaret: “Willkommen! Bienvenue! Welcome! Fremder, étranger, stranger. Glücklich zu sehen. Je suis enchanté. Happy to see you. Bleibe, reste, stay.” That is, to tell a person thanking you that they are welcome is to return and thereby nullify the obligation. “No, this was mine to give, and no obligation is created.” This is a pleasantly gracious turn of phrase.

But more and more in American usage I hear—and receive for my spoken thanks—the phrase, “No problem.” The underlying message here is quite different: “Providing this gift or favor, or doing this service for you, did not inconvenience me. Serving you has not been a problem for me.” Perhaps, if we want to be kind, we can interpret the message as: “I recognize that I am here to serve you. That is my purpose—and therefore not troubling to me.” But still, there is a reversal of message and a diminishment of intent here. Not “I grant you freely without obligation, as a host offering hospitality,” but “Your implied demand for service has not inconvenienced me.” Or “Your being in a position to receive a favor—as a small child or servant—might have inconvenienced me, but I want to let you know that it didn’t and I reject the notion that I had any obligation to give you a gift or provide you a service in the first place.”

Thank you and You’re welcome are a yin-yang pairing, the simultaneous creation and removal of a feeling of obligation, engendered by an originating act of kindness. The reply of No problem denies the intention that original act. In this way, it feels like a rejection of politeness rather than an expression of it. Perhaps I’m being over-sensitive, but to hear “No problem” when I try to thank someone leaves a faint taste of disdain in my mouth and in my mind.2

1. The clue that a speaker is using the subjunctive is a verb conjugation would normally sound and feel wrong. For example, the third person (he, she, it) conjugation of please is “pleases,” as in “It pleases me to see you.” But “If it please you” leaves the matter in some doubt or to be hoped. Similarly, when you see “helping” verbs like “may,” “would,” or “should,” you are often dealing with the subjunctive.

2. But cultural tastes differ. In two languages and cultures that are widely separated by geography and affinity, the Spanish use “De nada” and the Russian “Nichevo” to say the same overt thing as the English phrase “You’re welcome.” Both the Spanish and the Russian translate as “It’s nothing”—as does the French “Pas de quoi”—which is pretty close in semantic content to “No problem.” But I still find the English “You’re welcome” more gracious.

Sunday, November 19, 2017

When Wars Will End

Chinese women marching
Terra cotta soldiers

When I was growing up, I remember a tee shirt with the motto: “Be Reasonable, Do It My Way.” As with most such shirts, the intention was to be amusingly ironic. But I find it amazing that many people, especially in the realms of politics and economics today, actually think like this. No, not that they would ever voice this motto, because formulating this thought aloud in words would show them how fatuous is this idea compliant control. But still, it lurks there in the back of their minds: “I’m right, you’re wrong, we’ll all get along when you just shut up, listen to me, and obey.”

For the person who unconsciously subscribes to this tee-shirt motto—or who can look at it, read it aloud, and still fail to find the irony—the people around them must be something other than real, live, self-actuating, independent, and strong-minded human beings. Maybe the subscribers believe they are surrounded by a species of fleshy ghosts, or puppets, stick figures, and imaginary characters, like those in a book or play. For the subscribers, other human beings may have faces and voices, but their thoughts, their reactions, and the words that come out of their mouths are somehow unrelated to reality. For the people who believe Their Way is the Only Way, the intentions, opinions, aspirations, and desires of other people are simply unimportant, fictitious, or wrong—on the order of “You can’t really believe that, can you?”

We live in a world of varied opinions. Freedom of thought and action is not just an abstract idea, to be written somewhere on a dusty parchment and forgotten when convenient. The human ability to approach ideas, propositions, and profound beliefs as intellectual objects, to dissect them, to weigh the evidence for and against each side, to reach a conclusion, and then act on it … this is all a function of each human being’s having a unique brain inside a physically separate body. Perhaps all protozoans exhibit common reactions to their environment, based strictly on their genetic code and the interactions of their internal proteins. Perhaps all fish, frogs, and reptiles are predictable in their behaviors, based on the primitive structures of their vertebrate hindbrains. But when you get into the class Mammalia, where the brain starts developing different lobes and functions, learns from its environment, and can overcome its reactions, all bets are off and groupthink becomes a relic of the distant past. Even dogs have different personalities and operate with their own senses and ideas.

Yes, there have been societies with monolithic social and political structures: Nazi Germany, Soviet Russia, Maoist China, and North Korea under the Kim regime. We are all familiar with their rallies and parades, with row after row of soldiers marching in step, usually followed by military vehicles which, lately, are towing intercontinental missiles.1 You can look at the newsreels and the pictures and think that everyone in these societies must think the same: ants with jackboots and armbands. But note also that every one of these societies has an active bureau of secret police and—somewhere out in the country, away from the cities—a growing population of political dissidents in labor camps.

So, the people who march in step and who stand out in the sunshine with their arms raised—that is, the members of these societies who don’t go into the labor camps—do they all really think alike? Are the mottos of the national party and the image and words of the dear leader really engraved on their hearts? While it’s not possible—not yet, thank God—to open a person’s skull and examine his or her brain from the outside to see the content of its innermost thoughts, I believe that even there we would find a diverse mix of ideas, intentions, and reactions. Some people actually believe the party mottos and the leader’s words because they have heard the arguments, weighed them to the best of their abilities, and agree with them. Some believe because the arguments are easy to understand and affirm, and the person—for a variety of reasons—doesn’t think or care much about politics or economics, or the greater questions to which political and economic thought applies. Some believe that by saying the words and mimicking belief in them they can be part of the larger thing that is moving through their lives whether they want it or not. Some smaller fraction of this group believes that if they say the words loudly enough and show other signs of commitment and support they can get ahead in their job or their living situation—that they can climb on the back of the beast, ride it, and perhaps one day even be allowed to take the reins. And others believe that the beast is here, is a fact of life whether they want it or not, and the course of greatest safety for themselves and their family is to pretend to show it their support.

Even in the heads of the jackbooted soldiers marching down the street—aside from immediate concerns about missing a step, scuffing the shine on their boots, or being caught with their uniform in some other kind of disorder—you would find this diversity of beliefs and expectations.

Even in the social dimensions where belief is the basis of a group of people coming together in the first place, you will find that they all have different ideas and opinions. From the simple words of a carpenter and later a rabbi who lived two thousand years ago, how many varieties of Christianity have spawned, encountered schisms, and split over issues like the nature of the godhead, the nature of sin and right and wrong, the meaning of the words as they are written in the texts, and even the substantial nature of bread and wine? Even a supposedly monolithic religion like Islam, which adheres to the words of a single man exactly as they were written down from his lips, has split into factions over the rights of his successors, the importance and interpretation of certain passages above others, and the proper form of obedience. Even the simple and direct message of the Buddha has generated two versions of the proper way to observe and follow it, plus cultural variations in every land that has adopted Buddhism. Religions like Christianity and Islam will often act or react in a monolithic fashion when expanding into new territories or confronting outside opposition, but all the while their adherents are thinking, interpreting, adopting, and preparing to set up another sect or create a new schism.

People cannot help but have different ideas. In every land, in every generation, the restless human mind examines, interprets, adapts, and sometimes discards the thoughts and beliefs, patterns and traditions that wash across groups of people like the waves in an ocean. In reality, most people cannot hold a single belief or thought in their heads for all of their lives. Most groups larger than an extended family or small tribe cannot remain cohesive for longer than a generation—and some not even until the next election.

Whether a person believes in and cares deeply for a political or economic proposition or tradition; or follows it only because mother, father, or a teacher once voiced it and the person him- or herself really doesn’t care; or follows it because his or her neighbors are suspicious, will whisper among themselves, and will one day turn the person into the secret police … various societies tend to build up standing waves of belief and tradition that in turn will engender consolidated political and economic actions. Societies in this state of temporary conformity, at the crest of the passing wave, will sometimes try to spread their rigid social, political, or economic order to other groups and countries. And then, in an excess of enthusiasm, war will break out, soldiers will be enlisted for fighting and killing instead of just marching, and human beings will endure the results in terror and misery.

When will the world come together in harmony? When will the various religious sects recognize that their functional similarities outweigh their doctrinal differences and join together in uniformity? When will all of the Earth’s nations agree on a single political or economic principle and a social order that is not tainted by the interpretations and cultural characteristics of one group or another? When will they put off their differences and give up their quarrels? When will the wars end?

Only when we can put off our restless human mind, stop thinking and examining the propositions of everyday life, and become something more like fleshy ghosts or stick figures than active, thinking human beings … That is, never.

1. I find it telling that the Western democracies, which allow wide latitude for political and religious dissent, don’t hold these monolithic rallies and military parades. When was the last time you saw members of the military services marching in lockstep down Pennsylvania Avenue? Not on the Fourth of July: our national parades are a time for high-school marching bands, costumed dance troupes, veterans groups, and balloons. The active-duty military only come together and march in step to honor a fallen president.