Sunday, December 29, 2013

Believer or Seeker?

I took an online survey recently where, in the personal identification section, the pollster asked for my religious affiliation. Usually I enter “atheist,” because I don’t have a personal god I can name or envision or even talk about. But this survey question offered no such category. Along with the usual choices of Christian, Jewish, Muslim, and so on, it listed at the bottom “Seeker.” So I chose that one—and it may have changed my view of the whole religious question.

Some people have made a religion out of their atheism. They don’t believe in any god as described in established religions, but they nonetheless have an organizing principle, a belief system built around it, and usually a rite and a communion. In place of a sentient if invisible and transcendent Presence, they elevate a model of the world they experience and their human response to it based on politics, economics, physics, or some other intellectual pursuit. They center their life on that model and take guidance from its basic premise and its corollaries as faithfully as any Christian or Jew reading the New or Old Testaments or a Muslim reciting the Quran.1

I’m not one of these somewhat militant atheists. I don’t think I’m smarter than the rest of humankind because I’ve “seen through” their various mystery religions. Rather, I feel my body-mind complex lacks some gene or neural network that would let me sense and appreciate the god with whom others so easily commune. I am blind in that dimension.2

So “seeker” seems to fit me. But does that mean I will be satisfied to eventually latch onto one of the major religious doctrines and settle down? I don’t actually expect to “find” what I’m supposed to be seeking. Instead, to me, keeping your mind and your options open is a virtue. While I may entertain many small and varied beliefs (with a lowercase “b”), I reject the idea that intellectual completion requires me to embrace one great belief (with a capital “B”) as my guiding life principle, to which or to whom I submit myself, body and soul, whole and entire, world without end, amen.

In my view, the believer has either created for himself or accepted from others3 a particular model of the world. It satisfies or can be made to fit, through exposition, extension, or explanation, all circumstances. For the religious believer, the presence of a god and his laws, his will, his creation, and his omniscience are the driving forces in the believer’s life and in the universe itself. For the political or economic believer, the principles of democracy, free-market capitalism, or state-sponsored communism are the driving forces of the believer’s personal action and of society itself. God, or Thomas Paine, Adam Smith, or Karl Marx presents the model, the system, the template for future action and creation.

The seeker, on the other hand, either creates for himself or accepts from others models that are merely provisional. What keeps the seeker from becoming a believer is that, for him, none of the models best describes all actions and occasions. The seeker can see viable alternatives to the proposed model. He or she can also see places and circumstances where the model does not fit, and the explanations, corollaries, and work-arounds that the model maker suggests to cover them are either inadequate or defy logic and common sense.

In my thinking, the positions of the seeker vs. the believer are comparable to the underpinnings of inductive vs. deductive reasoning.

With deductive reasoning, the major premise serves as a rule that has been established and is not in question. It is up to the reasoner then to apply that rule to the data as they are found. Consider the logical syllogism “All geese are white; this bird is a goose; therefore, this bird is white.”4 The major premise about the coloring of geese is not in question. The minor premise about the inner nature of this bird is not in question. What is still to be decided is how the one affects the other. The proof is based on certain knowledge. The deductive reasoner works top-down, from the accepted general rule to the specific circumstance.5

With inductive reasoning, the reverse is true. The major premise is not accepted as a rule but as a supposition that may be tested and found wanting. The minor premise may be taken as a direct observation. And the conclusion is subject to testing and statistical verification. Consider the logical syllogism “Most all-white birds are doves, geese, or swans; this bird is all white; therefore, this bird is probably a dove, goose, or swan.” That’s not a rock-solid, take-it-to-the-bank conclusion but, coupled with other generalities and analyses, it will take you in the right direction.

I would maintain that most scientists begin with a set of general observations—for example, about the nature of white birds. From that, they form a hypothesis—do all of these birds actually belong to just one or two limited, definable types (doves, geese, swans)? And then they set about testing that hypothesis by trying to find one or more cases where it is wrong. If the scientist finds an all-white sea gull, the premise must be expanded. If he or she finds all sorts of all-white birds—without interference from a singular genetic mutation like albinism—then the premise may have to be abandoned.

Scientists on the verge of discovery are always working inductively, like the scientist trying to establish the nature of white birds. Scientists who have previously established the truth of, or at least the preponderance of evidence in support of, a rule or theory or mathematical or physical law are working deductively, like the scientist trying to establish the known attributes of a particular goose. Scientists may also work from deduction to define the limits and fill in the blank spots of the reality they are examining, in order to prepare for the next inductive leap. But true science—the sort that discovers new knowledge rather than categorizes and classifies what is already known—progresses by proceeding from observation to hypothesis to testing. It cannot begin with a major premise that is accepted as true without questioning how that truth was established.

I like to think that the seeker has the better grasp on the true nature of the world. It is necessary to accept some premises as proven either so solidly as to be true or by a preponderance of the evidence as to be likely. Those accepted facts are like the stones that stick up above the surface of the river: they provide stable places for you to put your feet for taking the next step. But if the river surface is all stones, side to side, like a pavement or a highway, then you have nothing more to learn, nothing to observe, hypothesize, and test. If the river is flowing on all sides with no stones at all, then you are left with an unknown surface that may hide a bed only inches deep or holes that go in over your head.

The world—and by that I mean the complexity of all human relationships and actions, observed and potential, the complexities of life on Earth, and possibly out among the stars, the complexity of the human mind which observes and analyzes it, and the complex nature of physical reality extending to the edges of the cosmos, if such there be—is more unknown than known, more watery surface than stable stones. So the safe bet is to limit the number of revealed truths and philosophical positions you’re willing to state positively and to allow for a maximum of doubt ameliorated by a working theory of probability.

It’s less comforting that way. You don’t get to pound the table and declare your beliefs as much as the true believer. You will live on a planet filled with wonders and question marks and under a sky that harbors unknown dangers and delights. But at the end of the day, you will get fewer rude surprises. It’s also more fun to dig in and enjoy the intellectual game of piecing this world together. Because that’s what the human mind is for—or so I believe.6

1. Truth be told, if I wanted to elevate a doctrine of scientific knowledge to the status of a religion, I might choose molecular biology. Its basis, the DNA molecule, with its associated mechanisms of RNA transcription and then translation of amino acids into proteins, is a majestic thing to contemplate. Coupled with the principle of evolution, DNA explains—at least for me—the creation of life on Earth in all its complexity. It provides a pattern—in principle, if not in exactly the same chemistry—for life elsewhere in the universe, too. But then, the molecule and its mechanisms are so complete and ubiquitous on this planet that I have argued elsewhere that they may not have originated here at all. (See, for example, DNA is Everywhere from September 5, 2010.)

2. This is not to say that I lack moral guidance. I tend to favor Buddhist principles, because they guide thought, speech, and action from rational ideals of complementarity and reciprocity, instead of from warnings of eternal judgment by a transcendent father or mother figure. Buddhism also relies on the individual, rather than dogma, to show what is the right thing to do in any particular case. The Eightfold Path has few specifics about what might be “right” or “wrong” action compared to, say, the Old Testament or the Quran, where injunctions on choice and preparation of various foods, styles of hair and dress, choice of pets, larger lifestyle choices, and other commonplaces abound. (See Gautama and the Godfather from August 5, 2012.)

3. Parent, teacher, religious practitioner—priest, pastor, rabbi, imam—political or economic guru, philosopher, literary lion, scientific thinker, or some other external source.

4. Okay, this is a silly—though still valid—example, because you can tell the bird’s coloring at a glance without reference to its genus and species. But consider the syllogism “All vampires feed at night; this creature is a vampire; therefore, it feeds at night.” Eating habits are not readily observable outside of direct action and its consequences; so it is useful to have a rule suggesting potential actions and a definition of the creature before you—then lock and bar your doors at night.

5. Notice that the syllogism does not work with the attributes of the major and minor premises reversed. You cannot say, “All geese are white; this bird is white; therefore, this bird is a goose.” The fallacy lies in the rules governing set theory. While “geese” may be a subset of the category “white birds,” it does not follow that the category “white birds” is limited solely to the type “goose.”

6. If I did believe in a god, he or she would be subtle, discriminating, and would allow for a world with more than one right answer. Everything I’ve experienced just seems to work that way.

Sunday, December 22, 2013

The Power of Personal Honor

If I have a sense of personal honor, I got it from my parents. From my mother mostly, as I remember it, although my father’s influence was almost as strong. Mother was around more, and it’s her voice I hear in my head.

If I did or said something wrong when I was growing up, she would say, “We don’t do that,” or “We don’t say things like that.” In every case “we” was the family, the closest human association I knew,1 the only club I could join at the time, because I’m talking about a child of four, five, or six years old. The family included not just my parents and my brother, the nuclear family, but stretched to the august heights of my distant grandparents, and to the breadth of my far-flung aunts, uncles, and cousins. When I did or said the wrong thing, I was letting them down, embarrassing them, and separating myself from what they held to be important. The idea that these interpretations of right and wrong might just be my mother’s own rules didn’t occur to me. The invocation of family heritage and watchfulness was the main thing.

My mother’s second line of defense was, “You’re better than that.” This came later, at the age of seven, eight, or nine. When I did or said the wrong thing, I was letting myself down, abusing an image of the proper person that I was supposed to be, the image I was supposed to carry inside me. It was a one-two punch: you’re letting us down, and then you’re letting yourself down.

And I have to say it worked. It socialized the wild demon that lives in every child. It set boundaries and limits. It made me conscious of my actions and speech. It built a conscience inside my head.

My father worked in a different way. He did not so much admonish as demonstrate. He always—as far as I could tell—spoke the truth. He did the right thing. He stepped outside of his comfort zone to preserve and protect what he saw as good and right, as belonging to the natural order.2 And if I asked him why, he would try to explain it all. He showed that he believed in a larger world beyond himself. In that larger world, in the society of which he partook and tried to demonstrate for us boys, telling the truth and keeping your promises were rewarded, law and reciprocity were the natural mechanism and the way that society worked, and the world made sense. The idea that this model of the world might just be my father’s personal view didn’t occur to me. The invocation of rightness and its reasonableness was the main thing.

Believing in this rightness built confidence. It helped me see that the world was a place in which right action and right speech worked to my benefit. This was not necessarily a safe or an easy world to inhabit, but one in which a human being could function. It reassured the timid child that there was a purpose to life. It made me conscious of what the world expected of me.

These weren’t acts of calculation or genius on my parents’ part, but simply the way their fathers and mothers had socialized and taught them in the past. And yes, I also experienced not a few angry shouts and spankings along the way. Still, I remember the admonitions and the demonstrations more.

In Freudian terms, these parental interactions constructed the superego which oversees the ego and suppresses the id. In terms of Transactional Analysis, they became the inner “parent” which instructs the “adult” and corrects the “child.”

Perhaps this sense of honor and restraint can grow naturally in a human being, without the influence of parents and their admonitions and demonstrations. Yet I tend to think not. A conscience, a sense of personal honor, the positive positioning of self in the world, a personal code that describes things one will always do, things one will never do—these are different from instinct. Instinct is what we inherit from our long evolutionary development and share with the animals. Conscience and honor are what we learn from our parents and teachers, and reinforce through our associates and nearby social partners, during our relatively short childhoods and then share with other fully developed human beings.

Perhaps a child can learn a conscience and a sense of honor by bumping along in contact with other children and the über-society of distant, uncaring adults. William Golding’s Lord of the Flies, about children organizing their own society on a desert island, and Leon Bing’s Do or Die, about the socialization of gang life in Los Angeles, suggest otherwise. Children’s view of life, unguided by caring adults, is necessarily fragmentary and self-centered. And what one can learn by brushing up against the larger society’s laws and institutions without patient guidance is usually how to avoid exposure and exploit the cracks, rather than how to navigate the channels and make a contribution.

Conscience and personal honor are part of a two-way mechanism. They compel you to say and do the right things, to keep your promises, to live and perhaps die by your personal code. At the other end of this mechanism are the people who have developed for themselves an unspoken sense of your inner constraints and compulsions. These are your high school and college teachers, corporate supervisors, early mentors, army sergeants, teammates and opponents, lovers and spouses, business associates, competitors, and casual acquaintances. They expect you to tell the truth, turn in your homework, show up for appointments, honor your commitments, fulfill your contracts, do your job, and perhaps go up that hill and die for your country. Or not. If you cannot or will not do these things—or perhaps do not even know that the obligation exists—then these people will know it. They will no longer hold you accountable and will avoid being in your company.

Conscience and personal honor are the flip side of human trust. As a coin cannot have only one side,3 so trust cannot exist without an anchor point within the person who is asking for, expecting, or receiving another’s trust. That anchor point is adherence to a code that rises above the animal instinct of “I’ll get mine and to hell with you.”

Curiously, people who have a code of honor are also more likely to extend trust to others. Not blindly and not to everyone or just anyone, of course. A person of honor looks for it in others. It’s something about the focus of the eyes, the quality of the smile, the underlying assumptions of casual talk and humor, the outcomes of observed actions. You sense it. You know it. And you put your faith in it. Or not. Those who do not live by their own code of personal honor cannot pass invisibly and seamlessly through society. They leave a trail of disappointment, if not of outright betrayal and disaster. For all their jokes and smiles, they cannot hide what they are. Honest people—people who are honest with themselves and others—will know them and step aside, turn away, find others in whom to place their trust. Or so I believe.

The personal code, the sense of honor, is at the core of every decision, the pivot point upon which every choice turns. It’s buried so deep in the psyche that the person of honor does not actually have to think “This is wrong” or “That’s something I won’t do” or “I shouldn’t say that.” The impulse itself comes wrapped in a red haze, a warning sense, a visceral repulsion. The sense of honor is buried so deep that the person will suffer deprivation and indignity, go to torture and death, rather than violate it. It becomes what the person is and does.

It’s not that someone is always watching, even if you can’t see him, and will tattle on you to authority or to your friends. It’s not that God is always watching and will dispatch you to Hell the moment you’re so careless as to die. It’s not that mother or father is watching—from the other side of the room, from the kitchen, or from somewhere up in Heaven—and will feel shame at your words or deeds. It might start that way, but the obligation to parents usually fades by the end by childhood. No, it’s that you yourself are watching. And this is the one person you cannot evade, cannot successfully lie to, cannot convincingly cheat, and who will know that you are untrue and found wanting.

The power of personal honor is the underpinning of social interaction. If I lose it, I become something like an animal. If too many around me lose it, we slip back not just beyond the teachings of our enlightened ones or the wisdom of the ancients, but beyond the social cohesion of the hunter-gatherer tribe. We become a monkey troop—and that’s a hard place for any individual to exist.

1. The dogs were another category of association, not so demanding but equally filled with responsibility.

2. As I’ve told elsewhere (see Son of a Mechanical Engineer from March 31, 2013), on a trip to Canada when I was very young, my family was out walking on a promenade along a shoreline protected with armor rock. A beautiful mahogany speedboat, totally unattended, had just broken away from the dock at a marina out on the point and was drifting down onto this rocky shore. My father climbed down over the rocks, waded into the water, and held off the boat until someone could get in touch with the marina to send a launch to retrieve it. He broke a finger wrestling with that errant speedboat. This is a memory I’ve carried to this day, and it taught me something about civic duty.

3. I suppose early markers of trade—scratches and dabs of paint on flat stones used for counting sheep and such—had only one side. But by the time coiners were pressing images of kings and gods onto precious metal, both sides were struck. Like the milling around the edge of a coin, it reduced the places where a person of dubious intent might shave a bit of the metal for himself.

Sunday, December 15, 2013

Comedy or Tragedy?

Soon after my novel First Citizen came out, my agent at the time showed it to a friend who was vice president of development at a major film studio. He liked the book immensely but, because it was so vast and sprawling—after all, it is the autobiography, with commentary, of a modern-day dictator patterned on Julius Caesar—he felt the entire novel could not be made into a movie. Instead, he urged me to take the book apart and find the single story line that would make an acceptable screenplay. Because I had recently read and practiced with the Syd Field books1 for another project, I agreed to have a go at it and write the screenplay on spec.

The vice president of development then proceeded to tell me exactly how it should be structured. The first act must show the hero happily living his everyday life. At the end of that act, something happens to destroy that life, strip the hero, and leave him—and you can read “her” throughout here—metaphorically lost and alone, out in the cold. The second act shows the hero struggling to regain what has been taken. “But everything must work against him,” my mentor said. “Not just the villain, but other people, society, the government—the ground itself must literally rise up to oppose him.” The second act ends when the hero discovers the key to winning his fight. The third act shows the hero using that key to succeed and put things right. And, in this final act, the hero “must meet and beat the villain by going mano a mano, like Holmes and Moriarty at Reichenbach Falls.”

That was the sum of his advice. All I had to do then was find the part of my story that fit the formula, and I would have a movie he felt he could produce.2

You can see this structure quite plainly in thousands of movies, mostly of the “action” variety. Sarah Connor in The Terminator has her everyday life, her roommate, and even her sense of normality stolen by a homicidal maniac who isn’t human and gives no reason for his hostility. No one will believe her, not even the police, as she runs from the monster. Not until she teems up with the equally frightening soldier from the future, Kyle Reese, does she begin to learn the truth and build a strategy to win her struggle. And in the end she is left alone to kill the robot with a power press in an automated factory. … I’ll leave it to the reader to recall other examples and dissect them.

Of course, what my would-be mentor was describing is technically a dramatic comedy. This is not because it’s filled with jokes and laughter, but because the hero is an innocent. Bad things may be done to him, but he has no responsibility for setting them in motion. They are simply circumstances that involve him. But in the end he triumphs through his own wit and skill, overcoming these dire and unjustified circumstances. The film executive was suggesting that all movies these days must be structured as such comedies. Nobody wants a downer. Nobody wants a lesson. They want to walk out of the theater feeling good. The hero has to win—and win big—against natural oppression. So tragedies are ruled out as a medium of popular culture.

This is a very limited view of human relationships and exchanges, of human nature, of the human condition, and of what constitutes “feeling good.”

Tragedies are not just sad movies where the hero suffers and dies. That would be simply pathos, where we pity the main character in his misfortune. Or bathos, where the suffering and loss—and the bloodletting—are so overdone that pity becomes sick or maudlin, mere sentimentality. Instead, tragedy must invoke something deeper and more involving in the human spirit. Tragedy shows what is noble in human beings by dispensing with the frivolity of everyday life and cutting to the bone. In tragedy, the main character confronts his own nature, the consequences of his actions, and the bedrock level of the human condition.3

The hero of a comedy only has to be smart and quick, and by the end he may have learned nothing except a one-time trick for beating the present odds. The hero of a tragedy has been to hell, seen the faces of fire and death, learned what is really going on in the deepest, most hidden parts of his life and condition, and come back wiser for the experience—or died in a state of spiritual release and redemption.

The hero of a comedy can remain passive in the story until the villain—or simple bad luck—catches up with him at the end of the first act. The hero of a tragedy usually initiates the action, sometimes long before the story even starts. The hero of a tragedy might be unsuspecting, but he is never an innocent.4 In the best stories, the action that precipitates his problems comes from the hero himself and reflects some essential part of his character. The hero cannot resolve the problem without coming to understand the true nature of long-cherished assumptions and/or relationships as well as past actions and/or reactions. At the end, the hero’s life cannot simply be restored to its former success and smoothness, because some faults cannot be fixed nor some complications disentangled. And the lessons learned from them cannot be unlearned. But even though the hero is financially, physically, or mentally broken, even though he dies, he ends in a better state for knowing the truth rather than a happy fantasy.

In the play Oedipus Rex, the king of Thebes is disgraced and deposed. The end comes not simply because he murdered his father and slept with his mother—although when he learns that he has in fact committed those sins, he gouges out his own eyes. Those sins were all along the subject of prophecies made about Oedipus both as baby, which caused his biological father Laius to have him exposed on a mountainside, and as a young man, which caused him to flee his adoptive parents. His “tragic flaw” is not even his hot temper, which caused him to kill a quarrelsome old man—whom he did not know was his own father Laius—at a crossroads. His flaw was persistence and a stubborn, perhaps ever self-righteous, pursuit of a different truth.

His new kingdom, which he inherited when he married the dowager queen Jocasta—whose husband at the start of the play has recently and mysteriously died—is suffering under a curse because the killer has not been brought to justice. Oedipus determines to pursue the story, find the killer, and lift the curse. When the prophet Tiresias, through whom the gods are working, warns him to leave well enough alone, Oedipus refuses and tracks the mystery to its conclusion and to his own destruction.

In another play in the series, Antigone, the daughter and half-sister of Oedipus comes to grief for much the same reason: stubborn persistence in doing what she thinks is right. Her brother Polynices has been killed in battle while revolting against Thebes. Her uncle Creon has declared that his body will not be given funeral rights and must lie exposed, which means his soul can’t rest in the underworld. Antigone buries him anyway, and Creon punishes her sedition by having her walled up, a dreadful punishment that causes her to hang herself. In grief, Creon’s son Haemon, who was her fiancé, kills himself. Creon’s wife Eurydice, who can’t stand the pressure, also kills herself.

In Hamlet, the prince of Denmark is undone by his own indecision. Having learned that his uncle had killed his father—from his father’s ghost no less—and then took his mother as queen, Hamlet vows revenge. But taking action is hard for him. He tests his uncle by putting on a play that reenacts the murder, and that’s proof enough of the guilt. Perhaps deranged by grief, perhaps to throw off suspicion, Hamlet becomes or pretends to be insane. During the course of madness he rejects his fiancé Ophelia, then kills her father by mistake, and she drowns herself in grief. Everyone dies at the end of this play: Ophelia’s brother Laertes, Hamlet’s uncle and mother, and Hamlet himself. It’s supposed to be Shakespeare’s greatest play, but for the life of me I can’t think why.5

In Richard the Third, the newly crowned king of England is defeated and killed in battle. His flaw is his despicable character. To get to the throne, he has betrayed or murdered everyone who stood in his way and a few others besides. When the final battle comes, no one will stand up for him.

The point of these stories is not that everything turns out badly. But in each case the ending is intellectually and emotionally satisfying. It either teaches or confirms something about human nature and the human condition that the audience either needs to learn or already knows. That the pursuit of justice is a risky business, and that bold actions can have irreversible consequences. That honoring the demands of conscience and the gods against the commands of civil authority will also have dire consequences. That indecision and delay can be worse than taking no action at all. That succumbing to the temptations of the easy, obvious path—and so becoming a thoroughgoing, self-serving bastard6—will end badly.

Whether comedy or tragedy, the audience and the reader must feel that the story has been resolved and that the resolution is justified. Comedy does this by everyone winning through the struggle and going home happy. Tragedy does it by having the main character’s loss and suffering explained, so that he or she arrives at a state of balance, justice, and purification, while the audience achieves catharsis—the purification of our temporarily clouded emotions.

Not all successful modern movies are dramatic comedies, however. And not all of the filmed tragedies are simply retreads of Shakespeare and other classic playwrights who worked that side of the aisle.

In the John Sayles movie Lone Star, the young sheriff, who has virtually inherited his father’s position, pursues an old murder and finds out more about his father, his family, and his personal relationships than he knew before. No one is happy at the end of the story, but they have achieved peace and the audience is satisfied.

The Chicago Black Sox scandal, retold in the movie Eight Men Out, another Sayles production, shows the consequences of the players holding a grudge against the team’s owner, trying to get even with this skinflint, and accepting bribes to thwart his ambitions and throw the World Series. None of the guilty players—who are supposed to be playing for love of the game—comes out of the experience whole. Summary justice is done and, while the careers of sympathetic people are wrecked, the audience understands and is satisfied.

Curiously, even the lightest of action flicks can have a tragic element. In the three recent James Bond movies with Daniel Craig in the title role, the villain is happily killed and disaster averted each time. But that’s almost incidental, because Bond—through his own blindness and impulsiveness—loses someone he has come to care about. James Bond, the solid soldier of the Ian Fleming novels, the devil-may-care playboy of a half-century of earlier movies, has become a tragic figure. He is learning the bitter truth about himself and his job. He accepts it coldly but not casually.

Not every story has to end happily. But then, not every one has to end in the bloodbath of a Hamlet or Antigone, either. What audiences and readers look for is a sense of rightness and completion, the feeling that justice has been done, that the character is made whole, and the world is set back to spinning west to east. Do that for the reader every time, and it won’t much matter what kind of structure underpins your story.

1. See his Screenplay: The Foundations of Screenwriting and The Screenwriter’s Workbook.

2. I did find what I thought was a story line, centered around the turning point of the war in Mexico, and wrote a complete screenplay based on the main character’s launching of the second American civil war and his rise to power. But it wasn’t what the studio executive was looking for, and we never talked again. Working on spec—that is, for free—is often like that. Now I take the attitude of the masked executioner in the movie The Four Musketeers: “And two more pistoles for rowing the boat. I’m a headsman, not a sailor.”

3. And if you think tragedies won’t appeal to modern young audiences, remember that Shakespeare wrote a great many of them which played well with the groundlings, the lowest-paying audiences of the period, which were composed mostly of young apprentices and day laborers.

4. Granny Corbin, the main character in First Citizen, certainly isn’t innocent. He is as much or more doing as done to.

5. But then, I don’t have much sympathy with indecisive, hesitant characters. To quote Tuco’s law, “If you’re going to shoot, shoot. Don’t talk.”

6. Darth Vader, please take note.

Sunday, December 8, 2013

The Thomassian Jihad

Recently my daughter-in-law1 posted on Facebook a link to “5 Reasons to Date a Girl With an Eating Disorder” with horror and disbelief. She hoped it was meant as humor but feared it was not. I clicked on the site and read the author’s reasons: bulimia improves a woman’s overall looks, makes her less costly to date, makes her fragile and vulnerable, implies she comes from a wealthy family with access to ready money, and makes her better in bed because of that same fragility and vulnerability.

My reaction, posted in a Facebook comment, was: “Seriously, each of these reasons treats a woman as if she were some sort of accessory to the male lifestyle. You might hear such reasoning applied to a new car: looks good, holds its value, low maintenance, good gas mileage, great performance. …” The author of the web posting wasn’t just treating a woman as if she were an object. You can do that simply by looking at a pretty woman in high-styled makeup and a gorgeous dress in the same way that you would look at a beautiful vase. This author was going beyond the casual look of appreciation to categorizing a certain type of human being and his relationship with her in a utilitarian, mechanistic way.

Something in all this struck a sore spot with me, a natural aversion that I have carried since childhood. It comes out whenever I encounter modes of thinking that treat human beings—those wonderfully unique, growing, aspiring, yearning, achieving focuses of potential, who are born out of complexity, half animal, half angel—as simple, utilitarian modules, as undifferentiated nodes in a system, as meat machines.

As stated elsewhere, I’m a great fan of Frank Herbert’s Dune books as an act of imagination and vision.2 However, his far-future, galaxy-spanning society was shaped by a single event in its distant past. After the rise of thinking machines and automation nearly wiped out humanity by coddling the human spirit almost to death, a war took place—the Butlerian Jihad—with the clarion call “Thou shalt not create a machine in the likeness of a human mind.” In the Dune worlds, the surviving human beings have learned to take on the functions of higher-order reasoning and projection that computers had once performed. Human beings were forced to become machines in service to others—Mentats—but with the particularly human characteristics of free will and choice.

Interesting as that backstory was, I personally have no problem with our current technology and its drive toward robotics and cybernetics. I believe advanced machines will facilitate, rather than hobble, the intellectual and emotional power that is innate in human beings. Advanced machines and the science in which they are embedded will enable humans to see the universe in all its complexity, study our bodies and our minds more deeply, achieve greater social goals, solve persistent problems, and free individuals from the daily grind of mechanical work and poverty. I really can’t imagine human beings becoming so swaddled and smothered by over-helpful machines that we give up our human intelligence and the will to go on living.3 But then, we aren’t quite there yet.

Instead, I see us as still living with the legacy of social structures under which human beings have coexisted for millennia with only the simplest of machines: the wheel, the lever, the wedge. In such societies, if there was a wheel, then someone had to be told to turn it; a lever, and someone had to pull on it; a wedge, and someone’s muscles had to push it to split wood or plow a field. For 4,800 years of recorded human history, up until about the 18th century and the invention of the steam engine, human beings—along with draft animals and natural forces like falling water—provided the only motive power in any society. And in many industries still today, from picking peaches to assembling iPhones and iPads, human beings are treated as meat robots whose backs, brains, and hands are chained to the simple machines they operate.

We still have the tendency, even in developed societies, of treating other people as convenient machines. The habit is strongest in the author of the “5 Reasons” website, who sees women as accessories and—to quote from a recent James Bond movie—“disposable pleasures.” But that habit also lurks behind the assumptions of every personnel professional and union organizer who sees factory workers as interchangeable parts. It echoes in the language of every sociologist who studies society as if it were a vast machine and people functioning as the cogs within it.

And so, with the blindness of the fanatic, the rigidity of the fundamentalist, and the carelessness for consequences of the social reformer, I want to launch my own jihad, the Thomassian Jihad. And my clarion call will be “Thou shalt not reduce a human being to the likeness of a machine.”

Human beings are not machines any more than beautiful women are fragile vases, or flowers and butterflies exist as the basis for pretty pictures. The reality is more complex, mysterious, engaging, and personal. The reality must be granted its power and purpose, its unseen web of relationships, its ability to grown and change. A human being is not a thing but a spark of unknown potential in play with all three phases of time—past, present, and future. A human being belongs only to him- or herself.

The Thomassian Jihad attacks all forms of human ownership, from race and wage slavery to sexual trafficking. It attacks all social and economic theorists, from Plato to Marx, who would reduce human beings to simple, malleable productive or consumptive units and treat the mass of human beings as social clay from which to mold “more perfect” societies. It rejects the modern idea of marketing, which reduces the universe of human desires to simple, predictable “needs” and “demands.” It rejects all methods of productive coercion, from the assembly line to productivity enhancements and time-and-motion studies, which reduce human will and creativity to simple, predictable actions. It rejects all scientific reductionism—even for the purposes of study—from Skinnerian behaviorism to neurological assessments, which treat the complexity of human interactions as simple, stimulus-response mechanisms. It attacks all principles of hierarchical dominance, from traditional views of marriage to modern states and corporations, where one person may take the humanity of another for granted.

Humans are unique and remarkable beings, arising out of complex and irreproducible neural networks, exercising autonomy and free will, able to recall and learn from the past, anticipate and build for the future. We are the first animals—with the possible exceptions of elephants, whales, dolphins, and some of the higher primates—who are able to escape the ever-present now of time.

The Thomassian Jihad transcends all politics of either the right or the left. It declares that human beings are not to be treated as parts of a whole, cogs in a machine, counters in a political game, or units in a larger functioning entity such as a society, a nation, a factory, a church, or a brothel. Humans must be treated individually and their unique qualities, abilities, intentions, and desires taken into consideration in any relationship or exchange.

The Thomassian Jihad envisions a time when any repetitive, unthinking, operationally directed task can be done better, faster, and cheaper by a machine. By using actual machines to do this work, we will free human beings to do what we do best: create, imagine, dream, strive, seek.

The Thomassian Jihad envisions a time when human labor of muscles and minds will be directed toward and invested only in objects of love. It promotes the creativity and imagination of a painter or sculptor to make a never-before-seen image, a composer to make a never-before-heard melody, a writer to make a story within his or her own unique vision of the world, a chef to make a meal to delight the human senses, a cabinetmaker to build a unique piece of furniture that satisfies the human touch and serves a human purpose, or an engineer to investigate a human need or problem and create an actual machine to serve it. Human effort must be prized as a special type or work, above what can be made by rote or by a machine.

The Thomassian Jihad is the struggle for the individual human spirit. Come and heed my call.

1. Truth in advertising: she is actually the wife of my wife’s second cousin. But we helped raise him when he was orphaned as a boy and think of him as our son. What’s thicker than blood? Family!

2. See The Dune Ethos from October 30, 2011.

3. We may all be deeply involved with our iPhones, iPads, and computers and entranced by the connections they enable. But we are simply connecting with people located in another place and time, rather than standing before us. None of us has lost the ability to eat, talk, or move. Human beings used to go into a similar trance when engrossed in a book. Civilization survived.

Sunday, December 1, 2013

Showing vs. Telling

I really want to like Jane Austen. Really. But as charming and amusing as her stories are, I can’t get over her mode of telling them. And that’s the problem—too often she tells them.1 This was all the fashion in the early 19th century, when the novel as a vehicle for storytelling was still in its infancy. But the modern sensibility is shaped by later practices and influences from other art forms, as I will show.

In creative writing classes, they make a big deal about “showing” vs. “telling.” This is sometimes hard for beginning writers to understand. In both cases, the medium is still just words, the verb tense is still in the past, and the pronouns are still he, she, and they—unless the novel is told first person, in which case it’s I and we. None of the bare mechanics of writing actually changes, and yet the feeling is very different.

In short form, the difference between telling and showing is the difference between writing “Roger was fat,” and writing “Roger’s chin rolled over the edge of his collar and the buttons tugged the cloth of his shirt sideways across his belly.” Showing is less economical, requires more words, and takes longer. It slows down the story. But it also paints vivid pictures and pulls the reader deeper into the reality of what is happening.

For the literal-minded, the above passage is still just telling, but with more descriptive detail. We don’t say “fat”; we show what fat looks like. And, if you’re willing to grant the reader a smidgen of sympathy and willingness to play along, writing “fat” can be as good as any number of descriptive jelly rolls to get the point across and keep the story moving forward. So showing must be something deeper, more structured, less obvious—right?

My own personal approach to showing is to embed the narrative in the point of view of a single character, at least for the unit of time, place, or action required to present what I think of as a single “scene.”2 This disciplines my writing. I have to work within a framework of what that character can see and hear, know and remember. I adapt the story to that character’s perceptions, assumptions, likes and dislikes, and level of understanding—that is, I place the reader’s awareness inside the character’s head. I try to use the language and experience of the character as much as possible, so that a lawyer will be thinking in terms of legal relationships and positioning, while an engineer will think in terms of mechanical relationships and possibilities.3

This discipline obviously limits the scope of my storytelling. So, to get a rounded view of what’s actually going on, I use different characters who know different parts of the story. Their views may be in conflict, and their intentions and assumptions usually are. I trust the reader to weigh the evidence and decide who—if anyone—is closest to the truth.4

Because I limit the narrative to a single person’s perceptions, feelings, and memories, my writing tends to be more showing than telling, even though the mechanics of the language are the same. Dry facts and past findings may be presented in a scene’s first paragraph or two, to set up the reader and character for what is to come. But even there, I try to give these bits of background emotional weight and actively perceived implications.

The opposite of this point-of-view writing is the “omniscient narrator,” who sees all, knows all, tells all, hovers over the story, and stands outside time—and the story’s emotional implications—rather like a god. The omniscient narrator has the viewpoint of the audience in an old-fashioned stage play. There everything is shown and known, open to inspection, and perceived by everyone on stage during the scene. Even private intentions and realizations are delivered through the technique of an actor putting a hand against his face and speaking an “aside.”

Writing as a medium for storytelling only started to achieve universal appeal with the advent of Gutenberg and movable type in the 15th century. As soon as printed books became widely available, people acquired the impulse toward universal literacy and private reading for pleasure. But before that, stories were told by bards and troubadours after dinner, or presented by players on a stage. This was the medium for 2,500 years or more, going back to Greek and Roman playwrights, and before them to the mystery plays of shamans and priests. Readers of early novels, like those of Jane Austen, were given the omniscient viewpoint of an audience sitting beyond the stage’s “fourth wall,” outside the proscenium arch.

Writing in the 20th century was strongly influenced by the competing medium of film, just as earlier reading for pleasure was influenced by the theater. I believe the modern approaches to narration—and to showing vs. telling—have followed the growth in sophistication of the movies.

Early movies used a stationary camera that approximated the field of view and viewpoint of an audience sitting outside the arch. The action took place as on a stage, at a distance, with an implied omniscience. Soon, however, directors were engaging in occasional closeups on one character or another to show particular nuances of intense emotion and dramatic reaction. In the old silent movies, sudden bursts of realization or reflection that might have been carried in a theatrical aside—as well as dialogue content too complicated to be communicated by facial expression and gesture alone—would be handled with a bit of text written on a card and inserted into the action. But the viewpoint was still outside the action, with the audience.

That sort of moviemaking feels dated to modern audiences. Now the camera moves around, pointing its focus like the attention of a character in the action. The microphone is directional, picking up and emphasizing some sounds, muting others. The point of view does not necessarily track just one character, and it certainly does not remain fixed inside his or her eye sockets, as does my fictional narration, because actors are paid to be seen, not become invisible spectators. But the function of the roving camera is to direct the audience’s attention, and if one character notices the bulge of a gun in another’s jacket, the camera will zoom in, isolate the perception, and set up that part of the story.

It has become a commonplace of moviemaking—good writers take note—that if the camera shows any setting as more than just background scenery or any object as more than just an everyday prop, then that scenery or prop will have acquired meaning, gained importance to the story, and will play a part later on. The roving camera has become a method of showing rather than telling. It replaces the old-fashioned way of communicating, where one character says to another, “Say, that’s a nice gun you’ve got there,” or “I didn’t know the old mill was on this road.”

Modern storytelling adopts this cinematic focus, and point-of-view writing is highly adaptable to such a focus. The omniscient narrator does not write “Cynthia had a gun and she knew how to use it.” Instead, Roger sees the gun in Cynthia’s hand and—by the firmness or wobble of her grip, if not from his personal knowledge of her skill level as a gun enthusiast and hunter—gauges how likely she is to fire it. That’s showing rather than telling.

This is not at all to say that writing a modern novel is like writing a screenplay. Studying screenplays as an art form can be useful for any writer as one more tool in the kit—and in case his or her novel gets picked up in Hollywood. But the form itself is very different from writing for a reader’s pleasure. Screenplays are technical documents, instructions to the director, actors, and backstage departments. Screenplays have a strict format, with indents that distinguish dialogue from action and that scale the duration of both, so that one page of script roughly equates to one minute of screen time. Screenplays are barren of description. Dialogue is unadorned, with just enough words to convey useful information with a hint of character. Exact phrasing and delivery are left to the actor and director. Action is terse, with just enough detail to guide the actor’s movements and changes of focus—as well as providing the location manager, property master, and wardrobe mistress with any necessary details. “Roger was fat” is not for the benefit of an imaginative reader but a cue to the casting department.5

Still, the cinematic approach to storytelling informs and advances the art of writing stories and novels. Good writing, I believe, has become more visual, aural, and sensual. Explanations of relationships have become demonstrations of the character’s embedded love or hate, trust or wariness. Action has become graphic and sequential, focusing on step-by-step development rather than after-the-fact summations. The reader experiences the story in real-time action rather than just hearing it described in generalities such as a bard might use.6

The art of writing is much more than just telling a story. Instead, you must direct the reader’s awareness, attention, expectations, and emotions. You must reveal the nuances of character and the specifics of the world they inhabit. You must make the reader live in that world and care about what happens to those people. And to do that, you must show what’s going on rather than just talk about it.

1. If you don’t believe me, go back and read the first chapter of Sense and Sensibility. It’s a cross between a genealogy and a legal document to show the entanglement of an estate in order to leave a mother and three daughters dependent and destitute. A playwright would at least have communicated this history, in more compact form, through a breakfast-room conversation somewhere in the first act. A modern novelist would have shown the moment of discovery, when the will is read—which I believe is somewhat like the opening of the Emma Thompson screenplay in the 1995 movie version. The disembodied narration of the actual novel is unfashionably antique.

2. See Writing for Point of View from April 22, 2012.

3. In a recent conversation, one of my friends who had just finished reading The Professor’s Mistress chided me for using “propeller” to refer to the motive power of the steamboat Galatea when everyone knows the nautical term is “screw.” Yes, of course, except that scene was written from the viewpoint of William Henry. He is a classics scholar and a landsman who falls in love with the boat for her cutter-like bow, her mahogany woodwork, and her link to a more gracious past. Before this, the closest he ever came to yachting was a scholarly article in the debate about ancient warships—trireme vs. quinquireme. So he would use the more formal term “propeller.”

4. To see this played out, take a look at my novel First Citizen, where two first-person narrators tell two slightly different versions of the same story.

5. For more on the art of writing screenplays, see Syd Field’s excellent manuals Screenplay: The Foundations of Screenwriting and The Screenwriter’s Workbook.

6. Think of how using opening screenfuls of background narration and “voiceovers” has gone out of fashion in modern movies. The most recent productions don’t even use place names superimposed on a wide-angle shot to identify the location, but instead the director chooses that shot so that even the least informed viewer can identify Washington, D.C., Paris, or London. And if the place is Anytown, U.S.A, it may not even be identified but rather shown for what it is by familiar features and actions.
       Similarly, the film opens and proceeds without a theatrical dramatis personae. Movies in the 1930s and ’40s would give the credits up front, and this served the dual purpose of identifying the major characters for the audience as well as acknowledging the actors. Nowadays, unless a character is important enough to be mentioned subsequently in dialogue, he or she may never be identified by name at all but only by face, character type, and a role which the audience must intuitively grasp.