Sunday, April 30, 2017

On Sincerity

When I was at the university in the late 1960s, the campus revolution was just getting under way. Although it was mainly fueled by anti–Vietnam War protests, the student demands spread in all directions, calling for a virtual redefinition of the university structure and of society itself. In coursework, the new demand was for “relevance”—meaning politicized teaching of the correct sort—and started the movement to dump the collected works of William Shakespeare for the collected works of Eldridge Cleaver. In addition to “relevance,” the other spiritual demand of the time was for “sincerity.” This would be in preference, I suppose, to blatant hypocrisy.

One of my former philosophy professors, when questioned in a student-led colloquium, stated: “Sincerity is a trivial virtue.” I knew immediately what he meant: many other human virtues are far more important. I would prefer a person who keeps promises, pays debts, abides by contracts, performs acts of kindness and public service, takes care of family members and friends, treats other people with respect, smiles politely, and otherwise behaves in concrete ways. Whether the person “really means it” or is “faking it” is far less important to me.

This preference is, of course, from my own point of view. If a person says “Thank you” when I give them something or perform for them some small service, it makes me feel good. Whether or not the person actually means it, or feels truly grateful—that is, is sincere about this minor politeness—does not matter to me. Surely, if the person is grimacing, making faces, rolling their eyes, or using a sarcastic tone, to imply that no thanks are actually involved, then I know that the words are not meant sincerely. But my hurt comes not because of their lack of sincerity, but because of the implied mockery, as if my small action was really beneath their notice, or not kind and helpful at all, and thus deserving of their scorn. Otherwise, if a person says “Thank you,” even if it’s a murmur and there is no eye contact or other sign of heartfelt emotion, I can accept this as an empty politeness from an obviously well-trained, civilized individual.

Politeness is the verbal grease that keeps us descendants of howler monkeys from screaming in rage and trying to kill each other.

From the speaker’s point of view, the sincerity—or lack of it—in the exchange is a small measure of the state of that person’s soul. The saint or the deeply feeling person who says “Thank you” with sincere gratitude, virtually blessing my small gift or act of service with reciprocated good wishes, is expressing their own feeling of being at peace with the universe and gladness upon recognizing and being recognized by a fellow human. The hurried person who murmurs “Thanks” out of pure reflex, the ingrained habit of good breeding, and is unconscious of any felt gratitude, is at least practicing that verbal grease which keeps us all functioning. And the snarky person who sneers and rolls their eyes, to laden that “Thanks” with double meaning, is spreading their own bile and cynicism, fouling the gears of civil discourse. That little bit of intentional meanness is hurting them, corroding their soul, much more than the momentary confusion and pain might cause me.

In human interactions, the measure of sincerity is much like the Turing test for artificial intelligence.1 If you cannot tell whether the person is being sincere or not, it doesn’t matter who’s typing on the other side of the wall. You accept the person’s statements or intentions at face value and move on.

A society that valued sincerity as a primary virtue would be far different from our own.

Yes, I know the intent of those early student demands. By rooting out hypocrisy—the evil of paying lip service to popular principles but then regarding oneself as free to act in accordance with private intentions—the promoter of sincerity hopes to bring those hidden intentions to the surface. When you demand that people act sincerely, you expose falsehood and can then hope to enforce proper action. People who can say one thing and do another would be revealed as perpetrating a hoax on the society around them.

But the purpose of the exercise will backfire. A society of people who are forced by cardinal values to always say and do what they mean and what they are feeling at the moment will be a harsh and abrasive society. “Gee, Grandpa, that’s a measly five dollars you put in my birthday card.” “No, lady, you don’t get any ‘thank you,’ because it’s your job to pour my coffee.” “You men all think I can’t open a door for myself, you bastards!”

And we have before us the example of the most strongly politicized societies—which are usually the goal of those who would most earnestly promote the virtue of sincerity—as hotbeds of rampant insincerity. There people will loudly proclaim the party line, sing the party songs, and march in lockstep with the party cadence, while secretly loathing the party and all its purposes. And the more the party demands of them proper feelings of allegiance and respect, the greater becomes their popular hypocrisy—but always well hidden, driven underground. People are just ornery that way.

No, people still own the real estate inside their heads. They need the space, the personal freedom, of being able to think one way and act another. They need to smile when they are tearful, to force a polite response when they want to scream at you, to turn away with a murmured courtesy rather than engage and share their deepest thoughts. Humans have always been a bi-level species. We have always used meaningless courtesies to smooth over differences between individuals that would otherwise have us howling all the time. Similarly, we use formalized, ritual diplomacy to moderate relations between nations that would otherwise have us always on the brink of war. Hypocrisy and insincerity let us pick and choose our battles. They allow us to live.

So yes, while we would like to think that our friends and family are always sincere in their expressions of love, gratitude, and contentment, that the barista at Starbucks is doing us a favor that deserves our thanks, and that corporate executives mean in their hearts every word they put in a press release—it is not always so. And our indulging the small hypocrisy of not really noticing—that, too, is part of the social grease that makes life tolerable.

1. Computer pioneer Alan Turing in 1950 proposed that, to test a computer for artificial intelligence, we station a person on one side of a wall and have them communicate with a respondent on the other side through typewritten messages. The first person does not know if the other is a real human being or a very fast and well-programmed computer. If the person on this side cannot tell after about five minutes whether the respondent is human or not, then the responding machine is artificially intelligent. This test has since been superseded by others that measure specific outputs and performance in more dimensions than simple chat, because mindless, confabulating language processors, like ELIZA in the mid-1960s, easily passed the Turing test but were hardly intelligent.

Sunday, April 23, 2017

On Teaching Writing

I can’t teach another person how to write. I don’t think anyone can. This is not to disparage those who teach writing courses and run workshops and retreats for beginning writers. Some of the basic skills are necessary and teachable. New writers also believe that the art includes structures they need to learn and professional secrets that can be taught. And I am not ungrateful to the excellent English composition teachers I had in high school and college who did teach me a thing or three. Finally, every new writer needs someone in a position to know something about writing who will read their work and give them both positive and negative feedback, because that builds confidence. But the art itself, the essence of writing—that can’t be taught, because it grows from inside a person.

Every new writer needs to know the basics. Becoming a writer is impossible without knowing the language in which you will write. For English, and for most other Indo-European languages, that means understanding its grammar, the parts of speech, verb tenses, slippery concepts like mood and, in English, the subjunctive, as well as sentence structure and diagramming, the rules and the malleability of syntax, a focus on words with vocabulary and spelling drills, the difference between a word’s denotations and its connotations, and on and on. It also helps immensely to study other languages that have contributed to your native tongue—as I did with French, Latin, and Greek—as well as one or more parallel languages—as I did with Russian and Portuguese—in order to recognize cognates and word borrowings, and to puzzle out the meaning of new words and gain a taste for their flavor.

The art of writing also has some broader structures that the novice can learn by rote. In the nonfiction world, the writer can study the organization of an argument: going from specific to general, or general to specific, and the logical fallacies that invalidate an argument in the eyes of any educated reader. A journalist learns the inverted pyramid structure, where the most important facts of the news story—the Five W’s of who, what, where, when, and why or how—occupy the lead, while other details and analysis necessarily follow. An essayist learns to find a hook in everyday experience—such as a common question or problem—that will draw the reader into the thread of the argument. A technical writer learns to break down processes into discrete steps and to address each one, with all its variables, separately and usually in chronological order.

In the fiction world, there are fewer formal structures to observe. Short stories are usually more compressed in time and space, and involve fewer characters, than novels.1 Most stories of any length are structured around some kind of loss, struggle, journey, or other contrivance that serves to keep the action moving forward. Most of them arrive at some kind of climax, where all the characters, plot lines, and problems come together and are addressed or resolved. And most also have a denouement, which tidies up any last unresolved plots and suggests how the characters will spend the rest of their lives. A playwright learns how to frame a story into separate acts, and how to move the action toward and break it—or appear to resolve at least some of it—at the end of each one. A playwright also learns to convey character and action through dialogue, where excursive histories and graphic action are not possible on a closed stage. A screenwriter must also learn a more formal writing structure, which involves placing the margins of action and dialog separately, so that one page of script roughly equals one minute of screen time, and to put certain words associated with sound or visual cues in all caps, so that they be noticed and given proper treatment in production. To be successful, the screenwriter also must obey current conventions about act structure and timing.

But aside from these generalities, the art of writing is something that either takes you into its confidence—or it doesn’t. Your mindset, daylight dreams, life experiences, and past reading either prepare you to tell stories, paint with words, and sing cantos to your readers—or they don’t.

Two years ago, I decided to make a formal study of music because, while I have always loved listening to music both popular and classical, my knowledge of it remained rudimentary. I wanted to be able to make music as well.2 I also want to keep my brain active and alive as I enter my later years, and learning new skills and tackling new challenges seem like a good idea. I began taking lessons on the keyboard—generic for pianos, organs, and synthesizers—and bought myself a Hammond drawbar organ with two keyboards, presets, vibrato and chorus, the Leslie function—don’t ask—and a set of pedals.3 I chose to learn the keyboard because it would teach me about chords and voicings in the way a single-note instrument like the trombone could not, and it had a fixed sound the way a stringed instrument—which needs to be constantly tuned—does not.

My teacher, who is a lifelong musician himself, had me learn scales and taught me the Circle of Fifths, which unlocked for me the structure of music. I already had some sense of how notes are represented on the staff, what the timing intervals are, and other parts of musical notation. I now learned how chords are structured, with all their variations, as well as chord progressions, which appeal to the ear and are the basis for most popular songs. This was like learning grammar and sentence structure as a prelude to writing.

My teacher has also taught me to differentiate harmony from melody, how to break down a new piece of music into its chords and their roots, to play them solo at first, and only then to work on the melody. This prepares me both to play the song and to accompany a singer or other band members. I also learned to blend the two functions of harmony and melody through voice leading. I learned to keep time in the bass and to do bass walks—although my timing is still faulty. I am now learning blues scales and their progressions. This is all like learning the various structures of nonfiction writing formats, or the differences between a short story and a play.

But … after two years, I am still at the stage of analyzing and interpreting, of working the thing out intellectually rather than emotionally. I am working my left hand to form chords or walk the bass, my right hand to play melody or voice lead, but the two are not yet coming together. I approach each new song as an exercise in deconstruction. A song is an intellectual challenge, not an act of personal expression. I can make music, but I don’t yet make it sing.

This is the essence of art. In music, you can learn notes, scales, chords, and progressions—but something inside you must open up and sing. In writing, you can learn grammar, vocabulary, and rudimentary structures—but something inside you must catch fire with story. A teacher cannot tell you how to light that fire. Oh, he or she can light matches and flip them at your head. A teacher can spot the small sparks that sometimes appear in your writing, point them out to you, praise them, and try to nurture them. But if the student’s mind is—figuratively—damp wood, then nothing will catch.

Armed with the basics of language and structure, any writer still must eventually teach him- or herself how to make a story come alive. In part, we do this by reading widely and observing what other writers have done and how they do it. For this, I love to study how a tale is formed in a short story or novel and then remade into a movie. Between one form and the other, the essence of storytelling emerges, stripped away and rebuilt, like a butterfly from a caterpillar’s chrysalis. As writers, we can also ask questions of the stories and novels we read: How did the author do that? What was he or she trying to achieve? Why does this feel right (or wrong)? We also absorb, as if by osmosis, what constitutes good taste in storytelling and what leaves a dull thud. And, of course, we learn by doing, by trying out new approaches, by seeing what works for us in our own personal style, how to create and move characters, by alighting on the forms and structures that work, by discarding the techniques and tropes that seem awkward and excessive.4

Writers learn by writing and adapting stories to their own personal taste and style, just as musicians learn by playing and adapting songs to the rhythms and harmonies that personally move them.

Ultimately, anyone with a sense for language and logic can learn to write an adequate newspaper article or a technical manual. These are about facts and only need an awareness of the reader and what he or she wants—or needs—to know in order for the writing to work. But stories involve something more, the dimension of emotions, of aspirations, desires, fears, and disgusts. Storytelling must look inside the writer’s own self and his or her own experiences to find the expression of ideas and emotions that will touch a similar chord in another human mind. Stories are as much about being human as they are a collection of words, imagined conversations, described actions, and resolved plot lines.

This is why I believe machine intelligences may one day write adequate newspaper articles and technical manuals, but they will never excel at writing fiction. Not, that is, until machines become so complex and involuted themselves that their programs resemble the human mind. Humans live in a shadow world: part of our daily life revolves around the facts and consequences we glean from the external world, and part lies in the interpretations we place upon them. And these interpretations, our attractions, aversions, and distractions—the push and pull of hot and cold feelings, towards and away from various thoughts and objects—are shaped by all of our mind’s unexpressed desires, hidden agendas, disguised hatreds, and other emotion influencers which lie buried in the subconscious and only come out in random thoughts and in our dreams.

If the writer of fiction does not touch this subconscious level,5 then the story will remain a mechanical exercise, a study of forms. It may involve extremely well-crafted characters carved from clues written on bits of paper drawn out of a hat. It may involve an intricate plot filled with actions traced on an Etch-a-Sketch. But it won’t make the reader glow with recognition, or identify with the situation, or even much care. That kind of story comes from within, and it’s nothing that one human being can teach another how to create.

1. Some fiction classes might also go into technical details that—for me—are esoteric to the point of disappearing into the aether, such as the differences between the novelette and novella and the novel. Aside from placing limits on length, I don’t know how these forms differ, other than being longer and more complex than a short story. How long should any piece of fiction be? Long enough to tell the story and satisfy the reader. Any other consideration is the business of publishers, such as how much paper and ink they will need to buy.

2. When I was in fourth grade, I started playing the trombone, along with dozens of other students who were being introduced to the school band with their own instruments. I could read music to the extent of linking notes on the staff with positions on the slide. I understood the timing of notes and tempo. I grasped that flats were a half-tone down, while sharps were a half-tone up. But I never understood the structure of our Western, twelve-tone music which makes the white and black keys of a piano a physical necessity. I never associated all those sharps and flats written on the staff at the start of a piece of music with anything other than the composer’s prefacing remarks; so I did not understand how they changed the playing of notes that appeared halfway down the page. The idea that music had different scales and keys, and that they were all part of a greater structure, was never fully explained to me. Naturally, I was terrible at playing the trombone.
       Nevertheless, my hunger to make music persisted through the years. I tried at various times to teach myself the violin, the guitar, the Chapman Stick®, and the French horn—all without success. Then, two years ago, I finally got serious and began studying music as a thing in itself, and I started taking lessons on the keyboard.

3. When I was growing up, my Dad bought a Hammond organ with rotary tone wheels and learned to play it. I never actually played the thing, but I did goof around on it, fiddled with the drawbars, and listened as the various pipe lengths and their voices came together to make a sound. So this instrument was more familiar to me than a piano and less bizarre than a synthesizer.

4. Early in my writing career, I heard an interview with Marilyn Durham about her novel The Man Who Loved Cat Dancing, where she mentioned the problem of getting a character to walk through a door. This is harder than it sounds, because any writer grapples—as I was grappling at the time—with how much to show and tell in every action. Do you describe the door? Do you show the character twisting the doorknob? Do you use the sweep of the opening door to describe the room inside? My epiphany then and my practice ever since is that, unless the door is important and hides something tantalizing, ignore it. Start the action inside the room. Doors are for playwrights who have to get their characters onto a stage.

5. See Working With the Subconscious from September 30, 2012.

Sunday, April 16, 2017

Objective, Subjective, and Narrative

Is the world external to ourselves, internal in our minds, or shared with others through the process of narrative? To answer that, first we need to decide what we mean by “the world” or “reality”?

Certainly the brain as an organ of thought can have no direct experience of the world outside the bony skull. It can have no actual sensation of any kind. Even “headaches” are a matter of blood pressure, muscle spasms, and pain in the nerves of the head and neck, not in the cerebral lobes. The only events that the brain can detect directly are cerebral accidents like strokes and concussions, and then most of those events are mediated by consciousness and its loss, or by changes in brain function and awareness, rather than by direct detection and assessment of the damage. Our minds are circuit ghosts that run through a network of neurons, rather than the neurons themselves.

Everything that we think of as reality and the world around us is conveyed to the brain by nerve stimulation. This sometimes occurs directly, as with nerve endings embedded in the skin and other tissues, yielding the senses of touch, taste, and smell. Mostly—and for the senses our minds are most aware of and involved with—the nerves connect with organs that receive signals from and interpret the external environment: the eyes to detect light, eardrums to detect pulses in the surrounding air, the inner ear to detect the pulls of gravity and acceleration, and so on. Each of these organs as well as the sensors in the skin, nose, and tongue are reporting their findings to the brain all the time.

Here is where the subjective/objective problem starts. The brain can process all these inputs, but the mind—the active awareness, the collective effect of those “circuit ghosts”—cannot pay attention to all of this simultaneous and often conflicting information. The brain of a developing organism automatically sets up hierarchies of stimulation awareness that govern the mind’s picking and choosing among the continuous flow of signals. For example, we react to the sound of our own name even among a jumble of overlapping restaurant chatter or a stream of messages through a public-address system. The loudest noises and the brightest lights and colors get priority in this hierarchy and pass directly to our awareness, because they are usually associated with danger. The brain’s monitoring of the edges of the retina—our peripheral vision—gives priority to movement there over signals that indicate outline or color, because the movement of things around us is also usually associated with danger and more important than their shape or color.

So, even though our brains may receive accurate reporting of signals related to sight, sound, and touch, they don’t always reach our awareness for interpretation and action. In this way, the objective data become partially subjective. Of course, much of that external reality is not subject to either mediation or reinterpretation by the brain. Fall out of a window, and you will find that gravity takes over and you impact on the pavement whether you think about it or not. In the same way, explosions, collisions, accidents, and intentional violence all have a reality that is not brain-mediated and subject to our awareness.

Human beings have also developed mechanical instruments designed to isolate and mediate signals from the physical world before they reach our senses for interpretation. For example, a spectrometer can pick out the actual wavelength of a beam of light. It can measure and tell a human observer that the photons in what appears to be blue light are oscillating with a wavelength of 440 nanometers, or green light at 530 nm, red light at 700 nm. Even if you have been raised by aliens or by color-blind humans, so that you traditionally call red light “green” and vice versa, the spectrometer can give you an accurate reading of the light as it is defined by the majority of other human beings. In the same way, an electronic tuner can identify the pitch of a note, say Concert A at 440 Hertz, even if you are personally tone-deaf.1

So, even though the brain mediates, selects, and interprets the reality coming in from the sense organs, a measurable and appreciable—that is, objective—reality does exist outside of human awareness. Still, awareness plays a big role in interpreting and providing context for that external reality. Is red light more pleasant to the brain and to the awareness “ghost” that rides inside it than green or blue light? That is a matter of subjective interpretation. Does a red light in a traffic signal mean “stop,” while green means “go,” and yellow means “do you feel lucky”? That, too, is a matter of interpretation based on learned—or, in the case of yellow, badly learned or misremembered—rules governing human activity.

The truth, as I see it, is that reality exists objectively but is interpreted subjectively. And when we go about putting reality into words—as I am doing right now, or when I try to relate an experience or tell a story in one of my novels—then the process becomes even more subjective. The idea, the story, the sense data that my brain recalls and tries to fix in words which other humans can then read, understand, and so vicariously experience becomes subject to the limits of my vocabulary, the meanings which I have learned to attach to certain words, the connotations with which those words resonate in my own private thoughts, and the choices I make among all these variables, on the fly, and in the heat of creation. While a piece of writing—a novel, a poem, or a news report—may become a crystalline distillation of reality for an enthusiastic reader, it is at the same time more and less than an absolutely faithful recording of the originally perceived reality.

In the same way, when a producer, director, cast of actors, and crew of set dressers, camera operators, and computer-graphic artists set about to recreate a beloved bestselling novel or even an original screenplay, they will be creating a new interpretation based on the particular skills, vision, and choices of each member of the production company. It may be, in both the author’s and the audience’s view, a crystalline distillation of the story, but it will still be different from the act of reading the novel or screenplay, or living the reality as it might have occurred in real life. While a novel or screenplay may have had no existence “in real life,” the same acts of selection and interpretation—of words in a book, or of skills and vision in a movie production—apply to the creation of a biography or a biopic about a person who actually lived and had a historically measurable effect on reality. The result in the telling will be different from the reality of living.

So we, as aware human beings, dance back and forth across the line separating objective and subjective all the time. We live in our minds, but our minds also participate in a reality that affects other human beings.

And now a third consideration has come into our lives. Based on the theories of postmodern literature and our evolving social sciences, we must deal with “the narrative.” This is more than the plot and story line of a single novel or movie. This is a shared construct of political and social—if not indeed physical—reality, put together from the writings of influential theorists, reporters, academics, and others who get paid for expressing their opinions. This sociopolitical narrative becomes the basis for new novels and movies, for the selection and interpretation of news stories, for more editorials, blogs, and opinion pieces, and for the crop of memes—those bits of ideological weed that would seem to crystallize one aspect of the larger narrative in one instant of time—that all reverberate in the social, media, and social-media echo chamber of modern society.

For anyone who bathes in the daily waters of politics, economics, opinion, and written and enacted literature—that is, books and movies—the narrative becomes an intermediate reality. Outside the brainbox of the skull, there exist both the objective reality of blue light at 440 nm or Concert A at 440 Hz and the narrative reality in which certain people are automatically good and virtuous, while others are wicked and demonic; certain historical facts are remembered and cherished correctly, while others are dismissed as damned lies; and certain acts or intentions are considered rational and lifesaving, while others are known to be foolish and cruel. And when a human encounters conflict between his or her subjective experience or memory and an externally shared narrative which he or she has accepted, either cognitive dissonance or a personal crisis occurs. When a society encounters conflict between its public narrative and external reality, a political, social, or economic crisis occurs. And sometimes the narrative wins—to the detriment of personal or social existence.

This is not actually a new phenomenon, although we now celebrate this kind of consensus in earlier times as “the narrative” and consider it either the obvious end product of a cohesive society or a mindless form of manufactured groupthink. Every religion since nomadic herders began coming together in their summer pastures has spun its own narrative, its greater vision, in which the lonely circuit ghost inside each brainbox partakes. The Greeks had their narrative of the Trojan War, a fictitious relic of a forgotten age—not unlike the Arthurian narrative of chivalry for the English—which shaped their ideas about what men and women were supposed to feel and do, how the gods were to be honored or mocked, and how the best of human intentions can sometimes go awry. The Roman world was held together largely by its own narrative, played out in the minds of emperors, generals, proconsuls, and tax collectors.

Narrative is strong in some people’s minds. Control of the narrative is a kind of power. And narratives that have the charm of simplicity, the echo of truth, and the ability to enforce popular desires will eventually drive out any narrative that is too complex, difficult to verify, or particular in its voice and vision. Narrative persists in any social setting until a harsh reality intrudes—the political party collapses or becomes irrelevant; the society is invaded or encounters some physical catastrophe like an earthquake or meteor strike; the economy is disrupted by some new technology or goes broke through a lack of production or an excess of debt—and then individuals are left to grope among the pieces to assemble their own, subjective views of reality once more.

Which of these—objective, subjective, or narrative—is the true reality? None of them. All of them. They are the blend of physical, private, and public or social needs, drives, and obligations that guides and motivates every human being. To choose one among them and elevate it to supremacy is to miss the whole point of being human.

1. But some perceptions remain an artifact of the mind. When I was living back East, I would sometimes sense the sky on a summer afternoon under certain conditions of humidity as green rather than blue. This was not the bright green of a laser pointer or fresh grass, but the muted green of pea soup. It may have been an actual color, reflecting the green plants in the fields up into the sky, or from moisture high in the atmosphere scattering sunlight to disperse the 530 nm wavelength more dominantly than 440 nm. But it sure looked green to me.

Sunday, April 9, 2017

No Utopia

A germ has infected the minds of bright people, deep thinkers, and intellectuals in almost every Western society. Its inception dates back almost 2,500 years to Plato and his Republic. The infection resurfaced again in the 16th century with Sir Thomas More and his Utopia, and once more and with even more virulence in the 19th century with Karl Marx and his writings in The Communist Manifesto and Das Kapital. The essence of these various outbreaks—the DNA of the germ, so to speak—is that humanity can reach a perfect state, a lasting condition of peace and plenty and universal brotherhood, if everyone would just let the brightest people among us order society according to rational, scientific, humanitarian principles.

This notion is in stark contrast with the other two principles of social organization. The first of those organizations goes back to the monkey troop: let the strongest, smartest, and (hopefully) wisest people in the tribe, confederation, or nation fight among themselves until the (hopefully) best person wins; then we will all obey him (or rarely, her) as our alpha male (or female), chief, king, emperor, or führer. This principle saw its heyday in the English Wars of the Roses. The second principle is less ancient and was developed in Greece at about the same period that Plato did his writings: let people come together, offer and consider arguments, and choose among themselves the best course of action, with the process overseen by a committee of citizen caretakers. With later refinements to address issues like dispersed populations, protection of the rights of the minority, and delegation of the popular will to elected representatives, this principle has come down to us in various democracies and in our own republic.

The main difference between the utopian ideals of Plato, More, and Marx and the other organizing principles of either kingship or democracy lies in the nature of time. A king or queen is supported, followed, and obeyed for only a brief moment in history—even if the king or queen and his or her followers don’t understand this and think accession to the crown is forever. The monarch’s decrees and orders are binding only for a limited period and subject to change with changing underlying conditions. Some decrees may become the basis of popular tradition and so live on for decades or centuries beyond their origin. But a new king with a different appraisal of the political, economic, defensive, or agrarian situation, or new experience of changed conditions, can issue new decrees that point social energies in a different direction. Similarly, the proposals and laws that democratic societies or their democratically elected representatives enact are always of a temporary nature. Some of the society’s core values and principles, such as those bound up in the U.S. Constitution, are intended to endure for the ages. But even these sacred documents are always subject to interpretation and amendment. The essence of either the kingship principle or the democratic principle is that law and the social organization which supports it are flexible, subject to the imagination and understanding of the people being governed, and able to respond to changing political, economic, and other conditions.

But the bright minds of the theorists—Plato, More, Marx, and all those who sail with and believe in them—look for a social order that transcends the changing nature of human understanding and imagination. They aim at the end of politics, economics, military aggression, agricultural variation, technological invention, and other underlying conditions. They expect to initiate the end of history itself. Their social and political views are flexible right up until the new and improved social order that they have devised takes over and operates so perfectly that nothing further can change. When you reach perfection, there’s nothing more to say or do.

Of course, human nature abhors perfection. We are restless creatures, always thinking, always plotting and planning, always sampling and judging, always looking for greener grass and trying to breed redder apples, always wanting more, always itching with a kernel of dissatisfaction. This itch is what motivates the bright minds looking for utopia in the first place. But those minds never stop to consider that restless humans, given an eternity of bliss, will soon want to reject their perfect state and move on to something even newer and better. That is the lesson of the Eden story.

The more hard-core, dirigiste thinkers among those bright minds have already concluded that human nature will ultimately need to be changed in order to fit into their perfect societies. People will have to become more self-sacrificing, more contented, less quarrelsome, more altruistic, less greedy, more … perfect, in order for their new and enduring social order to function. Stalin famously supported the researches of Trofim Lysenko, the agrobiologist who taught—counter to the principles of Mendelian genetics—that acquired traits, the products of nurture, could be inherited by later generations. Lysenko was working with plants and seeds, but Stalin and his acolytes believed that the same technique could be worked on human beings: get them to change their behavior, rethink their own natures, become the new Homo sovieticus—and human nature would be changed for all time to conform with Marxist-Leninist principles.

People do change, of course. While genetic mutations—especially among brain functions—tend to be slow and often self-canceling—especially when they work against social norms—people are always susceptible to new ideas. Religions with their coded values and ethical propositions can sweep across populations much faster than any physical mutation can sweep down the generations. Christianity raises the values of love, reciprocity, and cooperation. Islam raises the values of uniform belief and submission to religious authority. Buddhism raises the values of right thought and action in a hurtful world. And, yes, Marxism-Leninism raises the values of personal selflessness and obedience to political authority.

But these core value propositions are still subject to change. Inspiration and revelation harden into orthodoxy, become challenged by disputation and reformation, and succumb to new and different inspirations and revelations. The one exception to this change process would seem to be the scientific method, which is a form of either anti-revelation or continuing revelation. Arising in the work of 16th and 17th century thinkers like Galileo and Descartes, the method values observation and experimentation as the only support for—or disproof of—conjecture and theory. By its nature, the scientific method is flexible and perpetually adjusts its findings to changes in—or new discoveries about—those underlying conditions.1

We’ve been riding a wave of technological change derived from the scientific method for four centuries now. Observation, hypothesis, experimentation, and analysis have variously given us the steam engine, telephones, radio, television, computers, evolution, and genetics as well as far-reaching advances in our thinking about physics, chemistry, and biology. Human life is qualitatively and quantitatively different from what it was four hundred years ago in those societies that have embraced science as an organizing principle along with the Western tradition of personal liberty and free-market exchange of ideas, goods, and services.

And yes, along with our understanding of biology, chemistry, and physics, our understanding of human psychology and social structures has vastly expanded. We have become the animal that studies itself and thinks about its own future—not just on a personal level, but as a social organism. But we are no closer to finding a “perfect” social structure, because human beings are still descended from irritable, distrusting, independent-minded monkeys rather than docile, cooperative, obedient ants or honeybees. No amount of religious indoctrination, state orthodoxy, or applied lysenkoism will remake the mass of humanity into H. sovieticus.

Get over it, bright minds. In the next hundred or a thousand years, we may reach the stars, transmute the elements, and be served by mechanical intelligences the equal of our own. But we will still be irritable, distrusting, independent-minded creatures, half-angel, half-ape, and always looking for greener grass and redder fruits. And that flexibility of mind, combined with our stubbornness and independence, is what will keep human beings evolving and moving forward when more perfect creatures and more orderly societies have vanished from this Earth.

1. To quote physicist Richard Feynman: “It doesn’t matter how beautiful your theory is; it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.” Unfortunately, these days some of our lesser scientists—and their lay followers—seem to think that scientific propositions can be “settled” for all time and somehow made immutable.

Sunday, April 2, 2017

Obvious and Mysterious

In life, out here in the real world, I like things to be obvious. Paths are clear, yes means yes, love is forever, a promise is always kept, and hatred is to the death. In that way, I guess, I am like a child and easily fooled. But the world is complicated enough, nature preserves her mysteries, opinions differ, scientific and social theories expand outward like ice crystals, and the truth is difficult to seek and harder to find. Some days, it’s a puzzle just to know what’s going on around you.

But in literature, in the stories we tell for fun, I abhor the obvious. Plain people with pure hearts, villains with no shred of decency, roads that lead straight from one boring place to another, stories where cause leads predictably to effect—these are for children, not for me. The world of literature should be as unlike reality as we can make it.

Most of us live in a world bounded by rules and promises. The basic rules are handed down by our parents, if we are lucky, and become embedded in our psyches like guard rails on a twisting mountain road.1 These ground rules involve the basics of behavior like internal honesty, ambition and effort, personal credulity and cynicism, and other character traits. Some rules are part of our learned social structure: how we greet guests of differing social stations, which fork to use and when it’s appropriate to use our fingers, how to write a thank-you note, and other things we do because to forget them would make us social pariahs. And still other rules are handed down by governments: how to drive and where to park, what we have to pay in taxes, and when and where we can cross the street.

Promises we make every day, to show up for work at a certain time, to be home for dinner, to support the children and provide for their future, to pay off the mortgage, to take a sick day only when we are really sick. This is what a human being does. This is responsibility. These are the things we owe to our families, friends, employers, and bankers. The walls made up of rules and promises are invisible but strong, like armored glass—at least if we are a person of good heart and conscience2—and these walls do not break easily, if at all.

In literature, we expect these rules and promises to be more flexible. Not absent, of course—or otherwise the characters would have no conscience, no society, no state, and no civilization.3 But as readers we expect the walls to be less confining, more like rubber than glass. Of course, many stories are based on the main character suffering a great and immediate reversal or loss in his or her life—of a loved one, a job, a long-sought prize that is finally within reach, or some other disruption. The story is then about how the character reacts to the new situation, where the person’s ordinary state of being is stripped away. And then the old rules, the guides of conscience and social order, either don’t apply or have much reduced force in the character’s life and thinking.4

In my own writing, I strive to mix the two, the obvious and the mysterious. I am not a writer of horror or the supernatural per se; so I’m not talking about the supernatural, ghost stories, or gods and demons—although stories about artificial intelligence and alien life forms would come close. But I believe that the world as we perceive it has hidden depths and dimensions that we do not know or even suspect. I like to present a character in a plain and predictable exterior world that—when he or she gets too close, feels enough frustration, thinks too deeply, or presses too hard on the glass—reveals a new perspective and a new set of opportunities.

A writer can add the element of mystery in one of two ways. One is to take the character and the story into a strange and mysterious—or simply different—setting. Most science fiction stories are like this. The character goes into space for the first time, or lands on a strange planet, or encounters a new culture with mysterious practices. In the two-volume novel Coming of Age, my strange planet and new culture are simply the future, which my two main characters visit by receiving life-extension therapies that take them far beyond the traditional lifespan of “three score and ten.”

In the novel I’m working on now, The House at the Crossroads, which is the prequel to my time-travel novel The Children of Possibility, the story develops a kind of double whammy. First, two of the main characters are from six thousand years in the future, introducing the reader to a domain and culture that are almost like the Europe of our near future, but with some significant differences because civilizations have fallen and risen since then but never strayed far from the European character.5 Then, these two characters sign up with a time-travel service and are trained to go back and maintain a time portal in a medieval Europe that is strange and different to them—and perhaps not so familiar to the modern reader, either. And in the original book, Children, the mystery was to see the reader’s present day in the early 21st century through the eyes of a time traveler from nine thousand years in the future.

A second and simpler way to create mystery for the reader is to leave something out of the story. This is the writer being not so much neglectful—“Oops, I just forgot to mention”—as intentionally deceitful. For example, the reader may not know, but will soon get enough clues to guess, that the main character is actually guilty of the crime he or she is denying. Or that the character is actually blind, or mentally disabled, or psychotic—or a machine. These are always fun—even when the reader knows up front that the character has these attributes, as in my novels about artificial intelligence ME: A Novel of Self-Discovery and ME, Too: Loose in the Network. And again, the reader sees the familiar world of the present through the eyes of a self-aware software program which must navigate both our digital infrastructure and our complex human relations.

That second method—by leaving something out—also works in a novel where multiple characters have their own viewpoints and the story, in the hands of the omniscient author, plays off one against the other. I do this even in my non-science-fiction novels, like The Judge’s Daughter and The Professor’s Mistress. The reader, like the author, knows the motivations and intentions of the characters “on both sides of the door.” The mystery enters when the reader has to figure out how one character will discover the other’s real intentions, or secret, or the trap that has been laid.

But that, too, is a source of mystery in any story. Unlike life, which is regular, endless, and virtually unplotted, we read stories because we sense there is an underlying structure, a plot, a coming climax, resolution, and denouement—but we can’t figure out what it will be. The writer, like the omnipotent god of a world two handbreadths wide, keeps the future a secret and delights in teasing and fooling the reader.

1. When I first came out to California from Pennsylvania in 1970, I was amazed at all the sharp curves with steep drop-offs that had no guard rails to keep drivers from plunging over the cliff. In Pennsylvania, even a modest embankment is buffered with steel. But driving along Route 1 in Marin County, with a vertical slope above you and the Pacific Ocean two hundred feet below, only a little berm of sand prevents an erring driver from slipping over the side. It took me a while to realize that, while Pennsylvania roads get snow-packed and icy, with lots of skids, particularly on curves, the roads in Coastal California are free of ice all year round, not even wet for most of that time, and so safe as a pair of railroad tracks—if the driver exerts a minimal amount of caution. Of course, in lashing rain or dense fog, it would be nice to have those guard rails, but Californians are a tough breed.

2. If you are not a person of good heart or conscience, then you live in a kind of fantasy world compared to those around you. You tend to think nothing touches you, no one will notice your actions, other people have no real feelings or intentions, and the law—loosely enforced and fallible as it is—will never catch up to you. That makes the rest of us fools, in the short term; and you the fool, in the long.

3. Which is the case in most zombie and post-apocalypse stories.

4. I once was told by a production executive in Hollywood that the formula for a modern movie script is: first act, the main character is living a normal life, until something changes; second act, the character tries to get back to equilibrium, but the whole world rises up and fights against him or her, preventing resolution; third act, the character finds a new strategy, a new weapon, or a new ally and wins the struggle; and, finally, the denouement, where the hero or heroine and the villain go mano a mano in a fight to the death. This structure makes a good 120-minute movie, but it’s a little simplistic for a full-length novel.

5. I believe that each geographic region and its embedded population have certain characteristics which change slowly, if at all, over time. This explains why International Marxism produced a mindset and approach to governing in Soviet Russia that was similar to Tsarist autocracy and markedly different from the loosely centralized Imperial bureaucracy of Communist China. And if the United States ever goes full Marxist, the result here will be similarly unrecognizable to either the Russians or the Chinese.