Sunday, May 21, 2017

History a Thousand Years from Now

If historians from the 30th century, the far future, look back on our time, what will they see? What will they know and understand about our period?

First, let’s assume that they have full access to our writings and recordings, but they are not personally subject to our current states of thinking and feeling. That is, future historians might be like a modern audience seeing Romeo and Juliet with the original Shakespearean dialogue. They might hear the words as actors of the late 16th century spoke them, and they might have some sense of their meaning. But a modern audience might not fully understand what social forces drive two powerful families like the Montagues and Capulets to gather their extended kin into rival armies and fight in the streets of medieval Verona. And if they did understand those forces, they still might not feel the hatred in their own breasts. For us moderns, the rage of Tybalt and the wit of Mercutio remain distant curiosities.1

However, even an Elizabethan audience might not have been able to say what exact qualities of mind or habit made the Montagues so loathsome to the Capulets and vice versa. The evidence supporting the feud between these “two households, both alike in dignity” does not exist in the play. Instead, it’s an “ancient grudge”—which is Shakespeare’s version of the MacGuffin that sets his plot in motion.

In similar fashion, what would far-future historians know about the passions that seem to be tearing the United States apart and ravaging many of the other Western democracies as well?

On one side, they might see a radical strain of progressivism or passionate futurism. This is expressed in a number of different popular movements—inspired by the writings of various political philosophers like Marx, Lenin, and Mao—but with the same aim of overturning the adherents’ existing society and its political, economic, and social structures. The radical’s goal is to create a new state, a new basis for economic transactions, a new morality, and a new set of relationships between men and women, between parents and children, and between citizens and the state. This new order is always projected from rational, egalitarian, humanitarian, compassionate, and collectivist principles. It is based on theories about human nature and visions of a future that has never before been experienced on a countywide scale.2 And the fact that every country which has attempted to enact these theories and visions thereby sank itself into chaos, political repression, and self-inflicted poverty is dismissed as a failure, not of the theories and visions themselves, but of the imperfect people making the attempt.

On the other side, the future historians would see a passive strain of conservativism or traditionalism.3 This side of the argument has no popular movements and only weak political associations. This side has no coherent philosophy based on novel thinking but is reflected in writers like Edmund Burke, David Hume, and Adam Smith, who—rather than creating a vision of a new social and economic order—were trying to understand and express how people actually go about their political and economic business in developed societies. The adherents’ goal is not so much to build anything new as to preserve and defend those structures and relationships that have grown up in their own society over the decades and centuries. They don’t mind certain people obtaining and wielding political power and economic advantage, but they haven’t subscribed to the new theories and visions. They don’t mind their society evolving, moving slowly toward different values and accepting different relationships through a kind of unspoken plebiscite. But they resist the notion that a cohesive and doctrinaire group, fired with strong ideas and emotions, should push their society into configurations that are either untested or have been found in the past to be disastrous.

The historians might trace out the sequential steps—the published positions, the party platforms, and the pivotal elections—in this growing social disruption. They might assign causes to the society’s failure to adhere to its former constitutional structures and its market principles, as well as its failure to provide adequate rewards for effort and risk taking. They might read about demands for personal compassion and complete equality among all members of society. They might hear the siren call of an enlightened utopia … but they will not be able to feel its pull. They will remain deaf to the dimension of imagination that drives the collectivist movement. And without an exercise of imagination and empathy, they will not know the depth of revulsion among those others who can accept social evolution but not political revolution.

What the historians will see in our times is a social madness based on masochism and fear. They will see people in the most advanced societies the world has ever known struggling with a loss of existential faith. People on one side insist their lives are impoverished among an outpouring of goods and services. People on the other insist their lives are threatened simply by words and ideas. People on both sides are convinced the other has no understanding of—or respect for—them and their cause. Modern Montagues and Capulets bite their thumbs and spit on each other’s shadows without a clear remembrance of the ancient grudge that separates them.

Modern Americans look back on the political and social tensions of the 19th century that led to the first Civil War, and they grope for an explanation. “It’s all about slavery.” “It’s all about our way of life.” “It’s all about human rights.” “It’s all about my rights.” We look for simple, easy explanations of a complicated past that ended in five years of bitter war and the loss of more than American 600,000 lives.

If the divisions tearing at our society right now result in a second Civil War, what will future historians say? Granted, that first conflict was regional, based on two societies, North and South, which had grown apart—or had never actually been much alike—with their differing social values and economic systems, although with a single constitutional basis.4 The next conflict will be intellectual and visceral, with enclaves of sentiment and purpose concentrated along the country’s two coasts and among its urban elites, but otherwise with neighbor opposing neighbor across the width of a backyard fence. The next war is going to look more like a street fight or a riot—Montague and Capulet style—than any conflict between settled countries.

Will anyone in the far future understand it better than anyone alive today?

1. For a different sense of this family antagonism, here is Prokofiev’s “Montagues and Capulets” from the Romeo and Juliet ballet. It’s one of my favorite pieces.

2. But see When Socialism Works from October 10, 2010.

3. The word of the moment is “populism.” Supposedly, this reflects an aversion by the average person and the populace as a whole from the theories and visions espoused by a radical elite of political, academic, and cultural thinkers and leaders. A decade and more ago, it was the “people,” the populace itself, who were supposed to align with these theories and visions against the repressive forces found in traditional society. See how the language changes?

4. In fact, the Constitution of the Confederate States of America was practically a word-for-word duplicate of the U.S. Constitution, with some significant differences particular to the Confederate cause. Clearly, the foundation and structure of the government were not in serious contention.

Sunday, May 14, 2017

Wolves and Dogs

By now it is generally accepted, although not entirely proven, that dogs evolved from wolves.

The best current theory is that, rather than humans stealing wolf pups and feeding and raising them at their campfires, some subset of wolves domesticated themselves. In this theory, the hunting pack was supposedly attracted to the edible scraps found in the humans’ kitchen middens—waste piles where hunter-gatherer groups tossed their old bones, discarded skins, and other refuse. Because the humans came randomly and often to dispose of these wastes, the wolves could not avoid contact with this large, strange, and unpredictable species. Over time, the wolves which demonstrated the most tolerance of human presence got the best and the freshest scraps. Fearful or hostile wolves kept their distance and got less of the good stuff.

This theory dovetails nicely with the work of Russian biologist Dimitri Belyaev, who bred foxes for tameness. Working sixty years ago, this dissident from Soviet biology began studying foxes in order to disprove Lysenkoism—the Lamarckian theories of Trofim Lysenko, who said traits acquired in life could be passed along to later generations. Stalin loved Lysenko’s ideas, because they proved that the Soviet state could, with sufficient force and enough reeducation camps, create a new “Soviet man,” whose selfless passivity and obedience to the Party would breed true and ensure Communist dominance into the future. Belyaev’s foxes—animals of the same family, Canidae, as dogs but not the same genus—gradually changed their physical appearance as well as their behavior. Through hormonal mutations associated with their tolerance of humans, the foxes over generations developed shorter snouts, rounder heads, and changes in coloring—among the same set of features that differentiates dogs from wolves.

How is this not evidence of Lysenkoism? Because in both cases—the wolves at the kitchen middens, Belyaev with his caged foxes—the changes depended on selective breeding for certain qualities. Both cases depended on various traits—fearfulness and hostility, or their lack, along with the associated neurochemical and hormonal differences—existing in the animal at birth. In the beginning of each transformation, these traits existed as random genetic mutations; in later generations, they were selected and reinforced through breeding. Wolves that could tolerate the human presence ate better and were more successful in mating; wolves that feared or avoided human contact either died out or returned to the forest. Belyaev’s fox pups that could tolerate being handled and liked being played with were allowed to breed at maturity; pups that snarled and snapped like wild animals were discarded from the experiment. Whether the selection is a natural circumstance of the environment around the midden or an intentional choice by a human breeder, the result was the same. A gradient of selection—a test for survival traits—was imposed on the breeding group, and the preferred traits were passed along to succeeding generations.

Every farmer does this, and it’s been going on since human beings first stopped roaming after the wild herds and settled on the land. We find a type of berry we like, plant it separately, control its pollination, and turn it into a brilliant red tomato—or a coffee bean with a particular flavor, or a luscious strawberry, or a conscious hybrid like the loganberry. We find a type of grass whose seeds are palatable and turn it into wheat—or corn, or any other type of grain. We find boars with the tenderest meats or wild horses with the strongest backs and turn them into farm animals. And dog breeders, like my aunt, find poodles with the best combination of form, disposition, and coloring and breed them to create a line of miniature and toy dogs that are exact replicas of their larger cousins. Other breeders find dogs that are attentive to human desires as well as quick and clever with sheep and turn them into herders and healers. We’ve been doing this for ten thousand years.

In most cases the original specimen remains, for the rest of us, obscure. The original and unattended tomato plant has either died out for lack of habitat or hides in a forest glade somewhere, unrecognizable to passing hikers. The original boar might lurk in the forest and become the target of occasional hunting parties, but for the most part the production of pork for barbecue ribs and savory sausages remains hidden from the average customer’s attention.1 Wild horses still exist in the American southwest, but they are only the feral descendants of domestic horses brought to this continent by Spanish explorers. The original, prototypical horse—Przewalski’s wild horse of the Central Asian steppes—had once almost gone extinct and has since been preserved only as a curiosity.

In these cases, the average person has no emotional attachments, either to the farmed pig or the feral boar. But in the case of wolves and dogs we have both attachments and opinions.

Wolves exist in the public imagination as noble creatures. They are bound to the pack, loyal to their mates, fierce in their hunting, sleek in appearance, and bold in their status as predators. Although wolves might be the subject of childish fears born out of fairytales and horror novels, for most people they the emblem of everything that is implacably wild and free—and true to itself. The wolf has its own nature.2

Dogs exist in our homes as loving companions. They are biddable, fawning, loyal to our family, suspicious of strangers, and gentle with our children. Many people sleep in the same bed with their dogs. The average dog, with its rounded head, floppy ears, and wagging tail, is now more our court jester and emotional pillow than our guardian and defender. Yes, a large dog can be trained to become fierce and unfriendly, but they do so only in response to human bidding. Their nature is to trust and depend. The dog has the character we give it.

For many people, the transformation from wolf to dog is a travesty, if not a tragedy. We—or our table scraps—have created something unnatural, in defiance of nature. We have taken an animal that was once self-sufficient and uncompromising and turned it into a beggar and a clown. But the wolf of our imagination would make a poor playmate for our children, have no interest in defending our homes, and would not sleep in our beds or even doze in our strange and dangerous presence.

For others, the wolf in the wild is a menace to livestock, a danger to house pets and babies, and at the very least an unpredictable presence around ranches and farms. There are still people who will shoot a wolf on sight, even while environmentalists are trying to restock and encourage them in habitats where they once roamed. The wolf is a topline predator in an environment that offers ever fewer prey animals and so has become a nuisance.

None of these considerations, of course, is of any concern to either the wolf or the dog. It is performing in its environment and reacting to stimuli exactly as its genes were selected to do. It is fitted for survival under the circumstances in which it finds itself. And this is perfectly natural.

Wolves and dogs are both still fresh in the human consciousness and imagination. They are a reminder that our species has changed the natural world in ways that we believe are both good and bad. We bend species to our will. We change forest and field into plantation and farmland. We occupy so much of the land and use so much of the rivers flowing across it that, in many areas, “nature” is a thing that must be preserved behind a fence.3

Which is preferable—wolf or dog? That depends, like so much else, on your viewpoint and your purpose. I for one am glad that the distinction exists. I am proud that we have had a hand in engineering a companion who can remind us to be kind to creatures that are different from and less capable than ourselves. And I am pleased that we can still value the wolves of our imagination while petting the dog that stands at our side.

1. The visceral distinction between husbanded animals in production on the farm and their prepared flesh on the customer’s fork is linguistically preserved in English as a relic of the Norman Conquest. The words “pig” and “cow,” from Middle English, are retained for livestock, while the words “pork” and “beef,” from Old French, are kept for meats in the kitchen and on the table. This verbal distinction came about when the Anglo-Saxon field hand still worked to raise the food for his Norman overlord.

2. We have friends who once had a pet dog that had been bred from a line intended for pulling sleds in races like the Iditarod. Its genetic mix was part Siberian, part husky, and part wolf—the latter added for endurance in long races. This particular animal was devoted and loyal to its humans, but it did maintain a certain aloofness and dignity which the owners attributed to its wolf nature. For example, the dog would enjoy cuddling and being petted but would not allow you to touch it with your feet. The animal seemed to sense that feet were different from hands and represented an indignity.

3. However, the people who think we’re “destroying the planet” need to get out of their apartments in Berkeley. Great tracts on this continent—and on most of the others—still manage themselves pretty well under natural conditions. Humans, for the most part, live clustered along the coasts and in the river valleys. We are still thin on the ground over much of the Earth.

Sunday, May 7, 2017

On Safety Measures

Human nature seems to have a built-in equilibrium system. We are very good at evaluating situations and systems and then invoking a sort of psychological compensator.

Take, for example, the latest models of full-featured cars. They all come with advanced safety options like lane-departure warnings, backup cameras, adaptive cruise control—which adjusts its speed to that of the car ahead and even stops—and other technology that helps the human driver guide and control the car. These vehicles are halfway between the old-fashioned automobile with just human hand-and-foot controls and a clear glass windshield, and the prototype self-driving cars that can follow the road and make complex decisions without human assistance.1

I predict that, rather than make us all safer, these devices—especially lane departure and collision avoidance—will make the person behind the steering wheel less attentive to driving. Many drivers will now think that it’s okay to text or chat on the phone, even with strict laws against these practices, because the car will warn them if anything really important is happening. That’s the psychological compensator in action.

Of course, people still text and chat without any such assistance. When I was doing the long commute down to the biotech company each day, for a while I rode in the company-sponsored vanpool. Sitting high above traffic, we could look down into the cockpit of the cars moving alongside the van. I would routinely see a woman applying eye makeup using the rearview mirror, a man shaving, or someone reading a book propped against the steering wheel. The trouble was, they were all doing sixty miles an hour in dense traffic. But because the road was fairly straight and all the cars around them seemed to be holding their position, these drivers thought it was safe to keep just part of their attention—the eye not getting the makeup, or an occasional glance at the road every other sentence—on the business of driving. With automatic helpers like side cameras and front-mounted radar, these drivers will spend even more of their time on personal business.

In the same way, I’m pretty sure that people who have robotic vacuum cleaners—those flat disk-things that prowl around the room sweeping up crumbs and pet hair—are less conscious about picking up after themselves. And now that every processed food is designed for the microwave, and people are already less conscientious about preparing their own meals, many are careless about reading and following the instructions on the box or can. Just put it in and “nuke it” for a minute and thirty seconds on full power. See how that turns out and repeat as necessary.

People are not necessarily scofflaws or careless. They just have an evolved sense of how to read a situation and judge their own safety and efficiency. Sometimes this sense is faulty—as we can see from those humorous “fail” video clips on Facebook, showing young men attempting parkour jumps from a second-story roof onto a dumpster, or riding a motorcycle at speed up a plank into the bed of a pickup truck. But for the most part people look at their life situation with some precision, and they weigh their expenditures of attention, energy, and time against what they see.

For example, we live in a high-rise condominium with a three-level garage that has one entry point at the top of the structure—the garage is built into a hillside—and an exit on each of the lower floors. Everyone parking there must go some distance around the garage to get to their stall, and many people with stalls on the bottom floor must make two full circuits. Crosswalks are well marked but sometimes blind, and each of walkway is protected with a stop sign for oncoming vehicles. Still, most people travel through the garage at about fifteen to twenty miles an hour. Given the available sight lines, that feels like a safe speed. Some people travel thirty miles an hour or more, and that’s just too fast if others are backing out of or walking away from their stalls. And yet the condo association has rules limiting the garage speed to five miles per hour and has posted this limit on every other pillar in the structure.

Five miles per hour is just a fraction above human walking speed. Given the distances involved, that means most people will take ten to twenty minutes to drive from the entrance to their stall, or from their stall to an exit.2 Most people don’t factor that kind of delay into any of the trips for which they want to use their car: starting on the morning commute, getting back in the evening, of just dashing over to the shopping center. Nobody in our garage drives five miles an hour. They don’t have to, because their internal sensor says that three to four times that speed is still safe. There’s no cop around to ticket them—just occasional newsletter blasts from the homeowners association reminding us of the speed limit. And yet, at those higher speeds, massive carnage does not take place.

In the broader society, we have town councils, state lawmakers,3 and federal legislators passing all kinds of rules and regulations designed to make people safer. For example, the State of California posts sixty-five miles per hour as the freeway speed in most areas, only going up to seventy on certain long-distance freeways out in the countryside, like Interstate 5, where the sight lines stretch for miles. I can tell you from experience that if you’re not doing seventy-five or eighty on the road, you’ll get run over. And I have had Highway Patrol cruisers careen past me—sans lights, sirens, or any other sign of authority in a hurry—as if I was blocking their lane. When all the fish in the stream are breaking the speed limit by ten to fifteen miles an hour, I guess you save your tickets for the ones doing ninety or a hundred and weaving in and out of lanes.

The city might put a stop sign at every corner and a traffic light every other block, and people would still roll through if they could see that nothing’s coming at them for a quarter-mile. More safety measures, especially those that fly in the face of a human being’s internal evaluation of the situation, don’t make us safer. They just make us feel guilty—well, mildly—as we go about our business.

When cars are truly self-driving, like little personal buses, then we won’t even bother to look out the window or close the door when we get in or out.

1. See The Future of Self-Driving Cars from March 12, 2017.

2. I’ve walked through the garage—and I’m a fast walker. It takes time to make a full circuit.

3. Don’t get me started on the Proposition 65 warnings about “chemicals known to the State of California to cause cancer and reproductive harm.” They are posted on virtually every building and enclosed structure because, hey, modern life is full of chemicals. Who among us pauses in thought and then decides to stay outside?

Sunday, April 30, 2017

On Sincerity

When I was at the university in the late 1960s, the campus revolution was just getting under way. Although it was mainly fueled by anti–Vietnam War protests, the student demands spread in all directions, calling for a virtual redefinition of the university structure and of society itself. In coursework, the new demand was for “relevance”—meaning politicized teaching of the correct sort—and started the movement to dump the collected works of William Shakespeare for the collected works of Eldridge Cleaver. In addition to “relevance,” the other spiritual demand of the time was for “sincerity.” This would be in preference, I suppose, to blatant hypocrisy.

One of my former philosophy professors, when questioned in a student-led colloquium, stated: “Sincerity is a trivial virtue.” I knew immediately what he meant: many other human virtues are far more important. I would prefer a person who keeps promises, pays debts, abides by contracts, performs acts of kindness and public service, takes care of family members and friends, treats other people with respect, smiles politely, and otherwise behaves in concrete ways. Whether the person “really means it” or is “faking it” is far less important to me.

This preference is, of course, from my own point of view. If a person says “Thank you” when I give them something or perform for them some small service, it makes me feel good. Whether or not the person actually means it, or feels truly grateful—that is, is sincere about this minor politeness—does not matter to me. Surely, if the person is grimacing, making faces, rolling their eyes, or using a sarcastic tone, to imply that no thanks are actually involved, then I know that the words are not meant sincerely. But my hurt comes not because of their lack of sincerity, but because of the implied mockery, as if my small action was really beneath their notice, or not kind and helpful at all, and thus deserving of their scorn. Otherwise, if a person says “Thank you,” even if it’s a murmur and there is no eye contact or other sign of heartfelt emotion, I can accept this as an empty politeness from an obviously well-trained, civilized individual.

Politeness is the verbal grease that keeps us descendants of howler monkeys from screaming in rage and trying to kill each other.

From the speaker’s point of view, the sincerity—or lack of it—in the exchange is a small measure of the state of that person’s soul. The saint or the deeply feeling person who says “Thank you” with sincere gratitude, virtually blessing my small gift or act of service with reciprocated good wishes, is expressing their own feeling of being at peace with the universe and gladness upon recognizing and being recognized by a fellow human. The hurried person who murmurs “Thanks” out of pure reflex, the ingrained habit of good breeding, and is unconscious of any felt gratitude, is at least practicing that verbal grease which keeps us all functioning. And the snarky person who sneers and rolls their eyes, to laden that “Thanks” with double meaning, is spreading their own bile and cynicism, fouling the gears of civil discourse. That little bit of intentional meanness is hurting them, corroding their soul, much more than the momentary confusion and pain might cause me.

In human interactions, the measure of sincerity is much like the Turing test for artificial intelligence.1 If you cannot tell whether the person is being sincere or not, it doesn’t matter who’s typing on the other side of the wall. You accept the person’s statements or intentions at face value and move on.

A society that valued sincerity as a primary virtue would be far different from our own.

Yes, I know the intent of those early student demands. By rooting out hypocrisy—the evil of paying lip service to popular principles but then regarding oneself as free to act in accordance with private intentions—the promoter of sincerity hopes to bring those hidden intentions to the surface. When you demand that people act sincerely, you expose falsehood and can then hope to enforce proper action. People who can say one thing and do another would be revealed as perpetrating a hoax on the society around them.

But the purpose of the exercise will backfire. A society of people who are forced by cardinal values to always say and do what they mean and what they are feeling at the moment will be a harsh and abrasive society. “Gee, Grandpa, that’s a measly five dollars you put in my birthday card.” “No, lady, you don’t get any ‘thank you,’ because it’s your job to pour my coffee.” “You men all think I can’t open a door for myself, you bastards!”

And we have before us the example of the most strongly politicized societies—which are usually the goal of those who would most earnestly promote the virtue of sincerity—as hotbeds of rampant insincerity. There people will loudly proclaim the party line, sing the party songs, and march in lockstep with the party cadence, while secretly loathing the party and all its purposes. And the more the party demands of them proper feelings of allegiance and respect, the greater becomes their popular hypocrisy—but always well hidden, driven underground. People are just ornery that way.

No, people still own the real estate inside their heads. They need the space, the personal freedom, of being able to think one way and act another. They need to smile when they are tearful, to force a polite response when they want to scream at you, to turn away with a murmured courtesy rather than engage and share their deepest thoughts. Humans have always been a bi-level species. We have always used meaningless courtesies to smooth over differences between individuals that would otherwise have us howling all the time. Similarly, we use formalized, ritual diplomacy to moderate relations between nations that would otherwise have us always on the brink of war. Hypocrisy and insincerity let us pick and choose our battles. They allow us to live.

So yes, while we would like to think that our friends and family are always sincere in their expressions of love, gratitude, and contentment, that the barista at Starbucks is doing us a favor that deserves our thanks, and that corporate executives mean in their hearts every word they put in a press release—it is not always so. And our indulging the small hypocrisy of not really noticing—that, too, is part of the social grease that makes life tolerable.

1. Computer pioneer Alan Turing in 1950 proposed that, to test a computer for artificial intelligence, we station a person on one side of a wall and have them communicate with a respondent on the other side through typewritten messages. The first person does not know if the other is a real human being or a very fast and well-programmed computer. If the person on this side cannot tell after about five minutes whether the respondent is human or not, then the responding machine is artificially intelligent. This test has since been superseded by others that measure specific outputs and performance in more dimensions than simple chat, because mindless, confabulating language processors, like ELIZA in the mid-1960s, easily passed the Turing test but were hardly intelligent.

Sunday, April 23, 2017

On Teaching Writing

I can’t teach another person how to write. I don’t think anyone can. This is not to disparage those who teach writing courses and run workshops and retreats for beginning writers. Some of the basic skills are necessary and teachable. New writers also believe that the art includes structures they need to learn and professional secrets that can be taught. And I am not ungrateful to the excellent English composition teachers I had in high school and college who did teach me a thing or three. Finally, every new writer needs someone in a position to know something about writing who will read their work and give them both positive and negative feedback, because that builds confidence. But the art itself, the essence of writing—that can’t be taught, because it grows from inside a person.

Every new writer needs to know the basics. Becoming a writer is impossible without knowing the language in which you will write. For English, and for most other Indo-European languages, that means understanding its grammar, the parts of speech, verb tenses, slippery concepts like mood and, in English, the subjunctive, as well as sentence structure and diagramming, the rules and the malleability of syntax, a focus on words with vocabulary and spelling drills, the difference between a word’s denotations and its connotations, and on and on. It also helps immensely to study other languages that have contributed to your native tongue—as I did with French, Latin, and Greek—as well as one or more parallel languages—as I did with Russian and Portuguese—in order to recognize cognates and word borrowings, and to puzzle out the meaning of new words and gain a taste for their flavor.

The art of writing also has some broader structures that the novice can learn by rote. In the nonfiction world, the writer can study the organization of an argument: going from specific to general, or general to specific, and the logical fallacies that invalidate an argument in the eyes of any educated reader. A journalist learns the inverted pyramid structure, where the most important facts of the news story—the Five W’s of who, what, where, when, and why or how—occupy the lead, while other details and analysis necessarily follow. An essayist learns to find a hook in everyday experience—such as a common question or problem—that will draw the reader into the thread of the argument. A technical writer learns to break down processes into discrete steps and to address each one, with all its variables, separately and usually in chronological order.

In the fiction world, there are fewer formal structures to observe. Short stories are usually more compressed in time and space, and involve fewer characters, than novels.1 Most stories of any length are structured around some kind of loss, struggle, journey, or other contrivance that serves to keep the action moving forward. Most of them arrive at some kind of climax, where all the characters, plot lines, and problems come together and are addressed or resolved. And most also have a denouement, which tidies up any last unresolved plots and suggests how the characters will spend the rest of their lives. A playwright learns how to frame a story into separate acts, and how to move the action toward and break it—or appear to resolve at least some of it—at the end of each one. A playwright also learns to convey character and action through dialogue, where excursive histories and graphic action are not possible on a closed stage. A screenwriter must also learn a more formal writing structure, which involves placing the margins of action and dialog separately, so that one page of script roughly equals one minute of screen time, and to put certain words associated with sound or visual cues in all caps, so that they be noticed and given proper treatment in production. To be successful, the screenwriter also must obey current conventions about act structure and timing.

But aside from these generalities, the art of writing is something that either takes you into its confidence—or it doesn’t. Your mindset, daylight dreams, life experiences, and past reading either prepare you to tell stories, paint with words, and sing cantos to your readers—or they don’t.

Two years ago, I decided to make a formal study of music because, while I have always loved listening to music both popular and classical, my knowledge of it remained rudimentary. I wanted to be able to make music as well.2 I also want to keep my brain active and alive as I enter my later years, and learning new skills and tackling new challenges seem like a good idea. I began taking lessons on the keyboard—generic for pianos, organs, and synthesizers—and bought myself a Hammond drawbar organ with two keyboards, presets, vibrato and chorus, the Leslie function—don’t ask—and a set of pedals.3 I chose to learn the keyboard because it would teach me about chords and voicings in the way a single-note instrument like the trombone could not, and it had a fixed sound the way a stringed instrument—which needs to be constantly tuned—does not.

My teacher, who is a lifelong musician himself, had me learn scales and taught me the Circle of Fifths, which unlocked for me the structure of music. I already had some sense of how notes are represented on the staff, what the timing intervals are, and other parts of musical notation. I now learned how chords are structured, with all their variations, as well as chord progressions, which appeal to the ear and are the basis for most popular songs. This was like learning grammar and sentence structure as a prelude to writing.

My teacher has also taught me to differentiate harmony from melody, how to break down a new piece of music into its chords and their roots, to play them solo at first, and only then to work on the melody. This prepares me both to play the song and to accompany a singer or other band members. I also learned to blend the two functions of harmony and melody through voice leading. I learned to keep time in the bass and to do bass walks—although my timing is still faulty. I am now learning blues scales and their progressions. This is all like learning the various structures of nonfiction writing formats, or the differences between a short story and a play.

But … after two years, I am still at the stage of analyzing and interpreting, of working the thing out intellectually rather than emotionally. I am working my left hand to form chords or walk the bass, my right hand to play melody or voice lead, but the two are not yet coming together. I approach each new song as an exercise in deconstruction. A song is an intellectual challenge, not an act of personal expression. I can make music, but I don’t yet make it sing.

This is the essence of art. In music, you can learn notes, scales, chords, and progressions—but something inside you must open up and sing. In writing, you can learn grammar, vocabulary, and rudimentary structures—but something inside you must catch fire with story. A teacher cannot tell you how to light that fire. Oh, he or she can light matches and flip them at your head. A teacher can spot the small sparks that sometimes appear in your writing, point them out to you, praise them, and try to nurture them. But if the student’s mind is—figuratively—damp wood, then nothing will catch.

Armed with the basics of language and structure, any writer still must eventually teach him- or herself how to make a story come alive. In part, we do this by reading widely and observing what other writers have done and how they do it. For this, I love to study how a tale is formed in a short story or novel and then remade into a movie. Between one form and the other, the essence of storytelling emerges, stripped away and rebuilt, like a butterfly from a caterpillar’s chrysalis. As writers, we can also ask questions of the stories and novels we read: How did the author do that? What was he or she trying to achieve? Why does this feel right (or wrong)? We also absorb, as if by osmosis, what constitutes good taste in storytelling and what leaves a dull thud. And, of course, we learn by doing, by trying out new approaches, by seeing what works for us in our own personal style, how to create and move characters, by alighting on the forms and structures that work, by discarding the techniques and tropes that seem awkward and excessive.4

Writers learn by writing and adapting stories to their own personal taste and style, just as musicians learn by playing and adapting songs to the rhythms and harmonies that personally move them.

Ultimately, anyone with a sense for language and logic can learn to write an adequate newspaper article or a technical manual. These are about facts and only need an awareness of the reader and what he or she wants—or needs—to know in order for the writing to work. But stories involve something more, the dimension of emotions, of aspirations, desires, fears, and disgusts. Storytelling must look inside the writer’s own self and his or her own experiences to find the expression of ideas and emotions that will touch a similar chord in another human mind. Stories are as much about being human as they are a collection of words, imagined conversations, described actions, and resolved plot lines.

This is why I believe machine intelligences may one day write adequate newspaper articles and technical manuals, but they will never excel at writing fiction. Not, that is, until machines become so complex and involuted themselves that their programs resemble the human mind. Humans live in a shadow world: part of our daily life revolves around the facts and consequences we glean from the external world, and part lies in the interpretations we place upon them. And these interpretations, our attractions, aversions, and distractions—the push and pull of hot and cold feelings, towards and away from various thoughts and objects—are shaped by all of our mind’s unexpressed desires, hidden agendas, disguised hatreds, and other emotion influencers which lie buried in the subconscious and only come out in random thoughts and in our dreams.

If the writer of fiction does not touch this subconscious level,5 then the story will remain a mechanical exercise, a study of forms. It may involve extremely well-crafted characters carved from clues written on bits of paper drawn out of a hat. It may involve an intricate plot filled with actions traced on an Etch-a-Sketch. But it won’t make the reader glow with recognition, or identify with the situation, or even much care. That kind of story comes from within, and it’s nothing that one human being can teach another how to create.

1. Some fiction classes might also go into technical details that—for me—are esoteric to the point of disappearing into the aether, such as the differences between the novelette and novella and the novel. Aside from placing limits on length, I don’t know how these forms differ, other than being longer and more complex than a short story. How long should any piece of fiction be? Long enough to tell the story and satisfy the reader. Any other consideration is the business of publishers, such as how much paper and ink they will need to buy.

2. When I was in fourth grade, I started playing the trombone, along with dozens of other students who were being introduced to the school band with their own instruments. I could read music to the extent of linking notes on the staff with positions on the slide. I understood the timing of notes and tempo. I grasped that flats were a half-tone down, while sharps were a half-tone up. But I never understood the structure of our Western, twelve-tone music which makes the white and black keys of a piano a physical necessity. I never associated all those sharps and flats written on the staff at the start of a piece of music with anything other than the composer’s prefacing remarks; so I did not understand how they changed the playing of notes that appeared halfway down the page. The idea that music had different scales and keys, and that they were all part of a greater structure, was never fully explained to me. Naturally, I was terrible at playing the trombone.
       Nevertheless, my hunger to make music persisted through the years. I tried at various times to teach myself the violin, the guitar, the Chapman Stick®, and the French horn—all without success. Then, two years ago, I finally got serious and began studying music as a thing in itself, and I started taking lessons on the keyboard.

3. When I was growing up, my Dad bought a Hammond organ with rotary tone wheels and learned to play it. I never actually played the thing, but I did goof around on it, fiddled with the drawbars, and listened as the various pipe lengths and their voices came together to make a sound. So this instrument was more familiar to me than a piano and less bizarre than a synthesizer.

4. Early in my writing career, I heard an interview with Marilyn Durham about her novel The Man Who Loved Cat Dancing, where she mentioned the problem of getting a character to walk through a door. This is harder than it sounds, because any writer grapples—as I was grappling at the time—with how much to show and tell in every action. Do you describe the door? Do you show the character twisting the doorknob? Do you use the sweep of the opening door to describe the room inside? My epiphany then and my practice ever since is that, unless the door is important and hides something tantalizing, ignore it. Start the action inside the room. Doors are for playwrights who have to get their characters onto a stage.

5. See Working With the Subconscious from September 30, 2012.

Sunday, April 16, 2017

Objective, Subjective, and Narrative

Is the world external to ourselves, internal in our minds, or shared with others through the process of narrative? To answer that, first we need to decide what we mean by “the world” or “reality”?

Certainly the brain as an organ of thought can have no direct experience of the world outside the bony skull. It can have no actual sensation of any kind. Even “headaches” are a matter of blood pressure, muscle spasms, and pain in the nerves of the head and neck, not in the cerebral lobes. The only events that the brain can detect directly are cerebral accidents like strokes and concussions, and then most of those events are mediated by consciousness and its loss, or by changes in brain function and awareness, rather than by direct detection and assessment of the damage. Our minds are circuit ghosts that run through a network of neurons, rather than the neurons themselves.

Everything that we think of as reality and the world around us is conveyed to the brain by nerve stimulation. This sometimes occurs directly, as with nerve endings embedded in the skin and other tissues, yielding the senses of touch, taste, and smell. Mostly—and for the senses our minds are most aware of and involved with—the nerves connect with organs that receive signals from and interpret the external environment: the eyes to detect light, eardrums to detect pulses in the surrounding air, the inner ear to detect the pulls of gravity and acceleration, and so on. Each of these organs as well as the sensors in the skin, nose, and tongue are reporting their findings to the brain all the time.

Here is where the subjective/objective problem starts. The brain can process all these inputs, but the mind—the active awareness, the collective effect of those “circuit ghosts”—cannot pay attention to all of this simultaneous and often conflicting information. The brain of a developing organism automatically sets up hierarchies of stimulation awareness that govern the mind’s picking and choosing among the continuous flow of signals. For example, we react to the sound of our own name even among a jumble of overlapping restaurant chatter or a stream of messages through a public-address system. The loudest noises and the brightest lights and colors get priority in this hierarchy and pass directly to our awareness, because they are usually associated with danger. The brain’s monitoring of the edges of the retina—our peripheral vision—gives priority to movement there over signals that indicate outline or color, because the movement of things around us is also usually associated with danger and more important than their shape or color.

So, even though our brains may receive accurate reporting of signals related to sight, sound, and touch, they don’t always reach our awareness for interpretation and action. In this way, the objective data become partially subjective. Of course, much of that external reality is not subject to either mediation or reinterpretation by the brain. Fall out of a window, and you will find that gravity takes over and you impact on the pavement whether you think about it or not. In the same way, explosions, collisions, accidents, and intentional violence all have a reality that is not brain-mediated and subject to our awareness.

Human beings have also developed mechanical instruments designed to isolate and mediate signals from the physical world before they reach our senses for interpretation. For example, a spectrometer can pick out the actual wavelength of a beam of light. It can measure and tell a human observer that the photons in what appears to be blue light are oscillating with a wavelength of 440 nanometers, or green light at 530 nm, red light at 700 nm. Even if you have been raised by aliens or by color-blind humans, so that you traditionally call red light “green” and vice versa, the spectrometer can give you an accurate reading of the light as it is defined by the majority of other human beings. In the same way, an electronic tuner can identify the pitch of a note, say Concert A at 440 Hertz, even if you are personally tone-deaf.1

So, even though the brain mediates, selects, and interprets the reality coming in from the sense organs, a measurable and appreciable—that is, objective—reality does exist outside of human awareness. Still, awareness plays a big role in interpreting and providing context for that external reality. Is red light more pleasant to the brain and to the awareness “ghost” that rides inside it than green or blue light? That is a matter of subjective interpretation. Does a red light in a traffic signal mean “stop,” while green means “go,” and yellow means “do you feel lucky”? That, too, is a matter of interpretation based on learned—or, in the case of yellow, badly learned or misremembered—rules governing human activity.

The truth, as I see it, is that reality exists objectively but is interpreted subjectively. And when we go about putting reality into words—as I am doing right now, or when I try to relate an experience or tell a story in one of my novels—then the process becomes even more subjective. The idea, the story, the sense data that my brain recalls and tries to fix in words which other humans can then read, understand, and so vicariously experience becomes subject to the limits of my vocabulary, the meanings which I have learned to attach to certain words, the connotations with which those words resonate in my own private thoughts, and the choices I make among all these variables, on the fly, and in the heat of creation. While a piece of writing—a novel, a poem, or a news report—may become a crystalline distillation of reality for an enthusiastic reader, it is at the same time more and less than an absolutely faithful recording of the originally perceived reality.

In the same way, when a producer, director, cast of actors, and crew of set dressers, camera operators, and computer-graphic artists set about to recreate a beloved bestselling novel or even an original screenplay, they will be creating a new interpretation based on the particular skills, vision, and choices of each member of the production company. It may be, in both the author’s and the audience’s view, a crystalline distillation of the story, but it will still be different from the act of reading the novel or screenplay, or living the reality as it might have occurred in real life. While a novel or screenplay may have had no existence “in real life,” the same acts of selection and interpretation—of words in a book, or of skills and vision in a movie production—apply to the creation of a biography or a biopic about a person who actually lived and had a historically measurable effect on reality. The result in the telling will be different from the reality of living.

So we, as aware human beings, dance back and forth across the line separating objective and subjective all the time. We live in our minds, but our minds also participate in a reality that affects other human beings.

And now a third consideration has come into our lives. Based on the theories of postmodern literature and our evolving social sciences, we must deal with “the narrative.” This is more than the plot and story line of a single novel or movie. This is a shared construct of political and social—if not indeed physical—reality, put together from the writings of influential theorists, reporters, academics, and others who get paid for expressing their opinions. This sociopolitical narrative becomes the basis for new novels and movies, for the selection and interpretation of news stories, for more editorials, blogs, and opinion pieces, and for the crop of memes—those bits of ideological weed that would seem to crystallize one aspect of the larger narrative in one instant of time—that all reverberate in the social, media, and social-media echo chamber of modern society.

For anyone who bathes in the daily waters of politics, economics, opinion, and written and enacted literature—that is, books and movies—the narrative becomes an intermediate reality. Outside the brainbox of the skull, there exist both the objective reality of blue light at 440 nm or Concert A at 440 Hz and the narrative reality in which certain people are automatically good and virtuous, while others are wicked and demonic; certain historical facts are remembered and cherished correctly, while others are dismissed as damned lies; and certain acts or intentions are considered rational and lifesaving, while others are known to be foolish and cruel. And when a human encounters conflict between his or her subjective experience or memory and an externally shared narrative which he or she has accepted, either cognitive dissonance or a personal crisis occurs. When a society encounters conflict between its public narrative and external reality, a political, social, or economic crisis occurs. And sometimes the narrative wins—to the detriment of personal or social existence.

This is not actually a new phenomenon, although we now celebrate this kind of consensus in earlier times as “the narrative” and consider it either the obvious end product of a cohesive society or a mindless form of manufactured groupthink. Every religion since nomadic herders began coming together in their summer pastures has spun its own narrative, its greater vision, in which the lonely circuit ghost inside each brainbox partakes. The Greeks had their narrative of the Trojan War, a fictitious relic of a forgotten age—not unlike the Arthurian narrative of chivalry for the English—which shaped their ideas about what men and women were supposed to feel and do, how the gods were to be honored or mocked, and how the best of human intentions can sometimes go awry. The Roman world was held together largely by its own narrative, played out in the minds of emperors, generals, proconsuls, and tax collectors.

Narrative is strong in some people’s minds. Control of the narrative is a kind of power. And narratives that have the charm of simplicity, the echo of truth, and the ability to enforce popular desires will eventually drive out any narrative that is too complex, difficult to verify, or particular in its voice and vision. Narrative persists in any social setting until a harsh reality intrudes—the political party collapses or becomes irrelevant; the society is invaded or encounters some physical catastrophe like an earthquake or meteor strike; the economy is disrupted by some new technology or goes broke through a lack of production or an excess of debt—and then individuals are left to grope among the pieces to assemble their own, subjective views of reality once more.

Which of these—objective, subjective, or narrative—is the true reality? None of them. All of them. They are the blend of physical, private, and public or social needs, drives, and obligations that guides and motivates every human being. To choose one among them and elevate it to supremacy is to miss the whole point of being human.

1. But some perceptions remain an artifact of the mind. When I was living back East, I would sometimes sense the sky on a summer afternoon under certain conditions of humidity as green rather than blue. This was not the bright green of a laser pointer or fresh grass, but the muted green of pea soup. It may have been an actual color, reflecting the green plants in the fields up into the sky, or from moisture high in the atmosphere scattering sunlight to disperse the 530 nm wavelength more dominantly than 440 nm. But it sure looked green to me.

Sunday, April 9, 2017

No Utopia

A germ has infected the minds of bright people, deep thinkers, and intellectuals in almost every Western society. Its inception dates back almost 2,500 years to Plato and his Republic. The infection resurfaced again in the 16th century with Sir Thomas More and his Utopia, and once more and with even more virulence in the 19th century with Karl Marx and his writings in The Communist Manifesto and Das Kapital. The essence of these various outbreaks—the DNA of the germ, so to speak—is that humanity can reach a perfect state, a lasting condition of peace and plenty and universal brotherhood, if everyone would just let the brightest people among us order society according to rational, scientific, humanitarian principles.

This notion is in stark contrast with the other two principles of social organization. The first of those organizations goes back to the monkey troop: let the strongest, smartest, and (hopefully) wisest people in the tribe, confederation, or nation fight among themselves until the (hopefully) best person wins; then we will all obey him (or rarely, her) as our alpha male (or female), chief, king, emperor, or führer. This principle saw its heyday in the English Wars of the Roses. The second principle is less ancient and was developed in Greece at about the same period that Plato did his writings: let people come together, offer and consider arguments, and choose among themselves the best course of action, with the process overseen by a committee of citizen caretakers. With later refinements to address issues like dispersed populations, protection of the rights of the minority, and delegation of the popular will to elected representatives, this principle has come down to us in various democracies and in our own republic.

The main difference between the utopian ideals of Plato, More, and Marx and the other organizing principles of either kingship or democracy lies in the nature of time. A king or queen is supported, followed, and obeyed for only a brief moment in history—even if the king or queen and his or her followers don’t understand this and think accession to the crown is forever. The monarch’s decrees and orders are binding only for a limited period and subject to change with changing underlying conditions. Some decrees may become the basis of popular tradition and so live on for decades or centuries beyond their origin. But a new king with a different appraisal of the political, economic, defensive, or agrarian situation, or new experience of changed conditions, can issue new decrees that point social energies in a different direction. Similarly, the proposals and laws that democratic societies or their democratically elected representatives enact are always of a temporary nature. Some of the society’s core values and principles, such as those bound up in the U.S. Constitution, are intended to endure for the ages. But even these sacred documents are always subject to interpretation and amendment. The essence of either the kingship principle or the democratic principle is that law and the social organization which supports it are flexible, subject to the imagination and understanding of the people being governed, and able to respond to changing political, economic, and other conditions.

But the bright minds of the theorists—Plato, More, Marx, and all those who sail with and believe in them—look for a social order that transcends the changing nature of human understanding and imagination. They aim at the end of politics, economics, military aggression, agricultural variation, technological invention, and other underlying conditions. They expect to initiate the end of history itself. Their social and political views are flexible right up until the new and improved social order that they have devised takes over and operates so perfectly that nothing further can change. When you reach perfection, there’s nothing more to say or do.

Of course, human nature abhors perfection. We are restless creatures, always thinking, always plotting and planning, always sampling and judging, always looking for greener grass and trying to breed redder apples, always wanting more, always itching with a kernel of dissatisfaction. This itch is what motivates the bright minds looking for utopia in the first place. But those minds never stop to consider that restless humans, given an eternity of bliss, will soon want to reject their perfect state and move on to something even newer and better. That is the lesson of the Eden story.

The more hard-core, dirigiste thinkers among those bright minds have already concluded that human nature will ultimately need to be changed in order to fit into their perfect societies. People will have to become more self-sacrificing, more contented, less quarrelsome, more altruistic, less greedy, more … perfect, in order for their new and enduring social order to function. Stalin famously supported the researches of Trofim Lysenko, the agrobiologist who taught—counter to the principles of Mendelian genetics—that acquired traits, the products of nurture, could be inherited by later generations. Lysenko was working with plants and seeds, but Stalin and his acolytes believed that the same technique could be worked on human beings: get them to change their behavior, rethink their own natures, become the new Homo sovieticus—and human nature would be changed for all time to conform with Marxist-Leninist principles.

People do change, of course. While genetic mutations—especially among brain functions—tend to be slow and often self-canceling—especially when they work against social norms—people are always susceptible to new ideas. Religions with their coded values and ethical propositions can sweep across populations much faster than any physical mutation can sweep down the generations. Christianity raises the values of love, reciprocity, and cooperation. Islam raises the values of uniform belief and submission to religious authority. Buddhism raises the values of right thought and action in a hurtful world. And, yes, Marxism-Leninism raises the values of personal selflessness and obedience to political authority.

But these core value propositions are still subject to change. Inspiration and revelation harden into orthodoxy, become challenged by disputation and reformation, and succumb to new and different inspirations and revelations. The one exception to this change process would seem to be the scientific method, which is a form of either anti-revelation or continuing revelation. Arising in the work of 16th and 17th century thinkers like Galileo and Descartes, the method values observation and experimentation as the only support for—or disproof of—conjecture and theory. By its nature, the scientific method is flexible and perpetually adjusts its findings to changes in—or new discoveries about—those underlying conditions.1

We’ve been riding a wave of technological change derived from the scientific method for four centuries now. Observation, hypothesis, experimentation, and analysis have variously given us the steam engine, telephones, radio, television, computers, evolution, and genetics as well as far-reaching advances in our thinking about physics, chemistry, and biology. Human life is qualitatively and quantitatively different from what it was four hundred years ago in those societies that have embraced science as an organizing principle along with the Western tradition of personal liberty and free-market exchange of ideas, goods, and services.

And yes, along with our understanding of biology, chemistry, and physics, our understanding of human psychology and social structures has vastly expanded. We have become the animal that studies itself and thinks about its own future—not just on a personal level, but as a social organism. But we are no closer to finding a “perfect” social structure, because human beings are still descended from irritable, distrusting, independent-minded monkeys rather than docile, cooperative, obedient ants or honeybees. No amount of religious indoctrination, state orthodoxy, or applied lysenkoism will remake the mass of humanity into H. sovieticus.

Get over it, bright minds. In the next hundred or a thousand years, we may reach the stars, transmute the elements, and be served by mechanical intelligences the equal of our own. But we will still be irritable, distrusting, independent-minded creatures, half-angel, half-ape, and always looking for greener grass and redder fruits. And that flexibility of mind, combined with our stubbornness and independence, is what will keep human beings evolving and moving forward when more perfect creatures and more orderly societies have vanished from this Earth.

1. To quote physicist Richard Feynman: “It doesn’t matter how beautiful your theory is; it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.” Unfortunately, these days some of our lesser scientists—and their lay followers—seem to think that scientific propositions can be “settled” for all time and somehow made immutable.