Sunday, May 27, 2012

Writing the Great American Novel

Anyone who has announced, either shyly or boldly, to family and friends that he or she either plans to write, or has started to write, a novel hears some form of this comment: “Going to write the Great American Novel, are you?” It may be said with a smile. It may be meant kindly. But the underlying message always contains just a touch of condescension: “What? Little you? Going off to capture the Great White Whale with your itty-bitty hook and line?”

The Great American Novel is, of course, a myth and a put-down in itself. The idea goes back at least to the 19th century, when British writers held the keys to English letters and America was a distant place full of brash Yankees, drunken cowboys, painted savages, and hopeful fools. No one could capture that much absurdity between two covers. There is no Great American Novel, except in the imaginations of those for whom America is a strange and somehow unreal place that can, they believe, be safely dismissed.

But capturing the spirit of a country or an age is not really the first job of a writer. The only tool any writer has is a lamp against the darkness. It might be a floodlight or a pencil-beam flashlight. But it’s how the writer finds and illuminates what he or she knows, or supposes to be true, or wishes were so. Any novel is a mixture of one part observed reality, one part imagined reality or fantasy, and one part a reflection of the novelist’s own personality, biases, hopes, and fears.

Over the years many great writers have written what, for their time and the place, could be called a Great American Novel—often a couple of them, standing side-by-side in his or her oeuvre. Nathaniel Hawthorne portrayed life along the frontier in the 18th century. Henry James captured a certain style of East Coast upper-class life in the late 19th century, just as Mark Twain described the Midwest middle- to lower-class life of about the same time. Thomas Wolfe caught the New South in the early 20th Century; John Cheever portrayed the East Coast again at mid-century; and the other Tom Wolfe defined the fragmentation of American culture late in the 20th century. Anyone could name a handful of others who accurately depicted their times and places—Ernest Hemingway and the fragmentation of a world between two great wars, for example—but this handful contains my favorites and to some extent my influences.

All of these writers capture an aspect of America and her roots, but none of them does so definitively. They can’t. Henry James’s Isabel Archer from The Portrait of a Lady does not even exist in the same universe as Mark Twain’s Huckleberry Finn and his eponymous Adventures. And yet both are true aspects of America’s many storied past.

To suggest, even jokingly, that a young writer should try to capture some universal essence from the American experience does that young spark a disservice. Universality is not the goal of a writer. With time and patience—and many books in the rear-view mirror—a universal viewpoint may come in some fractured form, as it did for James and Twain and my handful of others. But the new writer does best to focus on a single time and place, a type of character, a particular point of view—and try to do it justice.

Another suggestion that young or beginning writers hear is: “Write what you know.” That works just fine if you’re an Ernest Hemingway who had traveled the world, survived participation in World War I and the Spanish Civil War, was on intimate terms with exotic sports like big game hunting and bull fighting, and a connoisseur of great whiskies. For most of us, however, what we know is the backyard—ground that Emily Dickinson covered pretty thoroughly; our home town—ditto Thomas Wolfe and Thornton Wilder; our years in high school and college, and one or two not very interesting jobs.1

The admonition to “write what you know” is an invitation to delay the actual writing process and go off to get yourself nearly killed by poking into war zones, attempting all sorts of dodgy transactions and trades, and engaging in romantic affairs. Then you, too, can write just like a Hemingway. Except that—as no one stops to think—that job was already taken. Ernest Hemingway did it brilliantly, and the current crop of foreign correspondents working for the cable news channels are already planning to mine the world’s existing troubles for their own books.

The better suggestion should be: “Write what fascinates you.” Every writer has an abiding interest.2 If you turn your lamp on that, you will have a built-in edge on other writers and a leg up on acquiring the knowledge and level of detail that will make your writing authoritative—not to mention pleasing to like-minded readers. Your knowledge will expand easily, almost effortlessly, as your cycle of work grows. And if you are very lucky and very skilled, you will be able to entice new readers who know little or nothing about your subject—and then you can expand their awareness and maybe even spark their own interest.

The reality is that readers don’t look for some kind of grand “summing up” of a nation or an age when they buy a book. They look for a good read, a new experience, and insight into some subject that has always kind-of attracted them. A few new facts, the answers to some nagging questions, and a bright and hopeful attitude are always welcome. But, at heart, they want to be entertained.

It’s called a novel for a reason. The book is something new—at least to the reader who has just picked it up. What the reader wants is not the familiar ground of “my backyard,” or “life on the Mississippi,” or “life among the ruins”—or rather, not only that. They want you to take them someplace they can recognize, yes—that part of your writing which is based on accessible reality. But they also want you to take them someplace new and strange—the part which comes from your (and their) wishing things were so.

Simply take them to that place, and you will indeed write a great novel.

1. Actually, I had some pretty interesting jobs. The early years as a book editor were a good grounding in my craft and how it fits into the mechanics of publishing. But then I took my writing and editing “out into the field,” as it were: working for an engineering and construction company, an electricity and gas public utility, an oil refinery, a pharmaceutical company, and a maker of biotech instruments and reagents. In addition to learning a lot about business, I learned how most of the technical side of the 20th century and modern civilization was put together. That’s useful knowledge to mine for book settings in my later career as a novelist.

2. Mine has always been technology. That comes from being the son of a mechanical engineer and grandson of a civil engineer. (My mother was a landscape architect, however, and her father a lawyer and judge; so I do have a softer, more natural, if no less analytical side.) My first toy was a pressboard box my father made for me that interlinked patterns of three or four colored lights and switches. I don’t remember what it did exactly, other than flash the lights when you flipped the switches, but that was enough for a three-year-old. Although I never had a head for math and so could not follow in my father’s profession, I remained fascinated by things that move and do stuff. And, true to that box, I’m fascinated to this day by computers and smart phones, anything with a keyboard that takes inputs and a display that gives outputs. As a writer, my day job has always gravitated toward technical subjects like steam engines, engineering and construction, energy generation and distribution, and finally biotechnology and biology. Of course, my early reading included a lot of science fiction and when I started writing, it was the “hard” kind of science fiction—no fantasy or magic, please.

Sunday, May 20, 2012

Awake at 3 a.m.

I have always been an intermittent sleeper, a condition that goes back as far as I can remember, certainly back to late childhood. Regardless of when I turn off the light at night or when I have to be up in the morning, I awaken three or four times at night. Usually, I just glance at the bedside clock, roll over, and go back to sleep. Sometimes I will get out of bed, drink a glass of water, walk through the apartment—although no sound or warning had actually awakened me—pause to look at the Moon, then get back into bed. Sometimes, less often, I will sit up and think for a bit before drifting off to sleep.1

For years I have noticed that the sleep-wake cycle seems to be in roughly 90-minute intervals. If I turn out the light at 11 p.m., I awaken at 12:30 a.m., then about 2:00 … 3:30 … 5:00. Sometimes I’ll miss a waking phase. Sometimes I’ll sleep through for two hours and then awaken. But the overall pattern is discernible.2

I did not realize the significance of the 90-minute cycle until last year, when I heard Allison Harvey, a U.C. Berkeley professor of psychology and sleep expert, speak at our NAMI East Bay chapter on the benefits of good sleep.3 She explained that human beings cycle through various sleep stages, going down to deeper and deeper levels, and then gradually rising to a short period of dreaming sleep—characterized by rapid eye movement, or REM—every 90 minutes or so. Most people continue sleeping and go back to deeper levels after the REM sleep. The REM periods get longer and longer during the night. Some people tend to awaken in the sixth hour after the third REM period and then fall back asleep for another two hours. I seem to be the exception who awakens more frequently.

Two notions about these short periods of wakefulness have become commonplace with me over the years.

First, if I’m wrestling with a problem—a troubling plot point in a book, say, or a logical or factual error in a recent passage that my subconscious suspects but I can’t quite identify, or a word choice that isn’t quite right—then an idea, answer, or correction will generally pop out when I awaken in the middle of the night. That’s why I keep a notepad and a pencil on the nightstand. If I try to remember the thought by simply holding it in my mind, it will surely be gone by morning. But just by jotting down a word or two—usually in darkness, by touch, using a blind scrawl that even I have trouble deciphering in daylight—I can bring back the sequence of thoughts and move ahead with them.

Second, if I happen to think about my life situation when I awaken in the middle of the night—about some course of action or direction I’ve chosen, some personal exchange I’ve recently had, or even just a summing up of where I stand and who I am—that evaluation will be uniformly negative. My thoughts at 3 a.m. are cold and harsh, judgmental, difficult … sometimes crushing. I used to think that this was the “real me” or “reality.” That what we find when we awaken at 3 a.m. is our actual situation unadorned by the pleasant lies we tell ourselves to keep moving forward during the day. The mask and its smile are stripped away, and 3 a.m. is the hour of the naked soul. These are not thoughts that I tend to jot down on the notepad, but if I remember them in the morning—because they tend to be harsh jolts—I then wonder what had slipped, and where reality actually lies.

Now I think I know.

While sleep itself is a great restorer of bodily function through muscle relaxation, endocrine rebalancing, and natural healing processes, what it does for the mind is more controversial. REM sleep involves a kind of inadvertent hallucination: the mind tells itself stories, sometimes very real seeming, sometimes knowingly fanciful, and usually related only tangentially to anything that might have happened during the preceding day. During REM sleep the body experiences a kind of muscular paralysis, so that the imagined visions, sensations, and actions pursued in dreams don’t result in thrashing limbs and personal injury.4

It’s understood that people deprived of sleep itself—whether kept from all sleep for a couple of days, or experiencing insufficient sleep during their daily cycle—become fatigued, error-prone, clumsy, irritable, and even physically sick. Complete sleep deprivation can cause hallucinations, mania, and finally death. But what part in particular the dreams of REM sleep play in healing the brain and mind—as compared with the other, deeper sleep stages—is unknown. Interestingly, studies where sleepers were awakened whenever the familiar eye movements and EEG patterns of dreaming began have proven that people can be deprived of REM sleep for quite long periods without harmful effect. It’s lack of sleep itself that is damaging.

But a night’s sleep is not one singular experience. As noted above, it comes in a regular and repeated pattern: successively deeper sleep stages, followed by shallower stages, then a period REM sleep, then off to the deep end again. This 90-minute cycle repeats four or five times a night. So we can conclude that whatever sleep does, it achieves in small doses that include both deep, dreamless sleep and light, dreaming sleep.

Scientific theories about sleep and its effects on the brain and mind abound: that sleep lets the nerve cells rebuild their stocks of the energy molecule, adenosine triphosphate (ATP); that sleep supports the selective formation of long-term memories from among the past day’s short-term memory events; that during sleep, interleukin-1 levels rise and immune system functions are promoted; that at a minimum, sleep allows the mind to pause, let go of current concerns and anxieties, work off its emotional buildup, and prepare for a new day. That is, in general, during sleep the brain rebuilds, digests, repairs, and prepares.

What does all this suggest for someone who has suddenly come awake at 3 a.m.?

For the solution to intellectual problems like plot fixes, logical errors, and word choices, the brain is riffling through its index cards of the past day’s events and tossing off random thoughts. If I’ve prompted myself with a thought before bedtime about solving a certain problem, then something related is likely to get bundled into all that activity. If I’m awake at the right time, the solution will pop into my mind, and I can capture it if I write it down. If I simply try to remember it for tomorrow, it will get tossed with the other short-term trash.

For the solution to emotional problems like who I am and where I’m going, the brain is randomly processing and discarding the past day’s emotions. Most of these will be hurtful, doubtful, and damaging thoughts, because those are the things that nag and worry us. Happy thoughts and hopeful expectations we tend to accept calmly as part of our emotional birthright. It would be an error, then, to think of the nighttime shuffling of our emotional baggage as a more truthful form of reality than the mix of hope, expectation, and conscious emotional balance that reflects our healthy daytime mind.

In either case, the wavelike nature of the sleep cycle—with each 90-minute segment undertaking to wash only some part, but not all, of the past day’s dirty dishes—suggests that experiencing just one time point, bang! awake at 3 a.m., will sample only a fraction of the entire process. The problem solution that the brain tosses out is not necessarily the final word on the relevant subject. It’s only a kernel, a fractal of thought and understanding, which then needs to be reexamined and expanded in the light of day with a fully awake mind and a refreshed perspective. The personal evaluation that the brain tosses out is just a fragment of feeling from a personality that’s undergoing repair, a personality in transition between the stages of emotional dissolution and reassembly. That evaluation also needs to be reexamined after the mind has re-integrated itself.

Mid-sleep awakenings can yield a provocatively fruitful source of ideas and inspiration—but they will still need development and evaluation with a clear head. The same awakenings can yield a deceptively convincing judgment on personal worth and direction—which is better left to an unconfused mind.

1. The only time this pattern breaks is when I am sick. Then I might sleep through the night without waking. But when I’m healthy, I tend to sleep in small batches.

2. This pattern might relate to something else I’ve noted about myself: I have a very accurate internal clock. If I need to be awake at a certain time, I will set the alarm out of habit, but I tend to come bolt-awake one minute before the alarm goes off. During the day, I generally know within a few minutes’ accuracy what time it is without looking at my watch. And I am very good at estimating the time requirements for various planned activities and assignments, for travel and connections, and for contingencies. I have a pretty good clock in my head. This seems to accompany my “bump of direction”: I can usually tell which way points north, where the sun is at any give time under cloud cover, and where I’ve been so I can retrace my route. I blame the Earth’s magnetic field for all this.

3. NAMI is the National Alliance on Mental Illness, and I volunteer at our local chapter to write up the speaker meetings for their bimonthly newsletter. See Sleep Better … Feel Better from March 23, 2011, in my NAMI Speaker Notes. The second page of the Harvey article includes a chart of the REM cycles described above.

4. Given that sleep is a particularly mammalian phenomenon—originally experienced during the day, when our mouselike, nocturnal ancestors had to lie low while the predator dinosaurs were awake and active—it makes sense that natural survival mechanisms would keep our bodies subdued despite whatever might be passing through our minds.

Sunday, May 13, 2012

On Outlining a Novel

Writing fiction is easy for me—provided I know what I’m supposed to write. With even a general idea of what the next scene should be and how the action and dialogue are supposed to advance the plot, I’m ready to go. But without this idea, I stare at the computer screen—which once was a sheet of paper rolled around a typewriter platen—and I literally know nothing. My mind is an echoing blank. Gray space between two lobes. Empty. Waiting. At once anxious to begin and bored with having nowhere to go. Not even waiting for something specific to happen, like a train or bus to arrive. Just … waiting.

Now, in order to start writing, I don’t need all the action choreographed in my head: the character does this and his opponent does that, cut and thrust, gambit and response, like the next twenty moves in a preplanned chess game. I don’t need all the dialogue programmed for me: the character says this and his opponent replies, point and counterpoint, remark and laughing return, like note cards laid out in order on a table. In fact, it’s better that I don’t have the scene too precisely defined, because then there would be nothing to discover from within my imagination. But I do need to know there’s a scene needed at this point in the story and how it fits into the overall sequence or plot of the novel.

Oh, and I do need what I call a “downbeat.” That’s the starting point, the first line for me as the writer and the first thought for the point-of-view character—a sight or sound or other sense image, a discovered fact, a perceived disjunction or discrepancy—that gets him or her moving toward the necessary action and dialogue. Without that downbeat, my mind and the character I’m following are like a piston stuck at “top dead center,” pushing at the crank pin without any sense of direction. But once I think of—or sometimes just randomly select1—a downbeat image or thought, my little two-stroke word generator starts up, putt-putt-putt, and I’m off into the dark corners of my mind where all the possible images, actions, and bits of dialogue are waiting to be discovered.

But all of this presumes that there are, somewhere in my mind, characters already chosen and defined at some level and a structure—a plot, an action sequence, a shadow of a life—into which the current scene will fit like a brick into a wall or a domino in a trail. For me, the difference between having a book to write and a book idea that sits idly in a folder somewhere is this sense of structure. Let’s call it an outline.

Some people think outlining is easy. As I’ve said elsewhere, when I was young and just starting out, people would tell me confidently, “There are really only seven plots in all of literature.”2 Writing should be easy, then, because all you have to do is pick one of the plots and execute it with some new characters in a new setting, like Ian Fleming finding a new villain and a different locale for James Bond’s next save-the-status-quo adventure. When I probed these confident people for specifics, however, they would mumble something about “boy meets girl … boy loses girl” and peter out, unable to name more. Today, they might mention “the hero’s journey” after mythologist Joseph Campbell, or talk about “action defines character” and the “three act structure” after screenwriter Syd Field. But these are just general story arcs, like telling the writer there have to be mountains in the landscape, some conflicts and challenges laid out for the main character. Stories, however, live through specifics: Which mountains? What conflicts? What challenges?

So where do stories come from? I can’t speak for other writers; they may indeed have discovered those seven plots or perfected the hero’s journey so that they can smoothly and professionally turn out yards of story per day. But I can no more sit down and “think up” a plot than I could sit down and conjure a sunrise or a rainbow. And I can no more steal another writer’s plot and fit my own characters and setting into it than I could steal Fred Astaire’s shoes and use them to become a dancer. No, for me, the story has to arise organically, piece by piece, from within my own mind.3

So … what are the steps from nothing to something? The way I go about outlining and writing a book is roughly this:

1. Get an Idea

All of my books start with a kernel of an idea. “What would happen if a tiny black hole fell into the Earth and started absorbing the core?” (The Doomsday Effect) “What would it take to recreate the life of Julius Caesar in 20th century America?” (First Citizen) “How would an otherwise honorable man react if one mistake from his distant past suddenly showed up in town?” (The Judge’s Daughter) Right now, I’m working on two new book outlines. One started with the question, “What would life be like for the first generation of people who lived far beyond ‘three score and ten’ through cellular regeneration and cloned organs?” The other started out, “What would happen if a man got sick of his job, bought a boat, took off on a cruise, and never came back?”

These ideas are not particularly new or unique, but something in them caught my fancy, suggested a deeper story line, got me thinking. Because they are not unique, I don’t have any concern about sharing the new book ideas publicly like this. You or any writer out there could start with such a question; the book you end up with will look nothing like mine. Fred Astaire’s shoes, remember?

None of these story ideas is complete and ready to write, either. It’s a kernel, a seed, and not the whole tree. The next step is to revisit that seed at regular intervals, and keep asking, Well … what would it be like? And then see if any new buds or branches sprout.

So each of these story kernels starts as a question in a Word document. It goes into either a folder on my hard drive labeled “Book Incubator” or, if it seems particularly promising, into its own folder. Just for curiosity sake, I also put a date on the file so I can backtrack and see where it all started.4

2. Incubate and Fertilize with New Material

My mind works slowly and mostly at the subconscious level. Characters, plot elements, background details, bits of realization and dialogue have to occur to me, bubbling up from those subterranean catacombs. As I come across interesting articles, book references, or internet sites that bear on the story, I add them to the folder.

The idea that became The Doomsday Effect started in the late 1970s. It went in the direction of the black hole eventually destroying the Earth and forcing humans to hurriedly undertake interstellar flight in generation ships. When that story line turned out to be a dead end, I had to find a way to solve the problem—or adjust the black-hole-sinks-into-Earth premise in such a way that it became solvable. The result was the novel you have.

The idea that grew into First Citizen actually started about the same period, the late ’70s, with the thought of writing a scholarly book on Julius Caesar, about whom there wasn’t at that time a popular biography. The idea morphed into a modern speculative novel when I realized that (1) I have no real taste for scholarship, being a storyteller, futurist, and inventor at heart, and (2) the ancient Romans practiced a complex system of religious and political beliefs full of obscure notions, minor deities, and complex rituals tied to the cursus honorum, or course of public offices, all of which I found tedious to contemplate and difficult to work into a novel. And none of them represented the real essence of the Caesar character anyway. My reading of Joel Garreau’s The Nine Nations of North America at the same time suggested the sort of national disintegration and civil war that had made Caesar’s own life come alive. And so the book idea morphed into a present-day novel about a young lawyer caught up in the civil strife of a politically broken United States.

The story about the man with the boat dates back to my youth, when our family owned small cabin cruisers and spent summer vacations touring the canal systems of eastern Canada. That notion remained untried and amorphous until I thought about putting it together with the son, William Henry, from The Judge’s Daughter and making a sequel with him as a dissatisfied professor of humanities during the political upheavals of the 1960s.

The story about cellular regeneration is much more recent, beginning with some lectures I heard while working at my latest corporate job, writing articles at the biotech company. Of course, just having people live longer is interesting but does not make a story. So the next step was to give it a set of characters from a failed novel I had been trying to write about mining icebergs for the Saudis, drawing on my experiences at the engineering and construction company.

My story ideas tend to gather details, like layers of moss and sediment, for a year or two. Then they meet a disjunction—“That won’t work!”—and suddenly change direction, like the black hole story. Or they languish inert until I realize that they belong with something else in the incubator, like the boat cruise joining the professor’s story, or the engineering family living longer through cellular regeneration.

3. Parcel Action into Chapters and Scenes

When a story idea and its components reaches critical mass—and that, again, is a matter of subconscious rather than conscious decision—then it’s time to begin making a book. What constitutes the material included in that “critical mass” is bits of action and dialogue, written as a way of trying out ideas and attitudes; notions about the overall plot direction and where it will end up; and characters selected with their names, aims and intentions, and positions in the possible action.

Now is the time to begin putting it all together. To use another landscape analogy, so far the plot has only been seen from orbit. If I were looking down on a view of the continental United States, say, I only know that the story wants to start in San Francisco and end up somewhere in the Northeast—maybe New York, maybe Boston—but not in Mexico or Florida.

Going from the orbital view down to 30,000 feet is to begin putting character actions with those bits of activity necessary to move the story across the landscape. In the case of the boat story, when does the professor buy the boat? What comes before that, forcing the decision to buy it? What does he have to do after purchase? When does the cruise begin? Does his wife Jane come along? What does Jane think about the boat? Or, in the cellular regeneration story, what does the character who gets a newly grown heart want to do most? Who opposes or helps him in this action? How do I match a story told over several medical episodes with a life action that will occur over several decades?

Once I have answers to these sorts of questions, it’s still not possible for me to sit down and start writing. If I try, I’ll still be at the level of “bits of action and dialogue.” So the next step is to get out the state maps and road atlas and start walking the story across the ground. This means defining, still fairly broadly, the actions, encounters, twists of fate and decision, random surprises, and natural consequences that collectively make up the story. Here I move at the level of individual chapters and scenes.5

Not every scene is fully defined—as noted above, sometimes it’s better to leave room for creative inspiration—but each step in the action is known and put in order. This is where logic takes over from subconscious direction. I have to ask questions like, “If Jane has a mental breakdown, how does she get out of the hospital? Does she wait for release? Discharge herself? Break out?” And, “If Jane breaks out, how does she evade security on a locked ward? How does she avoid recapture?” Logical answers to these questions lead to next steps, which lead to more questions and answers. I have to walk the ground.

Sometimes—often regularly—the logical next step and its consequences force me to throw the whole book, or at least the current part of it, back up into orbit. I have to put bedrock ideas on hold, switch off gravity, and begin shoving huge blocks of story and time around into new configurations. Events become plastic. Characters change their natures and positions in the plot. This happened recently in the new book about cellular regeneration. I had been stubbornly pushing at a story line involving the two main characters in some kind of concerted action with a beginning, middle, and end. But that single action had to be couched in a story arc that logically covered a century or more through changing social, political, legal, and technological conditions. When I realized the disjunction, the whole book became skyborn.

When a book goes into the air like that, I have to move quickly to put everything in some sort of sensible order before gravity switches back on and the plot goes into free fall. But, somehow, the story, the book outline, my concrete thinking about the book, always comes together. Still, outlining a whole book or even just the next part of it can be a nerve-wracking and troubling experience.

Ultimately, I emerge with a patchwork of sorts—some scenes firmly detailed and others gooily glossed over—that I think is strong enough to walk on. Then I go off for a day or a week, take deep breaths, equilibrate my head, and come back to start finding the downbeats and turning scene notes into production copy.

And so the book is under way!

1. The amazing thing is that, usually, an arbitrarily chosen image works just as well as an inspired one. Of course, the arbitration comes from the same place as the inspiration—down there in the subconscious, where the various parts of my brain are busy manufacturing the story. In some ways, however, a writer must treat freedom of choice like a surgeon. When faced with a new patient’s torso—skin and fat layers covering the organs that need attention, which are known only through remembered anatomy classes and a general experience of the human body, but which in this patient are known only through shadowy images on an x-ray—the surgeon might agonize over finding exactly the right place to cut, the absolutely best angle for the blade, the most economical length of the incision. Choices are infinite and knowledge limited. Delay too long while seeking this perfection, and the surgeon soon wants to put down the knife and walk out of the operating room. But the experienced surgeon knows that, within certain broad latitudes, it doesn’t matter where you cut. You make the incision—and then you make it work for the surgery you’re there to perform. And the writer has the advantage that, if the scene is botched and goes off track, it’s easier to start over than it is for the surgeon to seek a new patient.

2. See Some Thoughts on the Writing Craft.

3. As noted above, I’m strong a believer in the subconscious. Day-to-day awareness—the “I” that chooses pancakes rather than oatmeal for breakfast, or decides to take the car in for service today, or remembers to stop at the store for milk—is only the surface level of the mind. We evolved in complexity. Just as a million simultaneous chemical reactions define the activity of each cell, and a million cellular interactions define the life of the body, so a million neural interactions define the life of the mind in the context of an ongoing shower of sensory data and memory recalls. Systems operating within systems sort the inputs, coordinate the responses, store and retrieve the facts and perceptions. We have not one brain but at least three: medulla oblongata, cerebellum, and cerebrum, the latter divided into two hemispheres with separate functional areas, plus a number of separate and distinct structures that coordinate various types of awareness. It would be a miracle if all of this resulted in only one stream of thought and consciousness.

4. True story! I had an idea on file in the late 1980s, “What if scientists were to extract dinosaur DNA from insects trapped in amber and recreate living dinosaurs?” Of course, that’s not a story yet, and when I next visited the file I added, “Suppose they made a mistake in the coding?” I had to stop thinking about that one in 1991 when Michael Crichton brought out Jurassic Park. Even though my story, if developed, would have been very different—not necessarily about a failed flea circus/theme park on a jungle island—my premise was just too close to Crichton’s and would have become a me-too in the reading public’s view.

5. Since I tend to write books with multiple viewpoints (see Writing for Point of View from April 22, 2012), and I have a rule about following one viewpoint scene with another from the same character, this is also where I begin parceling out the action and understanding to one character or another.

Sunday, May 6, 2012

Economics of the Hive

Believers in a viable human collective—one run on a national scale1—say that true communism has never been tried. They suggest that the efforts of Lenin, Stalin, and Trotsky, Castro, Mao, Ho Chi Minh, Pol Pot, and lately Hugo Chavez were somehow flawed. Truly, these efforts were flawed—by coercion, confiscation, mass deportation, and imprisonment. But the reason for these suppressive actions, which are always regretted by the imposers, is that most humans do not willingly sacrifice their identities, family responsibilities, self-expression, hopes, and dreams to a national ideal of selfless sacrifice.2

But that’s not to say true communism has never been tried. Ants and bees practice perfect fidelity to the mantra “From each according to his abilities, to each according to his needs.” The only trick is, you have to give up everything that makes you human to participate.

T. H. White understood this in his late addition to The Once and Future King of the Antland sequence. There the boy Arthur visits a dreary place where the inhabitants have such limited vocabulary and capacity for feeling that the only adjectives available are “done” and “not-done,” referring ultimately for the workload for which each ant is responsible. The inhabitants are blasted by broadcast commands and exhortations at regular intervals. The motto of the invisible Ant-state emblazoned above the entrance to the nest is “Everything not forbidden is compulsory.” T. H. White understood too clearly the need for repression in order to create obedience.

George Orwell worked the same ground—but with humans this time—in his 1984. There the people with the best food, best jobs, and most self-aware existence are the outer party members of Ingsoc. Proles live in drudgery and miserable ignorance. Invisible inner party members must live in unimaginable luxury but also with unimaginable responsibilities.3 The average person of the outer party, Winston Smith, lives with simple boredom, personal conformity, loss of privacy, and the mind-static of endless exhortations to state-sponsored enthusiasms, fidelities, and hatreds that have grown stale—no, repulsive—through repetition.

I grew up with those books—plus, of course, about a thousand more—as part of a mid-20th century liberal education. What has stuck with me is an overall sense that each human is unique, responding to the world in his or her own fashion. Yes, we can define general patterns in the human psyche. The urge of one mother to love her children is very like, and draws from the same genetic and social heritage, as any other mother’s love. But the mothers themselves are different, the children are different, and the particular events and choices of a child’s life and its raising are different. So, then, the particular needs and actions of the exchange will be different.

What separates ants and bees, on the one hand, and human beings, on the other, is the sense of self, the individual mind and its awareness of difference from others, of having a unique place in the world.

In fact, we humans are so different from each other that we must actually learn about our similarities as part of our intellectual and emotional development. We are all born as helpless babies, but we are surrounded by a mother and father, aunts and uncles, and grandparents who are genetically programmed to nurture and protect us.4 We quickly discover the power of our voice and expressive responses: a cry will bring food, a smile or a giggle will bring cooing support. For the first two years of life, we are the center of a world dominated by large, dim faces and invisible hands bringing us sustenance, attention, play objects, and pleasurable responses. We start out as helpless tyrants.

In the second year—the “terrible twos”—we have to start learning that we are not the only creature who matters in what is fast becoming a strange and complicated world. This process, known as “socialization,” tries to teach the tyrant infant that other human beings have their own web of interests and centers of power, that they don’t always have time for us, that they resist being treated like objects or slaves. This is a complex message because, while it tells us that out there are other beings, not dissimilar from ourselves, whose needs and wants are important, we are also encouraged to become independent, take action for ourselves, stand on our own two feet, literally, and participate in the social structure of the family.

Then we discover the peer group—usually, in America, through kindergarten, playgrounds, and arranged play dates. Peers are not family, not former caregivers who once had our well being as their sole focus of attention. Peers are wild creatures who owe us no natural allegiance. We must meet each peer as an individual and, usually simultaneously, as part of a group. We must make deals, resolve conflicts, learn to accept and give support. And, through the influence of supervising adults, we learn to “share.”

Humans, unlike the vacant-brained ants and bees, must learn our place in the world through experience. We must learn empathy: the understanding of what others around us may be thinking, feeling, planning, wanting, needing. This happens so early that most of us don’t remember or remark it as a new way of thinking. To some extent, empathy—the opposite of the blank stare of autism—may be a skill wired into our brains in the same way that nerves connect the neocortex motor strip with the fingers of our right and left hands. To that extent, the autistic child may be said to have a brain impairment. But to a larger extent—or so I believe—empathy is something we learn.

Empathy is not necessarily or entirely a result of learned virtue. We don’t become aware of what other people might be thinking and feeling out of some kind of universal love and respect that we adopt at our mother’s or teacher’s urging. There may be some of that; certainly, children who were raised in an atmosphere of “grab anything you can get, Johnny” will tend to be less empathetic as adults. But I believe we mostly learn about others’ intentions and feelings as a defense mechanism. If the minds of those around you remain mysterious black boxes that randomly spew out actions and sentiments, then you will have a harder time figuring out who is likely to be a friend to be trusted or an enemy to be avoided. Later, as your involvement with groups and society in general develops in complexity, you will find it harder still to make deals, acquire support and cooperation, make your voice heard and your thoughts count. Without empathy, we are prisoners in a hostile and inexplicable world.

Humans don’t start out as members of a hive, automatically knowing cooperation and obedience because it is hard-wired into our brains. We start out as individuals at the center of a supportive family structure, perfect tyrants who must learn, gradually and painfully, to fit ourselves into a larger social structure that exchanges actions, thoughts, and feelings in order to support many individuals who might otherwise be in deadly conflict.

Human interaction is vastly more complex than that of bees or ants. Consider that the hive or nest represents a very simple form of economics: (1) find food outside, either pollinating flowers for bees or vegetable forage for ants; (2) bring it back inside; (3) process it into usable and storable chemical energy, either honey for the bees or some kind of leaf mold or aphid feed for ants; (4) defend the nest or hive from predators and changes in the environment; (5) tend the queen and her drones in order to raise the next generation. These activities require some coordination and even communication, but the communication cues are relatively simple, based on chemical transfer and patterned movements. The queen in either case is a breeding machine, not an order giver. The members of the hive or nest cooperate and react instinctively, through patterns that generations of adaptation have bred into their DNA and their tiny, hard-wired brains.

Ants and bees simply cannot want individual differences. A worker bee cannot imagine taking up the role of a drone or a queen, not just because its body is not suited to the task, but because it has no life experience of being different. It cannot imagine having a different kind of life, let alone having preferences about the life it does lead, choosing to experience different flavors of food or types of activity, or possessing anything—not even its own body. The human ability to imagine is a complex process based on our recognition of self, our understanding of how that self fits into a certain situation in society and into the stream of events, and our ability to mentally “stand outside” that society and stream and contemplate—based on all the differences we can see and know all around us—how things might be different.5 Ants and bees lack the brain complexity to do this.

A true communist society—the natural condition of future humanity as proposed by Marx, in which the coercive state has “withered away,” leaving only dutiful humans taking only what they need and giving their all in terms of time and effort—might resemble the bee hive or the ant nest. But humans are not ants or bees; their affairs are much more complex, their needs and wants include much more than just the simple economics of “find food,” “process food,” “raise young” in the nest or hive. Communism as it was practiced in Russia, China, Cuba, and elsewhere lacked the simple chemical and behavioral communication system of the hive and tried to compensate for that lack, in the human context, through bureaucratic powers of information gathering, planning, deciding, and ordering. Along the way, a state agency has to take over from each individual human and family the functions of deciding what they need, seeking it, and acquiring it. Individual and family responsibility are replaced with top-down orders and nation-spanning supply systems in a command-and-control economy.

If you really want the coercive state to wither away, you need to rely on the sort of organic, distributed action that supports the hive. You need to rely on each individual and family knowing what it needs, seeking it in the collective sphere of interaction called the marketplace, and bidding for it in terms of the time and effort needed to acquire it.6

Ironically, the economics of the hive, in human terms, is actually the free market in which individuals participate in their own interest, guided by their empathetic knowledge of what other humans might want and for which they would be willing to trade.

1. See When Socialism Works from October 10, 2010.

2. All right, the Germans came close in the 1920s and ’30s with National Socialism. In their defense, if there is one, they were reacting to the crushing debts and guilts imposed on them by the victors of the Great War while holding in living memory the recent unification of Germany’s various duchies and city-states into a powerful nation-state. The post-war collapse, devastation, and inflation, run through the incompetence of the Weimar Republic, would tempt any population with heraldic visions.

3. And, of course, Aldous Huxley worked this dreary world in frenetic reverse—full of technical marvels, casual sex, and parroted responses—in Brave New World. But still, the emphasis was on docile acceptance of one’s place in a stagnant society that was fixed by genetics and subliminal teaching as well as by its enforced social structure: “I’m glad I’m a beta. The alphas have to work too hard. And the gammas are so awfully dull.”

4. That is, for the lucky ones. The fate of children born into the cold and left in baskets on doorsteps is another matter.

5. In fact, the word “existence” comes from the Latin roots ex for “outside,” and sto, stare for “stand.” To exist is to stand apart from something. Our entire consciousness, our awareness of ourselves as separate beings, is not based on our being absorbed into ourselves but instead on being able to divorce our minds from our body/mind complex and see ourselves from the outside—in context, as it were—as something unique and active in the greater world. Bees and ants cannot do this. Even dogs don’t do this. Dolphins and whales, chimpanzees and the other great apes are apparently able to make the leap—at least part-way—although they may lack the language to express and enlarge upon the experience.

6. In this context, money is merely a portable form of human energy—earned through time and effort, spent as an outward expression of that time and effort.