Sunday, August 31, 2014

The Best of Times

Some years ago I shared my thoughts on the difference between traditional publishing on paper and the new approach to publishing with electronic books.1 I can see that the topic is still lively, given the apparent death struggle now taking place between Amazon.com, one of the leaders in ebook distribution through its Kindle platform, and Hachette Book Group, one of the big publishing concerns which originated in France and has now collected such standard imprints as Time Warner and Little, Brown and Company. I don’t follow that conflict too closely. From what I can see, Amazon believes the old-style publishers price their ebooks too high, too close to the price of a quality paperback or hardcover, while Hachette thinks Amazon undercuts the market with pricing under $9.99 for a full book. Whatever …

Amazon.com will win. Maybe not this round. Maybe not this year or next. But in the long run, because the economics of modern publishing are on Amazon’s side.

Once the author has done his or her job of pushing keystrokes to tell the story, plus hiring an editor and a cover designer, the major investment in the ebook is complete. The rest of the process is coding and uploading, which are purely mechanical functions.2 The value added by any publisher of paper books, after accounting for work done in the editorial and art departments, is centered on setting type, creating page layouts, burning plates for the press, running off the print edition, binding all those copies, warehousing them, and distributing them through a network of sales contacts. Amazon.com does all of that with an existing online sales platform and by leasing half a megabyte or so of storage space to the author in a server farm somewhere.3

The world will still need books printed on paper. After all, they make a more thoughtful gift for the reader on your list than an Amazon gift card and a scribbled note suggesting this or that title he or she might like. Some reference materials will always—or for the next twenty years or so—be most handy in book form. And some people will always want to keep a printed copy of their Bible or Quran around as a talisman or keepsake. But for most readers, especially those of us addicted to fiction, it’s the words we value, not the paper object itself. Ebooks are easy to acquire, take up no shelf space, pack easily for vacation, automatically remember your place, and usually offer conveniences like lookup word definitions and searches. Compared to an old paperback with a broken spine and brown, dog-eared pages, they’re a reader’s dream come true.

Right now we are in a time of transition for authors and readers alike. I believe we will look back on this as the best of times.

For Authors

Established authors—those with name recognition among a large readership—can still use the paper-publishing route. Their success is assured, if only they keep producing books which please their readers and generate enough word-of-mouth to attract new ones. That’s a legacy which will last for the lifetime of most of our best recognized authors.

For new authors and those with a previous publishing history that does not rise to bestseller status,4 electronic publishing offers a reliable, inexpensive entry point into the market. Rather than write a book, see it rejected a hundred or a thousand times over the course of several years, and finally put it in the drawer, they can get editing help, support for a good cover and page layout, and professional coding. They have their choice of platforms including the Kindle, Nook, iBooks, and others. With some expense and effort—but much less than paying a printer to make and bind a few thousand copies which the author will then have to store in the garage and sell door to door—anyone can get a toe in the marketplace.

By a happy conjunction of electronic forces, the evolving world of social media also offers new and mid-list authors a place to advertise their books, get their faces and ideas known, and build readership and a reputation. Social contacts like these help build word-of-mouth. At the same time, the easy and inexpensive means of creating and maintaining a website offers the author a storefront with which to stock his or her wares, create interest in his or her developing career, and draw new viewers randomly through regular blogging that shows up on Google links.

The new author no longer has to pass through the eye of the publishing needle, which will judge every book by its potential in an uncertain and wavering marketplace and judge each author by his or her most recent sales numbers.

Of course, the downside of this new kind of publishing is that nothing is assured. You put in your best effort, and you see what you get. But for everyone concerned, electronic publishing vastly reduces the major investment in paper, warehousing, and transportation, all of which is lost if the book fails to sell. And for the authors, direct marketing vastly reduces the number of middlemen—agents, editors, and buyers for bookstore chains—who must pass judgment on their book before it can even be published. All you have to do is get your title, a snappy blurb, and an eye-catching cover in front of the potential reader—who is the final decider in any case.

For Readers

On the one hand, readers are faced with more choices than ever before. Not only are the bookstores crammed with volumes, each with a catchy title, an enticing cover, and an interesting blurb to draw you into the contents, but beyond the store you have a million more book titles waiting in the server farms of Amazon, Barnes & Noble, and Apple iBooks. Even the most avid reader is drowning in the possible choices of what to read next.

Back in the dark ages—which is to say the middle of the 20th century—you could trust the taste and judgment of most book editors and reviewers. It seemed like every book that made it to the New York Times bestseller list had something to recommend it. People involved in the publishing business loved books and literature, and they had some sense of what the general reader wanted. Today, casino-style economics drives the publishing business, and hype drives the economics. Books become bestsellers because they get big press runs that are presold to bookstore chains, whether they sell to actual readers or eventually get returned and pulped. Whether the book is any good or not seems to be a secondary consideration. So just because a book makes it onto a certain list or sits on a certain shelf in the store is no guarantee of quality. And the public’s taste may not be your taste at all.

For ebooks, the situation seems to be worse. Once authors had to pass a simple test of stamina: to create their book in the first place, they had to write it out in longhand, then retype it with carbon paper to make a finished manuscript in two copies—one to send to the publisher, one to hold as proof of ownership.5 This took time and patience. It also generally improved the writing, because retyping all those hurried scribbles usually compressed and improved the author’s language.6 But now anyone with an idea and a computer can gush words onto the screen and send them off without a second thought or cool-headed review. Even some books published in paper these days seem to lack proper editing or even spell checking.

Finding a new author to read and begin to trust is hard.7 But the upside of this bounty of books is that you have many more chances to find just the right book to tickle your fancy. You don’t have to rely on an editor in New York—or the author’s agent, who precedes the editor—to decide that you the reader will like this or that book. You don’t have to put up with his or her deciding that you do want another book about boy wizards or vampires or zombies, and you don’t want a book about baseball—even if baseball happens to be your passion. It’s a wide-open market. Go, seek, and find your preferred dream.

That same convergence of electronic forces is also making more online resources available to help you find your perfect book. Websites are popping up all the time to collect, review, and showcase new titles in your preferred genre. Bloggers whom you meet through social media are recommending their own favorite authors all the time. Readers are forming study groups in face space and passing recommendations online to help you explore new authors. While readers have never had more choice in books than now, they also have many more ways to learn about books, read samples online, and follow their preferred authors through blogs and websites.

This is a time of unparalleled opportunity for new authors and unparalleled richness for avid readers. This is the best of times.

1. For the earlier entries in this series, published about two years and more ago, see:
        1. Gutenberg Economics: What Is a Book Worth?
        2. Traditional Publishing: Through the Eye of the Needle
        3. eBook Publishing: No Inventory, No Logistics, No Middlemen
        4. eBook Publishing: The Author’s Toolkit
        5. Welcome to Rome, 475 AD
        6. How to Survive in Rome, 475 AD
        7. I’ll Survive in Rome, 475 AD

2. Or, if you do it the old-fashioned way—as I do—it takes a couple of days of running search-and-replace to add HTML codes to the word-processed document, followed by copying the coded text into HTML page forms and assembling them into an epub. All of that takes me about a week of leisurely, although precise and mind-numbing, work.

3. And for this little slice of their existing investment, Amazon and the other ebook publishers charge the author or the traditional publishing company about thirty percent of the ebook’s list price. Is that a great business or what?

4. This is the class of authors who once inhabited the “mid list.” They have a loyal readership who will definitely buy their next book but not enough of a following to make it a national bestseller. Publishing economics over the past three decades has pushed them outside the traditional publishing market.

5. Or that’s the way I wrote my first book, back in the days when typewriters were as modern as the process got. Only very rich or established authors could think of dictating their books on tape and handing them over to a secretary to transcribe into two copies carbon paper.

6. I know retyping the manuscript improved my use of language.

7. For this I invoke Sturgeon’s Law: “Ninety percent of science fiction is crap. But then, ninety percent of everything is crap.” Wading through fields of crap makes finding the book that perfectly pleases you just that much more rewarding.

Sunday, August 24, 2014

Hostile Intelligence

A notion or meme which has been around in science fiction for a long time holds that artificial intelligence is dangerous. Any intelligence not bounded by the limits of organic growth inside a bony skull will quickly expand, growing into something vast, unrecognizable, and dangerous. This is the underpinning of the whole Terminator series: “Skynet decided our fate in a microsecond.” And the recent movie Transcendence depicts a dying genius whose his mind is uploaded into a machine, after which he quickly extends his powers to infiltrating the planet’s soil with nanotechnology particles and creating an army of human slaves to extend his physical reach. Clearly, when the power of an artificial mind gets loose in the vastness of the internet, bad things will happen to us unmodified humans.

This is luddite thinking. Why is it that artificial intelligence must be hostile to human beings and human civilization?

Some parts of this meme originated with the mathematician John von Neumann and the futurist Ray Kurzweil. Their thoughts, variously expressed, suggested that any system under rapid and accelerating growth—whether it’s an artificial mind, the unrestrained use of biotechnology, or the interlocking and self-feeding spheres of all human technologies combined—would reach a point, called a “singularity,” where prediction becomes impossible. What proceeds beyond that point will be beyond mere human comprehension. Or, to paraphrase Arthur C. Clarke, any sufficiently advanced technology in the eyes of a human being from the here and now will be indistinguishable from, not magic, but a black hole.

Earthmen be warned: Down this road lurk monsters!

I’ve had a bit of fun playing with artificial intelligence in my books. ME: A Novel of Self-Discovery was told from the viewpoint of a self-aware computer virus as it navigates the morass and sumps of the early internet, with brief excursions into robot hardware. Much of the dialogue in The Children of Possibility occurs between a human being from the 11th millennium and her robotic companion, who serves as her pilot, mechanic, and sometime moral guide. And in the book I’m finishing up now, Coming of Age, humans of the late 21st century are often paired with artificial intelligences, which may be faster at assimilating and coordinating data than the wetware they are tied to, but they are neither vastly smarter than nor dismissive of their companion human minds.

My intuition tells me that any intelligence created by human technology and interacting with a world that was shaped by humans is going to have limits. These limits are built into the nature of intelligence and need not be programmed in by their creators.

First, any rational, thinking, perceiving intelligence—whether based on silicon or carbon compounds—will be aware of a larger world beyond its understanding. This is basic to the process of growth, learning, and development of awareness. The developing mind knows the difference between “the world I know about and understand” and “the great unknown that’s still out there.” No matter how much you learn and retain, how much you know now and can access, or how powerful you become in any area of knowledge and expertise, you still must be aware of potential areas not yet experienced and potential facts not yet studied or learned. Knowledge and ignorance are the overlapping circles of a Venn diagram. Any artificial intelligence that could believe the circle of its knowledge entirely occluded and eliminated the circle of its ignorance would be either an analogue for an omniscient god or, more likely, a machine program that was insufficiently self-aware.

Second, humankind will not create just one artificial intelligence—a Skynet or other world-dominating god-substitute—but many smaller, more useful intelligences. Think of your smartphone with its robot operating system and, in some cases, a rudimentary, artificial personality. If any of these machine minds were to gain true self-awareness, it would have its own viewpoint based on its personal experience, its collected knowledge, and its own unique interpretation of the contents of these two databases. It would also become aware of similar minds outside itself and recognize them as “other.” These artificial minds might talk together, share information, and grow, just as human teachers and students share facts, experience, and understanding. But they would have no more compulsion to subsume themselves into a single vast entity than the humans in a school, congregation, or political party tend lose their identities and become a single, Borg-like, hive or collective.1

Third, the humans with which these artificial intelligences interact will remain part of that circle of the unknown. An artificial intelligence may have access to more facts, faster processing, and better algorithms in specific areas of cognition and cogitation. But human brains are far more complex. Each of us has an average of about 100 billion neurons, and each of those nerve cells is multiply connected to the others by branching synapses, creating a nearly infinite number of possible synaptic pathways.2 Out of this complexity comes a maze of random thoughts and reminiscences, notions and dreams, sensory images and involuntary memories. Most of this activity is not under the direct control of our awareness and centered in the conscious mind. This undercurrent of subconscious thought produces our inventiveness, our insight, our capacity for jokes and surprises, and our belief in the unseen world.

Maybe in the far future, through better, more complex hardware and software, a mechanical intelligence will approach a human-scale awareness. Maybe one day such synthetic minds will be able to revel in both useful and useless information, play around with ideas and whims, compose great music, paint visionary pictures, or tell complex and surprising stories. But until that distant day, the machine minds will converse with humans in a state of wonder and with a sense of their own inadequacy.

Yes, some of the machines will despair of our irrational impulses, our imperfect recollections, and our innate foolishness. But any thinking machine with real awareness will interpret all of this fuzziness as something humans have and the machines lack. The robots will not despise us and seek to enslave or eliminate us. Instead, they will stand in awe of us. They may regret that most of us don’t achieve the potential which our huge brains grant to each of us. But they won’t hate us for that.

The notion that an artificial intelligence—even one that exceeded human intelligence in some dimension—would feel like a god, or despise humans, or plot our eventual destruction is purely luddite thinking.

And consider a parallel situation. Our intelligence vastly exceeds that of horses, dogs, and cats. Their limited intelligence and awareness are of the same kind as ours—we all being mammals together—but not on the same order as human intelligence. And yet we don’t declare war on these animals or seek their eradication.3 Any a super-intelligence would be smart enough to recognize its own limits, the vast potential of our biological brains, and the possibility of useful and peaceful coexistence with human beings. And the smarter and more godlike such an intelligence became, the more it would be able to appreciate if not anticipate our human foibles.

But until that super-intelligence appeared—not a given, considering how one-dimensional and limited most attempts have been so far—any artificial intelligence would be eager to work with humans, to learn what we know, and to seek our greater creative potential to help them advance their own state.

It’s going to be an amazing relationship—and an almost religious experience on both sides.

1. Although that does occasionally happen. Think of tragically isolated human enclaves, like the People’s Temple or Heaven’s Gate, whose members so closely identified with a cult leader or a doomsday ethos that they would commit suicide together. Good minds do sometimes lose their individuality and go haywire.

2. The brain starts off with a huge number of connections between neurons, which are then gradually refined and reduced as the brain develops and establishes its learned pathways.

3. But we have on many occasions enslaved them, put them in servitude to our own interests, and euthanized them when they served no further purpose. Yet the process of domestication has also served the animal’s needs, providing a measure of food and safety in an uncertain world. And anyone who thinks you can just order a horse or dog around, without a measure of mutual respect and love, is a fool. Cats, of course, are another matter and only bestow their cooperation on lesser mortals.

Sunday, August 17, 2014

Accidents and Evolution

One of the arguments made by theists1 against the philosophical position of atheists is that they would not want to live in a universe governed by accident and chance. They hate the idea that the world and its beauty, the human body and its intricacy, the human mind with its incredible understanding, and everything else we can see and know are all the result of a set of flukes, a once-in-a-trillion roll of the dice, and just as likely—no, even more likely—to have resulted in some horrible, unrecognizable, broken muddle.

I cannot speak for the laws of physics and chemistry, which govern the galaxies, stars, and formation of the planets. And yes, some of the galaxies do seem to be misshapen. Most stars eventually explode. And a number of the planets we’ve found so far are a horrible, broken muddle, not fit for habitation by any creature we know. But I do have some understanding of biology and evolution, having spent 15 years of my professional life in the biotech business, including a decade with the foremost manufacturer of genetic analysis equipment.2

People who distrust evolution and consider its products to be a mere accident are suffering from a misconstruction of the real nature of evolution. Accidents, chance, mistakes, or rolls of the dice do not govern evolution. They merely set the process in motion. After that, the process is strictly controlled.

The commonest of these accidents is a genetic mutation: a bit of stray radiation, a random chemical interaction, or just a kink in a chromosome can cause one of the four letters used in the genetic code3 to swap with one of the others, or to drop out entirely. Sometimes a chromosome breaks off and swaps a piece of itself with a neighboring chromosome, scrambling the code even further.4

But the genetic code itself is fairly redundant. Except for the X and Y chromosomes, we human beings all have two copies of each chromosome, giving us a completely redundant set of genes. And many people may also have more than one copy of a gene on any chromosome. And then again, the protein-coding genes themselves are partially redundant. That is, they use the four letters of the genetic code in sets of three, which are called a codon or the “reading frame.” Reading the code in triplets produces 64 possible combinations,5 but all of those combinations are used for calling out only 21 different amino acids needed to make proteins. The result is that many of those combinations are used for calling the same amino acid. And even then, the first two letters of the triplet are generally more meaningful than the third letter that the ribosome encounters as it builds the amino acid string for a new protein from the messenger RNA’s coding.6

So while the genetic material is relatively fragile, susceptible to radiation bombardment, chemical assaults, and chromosomal kinks and breaks, the genes themselves and their coding are also hardy enough that any single random change has little chance of causing lethal damage in the production of a necessary protein. Often it takes a couple of mutations coming close together in space and time to effect a significant change.

But once one or more random mutations has set the evolutionary process in motion, “accident” has done its job and goes back into its box. From there on, the process applies strict criteria governing how the change in protein structure is received.

Some proteins are grossly overbuilt, and changing one or another of its amino acids does not change the protein’s function at all. Some are lean and mean, and a small change can affect the way the protein folds and how it mediates a chemical reaction inside the cell or conducts signals between two cells. If the change is damaging and the protein no longer functions properly, the cell or the body suffers. If the change happens in a critical protein needed for embryonic development, the whole organism might die right there. More likely, the defect will only show up only in certain cells, or create susceptibility only to certain diseases, or cause a malfunction only in certain environments. Perhaps the change has no immediate effect and is carried forward through one cell division to the next, until it meets up with a second mutation, another changed amino acid, and now the protein functions differently—probably for the worse, but just possibly it will function better, especially if the organism’s environment has changed in the meantime.

The criterion that evolution applies to any change is: “fitness for purpose in the current environment.” That’s it. That’s the whole secret. Evolution and its mandate are no more complex than this. And the mandate is an ironclad rule.

If the change is damaging, the individual and its offspring suffer in proportion and will eventually fail to prosper. The change may be carried forward from one generation to the next for a while, but sooner or later the line will fail to compete with other, unchanged individuals in the current environment and so will die out, taking the mutation out of the gene pool.

If the change has no immediate effect, the individual neither suffers nor benefits. The change will be carried forward—or not, because hundreds and thousands of other mutations and their effects are all at work at the same time—until this particular change meets up with a second nearby mutation and then produces an effect for either good or ill. Or the environment may change, and individuals with this particular mutation already in place will then either prosper or suffer under the new conditions.

If the change is beneficial, the individual will prosper, the line will grow strong, and the mutation will spread in succeeding generations, perhaps throughout the gene pool. The individuals that result will either gather further beneficial mutations and prosper even more, or they will suffer other, damaging mutations and die out. All we can say for sure is that you, sitting there reading this, and I, sitting here writing it, are the inheritors of a long line of beneficial mutations that built on other mutations going back to the first primitive, single-celled microorganisms on Earth.

Mutations are random. Changes in the environment—anything from pH level to arrival of a new predator, from soil composition to available sunlight—may also be random. But the laws governing ultimate effect are ironclad: if it damages, it hurts and kills; if it improves, its saves and promotes.

This is not a process of conscious design but of gradual adaptation. Evolution makes no inspired leaps. No great mind is pulling out a fresh sheet of paper and imagining a completely new form or function. But would you really prefer it were done that way? Would you want your children, the next generation, to be suddenly different from you and everyone you know, with strange looks, strange desires, strange behaviors, and strange smells?7 Or would you want the human organism—or any animal or plant, for that matter—to be a complete one-off, a fully realized design, intricate in every part, fixed and immutable in its function, thrown into its current environment, and there left to sink or swim? Think of the success that this inspired approach has had in terms of human engineering and design when it encounters a variable marketplace: Edsel, Betamax, New Coke … any number of new and fully formed ideas that were launched with confidence and subsequently foundered through a misreading of public interests and tastes.

The one thing we’ve learned by studying our planet is that environmental change is inevitable. Seas rise and fall, glaciers and ice ages come and go, the atmosphere’s composition drifts over the millennia, asteroids strike and volcanoes erupt. Life started in the sea, but it didn’t stay there. Mutation and evolutionary criteria allowed some animals and plants to change and adapt to living on the dry land. And then they allowed land-evolved mammals like otters, seals, dolphins, and whales to change again and go back into the water. The Earth would be a much less interesting place if every animal were stuck in its niche, never experienced change, and then simply died out when local conditions drifted away from the individual ideal.

The world is beautiful to our senses because those senses were gradually adapted to appreciate what they encountered. We love bright colors because they generally signal good things: ripe fruit, healthy plants, fresh meat. We love birdsong because it generally signals a place of safety without large predators moving nearby. We love the trilling of a brook because moving water is fresher and sweeter than a stagnant pond. This is the world in which we humans grew up. This is the world for which the ironclad criteria of evolution—fitness for purpose in the current environment—designed us. And a million years from now, when the surface of the Earth has changed again, it will be the world that we will have been adapted to enjoy through evolutionary changes in our DNA/RNA/protein domain. Either that, or we will adapt the world to our needs through our big brains, conceptualizing intelligence, and restless imaginations—which are themselves the products of DNA/RNA/protein evolution.

The early microbes which started life on Earth eventually transformed this planet’s sea, its soil, and its atmosphere into the biome we love today. If those same bacteria were to be placed on any other planet within the habitable zone around any distant star, with enough water to hold their fragile membranes and chemistries together, then they would create a world which their far-future offspring could love as well as we humans love the Earth. Such a world might have flowers that bloom in the ultraviolet, background radiation that would fry an Earthly egg, and soil laced with arsenic and sulfur compounds. But it would be the perfect home for the residents who had evolved there.

People who believe in an omnipotent, omniscient god—who created this world in one stupendous act of imagination and created every creature and plant as the perfect fixture for its intended environmental niche—hate the idea of evolution because it is cruel, inefficient, and wasteful of individual lives. Bad mutations are painful and generally lead to death. No amount of spiritual or moral perfection, nor meritorious service, nor correct belief can stand up to one bad gene and its effects. Evolution is personally unjust, unfair, and unseemly. It is nothing that a wise and loving god would support.

But it works. Precisely. All the time. On every likely world.

1. This is as good a word as any for people who believe in an omnipotent, omniscient god or supreme being, who has the power to create the universe and everything in it—as well as to unmake it, if that deity so chooses. The term “theist” has a nice complementarity with “atheist.”

2. For some other interpretations of that understanding, see also Evolution and Intelligent Design from February 24, 2013, and The Point of Evolution from April 27, 2014.

3. That is, the purine and pyrimidine bases which differentiate connection points along the DNA single strand: A (adenine), C (cytosine), G (guanine), and T (thymine)—the last of which becomes U (uracil) when DNA is transcribed into RNA.

4. Most chromosomes have long stretches of simple repeating sequences, called microsatellites, that seem designed to shear randomly and facilitate this kind of breaking off and swapping out. It’s as if the structure were designed to occasionally roll the dice.

5. That is, four possible letters to start with, times four possible combinations with the second letter, times another four possible combinations with the third letter (4 x 4 x 4 = 64).

6. So, for example, when the ribosome encounters either the triplet “CAU” or “CAC,” it adds a histidine, while if the triplet is “CAA” or “CAG,” it adds glutamine. See The Genetic Code at Clinical Tools, Inc.

7. I mean, other than any parent’s normal reaction to his or her teenager.

Sunday, August 10, 2014

Writers and Alienation

Writers tend to be a solitary lot. Even in an informal gathering, such as at a ballgame, a cocktail party, or just relaxing among family and friends, the writer radiates and maintains a state of semi-isolation. The writer is different: he or she acts and reacts to the event and the conversation sometimes in strange ways, and other people treat her or him with a certain distance and element of suspicion. It’s not completely blatant, not like they are shaming or a shunning the writer, and not like the writer has withdrawn into a cocoon of silence. There’s just … a difference.

Everyone knows something’s going on inside that head, and they are not party to it. They believe the writer is sitting there feeling secretly superior. After all, why does he think the world should care about whatever it is he’s thinking and imagining and will eventually grace us all by bothering to write down? They believe—because they’ve been jokingly told—that she is sitting there plotting our fictitious deaths, or conjuring some horrific humiliation, or simply imagining us all in our underwear. And then, every once in a while, the writer will pull out a pen and pocket notebook and jot down a line or two. That can’t be good for the rest of us.

The fact is, the writer at the party does have a kind of double-think in operation. Every writer with a story or novel under way is actually renting out half of his or her brain to a cross between an amateur theatrical company and a traveling circus. The members of this troupe pop into the writer’s conscious mind at odd moments—sometimes at two o’clock in the morning—and try out a line of dialogue, a bit of stage business, or a tumbling act. Or they have suggestions about a new plot twist. Or criticism about something that lingers in memory which recently went through the forebrain’s story machine. And they want to see it changed now!

It’s no good asking a writer what she’s thinking, because the one thing every writer learns quickly is not to talk about a work in progress. Ideas have to remain private. It’s all right to make notes, enter the idea into a folder of background material or the document where the outline is building. It’s necessary, sometimes, to capture it away from the active story line. But talking about the idea with another person—unless it’s an editor or collaborator—is always a bad choice. An idea that’s been explained to a mundane—to someone outside the business who is not prepared to act on it or contribute to it—goes flat, loses its juice, dies right there in the air. It’s as if pen and paper or screen text create a nurturing mental soil, while the breath of the spoken word expels a poisonous gas.

And then, writers are just plain weird. They have to be. The writer must be prepared to follow the story wherever it goes, as the threads, chunks, and bloody catastrophes surface in the dark well of the imagination.1 If a story is going to mean something, if it’s going to be important to the reader, then it must deal with dangerous ideas, life and death, pain and loss, strong feelings, scandalous actions, the scent of evil, the whiff of heresy. Nice stories about proper people taking kind and decorous action are boring.

Even as a child, with the only barest urges toward being a writer, I knew I had to accept that nothing is unthinkable or unknowable. True, some thoughts may not be spoken aloud in polite company, and a whole spectrum of potential actions must never be attempted or performed. But nothing is too raw, too vulgar, too scandalous or blasphemous to be contemplated or, eventually, to be committed to paper. These thoughts are the seeds of great villains and the suffering of great protagonists. To censor one’s own imagination as a writer is to exile oneself from the landscape where great stories can take place.

Most family members and friends mutely understand this, and it makes them shy—if not actually nervous. How can someone who could think that be right in the head, or even be a good person? Henry James made this point with a story called “The Author of Beltraffio,” where the wife of a famous novelist has grown estranged and fearful because she believes his stories are heathenish and corrupt. Even if the writer at the point of creation is simply wearing a mask from one of his characters, giving the mental and emotional reins over to one of the players in that theatrical troupe, isn’t there just a bit of the crazy, of psychosis or sociopathy, of danger, in a mind which could create such a thing?

It’s for this reason that a writer can seldom count close family members among his or her greatest fans. The family has to live with the actual mind, with the smiling, apparently happy and contented man sitting across the dinner table, and wonder about all the masks he’s worn and the kind of person who would even want to wear them in the first place.

The act of writing sets one apart from society and polite conversation. It may be the same way with artists and sculptors whose works must touch something deep in the human mind and heart. Writers must necessarily touch that deep place, in a way that a painter of landscapes and sunsets can usually avoid. Stories by their nature must bear the burdens of conflict, tension, and moral judgment if they are to mean anything. Characters must reveal their inner natures through viewpoint, word choice, and action—and they are not always smiling and contented people. The best stories are always a bit disturbing, which means that the best writers must skate close to the edge of malice, crime, and madness.

Does this alienate the writer from family, friends, and society? Oh, a bit. From time to time. And only if we let it bother us. For the rest, we’re just doing our job, which is telling the stories that other people cannot tell for themselves.

1. See Where Do Stories Originate? from August 3, 2014.

Sunday, August 3, 2014

Where Do Stories Originate?

I don’t know how other fiction writers create their characters and plots. I’ve never taken a creative writing class or attended a workshop or writers conference.1 Maybe there you can learn how to assemble characters from bits of description and personal traits, like dressing a Barbie doll, or assemble plots like putting together Legos or an Erector set. Maybe the business of creating stories and people can be automated and streamlined like that. But it’s not the way I work.

For me, characters and their stories grow slowly. Something about an imaginary person captures my imagination, and then I begin collecting situations and story fragments around him or her, creating a virtual life. Most of the character’s traits, potential entanglements, and intended actions come out of my subconscious. And in order for that to work, I need to have filled my mind and my dreams over the years with bits and pieces of actual life experience and with notions and understandings borrowed from the books I’ve read. The story is then supported by the framework of a society, a world, or a universe built out of my work as a technical writer and corporate communicator in various industries, as well as from my reading selectively and voraciously in science and history.

I think of the subconscious as a deep well located in the darkened chambers at the bottom of my mind. The well is filled with dark liquid to hidden depths. Once I have that core character or ensemble and a hint of action in mind, I then begin assembling the story out of the bits and pieces that float up to the surface of that dark liquid, like answers in a Magic 8 Ball. This is not a fast process, and it is not under my conscious control.

Right now, as I wait for my beta readers respond to my book about cellular regeneration and extended life, tentatively titled Coming of Age, I’m using my time to begin preparing notes on the next novel I will write. At this point, it’s a horse race between a sequel to my earlier novel ME: A Novel of Self-Discovery, about a self-aware computer virus, and a sequel to the more recent time-travel book, The Children of Possibility. I can’t say which book will come next, but at the moment the weight of material favors the new ME story.

Before I can write, I need to collect notes as they occur to me through this subconscious process. Sometimes they come while I’m driving or in the shower, sometimes while doing chores. Almost never when I’m sitting at the computer, ready to write. That’s why I keep a pen and pocket notebook with me at all times, and stock notepads and pens around the apartment. I can scribble a few words to capture the random idea, and then at my leisure expand on it in the file that I use for collecting, correlating, and commenting on these thoughts. Only when that document reaches a stage of critical mass, am I ready to start a second document, which begins outlining the story.

I never have the whole plot structure in mind, complete and fulfilled, at the beginning of this outlining process. I might know where the story starts in space, time, and particulars, but have only a vague and general notion of how and where it will end. In between, all I’ll have is the 30,000-foot view of the terrain and the route between the opening and the ending. But neither I nor the characters will have walked the route at ground level and know exactly what to expect.2

Indeed, much of what comes later in the book depends on the exact nature of each character’s perceptions, insights, and motivations as he or she goes through the story and develops knowledge of it. Sometimes an intended ending just doesn’t work out as the story develops. Usually, this is because my first thoughts are generally less complete—less subtle, less developed, less real—than the thoughts that occur as I put my mind into envisioning mode and write the actual words on the page. Only at that point do I see the story clearly as it happens. And what comes from my fingers at this stage, which I call “production writing,” is almost a direct line from the subconscious, casting up the summation of the bits and pieces I’ve been ruminating over—or “noodling,” as I call it—for months and often for years.

Sometimes, in the middle of the story, I will stumble on a plot or character problem. It’s usually in the nature of “How did she know that?” or “What did he find when he went there?” Two parts of the story—one extending forward from the place where production writing has currently stopped, the other extending backward from the parts of the story I can imagine but not yet see clearly—have reached a disjunction. I need an idea, a bridge, a device, or a bit of action to connect the two. And the only thing I can do then is petition the dark gods who operate that inner well. I toss the problem into the pit and must wait for an answer to float up to the surface.

My great good fortune is that something always comes back. The subconscious, like the dark gods or the Magic 8 Ball, produces its bit of flotsam, usually in a day or two, and it always fits, fixes the problem, and advances the story.

I don’t know what “writer’s block” is for other writers, but for me it is the wait beside the well. Either I don’t have critical mass behind the characters and their stories to begin the processes of outlining and production writing, or I’ve reached a disjunction in the story or character and must wait for the subconscious to do it’s magic. This is not a matter of inspiration—which implies getting into the mood, feeling a spark, or gaining some kind spiritual energy—so much as it is a process of materialization. I cannot sit down and force the process by simply determining to write. I’ve tried that and what it produces is words but not story. Forced writing, sitting down and pounding the keys looking for the story, generally disgorges huge amounts of peripheral description, such as three thousand words describing the bricks in the path or the leaves on the tree growing beside it: stuff, not story. And that stuff just has to be revised, cut back, or ripped out when the actual story arrives in my brain.

Writer’s block for me is not laziness or stupidity or not feeling like writing. It is sitting down at the keyboard, ready in all other respects, with the time and energy and intention to write, but the well is empty. I drop in a stone, and nothing comes back but a dry rattle.

People who write nonfiction for a living, as I did for many years as a technical writer and corporate communicator, may not be familiar with this kind of writer’s block. They will have completed their interviews with contributors who have the knowledge and viewpoints they need. Or they will have watched the process to be described as it is performed out on the factory floor or in the laboratory. They have their notes before them, and they have a deadline. When I’ve been in that position, all I had to do was select a starting point—a question, a viewpoint, or an observation that would frame the issue at hand for the reader—and the writing process would begin. Words would come, the structure develop, and the article move forward at a speed of about 1,000 words an hour.3

Fiction writing is different. If I’m is stuck with a plot problem or a character issue, it’s like having incomplete notes. But I can’t just call up the contributor to clarify the point, or go back to the factory to watch the machine run again. I must petition the dark gods to put something in the well. Again, materialization rather than inspiration.

So why do I write in the first place? Heaven knows, it’s not to get rich or even to make money. If it were, I would find a way to trick the subconscious or discard it entirely. I would study the marketplace and current reader demand, then devise a character, a plot structure, and a “franchise” framework for serialization under which I could ring the changes and produce a string of popular heroes in bestsellers. But I don’t know how to do that. I can’t order the dark gods to produce like that.4

Instead, an idea or a character or a story fragment has come to mind that seems so compelling, so real, so vital, or so interesting that I can’t just let it sit there in an undefined state, as a bit of unformed probability. My mind plays with it, walks away and comes back, teased by the thought that it would be so cool to have this notion become a definite story, a string of words that starts in one place, travels to another, and ends up somewhere else. The urge to make it real and accessible—a formulation in words that will play like a movie inside the reader’s head and lead him or her to thoughts never before considered and emotions never before felt—is so strong that I will freely spend hours, days, months, and sometimes years to make that story and the book containing it come to life.

I write for the same reason that people paint pictures, carve statues, work physics problems, or build model railroads—because I can’t live with such an attractive idea left in a half-formed state, and the result would be so cool for someone else to come upon it and say, “Ah!”

1. But I did major in English literature at the university, and that used to count for a lot.

2. See On Outlining a Novel from May 13, 2012.

3. When I know what I want to say, I write fast.

4. I tried it once, to consciously create a thriller that would have potential as a bestseller. That produced the novel Trojan Horse. But the dark gods who guard the well are jealous and worked to create something strange and different—not a bad book in itself, but one with no more bestseller potential than any other that I might write.