Sunday, October 27, 2013


We all live in stressful times. That’s an obvious truism. But, after some recent conversations, I have become even more aware of the nature of stress and have come to realize that, at least in the western democracies, we are living inside a huge social experiment—one without controls and with no end date.

Stress takes many forms. At the basic, biological level, stress involves any change in the immediate environment to which a species has became adapted. If the ambient temperature or humidity or other physical factor changes significantly; if the food supply markedly decreases, increases, or alters its nutritive content; if population density increases or decreases, or alters the expected ratio of available mates—the individual will be stressed. Physical stresses can result in lowering the immune system’s ability to fight disease; changing the sleep-wake cycle; changing metabolic rates, with increases in eating disorders and obesity; triggering auto-immune diseases like asthma and diabetes; and even contributing to various cancers.

Psychological stresses can arise as a secondary effect of physical stresses like sleep disorders and disease processes, but they can also occur as a direct result of environmental changes such as crowding, sensory overload, repeated exposure to physical risk, and excessive demands on our attention span and task orientation. Psychological stresses directly affect how we as individuals relate to other people, perform and cope with the tasks of daily living, and deal with marginal demands on our energy level.1

In the overall population, psychological stresses can increase the incidence of extreme individual acts such as violence and suicide as well as create subtler changes in the social dynamic. For example, some scientists are beginning to relate a population’s incidence of homosexuality and other non-reproductive practices—such as heterosexual couples preferring to delay or even forgo childbearing—to that population’s density. The closer we live together, the less we feel the need to bring more children into the mix. These could well be innate biological responses arising from social conditions and their accompanying psychological stresses.2

So what is this vast social experiment in which we’re engaged? I believe it takes two chief forms: urbanization and information density.

Other than Rome at the height of its empire, which is said to have reached a population of one million, cities in the ancient and medieval worlds simply didn’t grow very large.3 Rome managed it through a combination of superb engineering and government planning. They built a web of aqueducts that brought fresh water across the plains from distant hills. They build a web of public lavatories and sewers, ending in the Cloaca Maxima, to take their wastes down to the Tiber. They built multi-story tenement complexes, called “insulae” or “islands,” where unrelated individuals and families lived cheek-by-jowl in separate rooms and apartments. And they organized fleets of grain carriers to bring a seasonal corn ration from Africa and Egypt to feed the population—which then required them to organize a fleet of warships to fight off pirates. Rome grew big because technology permitted and encouraged it.

For the rest of the world, most people lived a rural, agrarian life. Unless you were a Roman legionary, a Persian Immortal, or a member of the Mongol horde, you were born, live, and died in a village or market town, seldom traveled more than thirty miles from home during your entire life, and had personal acquaintance of perhaps a couple of hundred people.4 And with notable exceptions like Marco Polo and Christopher Columbus, that was the scale and pulse of life for the average human from our hunter-gatherer ancestors right down to about the 1500s. Life was limited, dreary, predictable, quiet—and relatively unstressful. According to various estimates, the population of the entire planet was about 500 million.

Since then, of course, we’ve ballooned our population to about seven billion people—fourteen times as many as we numbered half a millennium ago. Cities with a population of a mere million people today are considered small. We didn’t do this because human birth rates suddenly shot up due to some unforeseen biological change. The number of humans grew for the same reason the City of Rome grew so large—superb engineering and government planning. We’ve learned the knack of providing clean water, relatively efficient waste disposal, dense urban housing, a highly organized economy, and plentiful food and entertainment to people in small geographic areas. And with the rise of technology, we’ve also added chemical and electrical energy to run our productive machinery, light our lives after dark, enable our communications, and distribute our entertainments. Our populations grew in numbers and density not because of any biological cause, but simply because they could.

Still, in a world of seven billion people, a human being with a brain, mind, and attention span evolved for a hunter-gatherer society can usefully know and relate to perhaps a couple of hundred other people. That’s your nuclear family, extended family, self-selected tribe of personal acquaintances and co-workers, and casual contacts with service providers like the mailman, store clerks, and cop on the beat.5

Those are the people you meet in the flesh and know by face and name—the people in your social circle from whom you might catch a cold or the next cycle of influenza. That’s been the social contact radius from the beginning of humanity down to about 1500 AD. But in the fifty years just before that date something happened that would change the game completely: Johannes Gutenberg adapted Chinese block printing with the addition of moveable type and invented the modern printing press.

Suddenly, you were in psychological and informational contact with many more people than just those with whom you could physically meet or exchange a letter—presuming you could write in the first place. Suddenly, that little-known practice of monks and clerics, not even well understood by your betters in the aristocracy—that is, reading and writing—became as essential a human skill as digging a latrine or riding a horse. Where before your life was bounded by symbols in stained glass, occasional pictorial signs above a tavern, and rules laid down by tradition and the spoken words of a king relayed by a herald or the local sheriff, now your life was guided by texts, road signs, proclamations, and legislation. Suddenly your circle of hidden acquaintance—the minds, thoughts, and thinking processes you might come into contact with but not actually recognize as another person—widened dramatically. This was the effect of information density on social organization.

In 500 years, our civilization has widened the exercise of reading from absorbing an occasional new law about a tax or the price of bread, to absorbing monthly, weekly, or even daily chunks of unrelated news and information from far outside your home and village, and then to actually reading for pleasure and immersing yourself in the lives, concerns, and activities of fictional people who never lived in the first place.

In the last 100 years with the advent of radio, and the last 50 years with the advent of television, we’ve multiplied that stream of news and information and fictional acquaintance a hundred- or a thousandfold.

In the last ten years with the expansion of the internet, and now in the last five years with the advent of social networking, we’ve multiplied once again the amount of information and non-personal concern that a person is expected to absorb. Today, the sphere of personal knowledge can encompass virtually every thought, fact, image, and opinion in existence on the planet. We’ve widened our acquaintance by the number of people we either know by proxy or simply by face and name, along with whatever else, true or false, they might choose to share through social media.

Before, a person was asked to care about and work on whatever might affect the village’s affairs by way of political decree or economic shift, and accept threats from what could come over the horizon in the way of storm, invasion, drought, and famine. Now, people are made aware of and concerned for wars happening half a planet away and environmental or economic threats a decade or a century into the future.

Before, a person of some prominence had to worry about his or her reputation in the town or the county, among a social circle that was bounded by the human voice and the limited scale of personal acquaintance. Now, a person fears to be censured—but also, in some cases, hopes to become known and praised—at a provincial, national, or even global level. People you don’t know, never heard of, and never will meet in the flesh are now weighing judgment on your deportment, your lifestyle, your reputation, your work, your art, and even your continued existence.

Where once you were aware of perhaps three hundred people out of 500 million on the planet, now you can plausibly be concerned with the thoughts and opinions of all seven billion—or at least the subset of those who are electronically connected to the world by a smart phone and a computer.

In our modern, urbanized societies, interconnected by streams of paper and floods of electrons, we live closer to more people and know more people on the periphery of our social contacts by perhaps an order of magnitude. And we are aware of more people beyond those whom we can touch, hear, or see by several orders of magnitude. We are asked to be sensitive and socially aware on a local, national, and global level, and take responsibility for challenges and problems that extend far outside our own lives, those of our family, and even our generation.

Does that add to our psychological stressors? How can it not? The experiment has been running at various intensities for 500 years, 50 years, and 5 years now. And no one can say what new level of complexity or social involvement might be added in the future with some new technological advancement that will extend the experiment by even greater orders of magnitude.

So I would cut the average human being some slack if he or she occasionally snaps in despair, commits random acts of violence, self-medicates, or retreats into the fetal position. The party will go on until the last celebrant standing drops dead on the dance floor.

1. Here I’m using “marginal” in the economic sense of a specified limit beyond which existing rewards, returns, or results can be predicted to change. The “last straw” could be defined as the “marginal” straw.

2. No, this is not a slam at homosexuality, nor an attempt to portray it as a chosen lifestyle. I understand that homosexuality is in most cases an innate condition, present in some form within the psyche at birth. But that is not to say it cannot also be a response to hormonal triggers and social pressures. The tendency may be there in the genes but, as with many other aspects of human life, it may also need psychological and biological help to come into full flower.

3. According to various estimates, the most populous cities from ancient times up to about 1500 AD might have accommodated between 100,000 and 600,000 people. Dense, but nothing like today’s crowding.

4. The other exception to the travel radius was nomadic herders, who might go a few hundred or a thousand miles from season to season. But since they took their family and tribe along with them, their circle of acquaintance was probably even smaller than those with a fixed address.

5. Sure, you work in a multinational corporation with an employee base of 100,000 people—but with how many of them do you actually do business? Perhaps twenty inside your own department. Maybe fifty more outside. Maybe two hundred more—and at a lower level of intensity and frequency—among your paying customers. … Add them to the members of your family, church group, and bowling or softball league, and you’re not dealing with many more people day to day than a serf attached to a feudal farming community.

Sunday, October 20, 2013

Safety Net vs. Hammock

In these times of heated political debate, which seems about to freeze into the magnetic polarity of all-out political war, it becomes harder and harder to determine actual political thinking. Progressives on the leftward pole seem to believe conservatives want a return to the brave new world of the 19th century, where children pick coal off slag heaps or thread spindles in textile mills seven days a week, and canned meat crawls with filth and botulism. Conservatives on the rightward pole seem to believe progressives want to create a cradle-to-grave social state resembling “the real world” of the red pill in The Matrix, where babies are put into crèches, hooked up to machines, and fed the decomposed bodies of their parents through a tube, all the while being entertained with a computer simulation of everyday life.

In other words, your political opponent is a fool, a green meanie, and a tool of Satan all rolled into one.

As a conservative,1 I realize that a certain amount of social programming is essential to a well-run society. I also know that allowing markets to satisfy human desires and letting capital to go where it’s needed most is the best way to run an economy. I know this because other methods have been tried over the centuries, from Randite social negligence during the medieval period to Stalinist and Maoist human engineering in the 20th century, and they all crashed and burned. For me, the key question is not socialism yes or no, market capitalism yes or no, but what’s the best mix of rules and incentives to give the greatest number of people the fullest, most meaningful human experience.

I believe in human beings. I know that some are shallow, fickle, and stupid; some are conniving and vicious; and some are saints whose feet hardly touch the ground. But most people have been pretty well brought up, tolerably educated, and made aware of their life choices. They tend to seek peace in their lives, an opportunity to find work that fits their talents, and the freedom to make a home for themselves and their children. They also want to drink beer and root for their favorite team on weekends—or drink wine and attend the symphony—and make a personal contribution to their company and their community the rest of the time.

All of that requires a certain amount of self-awareness, a willingness to work and struggle as well as dream and hope, and a not inconsiderable amount of luck. Some people, with the best intentions and training, still fall on hard times through bad luck, ill-considered choices, and sometimes personal folly. Some get sick and fall behind in the role they have carved out for themselves in society. And some are born with deficits of nature and nurture that stack the deck against them from the beginning.

For those who’ve lost the ability to cope, or never had the opportunity to try, lectures about hard work and self-reliance are not a solution. When you’ve fallen off the high wire, someone on the ground shouting “Keep your balance!” is hardly helpful. So, even as a fiscal and social conservative, I want to see a safety net in place. The market-based, capital-infused economies of the western world are rich enough—that is, they produce enough extra value—that they can take care of their citizens who stumble. This is proper.

But my preference, as a social and fiscal conservative, is for a taut, springy safety net from which those who fall can bounce back and find their feet, rather than a soft, comfortable hammock in which they can lie back and put up their feet. This is my preference, not because I’m a cruel person who wants to see poor people dumped onto the cold, hard ground, but because I believe the nature of human beings is to find what they’re good at, gain confidence by doing it, and make their own way in the world. A person who is challenged and succeeds is happier than one who never knows his full worth. This is built into our natures through a million years of hunter-gatherer wandering across a hard plain where ripe berries don’t always fall off the bushes and rabbits don’t come up and beg to be snared.2

One of my cultural icons, Robert A. Heinlein, once warned, “Don’t handicap your children by making their lives easy.” He understood that life is usually hard and unforgiving. Even the best of us has to watch where we put our feet, count our pennies, save for a rainy day, and keep our powder dry. Mother Nature is unforgiving of fools. By extension, any society that nurtures fools is going to get a lot more of them,3 and that way lies madness, poverty, and a hard time for everyone.

Of course, life sometimes gives a person insurmountable challenges. I’m not in favor of seeing people who are overmatched by circumstances—through bad luck, bad timing, medical necessity, or even willful choices—degraded. That’s the purpose of a safety net, to allow people who get a bad deal or make bad mistakes to recover their poise and move on. And in circumstances where that poise is lost forever—such as chronic illness or incapacity—a measure of support is the obligation of a well-run society.

But to assume that a large fraction or even a majority of citizens cannot survive without support is to exercise the discrimination of soft expectations. People—especially children and young adults—will perform to the level required of them. To require too little as a parent, teacher, first boss, or drill instructor is to doom the person to a lifetime of underperformance, near if not total failure, and eclipsed dreams.

Human beings are designed by evolution to respond to challenges. That is, the higher functions of our cerebral cortex have evolved not just to learn and remember, but also to project and evaluate, to probe and plan. We evolved to figure things out, to foresee the future and imagine our place in it—whether that future is what we’ll be doing next weekend or what kind of career path we’ll follow as an adult. We evolved to develop and accept personal goals, to strive toward a future we can imagine. Of course, some of that future will have been suggested by parents, teachers, and early heroes and role models. But in the end, we make it our own.

To deprive a person of this opportunity for growth and development, to put him in a cocoon of soft expectations and guaranteed results, is to make a chattel of him. He might think he has the world all figured out, that he’s stuck in his thumb and pulled out a plum. To him, it looks like a fair trade: be good, be docile, be content, and the effortless rewards will keep on coming—cheap living space, cheap and plentiful food, free entertainment, and a modest stipend with which to exercise a limited imagination. But the person in such a cocoon is living a life designed for him by others. And there is usually a catch: obey our suggestions, support our goals and initiatives, vote our way, hate our enemies, and—when called upon—fight and die in our wars. Time to pay the piper.

For many people, this might be enough. But not for me, and not for the people I know and love and respect.

The issue of safety nets, social responsibility, and personal goals and opportunities will only become more acute as we move forward with the technical revolution that is now shaping life in the western democracies. As machines and automation provide more of the daily necessities, and assume tasks that once could be done by relatively unskilled human hands and uneducated human minds, the old jobs, work patterns, and means of survival will certainly change.4 In the future that’s now rushing toward us, we will all be struggling to find meaning in our lives and our niche in society.

The questions and defining options for our society are only going to come harder and faster. And so it would be helpful if we could see beyond the current political positions based on fear and fantasy and actually discuss what we—as a people, a society, a social organism—are going to do.

1. Actually, I come out near the center of the four-quadrant chart sketched at The Political Compass, which is based on a series of questions. Out of ten steps in any direction, I show as 0.62 toward “right” on the economic scale and 2.97 toward “libertarian” on the social scale. This is not because I cannot make up my mind between the poles of right and left economically, or libertarian and authoritarian socially. Instead, I strive for balance, equanimity, a blend of individual and social goods, a middle way.

2. Of course, a thousand generations of hunter-gatherer wandering has also ingrained a primitive sense of social concern into human nature. Families, tribes, and small companies of travelers know how to share and take care of each other. See When Socialism Works from October 10, 2010.

3. Peter H. Diamandis, chairman and CEO of the X PRIZE Foundation, has formulated Peter’s Laws, No. 19 of which—although I’ve heard this in other contexts, too—says, “You get what you incentivize.”

4. See Automation, Work, and Personal Meaning from February 7, 2011.

Sunday, October 13, 2013

Why Own When You Can Rent?

Many conservatives believe that, under the current progressive administration and majority in the senate, this country is headed for some kind of socialism or even a kind of Marxism. Although these political brands have been tarnished, if not entirely disproven, by the fall of the Soviet Union, the failures of East Germany and North Korea, and the sotto voce turn toward capitalism in the People’s Republic of China, conservatives still believe that dismal, dystopian place is where the progressives want to take us all.

In the conservative mind, the model would be the sudden turn toward socialism that Great Britain made right after winning World War II. Then the government subsumed all hospitals and most physicians under the umbrella of the National Health Service and operated them directly out of ministerial departments. It nationalized major industries, turning private enterprises into British Airways, British Petroleum, British Telecom, British Steel, and other national industries run by the government after the fashion of the British Broadcasting Corporation. Capitalists out! Government ministers in!

Of course, Britain has since de-nationalized many of these industries. In fact, the only people still advocating and practicing this kind of state-owned socialism are third-world banana republics like Venezuela and Bolivia. And while I have no great love for progressives and their philosophy, I won’t make the mistake of calling them stupid.

We may be headed for—and in some cases have arrived at—an entrenched politics of personal dependence on government programs, high taxes on those who actually make a product or offer a service, wealth redistribution from the haves to the have-nots, and government control of all aspects of the economy. But we are not going to see a U.S. Airways,1 U.S. Petroleum, U.S. Telecom, and so on.2 What the progressives have learned from the British and the other nationalizers is, why own when you can rent?

Owning and running a company is hard work. You have to actually provide a product or service. You have to deal with suppliers and distributors. You have to keep customers happy—or at least not in revolt. You have to keep books and, if not show a profit, at least not create a scandalous loss. You have to take responsibility.

Of course, taking control of the corporate cash drawer is attractive. If you’re the lucky government minister appointed to run the national company, you have opportunities to engage in a bit of the “champagne and bad management” that you know those filthy capitalists formerly got to practice. But if you don’t have the cash drawer firmly in your paws—if you have to watch a government colleague operate the business from your seat somewhere beyond the fence line—then you have concerns. How do you know the man in charge will faithfully execute the public trust? Who’s around to watch him any more closely than the former government regulators—who have proven to be pretty fallible of late.3

And then, we have a long history of fighting monopolies in this country, going back to the robber barons of the late 19th century with their cartels, trusts, and interlocking directorates. For a hundred years, the government has been telling the American people that Big Business is bad, unfair, and anticompetitive. And within living memory the Justice Department has gone after AT&T, Microsoft, and others to prove the point. It would be a little hard to sell the public now on the virtues of putting all the economic resources of an industry under one management’s control.

But they don’t have to! The U.S. government has its own “interlocking directorate” of regulatory agencies to manage all the players in an industry. The Securities & Exchange Commission and the Internal Revenue Commission overlook their finances. The Commerce Department regulates their inter-business relationships. The Environmental Protection Agency makes sure they live lightly on the land. Federal and state consumer protection agencies look out for their customers. The Department of Labor, Equal Employment Opportunity Commission, and Department of Health and Human Services look out for their employees. And if the business is in a specialized industry like energy production, transportation, or pharmaceuticals, a host of particular agencies and regulatory boards have a hand in its affairs.

Even the Affordable Care Act, which will eventually take control of the health care industry, has no present mechanism for acquiring ownership of hospitals, laboratories, and pharmaceutical companies, picking up the contracts of doctors, medical technicians, and administrators, and managing the actual provision of health care services. Instead, it will manage the business of others by mandating the services they must provide and the price that the government as the ultimate, eventual single payer will offer. Figuring out how to survive and thrive between the two—between mandated product and regulated price—will be the job of the independently capitalized companies.

Why own the company and take on all the trouble of making it work, when you can simply write rules and regulations that mandate how the business will operate, what it will offer, whom it can employ, what resources it can acquire, and how it will use them? In the progressive dream, the money that the federal and state governments pay out to deserving citizens in the form of entitlements comes, not from some elected official’s or non-elected bureaucrat’s own pocket, but from the wealth of other citizens which has been taken in taxes. In the same way, the government can offer society the benefits of its own view of safe products and services, fair employment, and appropriate use of resources by regulating all aspects of industry and the economy. That it does so by playing with other people’s money—the investments of stockholders and the financing provided by banks and bondholders—is of no concern to the bureaucrats.4

Like the renter from hell, who clogs the plumbing, burns the carpet, and punches holes in the walls, then sues for the return of his damage deposit, it’s not his concern, not his property, not his responsibility. Who would want to take responsibility for actually running the economy and making it work when you can simply manage it?

1. At least not as the kind of mega-company that owns all the planes, employs all the pilots and flight attendants, and runs all the scheduled flights into, across, and out of the country. But a smaller U.S. Airways will still exist, and the Justice Department will still block its proposed merger with American Airlines on anticompetitive, antitrust grounds.

2. Even when the federal government bailed out General Motors and Chrysler at the start of the Great Recession and acquired part ownership in them, the intent was never to subsequently snap up Ford and then create some kind of national automobile company. The government remained a relatively silent partner in the two firms and easily permitted them to “buy their freedom”—with the government’s GM stock sold at public offering and its share in Chrysler going to Fiat.

3. Think of how diligently the Securities & Exchange Commission followed Bernie Madoff’s career, or how effective the government was in tracking the financial ball of yarn in the mortgage funding market that came unwound in 2008.

4. Some regulators such as the Environmental Protection Agency are, in theory, required to perform cost-benefit analyses of their proposed regulations. Of course, any such analysis will depend upon the values one places on the intended benefits and the estimates one makes of the expected costs. This is how the delta smelt—essentially a bait fish, although a rare one—came to outweigh the agricultural output of California’s Central Valley, where one percent of the nation’s arable land raises—used to raise—eight percent of its produce with a value in the tens of billions of dollars.

Sunday, October 6, 2013

Rules for Writers

I am Facebook friends these days with a number of writers, and many of them have recently been posting links to blogs that offer rules for writers. Most of these bloggers catalog offenses they feel should never be committed. Along the same lines, back when I was attending science fiction conventions, many panels on authorship offered rules on how not to write. And finally, early in my career I participated in a number of writers’ groups, and one or more of the members would always criticize from a position of rules that had been broken in the manuscript under discussion.

What sort of rules am I talking about? Not the generalities of writing clearly and naturally, avoiding excessive detail and the straining of artifice, being mindful of the reader’s time and attention span, using correct grammar, capitalization, and punctuation, and other matters generally raised in high-school English classes. The rules that get posted these days are specific, pointed, and implicitly refer to usages that the poster has seen too often in the books and manuscripts he or she reads.

Off the top of my head, and without intending to reveal sources, here are some examples:

• Don’t use adverbs. They’re a tool of lazy writers.

• Don’t use adjectives. Instead, just find a noun that best describes the object.

• Never use “book-said-isms.”1 To indicate who’s speaking, only use the plain verb “said.”

• Don’t tag lines of dialogue with the verb “said” to indicate who’s speaking. In a talk between two people, the reader can easily follow the back and forth. In a conversation among three or more, craft the speech pattern of each person—through modulation, word choice and order, accent, and so on—so that the reader will know who’s talking at any moment.

• Don’t use colorless verbs, but instead find action words. Characters shouldn’t “move” across a room; they should “creep,” “pace,” “glide,” or “stride.”

• Don’t use lists. Instead, pick out the two key elements and let them stand alone for what you mean.2

• Never use exclamation marks. The state of terror or excitement should be obvious from your choice of words.

• Never use ellipses. For any pause, a simple comma is sufficient.

I certainly agree that any of these elements, when used too often, will intrude on the reader’s enjoyment and distract from the reading. A good craftsman knows all of his tools and uses them at the appropriate time.3 If your only tool is a hammer, your prose is going to read like bang, bang, bang!

These injunctions are like telling a composer, “Don’t write a part for the piccolos. Use the flute section instead.” Or “Never use the glockenspiel or the triangle.” It might create some unusual music to force the flute section into the really high notes, or to task the violins or trumpets with making bell-like sounds. But as a general rule, it’s better to find a natural voice for the music using all of the instruments in an orchestra.

Each of the rules above requires the writer to take extra steps, to go out of his or her way, to avoid the hated construction. Adverbs can always be replaced with some longer construction, and sometimes with a more descriptive verb: so “walked slowly” can always be made into “walked at a slow pace” or even “limped”—if you want to give the character an enduring handicap. But the shorthand of adverbs exists for a reason. Similarly, in German there may exist a single word to designate a “little red car,” but in English even the most descriptive term—say, “roadster”—doesn’t quite do it. Specificity captures the reader’s imagination but, to paraphrase Macbeth, ’twere best done quickly.

In dealing with dialogue, it may be useful to create a unique speech pattern or a modest accent for one or two characters that suggests their origins or class. But not all characters need such designations, as most readers will hear the speaking parts with an inner ear that supplies its own familiar accent.4 Trying to give every character a unique speech pattern really only works if you are writing a conversation among a Scotsman, a Welshman, and a Cockney, unless you want to sink into the low vaudeville of Fu Manchu dialects and Irish brogues. And many readers will not be so attuned to speech patterns that they could follow a three-way conversation based solely on word choice and order without some help from “John said” and “Mary said,” or their equivalents in stage business.5

I do have a couple of rules for writers. One is don’t be obvious. It’s a big, wide, unique, and beautiful world out there. Try to capture it with your descriptions and actions. Make it come alive. Another is don’t be tedious. Use your words to focus on people, objects, and actions that move the story along. If a bit of description or dialogue isn’t helping to build character or advance plot, tone it down or knock it out—or twist it so that it eventually does its job.6

And finally, don’t let the writing get in the way of the story. If you are torturing your verbs and phrases to avoid simple usages like “he walked slowly,” you run the risk of wasting the reader’s time. Worse, you can come off sounding like an amateur who’s trying too hard: “Look, Ma! I’m writing!” Also, working with constant reference to the thesaurus is simply a bad habit we all go through and eventually learn to break.7 A simple, direct, and more or less colloquial style is the best approach to putting words on paper or on the screen.

But these are rules any writer can learn by reading with a keen ear and editing your own work with a critical eye. These are good writing habits rather than specific injunctions against any particular word form or construction.

For the rest, I favor the response of Dr. Emmett Brown in Back to the Future: “Rules? Where we’re going, there are no rules.”

1. These are replacements for the verb “said” that apparently are only used in a low literary context: “Whoa!” John shouted. “Enough,” Peter growled. “Get over here!” Simon barked. “I didn’t mean it,” Mary sniffled. According to the rule, these should all be “said,” “said,” and “said.” The rule maker would even forbid the occasional use of “asked,” “ordered,” or “whispered,” which actively indicate the speaker’s intention or tone of voice without invoking animal sounds.
       Of course, I would avoid using words that are not actually related to the act of speaking, such as “laughed,” “giggled,” “wept,” or “sighed.” Extraneous noises or necessary histrionics can be shown by adding, for example, “with a laugh” or “with a giggle” to the verb “said,” or by including these actions in a separate sentence accompanying the dialogue.

2. So—and incorporating more than one rule here—instead of writing “Two all-beef patties, special sauce, lettuce, cheese, pickles, onions on a sesame seed bun,” you should simply write “Two patties and sauce on a bun.” The reader will get the idea.

3. While I’ve been beguiled by Emily Dickinson’s poetry, it’s usually presented to the world after careful editing. If you’ve ever looked at her original manuscripts, you can see that she knew only three pieces of punctuation: the dash, the exclamation point, and the period, which is saved for the end of the whole poem.

4. In The Doomsday Effect, I had the character of Jason Bathespeake talk with an impediment, reflecting the fact that his speech centers were no longer connected to his vocal cords but to a mechanical synthesizer. In the manuscript, I had carried his odd speech pattern through every line of dialogue right to the end of the book. A wise editor suggested that I use the quirks for a chapter or two, then let the pattern fade out, with only an occasional reprise. This would approximate the effect of knowing or working alongside someone with a speech defect or accent: sooner or later the novelty wears off, you understand the speech as natural to the person, and you no longer hear the differences.

5. Stage business is a playwright’s term. I use it to mean some bit of action that accompanies dialogue to indicate who is speaking. “ ‘I see what you mean.’ John took out his cigarettes and lit one. ‘And how does that affect us?’ ”

6. Sometimes what seems innocuous or even banal at one point in the story will be shown to have significance later on. That’s a key tool of the mystery writer—and all of us, at some level, are presenting a mystery for the reader to discover and interpret.

7. Oh, I use Roget’s in my writing, but usually in the context of knowing that the word I have in mind is not quite the word I want. Then it’s helpful to look at some synonyms. Very occasionally, I might have written myself into a trap where I have to use, for example, the word “door” six times in three sentences. Then, if I can’t gracefully use the pronoun “it” for occasional relief, I might look into Roget’s for a synonym. But most of them are going to be a bad fit. Going “door … door … door … door … portal” is worse than succumbing to that final “door.”