Sunday, March 31, 2013

Son of a Mechanical Engineer

My father was a mechanical engineer. He was born in 1912, at the beginning of the internal combustion engine’s remaking of America but when blacksmithing was still a recognized profession. By the time he went to the university to study engineering, in the early 1930s, the curriculum was a curious mixture of old and new. In one course, he was part of a team with the hands-on assignment of improving the performance of, or “souping up,” a V8 engine. In another class, he had to learn ironwork and as a final exam forged a chain of iron links, which he said taught him more about the ductility of metals than any amount of theory. He didn’t know at the time that a universe of technological wonders still lay ahead of him—or maybe he did. In the course of a long professional life, he worked with radar systems, nuclear fuel technology, and digital computers. I am proud to be the son of this man.

Growing up as a boy in the 1920s, my father was passionate about aircraft and flying. He idolized the barnstormers of the era, collecting pictures and articles about them in a scrapbook. He made drawings of their aircraft and even wrote poetry on the subject.1 His dream at the time was to become a pilot—until he reasoned that, once the novelty wore off, airline pilots would simply be “bus drivers in the sky.” Then he decided to become an engineer instead.

Late in the 1920s, in rebellion against an overbearing father,2 he ran away from home. He hitchhiked from New York’s Hudson Valley out to California and worked a whole summer on a ranch. I never knew how he got back, but I assume it was the same way, by thumb. He was an independent spirit and didn’t care much what other people thought or said. I don’t think was afraid of anything, ever.3

My father taught me to be curious about how things worked. He fixed my lifelong obsession with machinery, in particular things with blinking lights, screens, and keyboards, by creating my first toy when I was three years old. It was a box whose front was covered with toggle switches and colored lights. When I threw the switches in certain combinations, the lights came on in certain patterns. These patterns didn’t mean anything in particular, but they fascinated this toddler. They also taught me, almost subliminally, that certain things work in certain ways—if you can only figure them out. Wasn’t that an example of pedagogical genius?

After graduating from the university, my father worked at Bell Labs during World War II, developing wave guides for the new detection technology, radar. Right after the war, he and four friends from the lab formed a group they called “McWellworth”—for McCoy, Williams, Elliott, Worley, and Thomas—and built the first portable tape recorder. It was based on recording technology captured from the Germans, who used iron oxide–coated paper tape to record the magnetic impulses from microphones. But where the German machines were the size of a desk, the McWellworth machine was the size of a suitcase—a big suitcase, to be sure, weighing about 25 pounds, and for just the recorder, because amplifier, speakers, and other parts of the system came extra—but it was portable nonetheless. The machine used the motors from aircraft hydraulic pumps, vastly geared down, to spin the tape reels. So when you put the machine into reverse or fast-forward, it ran at about 3,000 rpm!

My father was always interested in electronics, and I knew the Radio Shack store in Boston before it grew into a national chain. Ours was the first family in our neighborhood to have a home stereo system. My brother caught the electrical bug, as well as the photography bug. He built many home electronics kits and understood all about electrical circuits—much better than I ever did. My father installed that McWellworth tape recorder in our home stereo, although the recording and playback were still monophonic. That setup became my second technological toy. Listening to tapes taken from radio broadcasts and later from friends’ long-playing records, I learned to love the music of Guy Lombardo and my first Broadway play, My Fair Lady.

During the early 1950s, my father worked at Sylvania on Long Island to develop one of the first plants for processing nuclear fuel. The story in the family was that when the factory was ready to start up, my father ordered the other workers outside and went inside himself to throw the switches that initiated the automated equipment. That way, if there was an accident, he would be the only one hurt. It’s no wonder I came to think of my father as some kind of hero.

Also at Sylvania he worked on a project called “MobiDiC” —for Mobile Digital Computer. This was a contract with the U.S. Army to create a “portable” computer to solve problems in fire control plotting among artillery batteries in the field, at a time when digital computers filled entire basements with vacuum tubes. The new computer consisted of four truck trailers—one each for CPU, input-output, memory units, and power supply—that were backed into a cross configuration to make a single system. I find it ironic that, toward the end of his life, my father ran a small business, making draperies for large hotel and office complexes, and kept the books with a computer that fit neatly on a desktop.

He taught me how to tighten the lug nuts on a car, or any series of screws or bolts, like those holding up a set of door hinges or holding down a table top. First, tighten each pair of nuts on opposite sides—like those at the twelve o’clock and six o’clock positions, then at two and eight, then four and ten—and take each one down to just finger tight. Second, repeat the order of tightening to snug all the nuts up with a wrench. Finally, tighten to specification in that same order. This way, the wheel or bolted piece sits straight and firm.

He taught me how to cut a board, a tree limb, or a piece of pipe by making a small nick with the saw blade at the measured place, then improving it with longer and longer strokes, and to saw right through to the end, supporting the sawed-off piece with a brace or with your hand. This way, the sawed end doesn’t sag away and take with it a splinter of uncut material.

He taught me to think things through and to see the consequences of what I was doing. And, also, to see what others were doing and the consequences of their actions as well. This is not only the beginning of understanding your world but also of foreseeing and predicting the future—not to mention being a good groundwork for strategic thinking.

He taught me about efficiency: that the simplest and least involved way to do a thing, consuming the least energy and with the fewest moving parts, was also the most reliable and elegant way to do it. If engineering can be considered an art, this is its underlying principle.

My father was a sketch artist who drew accurately, if not with great feeling, portraits and at least one nude. His chalk drawings of two young boys—one flaxen-haired, one dark, and both drawn years before my brother and I were born—were eerily prophetic of the children he would have. His drawings, however, tended to have a sleepy, almost “empty” feeling, like looking at images of ghosts. He was an engineer first and an artist as distant second.

He tried to teach me mathematics, but I had a head like a rock. My mind works in words and pictures, and the relationships connecting them, while numbers have no particular flavor for me.4 I think he wanted at least one of his sons to follow in his footsteps as an engineer, but he was ultimately disappointed. At the end, though, I think he was proud that I became a writer, because he also liked to read. Still, I think some of his influence did come through in my writing, because I tend to concern myself with story structure and mechanics as much—perhaps even more so—than with the feelings and emotions of the characters.

As I was growing up, he filled our house with good books and magazines, and with music. He subscribed to the sort of publishing house reading programs, and record company listening programs, that were popular in the 1950s and ’60s. Under a subscription, the company would keep sending you so many books or classical records each month for as long as you paid the bills. I used to think the books and records were just for his use, although he didn’t mind if my brother and I became attached to them. I realize now they were meant mostly for us. By simply walking around the house, keeping my eyes open, and occasionally picking up and reading an article or a book, or playing a record, I acquired a big part of my artistic education. It also set a lifelong bent toward self-education and a love of literature and music.

My father hated waste and senseless destruction, and he would go out of his way to preserve a thing of beauty for its own sake. On a boating trip to Canada, my family was out walking on a promenade along a shoreline protected with armor rock. A beautiful mahogany speedboat, totally unattended, had just broken away from the dock at a marina out on the point and was drifting onto this rocky shore. My father climbed down over the rocks, waded into the water, and held off the boat until someone could get in touch with the marina to send a launch to retrieve it. He broke a finger wrestling with that errant speedboat. This also taught me something about civic duty.

He always had a project going. During his life he bought several boats—small cabin cruisers, small and large sailboats, and a speedboat—in part because he loved the water but also because he loved to improve things. Over the years he designed and built a flotation device for the open-cockpit sailboat and various swim ladders for the cabin cruisers, and he always repaired and worked on the engines himself. When he wasn’t working on a boat or tuning up one of the family cars, he had a project going in the house: building a laundry chute from the upper floors, repairing a lamp, recessing a refrigerator into the wall of a crowded kitchen. Late in life he was still tinkering: building model gliders and steamboats, and trying his hand at stained glass work. He even took up sailplaning and started a course of piloting lessons—finally realizing that childhood dream—until his eyesight became problematic through retinal degeneration and he had to be grounded. He taught me you are never too old to try something new.

I never became an engineer, but my father’s influence meant that I could work comfortably with engineers and scientists as a technical writer and editor. I have a hunger for understanding things, and I’m not shy about asking questions, proposing hypotheses, and sticking my fingers where they don’t belong. I am the son of a mechanical engineer and damned proud of it.

1. Not very good poetry. I remember one verse about an airplane crash, which ended: “… take the crankshaft out of the small of my back, and build up the engine again.”

2. My grandfather was a civil engineer. He poured the concrete for the dome at the Massachusetts Institute of Technology and, according to family legend, helped work out the stress tables for prestressed concrete. He was also an inveterate hobbyist who explored beekeeping, pigeon racing, and winemaking—and that’s surely where my father picked up his lifelong involvement in hobbies, because his father always had him work on these projects—usually the hard and dirty labor. But where my grandfather had a heavy hand, my father had a much lighter touch—probably in reaction to his own boyhood.

3. One of our family’s extended boating trips took us up the Hudson River to Lake Champlain in a tiny, 26-foot cabin cruiser built on a Steelcraft launch. This was in the year 1954, when hurricane Carol swept up the East Coast. That storm was responsible for knocking down the steeple of Old North Church in Boston, but its effects were felt further away in the Hudson Valley. We were on the lake that day, in hurricane winds and waves twenty feet high. My father and mother fought the storm all the way down the lake and anchored in the lee of Fort Ticonderoga. I stayed awake through the whole experience—holding the cabin door closed against buffeting winds—and I believe I understood at some level, and at the age of six years old, how close to death we came that day. Thinking back on it, since then I’ve not been afraid of much, either. For that, I also credit my father’s influence.

4. To this day, I tend to misspeak my numbers. Perhaps that’s because my mind gets confused by the fact that most amounts can be phrased in different words but mean the same thing: $4,600 can be called “four thousand, six hundred dollars” or “forty-six hundred dollars.” If I think about it, I really do understand the amounts involved. But in casual talk I’m just as likely to call $46,000 “four thousand dollars” as “forty thousand.” Just no feeling for numbers at all, and even less for orders of magnitude. But I do admire math, even if I can’t operate it. I was terrible at music lessons and the mathematical relationships between notes and chords, too—although I love to listen to music and respond to a chord change as an emotional experience.

Sunday, March 24, 2013

Information Saturation

I am repeatedly amazed by how different our lives are in the 21st century from the way people lived even 100 years ago. We can count through the advances: the increase in available energy per capita; proliferation of small motors and computer controls in daily living; rise in air conditioning and climate controls in the home and workplace, and in casual places like restaurants and shopping malls; advances in medical technology available to the wider population; shift from physical to mental and then to imaginative labor and skill sets; and so on. But today what captures my thoughts is the amount of information that is freely available to the average person. Since the creation of the internet and its adoption in daily life, information saturation leads all the other advances of modern technology by at least an order of magnitude. Photons, electrons, and digital technology have set our minds free.

Consider the life of the average person in the advanced western societies in 1913. Many people, perhaps even most, lived rurally and worked in agriculture. Although the electric light had come to the cities and towns, it was still struggling to arrive in the countryside. Radio had been invented, but it was still limited mostly to the dots and dashes of Morse code—the great radio shows and voice plays would not come into fashion until later, in the 1920s. Wealthier homes might have a hand-crank Victrola and a collection of Bakelite records, but they were mostly for music.1 Many homes had telephones, but their use was limited to emergencies and important business, and many of those homes were also on a party line, where your business belonged to everyone else who might be listening in, up and down the wire.

For news of the day and hard information, people relied on three sources: the local newspaper, books in the home or borrowed from the public library, and local gossip shared after church, grange meetings, and other communal gatherings. While we all cherish the notion of young Americans following Abe Lincoln to become self-educated by reading from the great books by lamplight, I think the reality was much as it is today. Reading books for information and pleasure was and remains an acquired taste. Many have it and cherish their favorite authors,2 but many others were, and are, too tired or restless after a hard day’s work to immerse themselves in a book and instead went off to play games, to engage in conversation—often lubricated by alcohol—or to bed.

All this would change, and the changes would accelerate, after two world wars forced the industrialization and urbanization of western society and culture. But 1913 was still on the cusp, full of potential for information and other technologies but not really exercising their long legs.

Consider now the life of the average person in the advanced western societies in 2013—and of people who might be called “upper middle class” in formerly “third-world” countries like India, China, and much of the Middle East. Electricity, radio and television broadcasts, and the internet are ubiquitous. Moreover, with the ascendance of digital technology over analog means of information storage, the multiplicity of devices for transmitting and displaying text, voices, music, photographs, and video images collapses into a single receiver that, regardless of its shape or advertised function, is actually a small computer with a display screen, speaker, and wireless link.3

Depending on how much existing music, how many paper-published books and film-recorded movies other people are willing to digitize and make available through the internet—not to mention the new music, books, videos, and other content that homebound creatives are preparing and packaging all the time—that single device in the average person’s pocket and in the palm of his or her hand can retrieve and display literally the entire recorded history of humankind, plus the ongoing beat of new advances, discoveries, insights, sights, and sounds. The storage capacity—which lives somewhere else, in someone else’s computer drives and memory, and then who cares where?—is essentially limitless. The access here and now is limited only by the bandwidth of the device in hand and the service to which it is linked.

All humans live in their minds. We think, remember, mull over, and experience only what reaches us through our eyes, ears, and other senses. For the hunter-gatherer wandering in forest and plains, the influx of information was only what one person could see and experience for him- or herself or learn by talking with other members of the tribe and people met along the trail. Once humans settled in villages along the river bottoms, began farming, and started writing things down, the average person’s information base started to include texts written in the past or at a distance, usually by strangers. This level of communication persisted for 5,000 years, with the occasional intrusion of a traveling bard or king’s herald, until Gutenberg started printing on paper with movable type. Then the onslaught of outside information got under way, first with cheap Bibles and a resulting religious revolution, then with popular authors and bound books and periodicals. The flood of new information included texts both great and small, serious and flimsy, full of certified, peer-reviewed absolute truth and outright, outlandish, absolute balderdash.4

But for all this immersion in information, are we better informed?

Certainly, we’re faster informed. Where once news traveled from town to town at the speed of a horse and from continent to continent at the speed of a sailing ship, now it travels at light speed. And, in this new century, it arrives in the palm of your hand rather than at the newspaper or telegraph office. With such speed and ease, first, more of the news gets through. We can hear the details of daily life, the ins and outs of politics and economics, from Europe and Asia, instead of just the major events like the fall of governments and election of popes. And then, we also hear the original story as it was sent, not some compressed and possibly garbled version.

In my generation we have become accustomed to the you-are-there-ness of television and movies, epitomized by the live broadcast of humankind’s first step on the Moon. We know not only the news but also the look and feel of foreign countries and cultures. And more humans have seen the vistas of Mars in pictures during the past decade than ever saw pictures from Antarctica or Mount Everest during all of the 20th century.

But better informed? Certainly, the average person is more literate, more comfortable dealing with abstract ideas, more conversant with a wider world, than at any time in the human past. But all this access to information has not, in my estimation, given the average person a more solid base of real knowledge, something upon which to build a career or a reasonable life. I think there are a couple of reasons for this.

First, as noted above, while we have more information at our fingertips, we also have more misinformation, deception, and absolute falsehoods. The old-time book and newspaper editors, who understood they were catering to a reading class that could be presumed to have a pretty solid knowledge base, knew what they could get away with and where bending the truth too far would lead to a break in the reader’s confidence in the source. But today, when just everyone is reading and texting and googling and tweeting, a lie or an intentional half-truth might bounce off a few hard heads—even a majority of hard heads—and still be accepted as gospel truth by a largish fraction of the gullible population. So, as an unpoliced content provider in the Wild West of the internet, hey-hey, go for it!

Second, having the entire database of human knowledge and experience at your fingertips does not inspire many people to dive right in and sample the smorgasbord. Rather, human nature assumes that, since the information is already there and can be accessed anytime, why bother looking it up and learning it now? Why not wait until you need it? Consider that a million people probably lived within easy traveling distance of the New York Public Library for most of the last century, and yet how many of them visited on a daily or weekly basis to soak in its riches? Having answers that outrun your curiosity by a factor of a thousand to one is not necessarily a good thing.

Third, not everyone wants the truth. Few of us in any age are on a crusade to know absolutely, positively the exact nature of what’s going on around here. Scientists and philosophers, perhaps, and now and then an account, auditor, or government commissioner. But most of us are more comfortable with a truth that fits our personal world view—which in itself has largely been acquired through casual reading and television viewing, supported by a few slap-dash investigations.

Fourth, we are imaginatively projective beings. Humans know that the world’s knowledge base is too much for anyone to know thoroughly. And whatever can be known with that absolute, positive certainty we might hope to achieve is going to be either too mundane—“water is wet,” “the ground is dirty”—or too intellectually abstruse—“E=mc2,” “ontogeny recapitulates phylogeny”—to be of much practical use. For everyday living, which is what most of us do, a mix of knowledge and folklore, practical observation and the cautionary tales grandma used to tell is all we really need. Most of us know enough not to try and pet the little snake with the buzzy tail, but we don’t need to memorize a catalog of the world’s venomous reptiles, the chemical formula of each kind’s venom, and the precise details of its effect on the nervous system.

Fifth, we are inveterate gossips, and nobody likes a know-it-all. Seriously, with all the world’s history and knowledge of subjects like medicine and astronomy, economics and classical literature, Mesoamerican culture and Hindu sacred texts available on reputable, university-supported, scholar-led databases, why do so many of us spend our time on Facebook and Twitter? Because aside from the knowledge of scholars, we hunger for bits and pieces about the people around us. It’s built into our genes. We are people watchers and people talkers.5

All of this is not to condemn human nature, or to wish that we all had higher interests and more serious purpose, or that the internet, the libraries, the publishers, and the promoters of culture should somehow be required to produce better content with more certification. Humans have gotten along just fine for the past 60,000 years (as H. sapiens) or four million years (as various kinds of developing primates) using the wit and intelligence they were born with.

Does that mean all this new technology, this speed, multiplicity, and richness are being wasted? Oh, no! We are all happier for knowing the wealth is out there, within our grasp but not necessarily requiring us to take hold of it. Each of us is free to enjoy as much of the banquet and the passing carnival as we want. And as a wise person once said, “Enough is as good as a feast.”6

1. Victor recordings were how Enrico Caruso became a beloved tenor in America. 1. 2. And for that, as an author myself, I thank them and express my deepest appreciation.

3. For an overview of this convergence, see my blog post In the Palm of Your Hand from October 21, 2012.

4. The popular joke these days is, “They can’t put it on the internet if it isn’t true, right?”—because we all know the internet is full of crap. (And I invoke Sturgeon’s law again: so is 90% of everything else.) But the same warning applies to published books and periodicals in the centuries leading up to the 20th. Book and newspaper editors might be wizards at regularizing style, grammar, and punctuation, but they are as susceptible as anyone else to bias, rumor, and false beliefs.

5. I learned this long ago in preparing layouts with photos for magazines and newsletters. A person’s eyes are drawn to the lightest, brightest colors and highest contrasts in a photograph—unless faces are available in the image, and then the eye goes right to them. One might even guess that the eye is drawn to lightness and contrast because those are the marks of a face against most backgrounds. Our brains scan the surrounding jungle and the city streets for signs of danger, and in most cases the danger comes from the nearest human or any pair of close-set eyes—jaguar, lion, leopard, or other predator—that may be watching you.

6. I can find several references for this quote. I first heard it from Marthe Keller’s Eastern European character in a spy movie from the 1980s, The Amateur (recommended, by the way). But I think the original may go back to an English proverb once voiced by Mary Poppins. It certainly sounds like her no-nonsense approach to life.

Sunday, March 17, 2013

What Makes a Good Book?

Every author wants to know this, as if it’s a secret. For most of us, though, I think the answer is, “Well, what kind of book do you like to read?” That will work as a guideline—except for two problems. One is the issue of skill. I as an author may not have the ability to write the kinds of books I like as a reader, although the attempt will probably make me a better writer in the long term. Still, the kinds of ideas, characters, notions, and pieces of action that come naturally to my mind as a writer may be significantly different from the story elements that I like as a reader.

The other problem is one of surprise. As a reader, we don’t know what’s behind the curtain. We don’t see the work that must take place offstage in order for the play to proceed. We may anticipate plot turns and twists, may hope to see a certain kind of outcome, but we are still following the trail of clues, hints, and structural devices that the author lays out for us. We don’t actually know, until the end, how it will all turn out. That suspense followed by surprise creates a pleasurable experience for the reader.

It doesn’t work for the writer, however. In order to lay that trail of clues, the writer must know where the story is ultimately going. Every ravel and stitch, the exact nature of the denouement, and how each character will be paid off, may not be whole and specific in the author’s mind at the beginning of the writing. But the general idea must exist in the structure, at least in outline.1 If you tried to present a play on the basis of a bare stage with the actors all working by improvisation, you would end up subjecting the audience to a pretty pokey rehearsal and perhaps even a couple of hours of tedium.

So the act of writing is not like the act of reading. It’s the difference between watching a play and putting on a production. For the audience, each line is new and fresh; for the players, the script is memorized by rote, counted off in minutes, and delivered under careful direction and stage management.2 “Fresh” gets lost in practicing the art.

But still, the things we like as readers can guide the practice we do as writers.

Characters with Know-How and Resilience

I like characters—at least main characters—who are smart, self-possessed, clever, and resilient. They should be “centered” in the martial-arts, yoga, Buddhist sense: they know the world they inhabit; they know how they fit into that world; they trust their own senses and skills; they have solid relationships based on affection and trust; they have the answers to life’s basic questions. In real life, people like this can actually be asses: smug, independent, self-satisfied, archly superior, and a know-it-all. So the characters I like have a bit of humility, too. They fit into the world, but they understand that the universe is a much bigger place, full of questions and wonder.

Contrary to what they tell you in creative writing class and in writing groups, I don’t need to identify with a character. Not, that is, if “identify” means “find ways that he is just like me.” If that were the case, I could only like male characters with middle-class backgrounds who are entering middle age carrying the life artifacts of a marriage, kinship, and responsibility. The beauty of imagination is that I can—and most readers can, too—find ways to identify with people of the opposite sex or different orientations, as well as different races, backgrounds, nationalities, belief systems, and operative fantasies and goals. As a science fiction writer, I would even go so far as to say different species and material compositions, too.3

The secret, for the reader, in identifying with characters not like yourself is seeing represented in the character the same kinds of needs and urges you feel. You identify with other beings, of whatever kind, who inhabit the human condition. But what is the “human condition”?

To my mind, it encompasses: being self-aware; being able to remember the past and look into the future while staying fixed in the present; being vulnerable to hurt, injury, and death; being susceptible to hope, to needs and wants, to plans and goals, to caring for others; being pitched into conflict with forces greater than oneself and perhaps greater than the limits of one’s available resources; being willing to try. In this configuration, stones and lizards don’t make good characters, and neither do the immortal gods. But pretty much anyone else is fair game in the hands of the right author.

The secret, for the writer, in creating characters that almost any reader can identify with is finding and revealing that human awareness, those vulnerabilities and susceptibilities, those conflicts, and that willingness. The writer’s imagination lets him or her “put on the face and life” of the character, live that character’s story from the inside out, and proceed with confidence through the story’s demands and actions. This immersion in character is one of the reasons I like to write each scene from a single point of view, taking the reader inside the character’s mind and perceptions—even at the expense of repeating some piece of action later from another character’s point of view—rather than dancing from head to head through the story.4

Plots with Energy and Action

In the books I like, things must happen and they must have meaning as they happen. People must want certain things or fear certain things and then take the steps to achieve or avoid them. They may be confronted by evil-doers, villains, or blind bad luck, but they must fight back and fight for something. This is part of that “being willing to try.”5

Of course, that makes it sound as if all good stories must be either pirate adventures or spy novels. Heroic characters fighting arch villains mano a mano over Reichenbach Falls. But those kinds of stories can quickly lose their thread of meaning—especially in the modern thrillers based on Hitchcock’s MacGuffin, where the central object of dispute, theft, pursuit, or desire is either taken for granted or even never revealed in the first place. And action that is reduced to mindless car chases and fireballs, with hurts and consequences that touch neither the characters involved nor the reader/viewer, cheapen the overall experience. Sometimes the most powerful stories can hinge on a single word or glance of ambiguous or misinterpreted meaning. Sometimes the action may encompass a love that waits a decade or more and is refused in a moment of thoughtlessness. Sometimes the most valorous achievement of the human heart is simply to endure.

Whatever real, human people can value, strive for, wager, and win—or lose—may be the basis of a story full of action. Whenever people interact on the basis of liking or loathing, in cooperation or in conflict, the story may pick up energy that carries the reader forward.

Stories with Forward Momentum

Momentum is the key to storytelling. Editors and agents talk about “keeping the reader turning pages.” In whatever format, the story must set up expectations. Those expectations may be fulfilled, but along the way the reader must have cause to doubt. The writer plays the reader’s mind as an angler plays his fish.

How do you set up expectation? Usually, by stopping before the last step. The writer establishes a scene, a path, a direction for the characters and the reader to follow—and then takes them almost to the end. This is going to be difficult for writers who are afflicted with obsessive/compulsive disorder, who must tell each story through to the end at one sitting. The key to page-turning is for the writing to break off, leave the action almost complete, turn to other story lines and events, leave the final step to be discovered later, in other mouths and other minds, in the context of other discoveries.

Another technique is to focus, like the lens of a movie camera, on a certain object, action, or situation that might be important later. Do this skillfully, with good judgment and timing, and the reader will know that something important is being shown but will not know how to interpret it. That sets up the reader’s mind and imagination to wonder what’s going on, how the object/action/situation will be used or will play out later, and how the story might bend or twist so that it adapts to this newfound focus.6

If this sounds like the writer is playing with the reader’s mind … well, of course! Readers come to a work of fiction for the same reason they sit through a magician’s act. They want to be shown something new and different. They want to see something familiar become something strange and wonderful. They want to guess at the secrets behind the show, to think through what they are seeing and feeling, to be tricked with that element of “non-lethal surprise” which is the basis of all humor, wonder, and good storytelling.

The worst thing a story can be is predictable, plodding, and obvious. The worst plots move from A to B to C with all the lines drawn in and connections made. The worst characters start out familiar and bland, have nothing to want or achieve, and end up in much the same state. If the optimistic Candide suffers no hardship and remains in “the best of all possible worlds,” then the writer has not done his or her job. Characters should be made to struggle and suffer. Why? So that we as readers won’t have to.

1. Some writers avoid outlines. They pull out a blank piece of paper or computer screen, and the only solid idea in their heads might be a character’s name or just a glimmer of personality, a bit of setting, and a starting piece of action … then they shut their eyes and let the story come to them, day by day, chapter by chapter. It might work for some people, but I need a more specific outline—which I may then bend and extend as necessary—before I can start walking the path of imagination. I suspect that writers who try the “eyes closed” method are really looking for that element of surprise they get when reading another author’s hard-won work. What their own readers get might be exciting, but it could also be a dribbling bore and not likely to end up anyplace in particular.

2. Don’t ask how many times I might rework a sentence to get the energy and pacing just right. Some thoughts come out of my head fresh and perfect the first time. Some take a dozen rewrites to become even halfway acceptable and still, in my mind, remain provisional.

3. I’ve written from the viewpoint—that is, slipped inside the mindset—of self-aware lines of computer code in ME: A Novel of Self-Discovery and from the viewpoint of an 11th-millennium time traveler for whom 20th-century humans are a cross between Neanderthals and chimpanzees in The Children of Possibility.

4. See my blog post Writing for Point of View from April 22, 2012.

5. It would seem odd, then, that the main characters in my two fictional family histories, William Henry Wheelock in The Professor’s Mistress and Robert Wheelock before him in The Judge’s Daughter, are both mostly passive characters, for whom the most significant events in their lives come to and pass through them, rather than their going out and forcing the action of the story through their desires and passions. Call these books an experiment in the way most people really live: wanting certain things, taking the steps they know they can, but in the end enduring what actually comes their way. That, too, is part of the human condition.

6. Of course, do this too often, or do it randomly without effect—like a drunk swinging the camera around on the set, or the reader’s eye around in the story—and the technique will fall apart. It will become a trick, like another meaningless car chase or fireball: the effect might be dazzling but it remains unsatisfying.

Sunday, March 10, 2013

Scientific Triumphalism and Militant Atheism

As I’ve noted elsewhere, I am an atheist. The universe I inhabit does not need either a Creator nor a Caretaker. The life I live does not need a supernatural Protector and Favor Granter working on overwatch. Like the Buddhists, I can reason out a robust morality based on sobriety, reciprocity, and fairness without the intervention of a supernatural Law Giver.1 All of this, however, does not make me more intelligent, or less gullible, than the people around me who believe in and pray to an all-powerful Being. I don’t despise them for their beliefs. Rather, I tend to think I’m lacking in some essential gene or protein that lets my brain feel the presence of such a Being and acknowledge His/Her/Its reality.

Back when I was a child and then an adolescent attending church and trying to believe,2 I was painfully aware that I was alone. I lived in a community of many churches. All around me were young believers. We said the Lord’s Prayer in grade school. We still pledged allegiance “under god” in high school. The money in our pockets trusted in God. And there I was, some kind of pariah with a tin ear who could not tune in to the dominant paradigm of my society.

As I was growing up, I remember hearing about Madalyn Murray O’Hare and the lawsuits she filed in support of the First Amendment’s separation of church and state. Her suit before the Supreme Court ended Bible reading in public schools in 1963 and, I suppose, foster-fathered the many lawsuits that now perennially seek to remove nativity scenes and Christmas trees from town squares. But Ms. O’Hare was never a hero of mine—in fact, she always struck me as a gadfly and a fanatic.

And yet, in the span of a generation—my generation—her critique of religion has become the national conscience. Not in the way she intended, I think. Back in the ’60s, we all accepted the school prayer rulings in the spirit that reading from the King James Version was probably offensive to our Jewish and even to our Catholic classmates. And if we had any Muslims or Hindus or Confucians in my mostly white, mostly middle class schools, we would have protected their feelings, too. But we weren’t so sensitive about atheists’ feelings because—look around—did you see any atheists?

These days, however, if you listen to the mainstream media and the chitchat on social media, all the right-thinking people are atheists. Belief in any kind of god is not even a bourgeois affectation anymore. Aside from a pattern of cultural detritus—like saying “god damn” when you hit your thumb with a hammer—no one actually believes in the Angry Old Man with the White Beard Who Lives in the Sky and Watches You! God has become a vaporous spirit who barely inhabits the local church steeple. Think of the hapless mother, Rachel, in the recent spy movie Hanna. When asked to clarify her casual reference to “God,” Rachel says, “Well, not in any monotheistic sense. Buddha, Krishna, the god within. Whatever you believe in.” God has become a spiritual bumper sticker—nondenominational, plastic, sensitive to the needs of others, and shapeless.

Unless, of course, you actually, devoutly, loudly believe. And then you’re over on the other side with the rednecks, the Bible thumpers, and the militant Muslims. They’re the people who won’t see reason and will never, ever give up their mass-hallucinatory delusions. We’re polite to such people because, after all, it’s a free country and we don’t want to be dogmatic about these things. We look down on those who inhabit the Christian tradition, because most of us by now feel we have seen through all that, like a rotted veil, and finally abandoned the mental shackles. We are polite to the Muslims, Hindus, and Catholics, because they are not really us.3 But in the 21st century western tradition, we really feel more comfortable with the Communists and their “opiate of the masses” dismissal, or with Rachel and her “whatever” mishmash, than with anyone from the little church on the corner.

How did things change so fast? Did we really go from being “soft on Communism” in the 1950s to freely accepting its inner social paradigm in the 2000s?

To a large extent, I blame the triumph of the scientific method and its astounding and continually abounding proofs in modern technology. In 500 years, we’ve gone from the unseating of geocentrism and the music of the spheres, to space flights and planetary explorations that are so regular and common they don’t even make the evening news anymore. In 200 years, we’ve gone from big, clumsy, inefficient steam engines powered by coal to a world dominated by reliable, efficient small motors and turbines that whisk us around town, up in the air, across the country, and to faraway continents. In just over 100 years, we’ve gone from hand-blown light bulbs and hand-crank telephones to machines that will do our thinking and remembering for us, as well as handfuls of polymer and glass that can connect us to the world and its entire base of knowledge in an instant.

In a new world that Shakespeare would have described as Ariel and Caliban having come alive—and with the promise of more wonders yet to come—what room is there for a fusty old Hebraic god with ten commandments that were spoken out of a burning bush? A philosophy based on science and technology has shown itself to be more successful and more fruitful than any philosophy based on supernatural fear and loathing.4

And yet, the anger and loathing with which some of these modern atheists regard their recently discarded religious tradition—especially of the once-dominant Christian variety—bears a resemblance to “protesting too much.” To discard a flag you once sailed under, because you’ve simply changed your mind and allegiance, is one thing. To stamp on that flag, burn it, spit on it, and shriek your defiance with a clenched fist—that speaks of a lingering uncertainty. I could calmly abandon the traditional notion of God, because I was never more than sailing under a false flag anyway. But among the people who need to marginalize and despise believers and push them into dark corners, I sense a neophyte’s doubt and a still-unformed perception of what they think is really going on in the universe.

On the other hand, we’re also seeing scientific philosophy adopting the mental trappings of a religion. We are starting to see, in some areas, a creed developing, an orthodoxy to which “all reasonable people” are supposed to subscribe. I’m sure that my recent posts—about the limits of what we currently know, about the trend in finding proofs of various theories in complicated mathematics presented as if they were observations, and about “consensus” in science—all have angered professional scientists.5 It’s a thin line I walk as a non-scientist. I’m not speaking in favor of an anything-goes approach to knowledge, where my nudnik idea is just as valid as your peer-reviewed paper supported by observation and experimentation. But I also know that mathematics is a product of the human mind that represents and maps to the physical world but is still capable of false analogy and miscorrelation. And for whatever you can say about mathematics, repeat ten times over for computer modeling and simulations, where the operator is forced to choose among data sets and weigh competing variables.

Is scientific fundamentalism—that is, the dogmatic belief in the supremacy of certain scientific positions, even though they are supported more by theory, mathematics, and computer modeling than by direct observation and established fact (e.g., the catastrophic predictions about anthropogenic global warming)—preferable to religious fundamentalism? Either one suggests a closed mind and acceptance of “this far, but no farther” in the evolution of ideas. As a convinced evolutionist myself, I understand—I know, if you will—that everything changes. Today’s certainties give way to uncertainty, doubt, confusion, and new conceptions, leading to new certainties that will also, eventually, crumble and fall. If you nail your heart to the mast of any theory, as Ahab nailed his golden doubloon to the Pequod’s mainmast, you will eventually be disillusioned and disappointed.6

I guess, in the end, my atheism is a kind of a mental shield. Most humans, who have the gene or protein that lets them hear the whispers of God, are also susceptible to a fixation on ultimate truth with a capital “T.” They crave absolutes, identities, and certainties. They want a world that is fixed and everlasting. They want to build a home on bedrock, under a sky that never rains down stones, alongside a creek that will never rise, and bordering a field that will never fail and blow away in dust.

I can never have that. I know that time is the enemy of certainty, that youth succumbs to age, and age succumbs to dust. It’s not that I believe in nothing, but I know that nothing is always a possibility and it hovers always near.

1. However, I am also a child of western civilization, which has been shaped by a religious tradition and Judeo-Christian theology. Whether I would be an atheist—or even the same person with the same open, rational viewpoint—if I had grown up in a Muslim or Asian culture is a delicate question.

2. I was a child of mixed marriage: mother a Methodist, father a Presbyterian, and older brother off sitting with the Congregationalists. No one mentioned God much in our household, and we only said grace at the table for Thanksgiving and Christmas. I tried to attend the Presbyterian church while in high school. It was a modern church, which acknowledged predestination without making a big thing of it—and I never heard the angels’ wings. We took communion in Wonder Bread and Welch’s grape juice several times a year, and didn’t make a big thing of that, either. Sunday school was more like a social club. And yes, I’ve been known to swear by invoking the identities of God and Jesus Christ, but my heart’s never really been in it.

3. And besides, some of them will go all jihad on you in public.

4. And yet, look at the content of our popular media. The supernatural still abounds in stories about vampires, werewolves, witches, and zombies—not to mention science-fiction phenomena that might as well be supernatural, like wormholes, time travel, and achieving super-light speeds. While we all know this stuff’s not possible, we all love stories about it. And for young minds without a decent science education, well, who’s to say what might not be possible?

5. The reader might think, based on some of my recent posts about science and mathematics, that I’m anti-science. Not at all. I think science and its handmaiden technology represent some of the best works of the human mind. But in the realms of the very big and the very small—cosmology and subatomic particles, where observation is difficult and hands-on experimentation almost impossible—our best minds are coming hard upon some fascinating conundrums, riddles, and apparent absurdities through their observations. In response, they are pushing theory and mathematical reasoning to the edge of credulity. So I sense that an upheaval is coming, a recalibration of our collective mindset, such as has happened every hundred years or so for the past half a millennium. But then, I’m a science fiction writer, not a scientist. Imagination, speculation, and reasonable doubt are my stock in trade.

6. Even evolution itself, that most durable of scientific faiths, may one day be replaced by a better theory, although it’s held up pretty well since Darwin’s day through the multiple discoveries of cladistics and genetics. I suppose that the paradigm-shifting blow will come in the guise of what we mean by “random” mutations, when we already have evidence from epigenetics that some of the changes due to environmental methylation seem to follow predictable patterns.
       And who knows what the theoretical impact will be if we ever discover the origins of the DNA/RNA/protein interaction? DNA and its complements represent a complex yet highly conserved mechanism, the organizing principle behind all life on this planet. Yet nowhere in the most obscure and isolated populations, nor in the fossil record, do we find any trace of a competing or less developed molecular mechanism. With the earliest bacteria and blue-green algae appearing four billion years ago, and then with the first multi-celled organisms half a billion years ago, the identical interaction of DNA, RNA, and proteins conquered this planet in the same form that we find them today. Which means that either this highly complex, almost arbitrary mechanism was so robust that it quickly beat out all competitors and effectively froze its own evolution for eternity. Or it did not not actually originate on Earth but immigrated, or was carried, or even seeded here from somewhere else.

Sunday, March 3, 2013

Understanding the Universe

Recently a Facebook friend shared a graphic1 from the results of the NASA Wilkinson Microwave Anisotropy Probe, comparing the measured content of the universe as we see it today and as of just shortly—380,000 years—after the Big Bang.

According to the graphs, dark matter has always been a component of the universe, as have other components like atoms, photons, and neutrinos. Dark energy was also present in the early universe, but in such a tiny fraction as to go unnoticed. Then, as the universe expanded and cooled, the atoms of normal matter and the units of dark matter, in whatever form it takes, became less dense and so came to represent less of the universe’s total content. The early neutrinos and photons lost energy faster than the other visible matter, so they became just a tiny fraction of today’s universe. Meanwhile, the density of dark energy did not decrease at all—and so it grew in proportion, so that it dominates the universe today.

I cheerfully admit I'm not a degreed scientist, although I am a logical fellow with pretty good reasoning powers. Perhaps because I have not made a life-long study of this subject and immersed myself in the beauty of the mathematics that supports it, I can ask a few questions. Such as …

Since we can’t actually see, touch, taste, feel, or detect dark matter and dark energy, isn’t their part in the contents labeling of the Big Box of Universe® simply a supposition?

As I remember it, the only reason astrophysicists proposed dark matter at all—since it couldn’t be directly observed or detected—was to account for the motion of stars in galaxies. When you look at the galaxies that actually spin, their stars seem to orbit together, as if they were painted on a fixed disk, like a huge phonograph record, rather than chase each other at varying speeds around the center, like chips of wood floating on the surface of a whirlpool. When astronomers added up the mass of the stars, dust, and gas, they fell far short of the mass needed to create this painted-disk effect. And hence the need for something extra, some kind of matter whose nature we don’t yet understand and cannot see—and hence “dark”—to account for the computed mass requirement.2 Guesses about the nature of dark matter range from massive subatomic particles which don’t interact with the kind of matter we can see—weakly interacting massive particles, or WIMPs—to galaxies stuffed with the kinds of normal matter we might detect but normally can’t see at long range, like brown dwarf stars3 and rogue planets that might either have coalesced out of interstellar dust and gas or been thrown out of existing star systems.

Dark energy is a theoretical construct that accounts for the fact that the universe has not only been expanding since the time of the Big Bang, but that the expansion seems to be accelerating. The nature of the acceleration is such that instead of the universe eventually re-collapsing under gravity to start another Bang, astronomers now think the far future holds a Big Fadeout. First the galaxies will outrun the light they shine back at us, and so disappear.4 Then the stars in our Milky Way will grow dim and distant. Finally, even the Sun and planets, then the molecules of the Earth, and ultimately the molecules inside our own brains will become so distant from each other that they no longer interact. All activity in the universe eventually will cease. Guesses about the nature of dark energy range from some kind of negative pressure associated with vacuum, to some kind of gravitational repulsion that is currently unmeasured or unaccounted for, to the energy potential of a dynamic field called “quintessence.”5

Clearly, when we look out beyond the atmosphere of Earth, beyond the luminous bodies and simple orbits of our solar system, and try to take in the really big picture of the cosmos, we find complexities that pass our understanding. In our physics, which goes back to Newton and Einstein, the only way to get more mass is to add more stuff—even if it’s stuff you cannot see or detect, except by its effects. The only way to get more acceleration is to add more energy, even if you have to conjure it out of pure nothingness.

Our understanding of the nature of things breaks down in the realm of the very small. In Quantum Mechanics, which relies on mathematics and theory to help the human mind put order to what we can only marginally observe, we have constructed the Standard Model. It speaks of particles in terms of their trajectory, energy, and spin. And, for those bits that we cannot observe or corral directly, the Standard Model treats particles and fields interchangeably.

In the realm of the very big, dealing with scales of time and distance that are similarly outside of human experience, we use mathematics and theory to put order to observations that defy common sense. Cosmology has created an exploding universe that once expanded faster than light, has since slowed down, and now appears to be speeding up again, all to account for photons that lose energy for no otherwise apparent reason and a residual hiss of microwave radiation that appears to be the echo of the moment of creation. More mathematics and theory addresses galaxies that don’t spin as we would expect.

I sometimes think that when we venture outside the realm of the humanly observable, we're like an engineer trying to understand an internal combustion engine by riding on the face of the piston. All around us we see a lot of noise and activity, but we can only understand a fraction of what's going on and then theorize about the rest of it. From where we sit, we know nothing about the engine’s purpose, and we can only spin theories about things like gas tanks, carburetors, spark plugs, ignition coils, crankshafts, and wheels—all of which we've never actually seen and can only infer. All we really know and can observe is that we're inside a big, noisy space full of explosions and hot air.

I sometimes think the problem—both at the quantum and the cosmological levels—is one of viewpoint and perspective. We are trying to apply theories and metaphors from everyday experience to scales at which they may not apply. We believe in homogeneity and isotropy. That is, things should be pretty much the same everywhere and in every direction that we look.6 This is an assumption. To avoid excess theoretical baggage, in the manner of Occam’s razor, it makes sense to postulate that the physical reality we observe on Earth also operates on Mars or in the Andromeda galaxy, and that principles which operate at distances we can encompass with our two hands also operate at the span between two quarks or between two galaxies.

Our science is full of such assumptions. They let us think sanely, but they sometimes lead us to think wrongly. In geology, for example, the doctrine of uniformitarianism held sway for many years: that processes we can observe here and now have always operated at the same speed and intensity. Rivers wear away the riverbank at the same rate every year. Rain and runoff wear away mountains at the same rate over the millennia. That made for a gradually changing world, which is what we usually observe. As a doctrine, uniformitarianism made sense over catastrophism, which held that violent changes like Noah’s flood can suddenly create new landscapes. Of course, both processes—the quick and the slow—are at work. It took a while for geologists to accept that a great flood on the Snake River at the end of glaciation some 15,000 years ago carved canyons in a matter of days. But any of the old geologists who had seen a volcano explode would have had to admit that catastrophism sometimes happens.7

Perhaps our predilection for homogeneity and isotropy is leading us to wrong conclusions. As I’ve pointed out before,8 we use terms like “gravity,” “space,” and “time” in meaningful sentences; we measure their effects and can use those measurements in equations and calculations; but we don’t necessarily understand the mechanism that supports these effects. Perhaps to those three poorly understood effects, we should add huge masses, great distances, and great spans of time.

The universe is a strange place. I think we will need to adjust our thinking to different scales of time and distance. But then, our understanding of the physical space and other realities—like the nature of life—has always been the successive adoption of different ideas, models, frames of reference, and metaphors.

For example, most of us still live in a Euclidean world, where straight lines are straight forever, rooms have flat walls that meet in 90-degree corners, and clocks run at the same speed all over the world. But if you adopt the Einsteinian model, you know that the baseboards curve, however slightly, with gravity and that your watch slows down by a fraction of a second whenever you take a transcontinental flight.

In the same way, people living in pre-Darwinian times saw the animals and plants around them as separate, individually created beings—and certainly a different order of being from self-aware humans. But the new evolutionary model and the evidence of genetics now force us to see that all life on this planet is related at a molecular level and that the forms themselves are fluid because they change in response to the environment.

Think of these new models as “whacks to the head” which jar our understanding and lead to the “ah-ha!” moment of a new vision. But the universe is still an ultimate mystery. I believe we all have a dozen or more cycles of “whack” and “ah-ha!” before we figure out what's really going on out there. For now, the Big Bang in cosmology and the Standard Model in subatomic theory are useful metaphors and explainers of what’s going on—although each is now running into some bizarre complications. I believe either or both of these models will one day be overturned before we arrive at a more accurate—perhaps even a final—view of the Real State of Affairs.

1. Giving credit where due, my Facebook friend William Maness shared this chart from Mohan Sanjeevan’s Facebook page Sci-fi, Science and Space Geek, commenting, “The green and dark blue slices? Those are actually ‘we don’t know WTF, but we can quantify what we need to match our model with what we observe.’ ” He then tipped his hat to me for my recent blog post about scientific presumptions in the realm of cosmology, and wrote, “The more I think about it, the more I agree with him that the standard model has got a gaping hole in it.” The substance of this posting comes from my response to Maness.
       Another commenter on that thread, a scientist with 30 years of experience teaching chemistry at the university level, observed that since he didn’t know the actual facts, it was best not to comment. I respect that. But I am not a scientist; I am a science fiction writer. It’s not my business to know but to question, wonder, and imagine. Without the froth of imagination, the soufflé of science just lies there like a pudding.

2. And yes, every galaxy is now believed to hide a supermassive black hole at its center, comprising the mass of hundreds of thousands or millions or billions of suns. And no, that’s still not enough mass to account for the orbital velocities we see.

3. Essentially solar fizzles, bigger than Jupiter but not big enough to ignite their own nuclear fusion or keep it going successfully for very long. If billions of solar masses tied up in a black hole are not enough to account for the missing mass, how many solar cinders lying around does it take to balance the equation?

4. Remember, this expansion is based on the observation that the light from distant objects like galaxies is red-shifted—that is, has a lower than expected energy level—which is thought to be due to the Doppler effect. This effect, in our everyday experience, accounts for the change in frequency of a sound wave that is either approaching or moving away from the hearer. Light waves are presumed to retain their energy over vast spans of time and distance, just as in Newton’s first law of motion an object once set in motion retains its inertia until it meets with a counteracting force like friction. So, if the photon is losing energy, it must be because the distance over which it travels is expanding and not because any other force is sapping its energy.

5. Quintessence, in its original meaning—from the Medieval Latin quinta and essentia, or “fifth essence”—was the purest and most concentrated form of a thing. Medieval philosophy accounted for four elements to make up everything in the realm below the celestial spheres: earth, air, fire, and water. The fifth element or essence supposedly permeated all of nature and was the material composing the celestial bodies. … See how far we have come?

6. Homogeneity is from the Greek homos, meaning “same,” and genos, meaning “kind”—or all of the same kind. Isotropy is from the Greek iso, meaning “equal,” and tropos, meaning “way”—or uniform in all orientations and directions.

7. In the same way, biologists once thought evolution took lots of time, that genetic mutations happened slowly, and their effects only accumulated over hundreds of generations. Then paleontologists Stephen Jay Gould and Niles Eldredge wrote about punctuated equilibrium: genetic mutations are happening all the time, but their effects only blossom into newly favored forms when the environment changes enough to make the old forms unworkable and the new forms advantageous. And then Peter and Rosemary Grant, as described in Jonathon Weiner’s The Beak of the Finch, showed how species of finches in the Galapagos Islands may change over a single generation to take advantage of environmental niches. Old scientific ideas die quickly under new evidence.

8. See the blog entries titled “Three Things We Don’t Know About Physics” from December 30, 2012 and January 6, 2013.