Sunday, January 26, 2020

Living With One Hand

Hand in cast

About a month ago, on Christmas Eve to be exact, I stumbled and fell on the sidewalk, landing on the side of my face and across the back of my left hand. Among other bangs and bruises, which have since healed, I broke three bones in that hand and wrist. Between the splint put on in the emergency room and the cast fashioned by the hand specialist—luckily, there was no need for corrective surgery—I now have limited use of those four fingers and the barest pinch between thumb and forefinger. Luckily also, I am right-handed. According to the hand doctor, the bones are not shifting and they are knitting well. But still, I may be another month or so in the cast and then a removable splint.

I try to treat everything as a learning experience. So what have I learned from this, other than to be more careful and watch my step? Well, for one thing, I have more understanding for those who, by accident or illness, have lost the use of their fingers or their whole hand. Here are the limitations—temporary limitations, I remind myself—so far encountered.

Typing: I am by training—self-taught from a typewriter manual one winter break during high school—and by nature a two-hand, ten-finger touch typist. No hunt-and-peck for me. So much so that being able to sit down and share my brain through my fingers is now part of my writing process.1 I can still type by spider-walking my right hand across the keyboard, using my left thumb or forefinger to hold down the occasional shift or control key. The result is barely faster than keying into the search function on your DVD player using the remote’s arrow pad. And I still hit the wrong key, or a combination of right and wrong keys, ten percent of the time. So writing is hard.2

Pockets: For fifty-odd years the distribution of daily necessities in my pants pockets has been: knife, comb (when I carried one), and handkerchief in right front; keys, coins, and sundry tokens in left front; wallet in left rear; nothing, or a washcloth for use as an emergency towel, in right rear. With my left hand encumbered by the cast, this time-honored practice goes out the window, and everything, stripped down to minimum, goes on the right side. I’m still fumbling around with this.

Child-proof caps: These devices of the devil, approved for medications taken by sick and feeble people, are impossible to work one-handed, even with the support of a few incompetent and weak fingers and a thumb on the other hand. Once I finally get a cap off, I leave it sitting lopsided on top of the bottle until I need to take another pill.

Zip-lock bags: They’re ingenious—until you realize they’re designed for two working hands with full grip strength. No combination of two fingers on one side, two on the other, can break the seal. The only way I can get them open is to pin one side against the countertop with my left thumb and grip and pull up the other side with my right hand. The contents only occasionally spill out all over the counter.

Buttons: Another ingenious device that really takes two hands. I can use my left thumb working against my cast to hold the buttonhole steady, if not partway open, then use all the fingers and thumb of my right hand to work the bottom edge of the button through the hole and try to anchor the top edge before the bottom slips out again. I can work the big buttons on a sweater, but I give up on all the tiny buttons on a dress shirt. So I wear mostly polo shirts these days. And sleeve buttons, right or left—forget it!

Jackets: Any garment, really, that does not have an expandable sleeve opening, like a winter coat with a knit cuff, blocks passage and traps the bulky cast. I’m too cold to wear my coat draped over the left shoulder, like a Hussar’s pelisse, and too frugal to cut open the seams of the left sleeve on a good coat for a temporary situation like this.

Knots: Unless I can use my teeth, knots are impossible. Shoelaces are hopeless. So I wear mostly slip-on sandals. And socks are hopeless, too. I can do a sloppy granny knot with something as big and thick as a bathrobe belt, but it doesn’t hold for long.

Zippers: I can work a zipper if it’s anchored at one end, like the fly in men’s trousers. But the open-ended version, like on a jacket, is impossible. I just can’t get the pin on one side inserted into the box on the other, manage to hold them both steady, and still pull up on the slider tab. So my jacket flies loose in the wind.

Manual transmission: Yeah, I’m driving a stick again, now that I’m not commuting 35 miles a day each way through heavy traffic. I traveled by Uber for the first couple of days after the fall, when my whole arm was in a splint. My cast now lets me take a light grip on the steering wheel while shifting; so I can still drive—but carefully. And sharp turns require a big over-and-under maneuver with my right hand; so I must plan ahead to avoid shifting while turning. The clutch lever on the motorcycle is hopeless, however, and I have no strength to hold onto the left grip, much less steer; so the bike is on a battery tender for a couple of months. Anyway, I can’t get my left arm into my leather riding jacket.

Personal grooming: Washing my face one-handed feels weird and incomplete. I can shower using a big rubber-and-vinyl glove that slips on over the cast, but then the cleaning process is mostly feeling around one-handed with the soap or washcloth, and my right side gets much less thorough attention than my left. Shaving and toothbrushing are no problem with modern electronic devices, but putting deodorant on my right armpit using only my right hand is an exercise in gymnastics. Luckily, I’m still flexible enough.

Heavy lifting: The hand specialist warned against this. Any use of my left thumb or fingers other than to steady a load in my right hand is perilous. Anything that tugs or pulls on those fingers—like holding the dog’s leash while I lock or unlock the door—risks moving the bones that are trying to knit, giving rise to an ominous ache. I still use the left hand in a limited fashion, but carefully and with occasional pain.

Living with one hand and a few weak fingers, everything is harder, it takes longer, and if I put any pressure on the broken bones or the wrist, it hurts. So life goes on—but at a much slower pace.3

1. I’ve tried the Dragon Naturally Speaking dictation software through a number of its upgrades. Ninety-five percent accurate is still one goofball error every twenty words, and the voice-correction protocol—required so that the software can learn from its mistakes—is painfully slow and irritating. Also, my mind just does not compose through the spoken word.

2. I once bought an ingenious little device called a Frogpad, a one-handed keyboard, but never learned to use it comfortably. The company now seems to be out of business. Pity—I could use it this time.

3. Given how difficult it is to type, and that the last few blogs were already written and in the can before the accident, this may be the last posting I can make for another month or so. I have to take these things one at a time.

Sunday, January 19, 2020

The Writer’s Job

Midnight writer

I have been pursuing this profession—writer and sometimes editor—for almost sixty years now. I first got the inkling1 when I was about twelve years old and attempted my first novel.

That was a fragment, not even a complete story, about a steam locomotive and passenger cars in the Old West that pull into the station with everyone on board dead. It was a fine setup for a mystery, except that I didn’t understand at the time that first you have to know what happened, then you wind it back to what the reader—and presumably the detective, of which I had none—first learns. So I had a setup with no premise.2 But it was a start. I wrote out what I had on an old Royal portable typewriter that was in the family, created a cardboard-and-crayon cover, and stitched it together with yarn. It was a rude start, but I was on my way.

What drew me to writing, when I knew nothing about it, was that a writer—specifically a fiction writer, specifically a novelist—could apparently work for himself from home, rather than for somebody else in an office, and could count his time as his own, rather than keeping to somebody else’s schedule. Well, it was a dream. But it fit my bookish and solitary nature. Besides, it was clean, literary, intellectual work, and you didn’t have to hurt anybody.3

My second novel, written at age sixteen, was a much grander affair: with a first draft in fountain pen on white, lined tablets; second draft typed double spaced, with margins, two copies using carbon paper on my grandfather’s upright Underwood, just like the publishers wanted; and running 472 pages, or about 60,000 words, all told. It was a dreadful space opera about a renegade university professor and rebel leader against an interstellar empire, with a romantic subplot. It had a beginning, middle, and ending—and I knew even as I finished it that the damn thing was unpublishable.4 But the effort was what counted, and it got me fixed on my present course.

My novel career paused when I went to college and studied English literature. I had no ideas for another book—having been emotionally drained by the space opera—and was too busy anyway with my studies and the mountains of reading they required. But that reading gave me perspective on literature and the language. And all along I had thought that, when I graduated, I would immediately write another novel and make my name with my first published book. I had dreamed that I would support myself with fiction writing.

But about three months before graduation I took mental stock and realized I still had no usable ideas, nothing to say. This is not surprising, because few people in their early twenties have much to say to the adult world—which was my preferred venue—and the market for Young Adult literature is limited. So I was suddenly faced with the realization that I needed a “day job” to support my imminent debut into the real world.5 And what I was best qualified for was work as an editor. Through the graces of one of my professors, I got a junior position at the Pennsylvania State University Press. It was eight hours a day on my butt with a blue pencil, correcting spelling and grammar, untangling sentence structure, and marking copy for typesetting, all according to the Chicago Manual. But I loved it. After the university press, I went to a tradebook publisher—where I learned about that railroad tragedy, and much else about the West and my newly adopted state of California—and from there to technical editing of engineering and construction reports and proposals.

My third unpublishable novel came about in my late twenties, while I was working for the engineering company. Based on the time-honored mantra of “write what you know,”6 I tried to write a business novel based on the scramble of a second-tier construction company to answer a request for proposal from a major client for a mega-million-dollar mining development for a Third World client.7 That book progressed as far as a rough first draft, although I never sought a publisher.

In the meantime, I went from engineering and construction, to a public utility providing electricity and gas, to an international pharmaceutical company, to a pioneering maker of genetic analysis instruments, with stop-offs working as a temp in administration at two local oil refineries. In each case, I worked first as a technical writer—learning the secrets of the company’s respective industry—and then moved into internal communications—explaining the company’s business to its employees. And in every case, I was building myself an understanding of and intimacy with the business world and its technological basis, an understanding that I have been mining ever since as background for most of my novels.

So … what is the writer’s job in all of this? It is the same, pretty much, whether the task is editing another writer’s work, or creating and editing technical documents, writing newsletter articles and press releases, or writing a full-blown novel, whether a historical fiction or far-future adventure.

First—and especially in fiction—take the reader somewhere new, show the reader the unique side of an everyday life situation, or of a product or technology, something that he or she has never considered before. There are two ways to approach a story: you can come at the topic head-on and flat out, or from an oblique angle and on the bounce. Think of the latter as putting “English” on the ball, making it spin. That slightly-to-one-side approach puts the reader’s natural defenses off guard and simultaneously raises his or her curiosity. This also works for a new product description or policy analysis—although not in a tightly prescribed document format like a pharmaceutical batch record.

Second—especially in technical writing, communications, and non-fiction editing—make the obscure explicit and the confusing understandable. It is an article of faith with me that nothing is so complex that it cannot be made intelligible to a bright ten-year-old child. But you have to use simple words, work in everyday analogies, and take some extra steps and make some supporting linkages in your reasoning. And you have to use that bounce thing described above to make the reader care in the first place.

Third—and this applies to all writing and editing—be rigorous in your logic and sequence, and honest in your evidence and conclusions. You are invading the reader’s mind, and this is hollowed ground. You can play with the reader’s perceptions and trick his or her understanding in the same way that a magician’s sleight of hand arouses an audience’s awe and wonder. But you can’t lie to the reader or offend his or her senses of fairness, right and wrong, or proportion. And you can never disrespect the reader. For you are playing in another person’s sandbox and, if you offend, will be asked to go home with a slamming of the book.

Fourth—and this applies to almost all types of writing, except perhaps for instruction manuals—paint pictures and tell stories. The human mind is not exactly immune to bare facts, but we have a hard time understanding them and fixing them in memory without a context. This is why storytelling and visual representation have been so powerful in almost all human cultures. This is why religious groups and political parties create a “narrative” to support their core truths. Your job is to create in the reader’s mind a structure made of words, mental images, and associations that carries your message.

To be a writer is to be, effectively, inside somebody’s head by invitation. Play nice. And have fun.

1. What a writerly word! You can almost smell the printer’s ink and hear the presses hum.

2. Curiously, I was foreshadowing one of the tragedies of early railroading, when the trains of the Central Pacific Railroad had to navigate miles of mountain tunnels and snow sheds in the Sierra Nevada, and the accumulated coal smoke asphyxiated their engine crews. From this was born the generation of oil-fired, cab-forward steam engine designs, which worked that route for years.

3. Except, of course, your characters—which I also didn’t understand at the time. But they are only made of smoke and dreams.

4. Anytime you hear about anyone writing a brilliant first novel, count on it being their second or third completed manuscript. Even Harper Lee had to go through this process.

5. Back in my teens, when I was working on the space opera, I wrote to one of my favorite authors, Ray Bradbury, asking if he would read my novel. He politely declined, which was a blessing. But he then suggested that, to become a fiction writer, I should not go to college and instead get a menial job as a dishwasher and just write, write, write, and submit everything to publishers. That course for me, as a teenager, would have been a disaster. Given my subsequent history of real-world, practical experience, I don’t think it would have worked out any better for me as a college graduate.

6. Which is a trap. The command should be “write what excites you, that you know a little something about, but you want to know much, much more.” If your current life is dull—that future as a dishwasher toward which I had been urged—it shouldn’t limit your scope and imagination. And these days, with all of the online resources available, research is easy, right down to getting street views of any city and most towns around the globe.

7. Well, not every idea is a good one. However, some of that story—and so much more—found its way into Medea’s Daughter forty-odd years later.

Sunday, January 12, 2020

Memory and Imagination

One hand

As I think through the way I go about constructing the elements of a story line for a novel and then determining the next actions, images, details, and anecdotes that will support the plot, I have made an interesting discovery: these things pop into my mind—presumably from the subconscious1—in the same way that random memories come to mind.

This act of remembering in this way is different from sitting down, at least mentally, and asking myself what I can recall from a past experience: my graduation day, my wedding day, or any memorable event that is supposed to “stick in the mind.” No, this is the sort of sense, image, or snatch of conversation that surprises me when I’m doing something completely different: a flash of the roadway where I was driving on a trip three years ago, or the image of a house I once visited, or a fragment of what someone once said to me. These things come “unbidden” and when they are least expected.

The difference with the ideas, images, and dialog fragments that come to me for my work in progress is, first, these imagined things did not happen, have never happened, and usually have no relationship to—are not an echo of—anything that has ever happened to me. Second, and unlike the act of forced recall, I find myself unable to sit down, mentally, and ask myself about what should happen after the starting point of a story, or what image or sound or sensory perception would best illustrate the next action scene, or what the characters would say in the situation. I can pose to myself the question of what should happen next, but then I have to go off and take a walk or bike ride, or get my morning shower (warm water hitting my right shoulder and the side of my neck seems to be especially inducive to creative thoughts), and otherwise let my subconscious mull the problem. And then later, out of nowhere, like an unbidden memory, I know what the next action, image, or dialog bit should be.

So it’s apparent that both memory and imagination—at least in my case, and in dealing with a developing story—come from the same place, the subconscious. But the one is a reflection of something that actually happened, while the other is a bit of imagery or dialog with no context other than the mental structure of the novel I’m writing at the time.2

I have noted for many years, going back to my time as a teenager, something I call “the Return.” This is an instance of my hearing a piece of music and then later—two to four days, usually—having it pop into my mind, again unbidden. I will find myself singing or humming the melody or, more often, just “hearing” it in my head. I have also noted that the speed of the Return is linked to my mental and physical health. When I am in good spirits and feeling well, the music will come back in two days—but seldom less. When I am feeling poorly, the return might take four days or not happen at all.

For comparison, the return of a random memory—a place image or a conversation—that occurs while I’m doing something else might take two to a dozen years, while the subconscious production of a scene, action, image, or dialog bit that I have posed to myself for consideration generally takes less than two days. So the two processes—memory and imagination—while alike, are not identical.

Interestingly, none of this has anything to do with dreaming, which apparently comes from another part of the brain. While I have vivid, busy, and sometimes frustrating dreams—but seldom what you would call nightmares—few of them have to do with specific memories, although some are linked to places in my memory, specifically two of the houses we lived in when I was a child. No dream has yet produced anything related to one of my books—not a usable plot point, image, or bit of dialog. However, I occasionally wake up, after dreaming whatever dreams I had, and suddenly get an idea that might help the current book along. Only once in all my experience did I dream of a character in one of my books, and that was my first novel, written as a teenager, and the dream occurred about two years after I finished writing it.

All of this might mean something to a psychologist, or maybe a neurologist, but to me this is just the way my mind works. And the most important conclusion—at least for me—is that I cannot force the construction of a plot or of any particular part of a story. I have no secret list of plot outlines,3 no mechanism for character generation, nor other out-of-the-box novel-writing aids. The story comes from the dark—as in hidden or unilluminated, rather than foreboding or evil—places of my mind.

The story, characters, inciting incidents, and resulting actions have to come from someplace else—outside me, inside but hidden … the gods … the stars—in order for me to believe in them as real people doing real things that have meaning. Yes, I know that they are made up—but I have to forget that part for the story to live inside me. And once I have the outline—that composite of structure and belief—as well as a notion of the associated actions, sense images, and dialog, I can let the magic happen.

Magic? That’s when I sit down at the keyboard, put my mind on its invisible track, forget myself, and let the story flow through me.

1. See Working With the Subconscious from September 30, 2012.

2. Curiously, works that I wrote in the past, stopped writing, and put an end to, do not generate new imagery or dialog. I might suddenly wake up and recall a word, image, or dialog line that I once put on the page and now know to be an error, or lacking, or incompletely linked to something else in the novel—and usually this is still related to the work in progress. But I don’t generate new ideas and directions for the old books. My internal mental process seems to know when what’s done is done.

3. I’ve heard it said that there are only seven plots in all of storytelling. If so, I’d like to know what they are. Whenever I hear this, the examples given are so general (“Well, there’s boy-meets-girl …” or “the hero’s journey …”) as to be practically useless. Once you’ve picked that type of story, then the real work of generating setting, character, incident, and action begins.

Sunday, January 5, 2020

Twenty-Twenty

Crystal ball

The first time I had occasion to write the calendar date “2020” on anything was a few weeks ago, in preparing the draft of the January-February edition of our local NAMI chapter’s newsletter. And as I keyed in those digits, a chill went through me. It was as if I knew, in that sleepless but dormant part of my brain that reads the tea leaves and maps them onto the stars, that this year would be significant.

Oh, I know, we’re going to have a watershed election. And given the political emotions that are running in this country, the campaigning, the election day, and the aftermath are going to be traumatic for one group or another. But that was not the reason for my chill. This was more a moment of existential dread.

Twenty-twenty marks the start of something, but I don’t know what. Well, yes, in one way, we are all past the dystopian vision of the year 2019 as given in the movie Blade Runner, and we still don’t have Aztec-style corporate pyramids and towers shooting flames into the murky night sky of Los Angeles. (And my car still doesn’t fly, dammit!) So that’s a plus for the new year. But that’s not what concerned me.

I can remember from thirty-odd years ago hearing a speaker from The Economist describe how the character of each decade doesn’t change or become consistent and recognizable on the year ending in zero, but on the fourth year into the new decade. So the decade of the 1950s didn’t really end and the 1960s start until 1964. By then we’d been through the Cuban revolution and our ill-fated invasion, the first moves in the war in Vietnam, and the assassination of a beloved president. And then we had the Sixties.

In similar fashion, the Sixties didn’t end in 1970 but not until 1974, with the culmination of that war and the impending impeachment and resignation of the president who came in promising to end it. And the economic malaise—recession and inflation, “stagflation”—that started with the oil embargo and gas crisis in that year marked the Seventies and didn’t really end until about 1984. (Another year whose mention will give you willies.) And then the computer revolution and economic growth of the Eighties, which started with wider acceptance of small, desktop computers and personal software, fueling the “tech stock” market boom, changed the economic structure of the country and continued through the early Nineties.

You can define your own parallels to this theory, in cultural norms, music, clothing styles, and sexual mores, but I think the pattern holds true. Decades don’t change on the zero year but about four years later.

I think something similar happens with centuries. But there the change is not in the fourth decade but in the third, the years counted in twenties..

The 18th century, which was marked—at least in the Western European sphere—with the wars between England and France, culminating in revolution in the American colonies and then in France, the rise of Napoleon, and the struggle for all of Europe, extending to the shores of the Nile and the heart of Moscow, did not really end until the middle of the eighteen-teens. Similarly, the breakaway of the United States from England and the finding of this new country’s own character did not really end until the War of 1812. After that decade, the 19th century came into its own.

And then, the Victorian, imperial, colonial über-culture that—at least in Western Europe—took the superiority of one race and one extended culture for granted, did not end until the shattering madness of World War I. And what came out of that war was the new century, with a perhaps more enlightened view about races and cultures—at least for some people—but also a clutch of hardened counter-ideologies and the technological means to pursue them to horrific ends. After that first global war, the 20th century came into its own.

And finally, the 20th century has been with us ever since. The fall of the Soviet Union and putative end of the Cold War in 1989 was a highway marker, but the effects of that lingering aggression and its bunch of little colonial brush wars and invasions (Korea, Vietnam, Grenada, Kuwait, Iraq, Afghanistan) continued along the same lines, although those lines were becoming blurred with the rise of Arab and Muslim nationalism and the flexing of Chinese muscles. And over them all has loomed the technological changes in warfare that started in World War I, with the machine gun and chemical agents, and continued in World War II, with the jet engine, the rocket, and the atomic bomb. The century saw war become less about armies marching against each other on a common battlefield and, because of “the bomb,” more about political and ideological maneuvering, guerrilla techniques, and terrorist tactics.

You can define your own parallels to this theory with the way that political and cultural norms, music, clothing styles, and even sexual mores changed in the 1920s. The theory still holds: centuries don’t change on the zero-zero year but about twenty years later.

So, although we have been living in the 21st century for the past two decades, it still feels like an extension of the 20th century. We have the same international tensions, the same small wars in far-off countries, the same political differences at home. And, aside from the power of my handheld phone and the number of small computers controlling everything from my car’s ignition to my coffeepot, this century doesn’t feel all that different from the one in which I was born and raised. But still, I sense big things coming, and hence the existential dread.

What are these “things”? My internal leaf-reader and star-mapper doesn’t know. But my sense as a science-fiction writer can offer a few guideposts.

First, the technical revolution that brought us the small computer as a workhorse in everyday life will bring us artificial intelligence in the same role, but maybe more aggressive. And unlike the one big IBM 360 that sat in the basement of the computer science lab at Penn State in the 1960s and ran most of the university’s administrative functions on campus, or the one all-pervasive “Skynet” that destroys the world in the Terminator movies, artificial intelligence will be distributed entities, functioning everywhere. As soon as we work out the neural-net (i.e., capable of learning) architecture and marry it to the right programming, these intelligences will proliferate through our technology and our society like chips and software.1

But don’t think of “intelligence” as being a human-type soul or a little man in a software hat or even a Siri or Alexa. You won’t be holding a verbal discussion with your coffeemaker about whether you feel like dark or medium roast this morning. Instead, you will find that your world is going to have an eerie sense of precognition: answers and opportunities are going to come your way almost seamlessly, based on your past behavior and choices. Your phone is going to become your life coach, trainer, technical expert, and conscience. This trend is only going to expand and multiply—and that’s just on the personal level.

On the macro level, business operations and relationships, lawyers and judges, doctors and hospitals, and any sphere you can think of where knowledge and foresight are valuable, will change remarkably. You won’t go into a contract negotiation, a court appearance, or a diagnostic session without the expert system at your elbow as a kind of silicon consigliere. We’ve already seen the Jeopardy-playing IBM Watson prove itself as a master of languages, puzzles and games, and historical, cultural, scientific, and technical references. The company is now selling Watson Analytics to help manage business operations. This trend is only going to expand and multiply.

Second, the biological revolution that brought genetics into the study of medicine—and the sequencing of the human genome was completed on the cusp of this 21st century—will see a complete makeover of the practice. In this century, we will come to know and understand the function of every gene in the human body, which means every protein, which means almost every chemical created in or affecting the human body. That will change our understanding not only of disease but of health itself. Sometime in this century—all of the ethical handwringing aside—we are going to be modifying human genes in vivo as well as in egg, sperm, and embryo. From that will come children with superior capabilities, designer hair and eye color (just for a start), and stronger immune systems, among other things. The definition of “human” will be rewritten. Adults will benefit, too, by short-circuiting disease and regaining strength in old age. This trend is already under way with gene therapies, and it will explode in practice and popularity.

Moreover, the nature of our material world will change. Already, scientists are examining the genetic capabilities of other organisms—for example, Craig Venter’s people sifting the seawater on voyages across the oceans, looking for plankton with unique gene sequences—and adapting them to common bacteria and algae. You want a pond scum that lies in the sun and releases lipids that you can then skim up and refine like oil? That’s in the works. You want to harvest beans that have valuable nutritional proteins, bleed red, and taste like meat? That’s in development, too. You want synthetic spider silk and plastics as strong as—or stronger than—steel? You want carbon graphene sheets and Bucky-ball liquids that have the strength of diamonds, the electrical capacitance of metals, and the lightness and flexibility of silk? Just wait a few years.

As they say in the song, “You ain’t seen nothing yet.”

1. See Gutenberg and Automation from February 20, 2011.