Sunday, January 26, 2020

Living With One Hand

Hand in cast

About a month ago, on Christmas Eve to be exact, I stumbled and fell on the sidewalk, landing on the side of my face and across the back of my left hand. Among other bangs and bruises, which have since healed, I broke three bones in that hand and wrist. Between the splint put on in the emergency room and the cast fashioned by the hand specialist—luckily, there was no need for corrective surgery—I now have limited use of those four fingers and the barest pinch between thumb and forefinger. Luckily also, I am right-handed. According to the hand doctor, the bones are not shifting and they are knitting well. But still, I may be another month or so in the cast and then a removable splint.

I try to treat everything as a learning experience. So what have I learned from this, other than to be more careful and watch my step? Well, for one thing, I have more understanding for those who, by accident or illness, have lost the use of their fingers or their whole hand. Here are the limitations—temporary limitations, I remind myself—so far encountered.

Typing: I am by training—self-taught from a typewriter manual one winter break during high school—and by nature a two-hand, ten-finger touch typist. No hunt-and-peck for me. So much so that being able to sit down and share my brain through my fingers is now part of my writing process.1 I can still type by spider-walking my right hand across the keyboard, using my left thumb or forefinger to hold down the occasional shift or control key. The result is barely faster than keying into the search function on your DVD player using the remote’s arrow pad. And I still hit the wrong key, or a combination of right and wrong keys, ten percent of the time. So writing is hard.2

Pockets: For fifty-odd years the distribution of daily necessities in my pants pockets has been: knife, comb (when I carried one), and handkerchief in right front; keys, coins, and sundry tokens in left front; wallet in left rear; nothing, or a washcloth for use as an emergency towel, in right rear. With my left hand encumbered by the cast, this time-honored practice goes out the window, and everything, stripped down to minimum, goes on the right side. I’m still fumbling around with this.

Child-proof caps: These devices of the devil, approved for medications taken by sick and feeble people, are impossible to work one-handed, even with the support of a few incompetent and weak fingers and a thumb on the other hand. Once I finally get a cap off, I leave it sitting lopsided on top of the bottle until I need to take another pill.

Zip-lock bags: They’re ingenious—until you realize they’re designed for two working hands with full grip strength. No combination of two fingers on one side, two on the other, can break the seal. The only way I can get them open is to pin one side against the countertop with my left thumb and grip and pull up the other side with my right hand. The contents only occasionally spill out all over the counter.

Buttons: Another ingenious device that really takes two hands. I can use my left thumb working against my cast to hold the buttonhole steady, if not partway open, then use all the fingers and thumb of my right hand to work the bottom edge of the button through the hole and try to anchor the top edge before the bottom slips out again. I can work the big buttons on a sweater, but I give up on all the tiny buttons on a dress shirt. So I wear mostly polo shirts these days. And sleeve buttons, right or left—forget it!

Jackets: Any garment, really, that does not have an expandable sleeve opening, like a winter coat with a knit cuff, blocks passage and traps the bulky cast. I’m too cold to wear my coat draped over the left shoulder, like a Hussar’s pelisse, and too frugal to cut open the seams of the left sleeve on a good coat for a temporary situation like this.

Knots: Unless I can use my teeth, knots are impossible. Shoelaces are hopeless. So I wear mostly slip-on sandals. And socks are hopeless, too. I can do a sloppy granny knot with something as big and thick as a bathrobe belt, but it doesn’t hold for long.

Zippers: I can work a zipper if it’s anchored at one end, like the fly in men’s trousers. But the open-ended version, like on a jacket, is impossible. I just can’t get the pin on one side inserted into the box on the other, manage to hold them both steady, and still pull up on the slider tab. So my jacket flies loose in the wind.

Manual transmission: Yeah, I’m driving a stick again, now that I’m not commuting 35 miles a day each way through heavy traffic. I traveled by Uber for the first couple of days after the fall, when my whole arm was in a splint. My cast now lets me take a light grip on the steering wheel while shifting; so I can still drive—but carefully. And sharp turns require a big over-and-under maneuver with my right hand; so I must plan ahead to avoid shifting while turning. The clutch lever on the motorcycle is hopeless, however, and I have no strength to hold onto the left grip, much less steer; so the bike is on a battery tender for a couple of months. Anyway, I can’t get my left arm into my leather riding jacket.

Personal grooming: Washing my face one-handed feels weird and incomplete. I can shower using a big rubber-and-vinyl glove that slips on over the cast, but then the cleaning process is mostly feeling around one-handed with the soap or washcloth, and my right side gets much less thorough attention than my left. Shaving and toothbrushing are no problem with modern electronic devices, but putting deodorant on my right armpit using only my right hand is an exercise in gymnastics. Luckily, I’m still flexible enough.

Heavy lifting: The hand specialist warned against this. Any use of my left thumb or fingers other than to steady a load in my right hand is perilous. Anything that tugs or pulls on those fingers—like holding the dog’s leash while I lock or unlock the door—risks moving the bones that are trying to knit, giving rise to an ominous ache. I still use the left hand in a limited fashion, but carefully and with occasional pain.

Living with one hand and a few weak fingers, everything is harder, it takes longer, and if I put any pressure on the broken bones or the wrist, it hurts. So life goes on—but at a much slower pace.3

1. I’ve tried the Dragon Naturally Speaking dictation software through a number of its upgrades. Ninety-five percent accurate is still one goofball error every twenty words, and the voice-correction protocol—required so that the software can learn from its mistakes—is painfully slow and irritating. Also, my mind just does not compose through the spoken word.

2. I once bought an ingenious little device called a Frogpad, a one-handed keyboard, but never learned to use it comfortably. The company now seems to be out of business. Pity—I could use it this time.

3. Given how difficult it is to type, and that the last few blogs were already written and in the can before the accident, this may be the last posting I can make for another month or so. I have to take these things one at a time.

Sunday, January 19, 2020

The Writer’s Job

Midnight writer

I have been pursuing this profession—writer and sometimes editor—for almost sixty years now. I first got the inkling1 when I was about twelve years old and attempted my first novel.

That was a fragment, not even a complete story, about a steam locomotive and passenger cars in the Old West that pull into the station with everyone on board dead. It was a fine setup for a mystery, except that I didn’t understand at the time that first you have to know what happened, then you wind it back to what the reader—and presumably the detective, of which I had none—first learns. So I had a setup with no premise.2 But it was a start. I wrote out what I had on an old Royal portable typewriter that was in the family, created a cardboard-and-crayon cover, and stitched it together with yarn. It was a rude start, but I was on my way.

What drew me to writing, when I knew nothing about it, was that a writer—specifically a fiction writer, specifically a novelist—could apparently work for himself from home, rather than for somebody else in an office, and could count his time as his own, rather than keeping to somebody else’s schedule. Well, it was a dream. But it fit my bookish and solitary nature. Besides, it was clean, literary, intellectual work, and you didn’t have to hurt anybody.3

My second novel, written at age sixteen, was a much grander affair: with a first draft in fountain pen on white, lined tablets; second draft typed double spaced, with margins, two copies using carbon paper on my grandfather’s upright Underwood, just like the publishers wanted; and running 472 pages, or about 60,000 words, all told. It was a dreadful space opera about a renegade university professor and rebel leader against an interstellar empire, with a romantic subplot. It had a beginning, middle, and ending—and I knew even as I finished it that the damn thing was unpublishable.4 But the effort was what counted, and it got me fixed on my present course.

My novel career paused when I went to college and studied English literature. I had no ideas for another book—having been emotionally drained by the space opera—and was too busy anyway with my studies and the mountains of reading they required. But that reading gave me perspective on literature and the language. And all along I had thought that, when I graduated, I would immediately write another novel and make my name with my first published book. I had dreamed that I would support myself with fiction writing.

But about three months before graduation I took mental stock and realized I still had no usable ideas, nothing to say. This is not surprising, because few people in their early twenties have much to say to the adult world—which was my preferred venue—and the market for Young Adult literature is limited. So I was suddenly faced with the realization that I needed a “day job” to support my imminent debut into the real world.5 And what I was best qualified for was work as an editor. Through the graces of one of my professors, I got a junior position at the Pennsylvania State University Press. It was eight hours a day on my butt with a blue pencil, correcting spelling and grammar, untangling sentence structure, and marking copy for typesetting, all according to the Chicago Manual. But I loved it. After the university press, I went to a tradebook publisher—where I learned about that railroad tragedy, and much else about the West and my newly adopted state of California—and from there to technical editing of engineering and construction reports and proposals.

My third unpublishable novel came about in my late twenties, while I was working for the engineering company. Based on the time-honored mantra of “write what you know,”6 I tried to write a business novel based on the scramble of a second-tier construction company to answer a request for proposal from a major client for a mega-million-dollar mining development for a Third World client.7 That book progressed as far as a rough first draft, although I never sought a publisher.

In the meantime, I went from engineering and construction, to a public utility providing electricity and gas, to an international pharmaceutical company, to a pioneering maker of genetic analysis instruments, with stop-offs working as a temp in administration at two local oil refineries. In each case, I worked first as a technical writer—learning the secrets of the company’s respective industry—and then moved into internal communications—explaining the company’s business to its employees. And in every case, I was building myself an understanding of and intimacy with the business world and its technological basis, an understanding that I have been mining ever since as background for most of my novels.

So … what is the writer’s job in all of this? It is the same, pretty much, whether the task is editing another writer’s work, or creating and editing technical documents, writing newsletter articles and press releases, or writing a full-blown novel, whether a historical fiction or far-future adventure.

First—and especially in fiction—take the reader somewhere new, show the reader the unique side of an everyday life situation, or of a product or technology, something that he or she has never considered before. There are two ways to approach a story: you can come at the topic head-on and flat out, or from an oblique angle and on the bounce. Think of the latter as putting “English” on the ball, making it spin. That slightly-to-one-side approach puts the reader’s natural defenses off guard and simultaneously raises his or her curiosity. This also works for a new product description or policy analysis—although not in a tightly prescribed document format like a pharmaceutical batch record.

Second—especially in technical writing, communications, and non-fiction editing—make the obscure explicit and the confusing understandable. It is an article of faith with me that nothing is so complex that it cannot be made intelligible to a bright ten-year-old child. But you have to use simple words, work in everyday analogies, and take some extra steps and make some supporting linkages in your reasoning. And you have to use that bounce thing described above to make the reader care in the first place.

Third—and this applies to all writing and editing—be rigorous in your logic and sequence, and honest in your evidence and conclusions. You are invading the reader’s mind, and this is hollowed ground. You can play with the reader’s perceptions and trick his or her understanding in the same way that a magician’s sleight of hand arouses an audience’s awe and wonder. But you can’t lie to the reader or offend his or her senses of fairness, right and wrong, or proportion. And you can never disrespect the reader. For you are playing in another person’s sandbox and, if you offend, will be asked to go home with a slamming of the book.

Fourth—and this applies to almost all types of writing, except perhaps for instruction manuals—paint pictures and tell stories. The human mind is not exactly immune to bare facts, but we have a hard time understanding them and fixing them in memory without a context. This is why storytelling and visual representation have been so powerful in almost all human cultures. This is why religious groups and political parties create a “narrative” to support their core truths. Your job is to create in the reader’s mind a structure made of words, mental images, and associations that carries your message.

To be a writer is to be, effectively, inside somebody’s head by invitation. Play nice. And have fun.

1. What a writerly word! You can almost smell the printer’s ink and hear the presses hum.

2. Curiously, I was foreshadowing one of the tragedies of early railroading, when the trains of the Central Pacific Railroad had to navigate miles of mountain tunnels and snow sheds in the Sierra Nevada, and the accumulated coal smoke asphyxiated their engine crews. From this was born the generation of oil-fired, cab-forward steam engine designs, which worked that route for years.

3. Except, of course, your characters—which I also didn’t understand at the time. But they are only made of smoke and dreams.

4. Anytime you hear about anyone writing a brilliant first novel, count on it being their second or third completed manuscript. Even Harper Lee had to go through this process.

5. Back in my teens, when I was working on the space opera, I wrote to one of my favorite authors, Ray Bradbury, asking if he would read my novel. He politely declined, which was a blessing. But he then suggested that, to become a fiction writer, I should not go to college and instead get a menial job as a dishwasher and just write, write, write, and submit everything to publishers. That course for me, as a teenager, would have been a disaster. Given my subsequent history of real-world, practical experience, I don’t think it would have worked out any better for me as a college graduate.

6. Which is a trap. The command should be “write what excites you, that you know a little something about, but you want to know much, much more.” If your current life is dull—that future as a dishwasher toward which I had been urged—it shouldn’t limit your scope and imagination. And these days, with all of the online resources available, research is easy, right down to getting street views of any city and most towns around the globe.

7. Well, not every idea is a good one. However, some of that story—and so much more—found its way into Medea’s Daughter forty-odd years later.

Sunday, January 12, 2020

Memory and Imagination

One hand

As I think through the way I go about constructing the elements of a story line for a novel and then determining the next actions, images, details, and anecdotes that will support the plot, I have made an interesting discovery: these things pop into my mind—presumably from the subconscious1—in the same way that random memories come to mind.

This act of remembering in this way is different from sitting down, at least mentally, and asking myself what I can recall from a past experience: my graduation day, my wedding day, or any memorable event that is supposed to “stick in the mind.” No, this is the sort of sense, image, or snatch of conversation that surprises me when I’m doing something completely different: a flash of the roadway where I was driving on a trip three years ago, or the image of a house I once visited, or a fragment of what someone once said to me. These things come “unbidden” and when they are least expected.

The difference with the ideas, images, and dialog fragments that come to me for my work in progress is, first, these imagined things did not happen, have never happened, and usually have no relationship to—are not an echo of—anything that has ever happened to me. Second, and unlike the act of forced recall, I find myself unable to sit down, mentally, and ask myself about what should happen after the starting point of a story, or what image or sound or sensory perception would best illustrate the next action scene, or what the characters would say in the situation. I can pose to myself the question of what should happen next, but then I have to go off and take a walk or bike ride, or get my morning shower (warm water hitting my right shoulder and the side of my neck seems to be especially inducive to creative thoughts), and otherwise let my subconscious mull the problem. And then later, out of nowhere, like an unbidden memory, I know what the next action, image, or dialog bit should be.

So it’s apparent that both memory and imagination—at least in my case, and in dealing with a developing story—come from the same place, the subconscious. But the one is a reflection of something that actually happened, while the other is a bit of imagery or dialog with no context other than the mental structure of the novel I’m writing at the time.2

I have noted for many years, going back to my time as a teenager, something I call “the Return.” This is an instance of my hearing a piece of music and then later—two to four days, usually—having it pop into my mind, again unbidden. I will find myself singing or humming the melody or, more often, just “hearing” it in my head. I have also noted that the speed of the Return is linked to my mental and physical health. When I am in good spirits and feeling well, the music will come back in two days—but seldom less. When I am feeling poorly, the return might take four days or not happen at all.

For comparison, the return of a random memory—a place image or a conversation—that occurs while I’m doing something else might take two to a dozen years, while the subconscious production of a scene, action, image, or dialog bit that I have posed to myself for consideration generally takes less than two days. So the two processes—memory and imagination—while alike, are not identical.

Interestingly, none of this has anything to do with dreaming, which apparently comes from another part of the brain. While I have vivid, busy, and sometimes frustrating dreams—but seldom what you would call nightmares—few of them have to do with specific memories, although some are linked to places in my memory, specifically two of the houses we lived in when I was a child. No dream has yet produced anything related to one of my books—not a usable plot point, image, or bit of dialog. However, I occasionally wake up, after dreaming whatever dreams I had, and suddenly get an idea that might help the current book along. Only once in all my experience did I dream of a character in one of my books, and that was my first novel, written as a teenager, and the dream occurred about two years after I finished writing it.

All of this might mean something to a psychologist, or maybe a neurologist, but to me this is just the way my mind works. And the most important conclusion—at least for me—is that I cannot force the construction of a plot or of any particular part of a story. I have no secret list of plot outlines,3 no mechanism for character generation, nor other out-of-the-box novel-writing aids. The story comes from the dark—as in hidden or unilluminated, rather than foreboding or evil—places of my mind.

The story, characters, inciting incidents, and resulting actions have to come from someplace else—outside me, inside but hidden … the gods … the stars—in order for me to believe in them as real people doing real things that have meaning. Yes, I know that they are made up—but I have to forget that part for the story to live inside me. And once I have the outline—that composite of structure and belief—as well as a notion of the associated actions, sense images, and dialog, I can let the magic happen.

Magic? That’s when I sit down at the keyboard, put my mind on its invisible track, forget myself, and let the story flow through me.

1. See Working With the Subconscious from September 30, 2012.

2. Curiously, works that I wrote in the past, stopped writing, and put an end to, do not generate new imagery or dialog. I might suddenly wake up and recall a word, image, or dialog line that I once put on the page and now know to be an error, or lacking, or incompletely linked to something else in the novel—and usually this is still related to the work in progress. But I don’t generate new ideas and directions for the old books. My internal mental process seems to know when what’s done is done.

3. I’ve heard it said that there are only seven plots in all of storytelling. If so, I’d like to know what they are. Whenever I hear this, the examples given are so general (“Well, there’s boy-meets-girl …” or “the hero’s journey …”) as to be practically useless. Once you’ve picked that type of story, then the real work of generating setting, character, incident, and action begins.

Sunday, January 5, 2020

Twenty-Twenty

Crystal ball

The first time I had occasion to write the calendar date “2020” on anything was a few weeks ago, in preparing the draft of the January-February edition of our local NAMI chapter’s newsletter. And as I keyed in those digits, a chill went through me. It was as if I knew, in that sleepless but dormant part of my brain that reads the tea leaves and maps them onto the stars, that this year would be significant.

Oh, I know, we’re going to have a watershed election. And given the political emotions that are running in this country, the campaigning, the election day, and the aftermath are going to be traumatic for one group or another. But that was not the reason for my chill. This was more a moment of existential dread.

Twenty-twenty marks the start of something, but I don’t know what. Well, yes, in one way, we are all past the dystopian vision of the year 2019 as given in the movie Blade Runner, and we still don’t have Aztec-style corporate pyramids and towers shooting flames into the murky night sky of Los Angeles. (And my car still doesn’t fly, dammit!) So that’s a plus for the new year. But that’s not what concerned me.

I can remember from thirty-odd years ago hearing a speaker from The Economist describe how the character of each decade doesn’t change or become consistent and recognizable on the year ending in zero, but on the fourth year into the new decade. So the decade of the 1950s didn’t really end and the 1960s start until 1964. By then we’d been through the Cuban revolution and our ill-fated invasion, the first moves in the war in Vietnam, and the assassination of a beloved president. And then we had the Sixties.

In similar fashion, the Sixties didn’t end in 1970 but not until 1974, with the culmination of that war and the impending impeachment and resignation of the president who came in promising to end it. And the economic malaise—recession and inflation, “stagflation”—that started with the oil embargo and gas crisis in that year marked the Seventies and didn’t really end until about 1984. (Another year whose mention will give you willies.) And then the computer revolution and economic growth of the Eighties, which started with wider acceptance of small, desktop computers and personal software, fueling the “tech stock” market boom, changed the economic structure of the country and continued through the early Nineties.

You can define your own parallels to this theory, in cultural norms, music, clothing styles, and sexual mores, but I think the pattern holds true. Decades don’t change on the zero year but about four years later.

I think something similar happens with centuries. But there the change is not in the fourth decade but in the third, the years counted in twenties..

The 18th century, which was marked—at least in the Western European sphere—with the wars between England and France, culminating in revolution in the American colonies and then in France, the rise of Napoleon, and the struggle for all of Europe, extending to the shores of the Nile and the heart of Moscow, did not really end until the middle of the eighteen-teens. Similarly, the breakaway of the United States from England and the finding of this new country’s own character did not really end until the War of 1812. After that decade, the 19th century came into its own.

And then, the Victorian, imperial, colonial über-culture that—at least in Western Europe—took the superiority of one race and one extended culture for granted, did not end until the shattering madness of World War I. And what came out of that war was the new century, with a perhaps more enlightened view about races and cultures—at least for some people—but also a clutch of hardened counter-ideologies and the technological means to pursue them to horrific ends. After that first global war, the 20th century came into its own.

And finally, the 20th century has been with us ever since. The fall of the Soviet Union and putative end of the Cold War in 1989 was a highway marker, but the effects of that lingering aggression and its bunch of little colonial brush wars and invasions (Korea, Vietnam, Grenada, Kuwait, Iraq, Afghanistan) continued along the same lines, although those lines were becoming blurred with the rise of Arab and Muslim nationalism and the flexing of Chinese muscles. And over them all has loomed the technological changes in warfare that started in World War I, with the machine gun and chemical agents, and continued in World War II, with the jet engine, the rocket, and the atomic bomb. The century saw war become less about armies marching against each other on a common battlefield and, because of “the bomb,” more about political and ideological maneuvering, guerrilla techniques, and terrorist tactics.

You can define your own parallels to this theory with the way that political and cultural norms, music, clothing styles, and even sexual mores changed in the 1920s. The theory still holds: centuries don’t change on the zero-zero year but about twenty years later.

So, although we have been living in the 21st century for the past two decades, it still feels like an extension of the 20th century. We have the same international tensions, the same small wars in far-off countries, the same political differences at home. And, aside from the power of my handheld phone and the number of small computers controlling everything from my car’s ignition to my coffeepot, this century doesn’t feel all that different from the one in which I was born and raised. But still, I sense big things coming, and hence the existential dread.

What are these “things”? My internal leaf-reader and star-mapper doesn’t know. But my sense as a science-fiction writer can offer a few guideposts.

First, the technical revolution that brought us the small computer as a workhorse in everyday life will bring us artificial intelligence in the same role, but maybe more aggressive. And unlike the one big IBM 360 that sat in the basement of the computer science lab at Penn State in the 1960s and ran most of the university’s administrative functions on campus, or the one all-pervasive “Skynet” that destroys the world in the Terminator movies, artificial intelligence will be distributed entities, functioning everywhere. As soon as we work out the neural-net (i.e., capable of learning) architecture and marry it to the right programming, these intelligences will proliferate through our technology and our society like chips and software.1

But don’t think of “intelligence” as being a human-type soul or a little man in a software hat or even a Siri or Alexa. You won’t be holding a verbal discussion with your coffeemaker about whether you feel like dark or medium roast this morning. Instead, you will find that your world is going to have an eerie sense of precognition: answers and opportunities are going to come your way almost seamlessly, based on your past behavior and choices. Your phone is going to become your life coach, trainer, technical expert, and conscience. This trend is only going to expand and multiply—and that’s just on the personal level.

On the macro level, business operations and relationships, lawyers and judges, doctors and hospitals, and any sphere you can think of where knowledge and foresight are valuable, will change remarkably. You won’t go into a contract negotiation, a court appearance, or a diagnostic session without the expert system at your elbow as a kind of silicon consigliere. We’ve already seen the Jeopardy-playing IBM Watson prove itself as a master of languages, puzzles and games, and historical, cultural, scientific, and technical references. The company is now selling Watson Analytics to help manage business operations. This trend is only going to expand and multiply.

Second, the biological revolution that brought genetics into the study of medicine—and the sequencing of the human genome was completed on the cusp of this 21st century—will see a complete makeover of the practice. In this century, we will come to know and understand the function of every gene in the human body, which means every protein, which means almost every chemical created in or affecting the human body. That will change our understanding not only of disease but of health itself. Sometime in this century—all of the ethical handwringing aside—we are going to be modifying human genes in vivo as well as in egg, sperm, and embryo. From that will come children with superior capabilities, designer hair and eye color (just for a start), and stronger immune systems, among other things. The definition of “human” will be rewritten. Adults will benefit, too, by short-circuiting disease and regaining strength in old age. This trend is already under way with gene therapies, and it will explode in practice and popularity.

Moreover, the nature of our material world will change. Already, scientists are examining the genetic capabilities of other organisms—for example, Craig Venter’s people sifting the seawater on voyages across the oceans, looking for plankton with unique gene sequences—and adapting them to common bacteria and algae. You want a pond scum that lies in the sun and releases lipids that you can then skim up and refine like oil? That’s in the works. You want to harvest beans that have valuable nutritional proteins, bleed red, and taste like meat? That’s in development, too. You want synthetic spider silk and plastics as strong as—or stronger than—steel? You want carbon graphene sheets and Bucky-ball liquids that have the strength of diamonds, the electrical capacitance of metals, and the lightness and flexibility of silk? Just wait a few years.

As they say in the song, “You ain’t seen nothing yet.”

1. See Gutenberg and Automation from February 20, 2011.

Sunday, December 29, 2019

Writing Ourselves into a Box

Time warp

Maybe … contrary to our theories, the universe is not uniform throughout, or not at all scales. Maybe gravity is not a constant at all scales but has different values at different scales.

We know that the force of gravity appears to be nonexistent at the subatomic scale, apparently playing no part in quantum mechanics. We can measure the force of gravity and use gravity equations at the planetary, solar, and interstellar scales, where it governs the attraction between bodies with which we are familiar. But we observe that, at the galactic scale, gravity is apparently not strong enough to produce the effects we can observe. That stars in a spiral galaxy spin in a manner—as if they were stuck to a platter instead of floating freely—which suggests each galaxy contains more matter than we can account for by the stars that shine and the planets, brown dwarfs, and black holes that we presume must accompany them.

Instead of considering variable gravity, we hypothesize that something else must be at work: a substance called “dark matter,” which does not react with normal matter or electromagnetism at all, but has a measurable effect on gravity—but only, apparently, at very large scales. The effect on galaxies is so apparent that we attribute this “dark matter” with comprising 85% of the substance of the universe and about 25% of its total energy density. All this for something we cannot see, touch, feel, or detect—only “observe its effects” at very large scales.

Perhaps we are wrong in our assumptions about the gravitational constant being fixed at all scales larger than that of quantum mechanics.

This kind of “wrong” thinking—if it is wrong, and perhaps “dark matter” actually does exist—has happened before. During the 19th century, geologists debated between “catastrophism” (the theory that geological changes in the Earth’s crust were created by sudden and violent events) and “gradualism” (the theory that profound change is the cumulative product of slow but continuous processes). Gradualism would be the slow accumulation of sediments to form rocks like sandstone and shale. Catastrophism would be sudden explosions like volcanos or events like the formation of the Snake River Canyon from the collapse of the earth- or ice-dam holding back Lake Bonneville about 15,000 years ago. What has since been resolved is that either theory on its own is too constricting, too reductive, and that both theories and systems played their part in the formation of the Earth’s surface.

This is probably not the case with gravity. A variable gravitational constant vs. the existence of dark matter is probably an either/or rather than a both/and situation. One kind of physics that explains the observations does not necessarily make room for a second kind to explain them. However, it wouldn’t surprise me if each galaxy contained more normal matter that we can’t see by either the ignition or the reflection of starlight—that is, black holes, brown dwarfs, and rogue planets—than we have habitually counted. And still there may be something going on with the effects of gravity at the very small and the extremely large scales that is out of whack with the gravitational constant we see at the planetary and stellar scale.

Time and distance are variables that we have only been seriously considering over the past hundred years or so—ever since Edwin Hubble exploded the idea that the local stars in the Milky Way galaxy were all that comprised the universe and instead introduced the idea that the fuzzy patches (“nebulae,” or clouds) on photographic plates were actually distant galaxies that are not much different from the Milky Way, and that they all were receding from us. In one conceptual leap, the universe, the cosmos, all of creation, became really, really—add a few exponents here—big.

But the universe did not become correspondingly old. We still had various measurements and computations putting the age of the oldest stars and the universe itself at approximately thirteen billion years. That’s about four generations of stars from the creation of matter itself to the Sun with its disk of gas and dust that contains elements which must have formed in the fusion processes and catastrophic collapse of older stars. By winding back the clock on the expansion, the scientific consensus came up with a universe that started in the explosion of, or expansion from, a single, tiny, incredibly hot, incredibly dense—add more exponents here, with negative signs—point. That singular point contained all the matter we can see in the billion or so galaxies within our view.

Because that view comprises a distance that the farthest galaxies could not have reached in thirteen billion years, even when receding from a common point at the speed of light, a scientific consensus originating with Alan Guth has formed around an “inflationary” universe. At some time, the story goes, at or near the very earliest stages of the expansion, the universe blew up or inflated far faster than lightspeed. Perhaps from a hot, dense thing the size of a proton to a cloud of unformed matter probably the size of a solar system, the whole shebang expanded in a blink—whereas a photon from the Sun’s surface now takes about five and a half hours to reach Pluto.

While I don’t exactly challenge the Big Bang and the inflationary universe, I think we are reaching for an easily understandable creation story here. We want to think that, because we know the universe is now expanding, and because our view of physics is that basic principles are the same in every place and at every scale, and that processes continue for all time unless some other observable force or process interferes with them, then the universe must have been expanding from the beginning: one long process, from the hot, dense subatomic pencil point that we can imagine to the vast spray of galaxies that we see—extending even farther in all directions than we can see—today. And with such thinking, we may be writing our imagination and our physics into a box.

The truth may lie somewhere in between. And I don’t think we’ve got our minds around it yet.

Sunday, December 22, 2019

Market Forces

Cooking the Meat

Cooking the Meat,
from the Bayeux Tapestry

I believe the views of the economy, as expressed in the politics of the left and the right, suffer from a fundamental philosophical difference—and it may not be what you think.

Basic economics is descriptive, not prescriptive. It tries to define and say what people actually do, not what the economist thinks they do or what they should do. In this, economics is a science like biology or physics: it is trying to describe the real world. In this sense, it is neutral about outcomes and concerned only with accurate observation and reporting.

Socialist, communist, or Marxist economics may start with description—after all, Karl Marx’s book was Das Kapital, which started out describing a system that was developing in a newly industrialized 19th-century society. But then he, along with all these “-ism” and “-ist” studies, launched into theory and prescriptions about how people should act in order to reach a desirable state, a utopia of fairness, equality, and happiness. In this sense, Marxist and socialist economics try to direct and, in some cases, manipulate outcomes, because their intention is aspirational rather than observational.

The reason for this difference is that market forces—the natural rules governing how people in large, anonymous groups exchange products, services, and units of value—are part of the human condition. Whether you like the human condition or wish to change it will guide your choice of economic theory.

What do I mean by “natural rules”? Here is a simple thought experiment.

Put two strangers on a desert island and let one come ashore with candy bars in his pockets, which can immediately be consumed for nutritional needs, while the other comes with gold bars in his pockets, which can have far greater value once the holder is rescued and leaves the island. And right there—barring a quick, premeditated homicide—the two survivors have established a market. How much gold will one give up for a candy bar? How much candy will the other give up for gold? And bearing on that question are sensations of hunger, expectations of rescue and its timing, and even the life expectancy of either or both parties before that putative rescue happens.

Where one person possesses something of value—either a tangible good or a useful skill—and the other person must have it, or may need it, or might have no interest in it but still possesses the means to acquire it, there will exist a market. And from that market will grow in the human mind all sorts of calculations: of cost and benefit, of current and future needs and wants, of available resources, and sometimes of pure, irrational desire. Along with these calculations will come a whole range of future projections. These mental gymnastics and their playing out in real-life action can be traced, plotted, and described. They are not a theory, although theories about motive and calculation can certainly be derived from human actions. Instead, they are facts to be analyzed and formulated into rules or laws—not the legislative laws of human societies but descriptive and projective laws, like those of physics and chemistry.

Adam Smith, in The Wealth of Nations published in 1776, described how each individual in a society and economy works for his own benefit, maximizing his satisfactions, advantages, and security. But through the collective action of all these individuals—that is, the “invisible hand,” as Smith described it—they achieve positive economic and social ends—that is, the national wealth of which he wrote—without the conscious intention of doing so. Smith was defining a phenomenon, a hidden force, that was already at work. People did not need to read his book and follow his precepts in order to achieve that “wealth of nations.” It just happened and he observed and described it.

Karl Marx, on the other hand—writing The Communist Manifesto in 1848 and Das Kapital twenty years later, almost a century after Adam Smith—started with a description of what was becoming the capitalist system in an industrialized setting, noted the instances in which a machine culture failed to lift everybody out of poverty, and thought to come up with something grander. Unfortunately, although he is regarded as a philosopher, Marx was no real visionary, and his “something” was the communal life of the medieval village.

The economics of the village did not depend on the exchange of money and the investment of capital in factories, but instead existed through the simple exchange of goods and services among individuals known to one another. The farmer raises wheat; the miller grinds it for a share of the crop; the baker makes it into bread for another share; and the cobbler and blacksmith get their daily bread in exchange for making shoes according to the needs of everyone else’s children and horses. That communal system—the isolated feudal castle without the authority of the feudal lord—works well enough on a local scale, where everyone knows and trusts everyone else, as in the Israeli kibbutz or those idealistic communes of the American transcendentalist movement. The feudal system even had an element of future planning and provision: if the miller’s children don’t happen to need shoes this year, the cobbler will still get his bread, because everyone knows that children will need shoes soon enough.

Marx’s error, in my opinion, was in ignoring human nature and believing that what had worked on a local level could be expanded to the national—and eventually the global—economic level. His prescription was not simply that local villages and towns should go back to their feudal roots in isolation, but instead that whole nations could exist on the trust and charity—“from each according to his ability, to each according to his needs”—of the local peasant. That cobblers and factory workers in Kiev would make shoes, not for personal benefit in terms of money wages or a share in the enterprise, but so that everyone in the country who wants shoes will have them. And everyone else in the economy will selflessly make their products—wheat, flour, and bread, along with tractors, milling machines, and bakery ovens—and provide their services—carpentry, plumbing, medical assistance, and legal advocacy—free of charge and in the expectation that everyone else will provide selflessly for their needs. This is the antithesis of Smith’s individual maximizing his own satisfactions, advantages, and security. It is the individual sacrificing for the greater good.

Karl Marx believed that once his system had gotten under way, the state, the governing force, would no longer be needed and would wither away. And people would just go on forever, sharing joyously. Marx was no student of human nature but instead a visionary who dreamed of his own utopia. Wherever his ideas have been tried, they required the state to grow stronger and force individuals into line, usually with actual punishments and threats of death. The alternative, was to try to create, through psychology and breeding, a perfectly selfless man, Homo sovieticus, which is to say the perfect slave.

Adam Smith’s capitalism, on the other hand, required no state intervention to operate. People will try to maximize their satisfactions and advantages naturally, and they will only sacrifice for those they hold near and dear, their families and closest friends and associates. But we—that is, later economists—have discovered since Smith that unbridled economic action does need some agreed-upon rules. For example, how is society to treat and dispose of the wastes of a large industrial factory or mechanized, monocrop farm, undreamed of in Smith’s time? And those situations, along with the multiplication of advantage enabled by developing science and technology, do require a government to monitor them and enforce the rules. We have also learned that contracts between individuals do not always operate smoothly, and some system of courts and judges is needed to arbitrate disputes. But the underlying economic system Smith described does not require a magical change in human nature, nor does it need an overbearing state to keep turning the crank of command-and-control decisions, endlessly directing what to make and how and where to deliver it, in order for the system to function.

If you believe that human beings can best decide where their own happiness, benefit, and advantage lie, for themselves and their families, then you subscribe to the natural economics that market forces will bring about—although with some social adjustments, rules of exchange, and the courts for enforcement. If you believe that human beings are flawed creatures, unable to decide where their own interest and that of the greater society lie, then you subscribe to the aspirational economics that only a strong state, the dictatorship of one group—the proletariat, the elite, the “best people,” or Animal Farm’s pigs—over all the others, and an eventual rewriting of human nature itself, can create.

I believe the difference is that simple.

Sunday, December 15, 2019

Chaos vs. Predictability

Red dice

Religious people would have you believe that, without a world or a universe ordered by an omnipresent, omniscient intelligence, or God, everything in human life would be meaningless, unpredictable chaos. If that is the case, I believe they don’t understand what chaos could truly be.

With or without a guiding intelligence, we live in a relatively predictable world. Yes, bad things sometimes happen in the lives of exemplary people, but this is not chaos. Storms sometimes blow down your house. Lightning strikes many places and people at random. Sinkholes open at your feet all over Florida and in other places built on limestone. But an intelligent human being can guard against some of these things. If you live in an area with hurricane-force winds, you can build your house out of steel and stone instead of lath and plaster. If the sky is threatening thundershowers, you can stay inside and not go out, because lightning almost never strikes from a clear sky. And if you live in sinkhole county, you can choose to move or survey the ground before you build.

A truly chaotic world would have hurricane winds and lighting bolts come out of a calm, clear sky. Gravity on Earth would be variable, so that if you accidentally stepped into a null-gee pocket, you might be thrown off into space by the planet’s revolution. Chemical bonds would change without rhyme or reason, so that a substance you had identified as food one day would become poison the next.

But, with or without an omnipresent God, we do not live in such a world or universe. Most interactions and events are governed by natural laws that can be analyzed and understood. Most hazards can be predicted and either avoided or insured against.

And then there is probability. While we can guard against windstorms and lightning, there is not much we can do about asteroid strikes. Even small bodies near the Earth and crossing our planet’s orbit are difficult to observe and harder to avoid—and when they hit, it’s going to be a hard day for anyone inside the impact radius. But still, we can study the history of previous asteroid strikes, observe the skies in our vicinity, and make a prediction based on probability about whether it will be safe to wake up and go outside tomorrow.

And if it’s not safe, if there is no prediction to be made based on identifiable probability, does that mean the world is chaos and life is pointless? Or that God, if He exists, does not love us anymore? Or that, if He does exist and still loves us, He must somehow be testing our faith and our resilience, in this best of all possible worlds?

Some things are unknowable. From a human perspective, which is bounded by experience, reason, and imagination, there are dangers in the universe that we can’t know about until they happen, and so we can’t evaluate them ahead of time. A comet falling toward the sun from the Oort Cloud or the Kuiper Belt for the first time and just happening to cross Earth’s orbit at the same time we intersect its path—that would be such an unknown. The universe is also filled with unresolved questions that extend beyond our experience of life on Earth and observations from within our solar system: the presence, effects, and unforeseen future conditions of gravity waves, dark matter, and dark energy come to mind. For example, we know the universe is expanding, but we’ve only been watching this process for about a hundred years—a snapshot in geologic, let alone galactic time. Might the expansion eventually speed up, slow down, or reverse itself? Might that change create effects that could be experienced and measured here on the planet or within our solar system? Our physics suggests not—but the reasoning and mathematics behind our physics are relatively new, too.

The long-term futures of the world, the solar system, the galaxy, and the universe itself are all vaguely predictable but existentially unknowable. Is this chaos? Does this condition demand the existence of a God who knows, understands, and might deign to tell us one day, perhaps through a prophet or a revelation? In my book, probably not.

The world I was born into, or that my parents and teachers prepared me for, is predictable in the short term: what errands I will run this morning; what I might plan to have for dinner tonight; where I’m going to be on Thursday, or next week, next month, or even next year. But in the long term all bets are off. I could die in a motorcycle crash or be struck by lightning or get hit by a bus anytime between now and then—although I would hope to be able to watch out for and guard against any of these eventualities. But still, that uncertainty is all right. Because I was born with the ability to reason in one hand and the capacity for hope in the other, but otherwise I was and am as naked as a clam without its shell. As are all of us.

Nothing is promised. Everything is ventured. Sometimes something is gained. And as Robert A. Heinlein is supposed to have said: “Certainly the game is rigged. Don’t let that stop you; if you don’t bet you can’t win.”