Sunday, March 29, 2020

The Limits of Accountability


Image of the coronavirus taken with an electron microscope
(Credit: U.S. National Institutes of Health/AP/Shutterstock)

I generally try to keep these blogs (or essays, or meditations, whatever) away from absolute topicality and from following the news of the day in short order. My concern is the longer view, the background view, the why rather than the what of events. But the past few weeks have been extremely disturbing to all of us—emotionally, mentally, physically, and financially.

We have seen a virus of unknown quality as to its incubation time, severity of symptoms, transmission rate, and mortality arise and spread around the world in a matter of months—and perhaps, because of initial attempts at hiding the crisis, within just weeks. That has been one problem: governments, scientists, journalists, and anyone with social media access have lied, exaggerated, imagined, and spun counter-factual accounts (okay, “fake news”) about this virus and its effects. We have gotten comparisons—some real, some bogus, and some irrelevant—with the mortality associated with the Spanish influenza pandemic of 1918, the H1N1 influenza pandemic of 2009, and the caseload and mortality of the yearly seasonal flu—as well as with deaths by gunshot and automobile. We hear that young people may get the disease and remain asymptomatic but still be carriers. We hear that older people and anyone with systemic vulnerabilities will likely get it and die.

In response to all this, various state governments around the world and in the U.S. have locked down their populations. In California, we live under a shelter-in-place order that has emptied the streets, reduced restaurants to take-out service only, closed all entertainments and public gatherings, and supposedly limits travel for non-essential workers to visiting the grocery store and pharmacy. Many other states have followed suit in this country. This has disrupted the local economy for goods and services, crimped the national economy for travel and tourism, and forced every business and organization to reevaluate its most basic assumptions and activities. The result has been people trying to stock up like doomsday survivalists and emptying grocery store shelves—including toilet paper, which seems to be the first priority for everybody.

As a personal experience, my local Trader Joe’s, where I went to do my weekly shopping on Monday, has instituted entry controls, attempting to limit store occupation to one hundred customers at a time. A clerk at the entrance monitors the queue outside and only lets people enter when a clerk at the exit signals someone has left the store. The queue path along the sidewalk is marked off with chalk at six-foot increments for “social distancing,” and we all advance by two giant steps each time someone up front enters. Inside the store, people are orderly and even pleasant—but at a distance. The number of Trader Joe’s personnel now almost equals the number of customers, and the shelves are reasonably stocked. At the register, a sign limits purchases to just two of any one item to prevent hoarding—although the checker let me get by with my weekly supply of six apples, six yogurts, and four liters of flavored mineral water. It wasn’t a bad experience, but it was sobering: we all seem to be taking these restrictions on our movements very calmly.

Health officials would like to see this personal lockdown extended for two, or eight, or perhaps eighteen months in order to “flatten the curve” of the virus’s exponential spread and keep the infected population from exploding as it apparently has done in China, Italy, and Iran. If these experts are right about the need for extending the restrictions, then local economies will crash. Small businesses, many large businesses, and whole industries like hotels, travel, and entertainment will go bankrupt or disappear. Unemployment will reach Depression-era levels, if not greater. China locked down its entire economy—or so it’s said—and dropped their gross domestic product by thirty percent—or so it’s said.

Because of uncertainty about all of this—fears of massive infection rates and millions of dead, the looming prospect of a cratered economy and worldwide depression—the stock market lost a third of its value in two weeks, ending the longest bull market in a sudden and dizzying bear market. The bond markets also crashed. The price of oil collapsed—although this had help from a price feud between the Saudis and the Russians. Gold prices spiked and then relapsed. There has been no safe place to invest in all of this turmoil.

The point of my bringing up these events in such detail is that we may have reached the limits of human accountability in a world still driven by natural forces. Whether the novel coronavirus—that is, this unknown version of a known type of virus—is the unfortunate meeting of a bat and a pangolin in a Chinese wet market, or the intentional creation of a weapon in a biosafety Level 4 lab, it still spreads by the vulnerabilities of the human immune system, the vagaries of human touch, and the viability of its own protein coat. Airline travel—which is virtually instantaneous these days, compared to horseback and sailing ship—allows the virus to move farther and faster before it touches down in a population and blooms with disease, and there it spreads in ways that are still hard to stop.

Today we all live with awareness of our scientific, medical, and technical capabilities, and so with a consciousness moral and civilizational superiority, compared to earlier times and less-developed places. Our past success with vaccines in treating viral diseases like polio and measles makes us believe that we should be able to quickly and easily prevent and treat this disease. We become impatient with diagnostic and pharmaceutical companies who can’t produce a rapid test or a vaccine within a matter of weeks.

We are capable of wielding such enormous economic power and organizational resources that we tend to believe we are immune to natural disaster. And so when hurricanes and earthquakes strike, or a virus comes into the population, we blame the response of the Federal Emergency Management Agency, the Red Cross, the National Guard, and federal, state, and local governments as being inadequate to the task. Someone must be at fault for this.

We look at previous civilizations and historic events like the Spanish Flu, the Black Death, the eruption of Vesuvius, or the storms that swept the Armada’s galleons off course, and believe we are superior. Because we understand the nature of viruses and bacteria and their role in disease, or the nature of plate tectonics and its role in earthquakes and volcanoes, or the weather patterns that create typhoons and hurricanes, we think we should be able to prevent, treat, and immediately recover from their effects. And if we do not, we blame the experts, the government, the organizational structures that have been built to protect us. Someone should be held accountable.

The fact is, we are still relatively helpless. Humans are not the masters of this world, only its dominant tenants. We are still subject to the unpredictable movements of its lithosphere, its atmosphere, and the other inhabitants of its diverse biome, including the tiniest specks of DNA and RNA wrapped in a layer of reactive proteins.

No one gets the blame. Everyone is doing their best. And we all die eventually.

Sunday, March 15, 2020

Harry Potter’s Broom

Nimbus 2000

Harry Potter’s broom

I enjoy many stories, novels, and movies based on magic and magicians—the kind where magic is a real force, not a stage performance. But I have always resisted writing about magic as if it was real and not, in Arthur C. Clarke’s words, a “sufficiently advanced technology.”1

The problem, as I see it, is that I have too practical and inquiring a mind. Being the son of a mechanical engineer, grandson of a civil engineer, having worked all my life with engineers and scientists, and being good at asking questions and keeping my ears and mind open, I have a feel for the way things work in the real world. Which means I can just about smell a technical problem without having to take measurements.

So … Harry Potter’s broom raises an interesting question. In the Wizarding World, is it the broom that flies, and the person simply steers or wills it to fly in a certain direction at a certain altitude? Or is it the person that flies, and the broom is simply an adjunct, a supplement to his or her powers, perhaps functioning as some kind of talisman?

The reason I ask is one of balance. A person perched on top of a broom has his or her center of mass positioned above the shaft of the broomstick.2 In that condition—as I know from personal experience with the inertial dynamics and all the postures and gestures involved in riding a motorcycle—your balance would be severely proscribed. Like a ship whose center of gravity and center of buoyancy become misaligned, the whole rig will tend to turn over.

So why doesn’t a witch or wizard riding a broomstick—either in the Harry Potter world or in the traditional Salem and Halloween sense—with only her or his legs hanging below the shaft, and the rest of the body’s mass above it, not turn over? Why don’t we see these people flying upside down and hanging onto the broom for dear life?

The question is pertinent because I don’t think that—to the extent authors who deal in magic and flying broomsticks are actually thinking this matter through—the person is flying and only using the broom as a talisman. We’ve seen comic scenes, particularly during Quidditch games, where a player is knocked off his or her broom and must hang on, two-handed and legs flailing, underneath the floating broom while he or she tries to climb back aboard. Clearly, the broom and not the human is doing the actual lifting and flying.

So why isn’t the rider flying upside down? Does the broom have a preferred side or orientation? Do the laws of physics cease to operate in the vicinity of the broomstick?3 Or does it have something to do with the positioning of the rider’s hands and legs and the strength of their grip on the shaft?

It’s all a mystery, as magic should be. Still, inquiring minds want to know.

1. The whole quote is “Any sufficiently advanced technology is indistinguishable from magic.” And that is the basis of much good science fiction.

2. Don’t be fooled by the wire stirrups in the picture, as if they anchored the rider in any preferred position. Mass is mass and finds its own center of gravity. Just ask any horseback rider who, with or without stirrups, experiences a broken saddle girth.

3. Well, of course!

Sunday, March 1, 2020

A Material World


The Buckminsterfullerene

In the movie Star Trek IV: The Voyage Home, Scott and McCoy try to find a light and strong material with which to build a giant seawater tank in the hold of their stolen Klingon ship. They locate a manufacturer of plexiglass in 20th-century San Francisco and offer him the formula for “transparent aluminum,” a material from the 23rd century. They assuage their consciences about temporal paradoxes by suggesting, “How do we know he didn’t invent it?”

Well, he didn’t. The crystal in many of today’s quality watches of all descriptions, including my upgraded Apple Watch, are made from synthetic sapphire. Since the composition of sapphire is corundum, or crystalline aluminum oxide (Al2O3)—the same material from which, in powder form, metallic aluminum is smelted—along with traces of iron, titanium, chromium, vanadium, or magnesium depending on the gem’s color,1 you could easily say that this crystal, which is durable, lightweight, strong, and scratch-resistant, is indeed “transparent aluminum.”

Synthetic rubies and then sapphires were invented in 1902 by French chemist Auguste Vermeuil. He deposited the requisite chemicals in the requisite combinations on a ceramic base by heating and passing them through a hydrogen-oxygen flame, then increasing the temperature to the point of melting and crystallizing the alumina. So far, we can make watch crystals and synthetic gemstones with this process. Whether it is scalable for fabricating whole spaceships is another question. But the technology is young yet.

If you are a dedicated browser among the pages of Science and Nature, as I am, with forays into Scientific American and Popular Science, you know that the world of materials science is hot right now. And the element carbon is getting a resurgence—but not as a fuel.

Carbon has the happy ability to bond with many different atoms including, sometimes, itself. Its four covalent bonding points allow it to share single, double, and even triple bonds with other carbon atoms, often forming chains and hexagonal rings that are the building blocks of organic chemistry and so the basis of all life on this planet. These rings and chains leave room for adding other atoms and whole other molecules, making carbon the backbone of the chemical world’s Swiss Army knife.

What modern materials scientists have discovered is that bonding among carbon atoms can be induced in several structural forms. We are all familiar with the three-dimensional, tetrahedral-shaped crystal of a diamond, whose bonds are so strong that they make it one of the hardest materials known. But those atoms can also be knit into fibers, which are then stabilized and supported in an epoxy resin to create a material that is light, strong, and useful in many applications, sometimes replacing steel. The carbon atoms call also form two-dimensional, hexagonal structures that can also be laid out in endless sheets, called graphene, which are strong and supple even at one molecule’s thickness.2 Or smaller sections of those sheets can be bent into nano-scale tubules, which are even stronger than the carbon fibers and have interesting chemical uses. And finally, the carbon atoms can be joined into microscopic soccer ball–like molecules, made of twenty hexagons and twelve pentagons with the formula C60 (pictured). This is the buckminsterfullerene—named after the architect Buckminster Fuller, who invented a spherical structure of similar configuration.

Graphene is not only strong but it is also electrically and thermally conductive, useful for dissipating heat. It has a high surface-to-volume ratio, which means it can be used to make batteries and fuel cells more efficient. It holds promise for flexible display screens and solar photovoltaic cells. And as an additive to paint and in other surface preparations, it can increase wear and resistance to corrosion.

Carbon nanotubes, which are also electromagnetically conductive, can be used in radio antennas and as the brushes in electric motors. Being biodegradable, they can be used in tissue engineering for bone, cartilage, and muscle. Because they are easily absorbed into cells, they can carry other molecules such as medicines as well as protein and DNA therapies. Spun into yard, the tubes would offer superior strength and wear in clothing, sports gear, combat armor, and even in cables for bridges and for space elevators—imaginative projects that have been proposed for hauling people and cargo up to geosynchronous orbit.

Buckyballs have potential uses as a drug delivery system, as lubricants that will resist breaking down under wear and heat, and as catalysts in chemical reactions. As a medicine in itself, the C60 fullerene can be used as an antioxidant, because it reacts with free radicals.

And that is just some of the potential for various pure forms of carbon.

Work on the genetics of plants and animals other than humans will have far reaching effects, too, in terms of bio-simulates. For example, spiders produce a raw silk that they spin into a strand which has a tensile strength greater than steel and more fracture-resistance than the aramid fibers used in Kevlar body armor.3 We could farm spiders for this silk, the way we do silkworms for their cocoon fibers, except that spiders in captivity will eat each other. But several companies are now working on creating synthetic spider silk.

Another area ripe for development is natural latex, the basis of all our rubber products. Rubber trees are native to South America, where they naturally grow in splendid isolation because a fungus-based leaf blight destroys any trees that grow too close together. Attempts by Ford to create a rubber plantation in Brazil in the late 1920s failed because of this blight. All of the world’s rubber currently comes from trees grown on plantations in Southeast Asia, where they survive only with the strictest vigilance—cutting and burning whole plantations at the first sign of blight—and government control of imported plants and vegetables.

Natural rubber is essential to modern life. Synthetics based on petroleum chemistry, like styrene-butadiene, are less resilient and elastic. A natural rubber tire can thaw from being frozen in the wheel well of an airliner at 35,000 feet in the time it takes for the plane to descend and land, while a synthetic-based tire will remain frozen and shatter upon impact. So discovering a genetic formula for latex and being able to extrude it in the same way the rubber tree weeps its sap would be a godsend.

One of the unsung stories of our modern life is the nature of our materials. They are not just getting cheaper but also lighter, stronger, and better. And this is only the beginning.

1. Just about every color but red. And a red crystal of virtually the same composition is called a ruby.
    Emeralds are a different material, however, based on beryl, which is composed of beryllium aluminum silicate (Be3Al2Si6O18) in hexagonal crystals with traces of chromium and vanadium.

2. The graphite in pencil “leads” is not chemically lead but a pure form of carbon. Small bits of what we now call graphene are layered into a three-dimensional composite, like the layers in sandstone or shale.

3. Interestingly, spiders that are fed a diet of carbon nanotubes make a silk that is even stronger, incorporating the tubules into its protein microfibers.

Sunday, February 23, 2020

Ancient Computers

Perpetual calendar

About twenty years ago, I used to keep on my desk—partly as ornament, partly paperweight, and somewhat as a useful device when I was writing fiction about the near future—a perpetual calendar like the one pictured here. It was a simple device: You align the month on the inner dial with the intended year on the outer dial, then read off the dates for the days of the week in the window at bottom. It took a minute to set, being mindful of leap years, and gave accurate readings over a span of fifty years.

This was a form of computer with no electronics and only one moving part. It was the sort of thing we all used to find specific information before we could carry a computer in our pockets that is the size of a deck of cards—and now wear one on our wrists that is the size of a matchbox.

I am of two minds about this, because I have always loved small devices with screens and keyboards. That love goes back to the first toy that my father made for me when I was about three years old. It was a box with a series of toggle switches and a line of small lights with red, green, yellow, and blue lenses. When you threw the switches, the lights would come on in different orders. It did nothing useful, except fascinate a small child. But it fixed my mind in a pattern that endures to this day.

Ever since the dawn of the Microprocessor Age, I have been chasing the ultimate handheld computer. It started with the first “personal digital assistants,” or PDAs, usually made in Japan and with crippled keyboards that required you to hold down three not-all-that-obvious keys to get a capital letter or a punctuation mark. Being a book editor and a stickler for form, I laboriously worked to get the right spelling and punctuation in my entries, so using the thing productively took forever. I then adopted, in rapid succession, a Palm Pilot—where you spelled everything out with a stylus or your fingertip—and then a variety of Hewlett-Packard calculators and tiny computers, chasing that holy grail.

My first cell phone had the traditional arrangement of rotary-dial numbers as a limited form of keyboard. That is, the digits 2 to 9 were each accompanied by three letters in sequence from the alphabet.1 You could store people’s names and numbers in the phone’s memory by “typing” them in using the keypad: Press 2 once for A, twice for B, three times for C, and wait a bit for the phone to sort out the right code and show your desired letter. And, of course, there were no lower-case letters or punctuation. It was really easier to keep your phone list separately, in a booklet or on a piece of paper, except that then your friends wouldn’t be on speed dial. But I digress …

Before we had computers at our fingertips, we had all sorts of handy ways to work out useful information.

The oldest is probably the Antikythera mechanism, a device of brass with geared wheels, now encrusted with corrosion and coral, discovered in a shipwreck off the Greek island of Antikythera in 1901. It has since been dated to about 200 BC, and x-rays of the gears and a reconstruction of their turning suggest that the mechanism was used to calculate astronomical positions and possibly to predict solar eclipses. The corollary would be the modern mechanical orrery, which dates back to the late medieval period and shows the positions of the sun and planets at any particular point in their continuously revolving orbits.

But mechanical representations of physical conditions are not the only form of ancient computer.

When I was compiling engineering resumes at the construction company, I came across a man whose work responsibilities included compiling “nomographs.” At first, I thought this was a typo and that he must actually be writing monographs—a literary pursuit, but an odd one for an engineer. Further checking revealed that, no, he really did make nomographs, also called nomograms. These are two-dimensional diagrams representing a range of variables associated with a mathematical function, usually shown as number sets along two or three parallel lines. Rather than solve the function mathematically, all an inquiring engineer had to do was draw a line of the correct angle across the parallel lines to achieve the answer.

As a form of computer, the nomograph is just a little more complex than a table of common logarithms2 or a telephone book—closer to a database than a calculation. And if we’re going to call a computer any device that gives you accurate astronomical readings, then a sailor’s sextant for “shooting the sun” at noon to determine latitude—and before that the astrolabe for calculating the angle between the horizon and the North Star—are also in the running as early “computers.”

But the point of this meditation is not to show how clever ancient peoples were but how much we are losing in the digital age. Orreries and sextants are now mechanical curiosities and decorative artifacts—the one lost to telescopes and observational satellites tied into much more sophisticated computer modeling, the other lost to satellite-based global positioning systems (GPS). Nobody writes nomographs anymore. The phone company doesn’t even publish the Yellow Pages anymore, or not on paper. Everything is online. And I can answer texts from friends by drawing letters with my fingertip—in both upper and lower case, with punctuation—on the crystal face of my Apple Watch, which doesn’t even need a keyboard.

The knowledge of the entire world along with real-time information, like GPS positioning and footstep counting with conversion to calories, is in our pockets and on our wrists. And that’s a wonderful thing.

But when the batteries die—or when future archeologists dig my Apple Watch out of a shipwreck, corroded with salt and perfectly nonfunctional—we will be left with lumps of silicon and dozens of questions. Who will draw the nomographs then?

1. Except for the 7, which picked up P, Q, R, and S, and the 9, which had W, X, Y, and Z. Presumably Q, X, and Z weren’t expected to get much use.

2. And a little less complex than a slide rule.

Sunday, February 16, 2020

Flying Cars

Taylor Aerocar

The Taylor Aerocar from the 1950s

So it’s now 2020 and the refrain I hear from all sides—including once on these pages—is, “Where’s my flying car?” We were promised in the tabloids and the Sunday supplements that our cars would fly by now. So where are they?

But let’s think about this a bit. First, what do you mean by “car”? Second, what do you mean by “fly”?

If a car is a vehicle that takes a driver and a number of passengers and their personal luggage on a flying trip of several hundred miles, then we had such a vehicle in my childhood. The Taylor Aerocar (pictured nearby) was available in 1954. Perhaps this vehicle fell short of the “flying car” definition because it didn’t just take off from the street. The owner trailed the two wings and tail section with its pusher propeller behind the vehicle on the road and then assembled the flight and control surfaces after arriving at the airport.

For a complete flying vehicle, we’ve long had small airplanes like the Cessna 172 Skyhawk, which can carry four people and their baggage about 736 miles at a top speed of 143 miles per hour. That’s a convenient flying distance and time, but the plane has to start and stop the trip at an airport or prepared landing strip.1 Also, the pilot needs a course of special instruction, must be licensed, and has to file a flight plan before each trip.

Both vehicles can actually fly, but neither can park in your driveway, roll into the street, and take off from there without FAA authorization and clearance.

As to the question of flight, a hovercraft would technically qualify as “flying,” although it seldom gets more than a few inches to a foot off the ground surface. So while you can travel with this vehicle across country and even over smooth water, the dream of taking your flying car up into the air, well above traffic, and over the rooftops cannot be satisfied with a hovercraft.

No, when we think of a “flying car,” we mean the sort of compact, wingless vehicle we all saw in movies like Blade Runner or The Fifth Element. There the cars occasionally might touch down and roll along the ground, but they mostly lift into the air and fly over and between buildings. We want our everyday parkable sedan but, you know … flying.

I have seen several designs and claimed pre-production models of such vehicles. Most use some assortment of ducted fans to generate lift and then, once aloft, forward motion. The ubiquitous aerial drone2 is a model for this sort of propulsion, using computers for control of its four to six rotors in maintaining stability and direction. Having a computer keep a flying car in level flight would go a long way toward removing one of the barriers to this concept, that of requiring the driver to maintain the vehicle in level flight and control it through all maneuvers and under all conditions of wind and turbulence, the way the pilot of a fixed-wing aircraft must constantly monitor the flight envelope. An extension of this computer control would allow the proposed flying car to maintain altitude separation, avoid collisions, and make protected takeoffs and soft, on-target landings. Indeed, the pilot/driver would only have to pick a destination and route, then sit back and become an interested observer of the passing countryside.

But the block to these cars becoming practical has more to do with energy than aerodynamics. It takes more energy to lift a body and maintain it aloft with a directed airstream like a ducted fan than to propel it forward through the air using an airfoil or wing to provide the passive lift. Even a helicopter provides its lift with an airfoil: those large rotor blades, which are so unwieldy in a parking lot. But small-diameter fans of the sort lifting any flying car we can envision will provide much less lift and so require more power.

Right now, four small but powerful gas engines driving the fans would consume about four times the fuel of an old-style Aerocar. And they would weigh more than the car’s performance parameters would probably allow.3 Electric motors would be far more efficient and infinitely more controllable, as well as quieter, but the battery weight and performance would again be outside the vehicle’s desired parameters. A flying car powered by internal combustion or electricity might have enough fuel capacity or battery charge to take you to the grocery store and back at low altitude, but it wouldn’t be much more than a rich man’s toy, like the very first automobiles: more trouble than they’re worth and bought only for the excitement and display value. Such a car would not be able to take you to the mountains or to Las Vegas and back for the weekend.

This is not to say that flying cars are impossible dreams, or that their development and practical use is more than a century in the future, if they are possible at all. But like so many other technologies we see in the movies, they wait upon developments in basic physics and in energy production, storage, and release that are still years away—even though some of our best academic minds, backyard inventors, and dreamers are working on the problem all the time.

In the meantime, as the link above shows, somebody’s got a working Aerocar for sale. All you need is a trailer.

1. And we won’t get into the questions of initial cost, regular service and maintenance, special storage requirements, and inspection intervals—all of which are much more intensive than for a regular family vehicle that would qualify as a “car.”

2. The lightweight kind with four small propellers and a camera on board, not an unmanned aerial vehicle (UAV) like the MQ-9 Reaper, which observes our enemies remotely and then rains missiles down on their heads.

3. Not to mention quadrupling service costs and time.

Sunday, January 26, 2020

Living With One Hand

Hand in cast

About a month ago, on Christmas Eve to be exact, I stumbled and fell on the sidewalk, landing on the side of my face and across the back of my left hand. Among other bangs and bruises, which have since healed, I broke three bones in that hand and wrist. Between the splint put on in the emergency room and the cast fashioned by the hand specialist—luckily, there was no need for corrective surgery—I now have limited use of those four fingers and the barest pinch between thumb and forefinger. Luckily also, I am right-handed. According to the hand doctor, the bones are not shifting and they are knitting well. But still, I may be another month or so in the cast and then a removable splint.

I try to treat everything as a learning experience. So what have I learned from this, other than to be more careful and watch my step? Well, for one thing, I have more understanding for those who, by accident or illness, have lost the use of their fingers or their whole hand. Here are the limitations—temporary limitations, I remind myself—so far encountered.

Typing: I am by training—self-taught from a typewriter manual one winter break during high school—and by nature a two-hand, ten-finger touch typist. No hunt-and-peck for me. So much so that being able to sit down and share my brain through my fingers is now part of my writing process.1 I can still type by spider-walking my right hand across the keyboard, using my left thumb or forefinger to hold down the occasional shift or control key. The result is barely faster than keying into the search function on your DVD player using the remote’s arrow pad. And I still hit the wrong key, or a combination of right and wrong keys, ten percent of the time. So writing is hard.2

Pockets: For fifty-odd years the distribution of daily necessities in my pants pockets has been: knife, comb (when I carried one), and handkerchief in right front; keys, coins, and sundry tokens in left front; wallet in left rear; nothing, or a washcloth for use as an emergency towel, in right rear. With my left hand encumbered by the cast, this time-honored practice goes out the window, and everything, stripped down to minimum, goes on the right side. I’m still fumbling around with this.

Child-proof caps: These devices of the devil, approved for medications taken by sick and feeble people, are impossible to work one-handed, even with the support of a few incompetent and weak fingers and a thumb on the other hand. Once I finally get a cap off, I leave it sitting lopsided on top of the bottle until I need to take another pill.

Zip-lock bags: They’re ingenious—until you realize they’re designed for two working hands with full grip strength. No combination of two fingers on one side, two on the other, can break the seal. The only way I can get them open is to pin one side against the countertop with my left thumb and grip and pull up the other side with my right hand. The contents only occasionally spill out all over the counter.

Buttons: Another ingenious device that really takes two hands. I can use my left thumb working against my cast to hold the buttonhole steady, if not partway open, then use all the fingers and thumb of my right hand to work the bottom edge of the button through the hole and try to anchor the top edge before the bottom slips out again. I can work the big buttons on a sweater, but I give up on all the tiny buttons on a dress shirt. So I wear mostly polo shirts these days. And sleeve buttons, right or left—forget it!

Jackets: Any garment, really, that does not have an expandable sleeve opening, like a winter coat with a knit cuff, blocks passage and traps the bulky cast. I’m too cold to wear my coat draped over the left shoulder, like a Hussar’s pelisse, and too frugal to cut open the seams of the left sleeve on a good coat for a temporary situation like this.

Knots: Unless I can use my teeth, knots are impossible. Shoelaces are hopeless. So I wear mostly slip-on sandals. And socks are hopeless, too. I can do a sloppy granny knot with something as big and thick as a bathrobe belt, but it doesn’t hold for long.

Zippers: I can work a zipper if it’s anchored at one end, like the fly in men’s trousers. But the open-ended version, like on a jacket, is impossible. I just can’t get the pin on one side inserted into the box on the other, manage to hold them both steady, and still pull up on the slider tab. So my jacket flies loose in the wind.

Manual transmission: Yeah, I’m driving a stick again, now that I’m not commuting 35 miles a day each way through heavy traffic. I traveled by Uber for the first couple of days after the fall, when my whole arm was in a splint. My cast now lets me take a light grip on the steering wheel while shifting; so I can still drive—but carefully. And sharp turns require a big over-and-under maneuver with my right hand; so I must plan ahead to avoid shifting while turning. The clutch lever on the motorcycle is hopeless, however, and I have no strength to hold onto the left grip, much less steer; so the bike is on a battery tender for a couple of months. Anyway, I can’t get my left arm into my leather riding jacket.

Personal grooming: Washing my face one-handed feels weird and incomplete. I can shower using a big rubber-and-vinyl glove that slips on over the cast, but then the cleaning process is mostly feeling around one-handed with the soap or washcloth, and my right side gets much less thorough attention than my left. Shaving and toothbrushing are no problem with modern electronic devices, but putting deodorant on my right armpit using only my right hand is an exercise in gymnastics. Luckily, I’m still flexible enough.

Heavy lifting: The hand specialist warned against this. Any use of my left thumb or fingers other than to steady a load in my right hand is perilous. Anything that tugs or pulls on those fingers—like holding the dog’s leash while I lock or unlock the door—risks moving the bones that are trying to knit, giving rise to an ominous ache. I still use the left hand in a limited fashion, but carefully and with occasional pain.

Living with one hand and a few weak fingers, everything is harder, it takes longer, and if I put any pressure on the broken bones or the wrist, it hurts. So life goes on—but at a much slower pace.3

1. I’ve tried the Dragon Naturally Speaking dictation software through a number of its upgrades. Ninety-five percent accurate is still one goofball error every twenty words, and the voice-correction protocol—required so that the software can learn from its mistakes—is painfully slow and irritating. Also, my mind just does not compose through the spoken word.

2. I once bought an ingenious little device called a Frogpad, a one-handed keyboard, but never learned to use it comfortably. The company now seems to be out of business. Pity—I could use it this time.

3. Given how difficult it is to type, and that the last few blogs were already written and in the can before the accident, this may be the last posting I can make for another month or so. I have to take these things one at a time.

Sunday, January 19, 2020

The Writer’s Job

Midnight writer

I have been pursuing this profession—writer and sometimes editor—for almost sixty years now. I first got the inkling1 when I was about twelve years old and attempted my first novel.

That was a fragment, not even a complete story, about a steam locomotive and passenger cars in the Old West that pull into the station with everyone on board dead. It was a fine setup for a mystery, except that I didn’t understand at the time that first you have to know what happened, then you wind it back to what the reader—and presumably the detective, of which I had none—first learns. So I had a setup with no premise.2 But it was a start. I wrote out what I had on an old Royal portable typewriter that was in the family, created a cardboard-and-crayon cover, and stitched it together with yarn. It was a rude start, but I was on my way.

What drew me to writing, when I knew nothing about it, was that a writer—specifically a fiction writer, specifically a novelist—could apparently work for himself from home, rather than for somebody else in an office, and could count his time as his own, rather than keeping to somebody else’s schedule. Well, it was a dream. But it fit my bookish and solitary nature. Besides, it was clean, literary, intellectual work, and you didn’t have to hurt anybody.3

My second novel, written at age sixteen, was a much grander affair: with a first draft in fountain pen on white, lined tablets; second draft typed double spaced, with margins, two copies using carbon paper on my grandfather’s upright Underwood, just like the publishers wanted; and running 472 pages, or about 60,000 words, all told. It was a dreadful space opera about a renegade university professor and rebel leader against an interstellar empire, with a romantic subplot. It had a beginning, middle, and ending—and I knew even as I finished it that the damn thing was unpublishable.4 But the effort was what counted, and it got me fixed on my present course.

My novel career paused when I went to college and studied English literature. I had no ideas for another book—having been emotionally drained by the space opera—and was too busy anyway with my studies and the mountains of reading they required. But that reading gave me perspective on literature and the language. And all along I had thought that, when I graduated, I would immediately write another novel and make my name with my first published book. I had dreamed that I would support myself with fiction writing.

But about three months before graduation I took mental stock and realized I still had no usable ideas, nothing to say. This is not surprising, because few people in their early twenties have much to say to the adult world—which was my preferred venue—and the market for Young Adult literature is limited. So I was suddenly faced with the realization that I needed a “day job” to support my imminent debut into the real world.5 And what I was best qualified for was work as an editor. Through the graces of one of my professors, I got a junior position at the Pennsylvania State University Press. It was eight hours a day on my butt with a blue pencil, correcting spelling and grammar, untangling sentence structure, and marking copy for typesetting, all according to the Chicago Manual. But I loved it. After the university press, I went to a tradebook publisher—where I learned about that railroad tragedy, and much else about the West and my newly adopted state of California—and from there to technical editing of engineering and construction reports and proposals.

My third unpublishable novel came about in my late twenties, while I was working for the engineering company. Based on the time-honored mantra of “write what you know,”6 I tried to write a business novel based on the scramble of a second-tier construction company to answer a request for proposal from a major client for a mega-million-dollar mining development for a Third World client.7 That book progressed as far as a rough first draft, although I never sought a publisher.

In the meantime, I went from engineering and construction, to a public utility providing electricity and gas, to an international pharmaceutical company, to a pioneering maker of genetic analysis instruments, with stop-offs working as a temp in administration at two local oil refineries. In each case, I worked first as a technical writer—learning the secrets of the company’s respective industry—and then moved into internal communications—explaining the company’s business to its employees. And in every case, I was building myself an understanding of and intimacy with the business world and its technological basis, an understanding that I have been mining ever since as background for most of my novels.

So … what is the writer’s job in all of this? It is the same, pretty much, whether the task is editing another writer’s work, or creating and editing technical documents, writing newsletter articles and press releases, or writing a full-blown novel, whether a historical fiction or far-future adventure.

First—and especially in fiction—take the reader somewhere new, show the reader the unique side of an everyday life situation, or of a product or technology, something that he or she has never considered before. There are two ways to approach a story: you can come at the topic head-on and flat out, or from an oblique angle and on the bounce. Think of the latter as putting “English” on the ball, making it spin. That slightly-to-one-side approach puts the reader’s natural defenses off guard and simultaneously raises his or her curiosity. This also works for a new product description or policy analysis—although not in a tightly prescribed document format like a pharmaceutical batch record.

Second—especially in technical writing, communications, and non-fiction editing—make the obscure explicit and the confusing understandable. It is an article of faith with me that nothing is so complex that it cannot be made intelligible to a bright ten-year-old child. But you have to use simple words, work in everyday analogies, and take some extra steps and make some supporting linkages in your reasoning. And you have to use that bounce thing described above to make the reader care in the first place.

Third—and this applies to all writing and editing—be rigorous in your logic and sequence, and honest in your evidence and conclusions. You are invading the reader’s mind, and this is hollowed ground. You can play with the reader’s perceptions and trick his or her understanding in the same way that a magician’s sleight of hand arouses an audience’s awe and wonder. But you can’t lie to the reader or offend his or her senses of fairness, right and wrong, or proportion. And you can never disrespect the reader. For you are playing in another person’s sandbox and, if you offend, will be asked to go home with a slamming of the book.

Fourth—and this applies to almost all types of writing, except perhaps for instruction manuals—paint pictures and tell stories. The human mind is not exactly immune to bare facts, but we have a hard time understanding them and fixing them in memory without a context. This is why storytelling and visual representation have been so powerful in almost all human cultures. This is why religious groups and political parties create a “narrative” to support their core truths. Your job is to create in the reader’s mind a structure made of words, mental images, and associations that carries your message.

To be a writer is to be, effectively, inside somebody’s head by invitation. Play nice. And have fun.

1. What a writerly word! You can almost smell the printer’s ink and hear the presses hum.

2. Curiously, I was foreshadowing one of the tragedies of early railroading, when the trains of the Central Pacific Railroad had to navigate miles of mountain tunnels and snow sheds in the Sierra Nevada, and the accumulated coal smoke asphyxiated their engine crews. From this was born the generation of oil-fired, cab-forward steam engine designs, which worked that route for years.

3. Except, of course, your characters—which I also didn’t understand at the time. But they are only made of smoke and dreams.

4. Anytime you hear about anyone writing a brilliant first novel, count on it being their second or third completed manuscript. Even Harper Lee had to go through this process.

5. Back in my teens, when I was working on the space opera, I wrote to one of my favorite authors, Ray Bradbury, asking if he would read my novel. He politely declined, which was a blessing. But he then suggested that, to become a fiction writer, I should not go to college and instead get a menial job as a dishwasher and just write, write, write, and submit everything to publishers. That course for me, as a teenager, would have been a disaster. Given my subsequent history of real-world, practical experience, I don’t think it would have worked out any better for me as a college graduate.

6. Which is a trap. The command should be “write what excites you, that you know a little something about, but you want to know much, much more.” If your current life is dull—that future as a dishwasher toward which I had been urged—it shouldn’t limit your scope and imagination. And these days, with all of the online resources available, research is easy, right down to getting street views of any city and most towns around the globe.

7. Well, not every idea is a good one. However, some of that story—and so much more—found its way into Medea’s Daughter forty-odd years later.

Sunday, January 12, 2020

Memory and Imagination

One hand

As I think through the way I go about constructing the elements of a story line for a novel and then determining the next actions, images, details, and anecdotes that will support the plot, I have made an interesting discovery: these things pop into my mind—presumably from the subconscious1—in the same way that random memories come to mind.

This act of remembering in this way is different from sitting down, at least mentally, and asking myself what I can recall from a past experience: my graduation day, my wedding day, or any memorable event that is supposed to “stick in the mind.” No, this is the sort of sense, image, or snatch of conversation that surprises me when I’m doing something completely different: a flash of the roadway where I was driving on a trip three years ago, or the image of a house I once visited, or a fragment of what someone once said to me. These things come “unbidden” and when they are least expected.

The difference with the ideas, images, and dialog fragments that come to me for my work in progress is, first, these imagined things did not happen, have never happened, and usually have no relationship to—are not an echo of—anything that has ever happened to me. Second, and unlike the act of forced recall, I find myself unable to sit down, mentally, and ask myself about what should happen after the starting point of a story, or what image or sound or sensory perception would best illustrate the next action scene, or what the characters would say in the situation. I can pose to myself the question of what should happen next, but then I have to go off and take a walk or bike ride, or get my morning shower (warm water hitting my right shoulder and the side of my neck seems to be especially inducive to creative thoughts), and otherwise let my subconscious mull the problem. And then later, out of nowhere, like an unbidden memory, I know what the next action, image, or dialog bit should be.

So it’s apparent that both memory and imagination—at least in my case, and in dealing with a developing story—come from the same place, the subconscious. But the one is a reflection of something that actually happened, while the other is a bit of imagery or dialog with no context other than the mental structure of the novel I’m writing at the time.2

I have noted for many years, going back to my time as a teenager, something I call “the Return.” This is an instance of my hearing a piece of music and then later—two to four days, usually—having it pop into my mind, again unbidden. I will find myself singing or humming the melody or, more often, just “hearing” it in my head. I have also noted that the speed of the Return is linked to my mental and physical health. When I am in good spirits and feeling well, the music will come back in two days—but seldom less. When I am feeling poorly, the return might take four days or not happen at all.

For comparison, the return of a random memory—a place image or a conversation—that occurs while I’m doing something else might take two to a dozen years, while the subconscious production of a scene, action, image, or dialog bit that I have posed to myself for consideration generally takes less than two days. So the two processes—memory and imagination—while alike, are not identical.

Interestingly, none of this has anything to do with dreaming, which apparently comes from another part of the brain. While I have vivid, busy, and sometimes frustrating dreams—but seldom what you would call nightmares—few of them have to do with specific memories, although some are linked to places in my memory, specifically two of the houses we lived in when I was a child. No dream has yet produced anything related to one of my books—not a usable plot point, image, or bit of dialog. However, I occasionally wake up, after dreaming whatever dreams I had, and suddenly get an idea that might help the current book along. Only once in all my experience did I dream of a character in one of my books, and that was my first novel, written as a teenager, and the dream occurred about two years after I finished writing it.

All of this might mean something to a psychologist, or maybe a neurologist, but to me this is just the way my mind works. And the most important conclusion—at least for me—is that I cannot force the construction of a plot or of any particular part of a story. I have no secret list of plot outlines,3 no mechanism for character generation, nor other out-of-the-box novel-writing aids. The story comes from the dark—as in hidden or unilluminated, rather than foreboding or evil—places of my mind.

The story, characters, inciting incidents, and resulting actions have to come from someplace else—outside me, inside but hidden … the gods … the stars—in order for me to believe in them as real people doing real things that have meaning. Yes, I know that they are made up—but I have to forget that part for the story to live inside me. And once I have the outline—that composite of structure and belief—as well as a notion of the associated actions, sense images, and dialog, I can let the magic happen.

Magic? That’s when I sit down at the keyboard, put my mind on its invisible track, forget myself, and let the story flow through me.

1. See Working With the Subconscious from September 30, 2012.

2. Curiously, works that I wrote in the past, stopped writing, and put an end to, do not generate new imagery or dialog. I might suddenly wake up and recall a word, image, or dialog line that I once put on the page and now know to be an error, or lacking, or incompletely linked to something else in the novel—and usually this is still related to the work in progress. But I don’t generate new ideas and directions for the old books. My internal mental process seems to know when what’s done is done.

3. I’ve heard it said that there are only seven plots in all of storytelling. If so, I’d like to know what they are. Whenever I hear this, the examples given are so general (“Well, there’s boy-meets-girl …” or “the hero’s journey …”) as to be practically useless. Once you’ve picked that type of story, then the real work of generating setting, character, incident, and action begins.

Sunday, January 5, 2020


Crystal ball

The first time I had occasion to write the calendar date “2020” on anything was a few weeks ago, in preparing the draft of the January-February edition of our local NAMI chapter’s newsletter. And as I keyed in those digits, a chill went through me. It was as if I knew, in that sleepless but dormant part of my brain that reads the tea leaves and maps them onto the stars, that this year would be significant.

Oh, I know, we’re going to have a watershed election. And given the political emotions that are running in this country, the campaigning, the election day, and the aftermath are going to be traumatic for one group or another. But that was not the reason for my chill. This was more a moment of existential dread.

Twenty-twenty marks the start of something, but I don’t know what. Well, yes, in one way, we are all past the dystopian vision of the year 2019 as given in the movie Blade Runner, and we still don’t have Aztec-style corporate pyramids and towers shooting flames into the murky night sky of Los Angeles. (And my car still doesn’t fly, dammit!) So that’s a plus for the new year. But that’s not what concerned me.

I can remember from thirty-odd years ago hearing a speaker from The Economist describe how the character of each decade doesn’t change or become consistent and recognizable on the year ending in zero, but on the fourth year into the new decade. So the decade of the 1950s didn’t really end and the 1960s start until 1964. By then we’d been through the Cuban revolution and our ill-fated invasion, the first moves in the war in Vietnam, and the assassination of a beloved president. And then we had the Sixties.

In similar fashion, the Sixties didn’t end in 1970 but not until 1974, with the culmination of that war and the impending impeachment and resignation of the president who came in promising to end it. And the economic malaise—recession and inflation, “stagflation”—that started with the oil embargo and gas crisis in that year marked the Seventies and didn’t really end until about 1984. (Another year whose mention will give you willies.) And then the computer revolution and economic growth of the Eighties, which started with wider acceptance of small, desktop computers and personal software, fueling the “tech stock” market boom, changed the economic structure of the country and continued through the early Nineties.

You can define your own parallels to this theory, in cultural norms, music, clothing styles, and sexual mores, but I think the pattern holds true. Decades don’t change on the zero year but about four years later.

I think something similar happens with centuries. But there the change is not in the fourth decade but in the third, the years counted in twenties..

The 18th century, which was marked—at least in the Western European sphere—with the wars between England and France, culminating in revolution in the American colonies and then in France, the rise of Napoleon, and the struggle for all of Europe, extending to the shores of the Nile and the heart of Moscow, did not really end until the middle of the eighteen-teens. Similarly, the breakaway of the United States from England and the finding of this new country’s own character did not really end until the War of 1812. After that decade, the 19th century came into its own.

And then, the Victorian, imperial, colonial über-culture that—at least in Western Europe—took the superiority of one race and one extended culture for granted, did not end until the shattering madness of World War I. And what came out of that war was the new century, with a perhaps more enlightened view about races and cultures—at least for some people—but also a clutch of hardened counter-ideologies and the technological means to pursue them to horrific ends. After that first global war, the 20th century came into its own.

And finally, the 20th century has been with us ever since. The fall of the Soviet Union and putative end of the Cold War in 1989 was a highway marker, but the effects of that lingering aggression and its bunch of little colonial brush wars and invasions (Korea, Vietnam, Grenada, Kuwait, Iraq, Afghanistan) continued along the same lines, although those lines were becoming blurred with the rise of Arab and Muslim nationalism and the flexing of Chinese muscles. And over them all has loomed the technological changes in warfare that started in World War I, with the machine gun and chemical agents, and continued in World War II, with the jet engine, the rocket, and the atomic bomb. The century saw war become less about armies marching against each other on a common battlefield and, because of “the bomb,” more about political and ideological maneuvering, guerrilla techniques, and terrorist tactics.

You can define your own parallels to this theory with the way that political and cultural norms, music, clothing styles, and even sexual mores changed in the 1920s. The theory still holds: centuries don’t change on the zero-zero year but about twenty years later.

So, although we have been living in the 21st century for the past two decades, it still feels like an extension of the 20th century. We have the same international tensions, the same small wars in far-off countries, the same political differences at home. And, aside from the power of my handheld phone and the number of small computers controlling everything from my car’s ignition to my coffeepot, this century doesn’t feel all that different from the one in which I was born and raised. But still, I sense big things coming, and hence the existential dread.

What are these “things”? My internal leaf-reader and star-mapper doesn’t know. But my sense as a science-fiction writer can offer a few guideposts.

First, the technical revolution that brought us the small computer as a workhorse in everyday life will bring us artificial intelligence in the same role, but maybe more aggressive. And unlike the one big IBM 360 that sat in the basement of the computer science lab at Penn State in the 1960s and ran most of the university’s administrative functions on campus, or the one all-pervasive “Skynet” that destroys the world in the Terminator movies, artificial intelligence will be distributed entities, functioning everywhere. As soon as we work out the neural-net (i.e., capable of learning) architecture and marry it to the right programming, these intelligences will proliferate through our technology and our society like chips and software.1

But don’t think of “intelligence” as being a human-type soul or a little man in a software hat or even a Siri or Alexa. You won’t be holding a verbal discussion with your coffeemaker about whether you feel like dark or medium roast this morning. Instead, you will find that your world is going to have an eerie sense of precognition: answers and opportunities are going to come your way almost seamlessly, based on your past behavior and choices. Your phone is going to become your life coach, trainer, technical expert, and conscience. This trend is only going to expand and multiply—and that’s just on the personal level.

On the macro level, business operations and relationships, lawyers and judges, doctors and hospitals, and any sphere you can think of where knowledge and foresight are valuable, will change remarkably. You won’t go into a contract negotiation, a court appearance, or a diagnostic session without the expert system at your elbow as a kind of silicon consigliere. We’ve already seen the Jeopardy-playing IBM Watson prove itself as a master of languages, puzzles and games, and historical, cultural, scientific, and technical references. The company is now selling Watson Analytics to help manage business operations. This trend is only going to expand and multiply.

Second, the biological revolution that brought genetics into the study of medicine—and the sequencing of the human genome was completed on the cusp of this 21st century—will see a complete makeover of the practice. In this century, we will come to know and understand the function of every gene in the human body, which means every protein, which means almost every chemical created in or affecting the human body. That will change our understanding not only of disease but of health itself. Sometime in this century—all of the ethical handwringing aside—we are going to be modifying human genes in vivo as well as in egg, sperm, and embryo. From that will come children with superior capabilities, designer hair and eye color (just for a start), and stronger immune systems, among other things. The definition of “human” will be rewritten. Adults will benefit, too, by short-circuiting disease and regaining strength in old age. This trend is already under way with gene therapies, and it will explode in practice and popularity.

Moreover, the nature of our material world will change. Already, scientists are examining the genetic capabilities of other organisms—for example, Craig Venter’s people sifting the seawater on voyages across the oceans, looking for plankton with unique gene sequences—and adapting them to common bacteria and algae. You want a pond scum that lies in the sun and releases lipids that you can then skim up and refine like oil? That’s in the works. You want to harvest beans that have valuable nutritional proteins, bleed red, and taste like meat? That’s in development, too. You want synthetic spider silk and plastics as strong as—or stronger than—steel? You want carbon graphene sheets and Bucky-ball liquids that have the strength of diamonds, the electrical capacitance of metals, and the lightness and flexibility of silk? Just wait a few years.

As they say in the song, “You ain’t seen nothing yet.”

1. See Gutenberg and Automation from February 20, 2011.