Sunday, July 5, 2020

Without the Option

Dead bird

This is a dark thought. So if you are at all depressed or suicidal—I’m not, just thinking out loud here—please stop reading and come back some other day. If you are strong and happy, read on at your peril.

The Covid-19 situation has us all thinking, marginally if not centrally, about our own mortality—especially if we are in the age group of the Baby Boomers, whom this disease particularly seems to like. Yes, I could be hit by a bus on the street—or run over on my motorcycle by a semi-trailer truck—and be killed today. Yes, I could develop cancer or some other devastating disease and my life turn terminal tomorrow. Yes, I could get the regular old Influenza A HxNx and die of its debilitating symptoms sometime this year or next. But we haven’t been soaked in four months of statistics about any of those causes and how many are dying each day, each week, each month. This Covid-19 doesn’t sink into the background noise of daily life but remains at the forefront. So, for those of us old enough to take notice, the thought of impending death seeps into our brains.

As an atheist, I am without the comforting option of any kind of belief in an eternal afterlife.1 I know—or certainly believe—that when I die, my mind and my various brain functions, such as thought and memory, will cease along with my bodily functions.2 I will not ascend to some other sphere as a discorporate spirit or psychic wave or sentient vibration. I—the part of me that thinks and plans and hopes—will quickly disappear into darkness. I will not sit on a cloud and look down on this world, on my surviving friends and family, or on any part of my reputation that might live after me, and feel anything positive or negative about them. I will not care. I will be as dead as roadkill, or a tree fallen in the forest, both of which eventually return to dust and their component atoms, leaving no discernible trace in the world. And in a hundred or thousand years, my life will have just as much meaning as that tree or animal among whatever passes for my distantly related family members or Western civilization itself. “Dust thou art, and unto dust thou shalt return”—body and soul, or so I believe.

You might believe this thought would be terrifying. That being nearer to death now than at any time in the past—when, as a younger person, I could cheerfully forget or ignore my mortal nature—would make me dread and fear those last minutes, make me scramble around in this life, frantically trying to put off death and preserve every hour, every minute of breathable viability. Or that it would make me rush out to experience every possible moment of joy or passion or novelty this life still has to offer. But that is not the way.

I am the same person I was up until February of this year: measured, thoughtful, introspective, and curiously unafraid.3 I am unlikely to become panicky or rushed just because the death that was always near has come a little closer—but then, ask me again five minutes before the final exit.

From this vantage point, however, I find death is not so scary. In fact, it will be something of a relief.

For one thing, I will find freedom from responsibility. It seems my life right now—and for all the years before this—has been an endless and widening cycle of responsibilities. These days, I must gather and protect my financial resources, because I am unlikely to earn any more against the future. I must pay my taxes, my condo dues, my ongoing debts—even though I try to pay the latter down every month and am careful about incurring more. I must care for my family members—in these days of coronavirus more in spirit than by my actual presence. I must walk my dog four times a day, following along her trail of smells and sniffs, because we live on the twelfth floor and I cannot just open the backdoor to let her out into a protected yard.4 But these are just my largest responsibilities today, and they are shared with almost everyone in my age group.

In my own particular makeup, I have lesser responsibilities that have been with me since childhood. Most are the residue of a lingering obsessive-compulsive disorder; the rest are the result of my upbringing by careful parents. I keep straightening pictures that go askew, as well as area rugs—which must align with the pattern in the parquet flooring—and the corners of my piles of books and magazines. I keep wiping, cleaning, polishing—caring for!—surfaces and finishes. I keep my clothes neat and clean—although I don’t iron them anymore, thank you. I must keep the car neat enough to entertain guests, as well as gassed up, serviced, and ready to roll. I do the same for the motorcycle, plus wipe dust off the shiny surfaces every time I take it out and clean bug splatters every time I bring it back. I worry over every scratch and stone chip in the paint, and chase every blemish with a dab of clearcoat followed by polishing compound. I wash and wax, where applicable, relentlessly.

For another thing, death will release me from the need to be and stay strong. It was the way I was brought up—as I suppose with most of the children in my generation. We were taught by parents who had gone through the Great Depression and World War II themselves to be resilient, enduring, patient, and uncomplaining. When work would get hard or complicated, and I would have to stay late or come in over the weekend, that was simply the price of being an adult. The inconvenience of a head cold is not stronger than the daily pattern of obligations, nothing about which to stay home and pamper myself—certainly nothing to deprive the dog of her walks and for me to resolve to clean up any messes she might make indoors. When my back goes into spasms—as it does in the cold and damp weather—and bending over is hard, that’s not enough to make me stop filling her water bowl, or leave a piece of lint on the floor, or let a rug remain askew. The pattern of life, as established, is more important than its minor disruptions.

Putting up with pain and inconvenience, suffering through that which must be endured, walking with back straight and unbowed into the whirlwind—this is the price of fulfilling my own self-image and the precepts that my parents followed and taught my generation.

Death, when it comes, will be a release of self from the web of life. Even if that is without the option of an afterlife, it may come as a blessing.

1. See, for example, My Idea of Heaven from July 22, 2012.

2. However, brain function may persist for some seconds or minutes after the body stops working. The story is told of Anne Boleyn, whose decapitated head, when held up for inspection, looked down on her severed body and moved her mouth as if speaking. We also know from extensive medical experience that brain cells can survive and be revived without irreversible damage for three to six minutes after the blood stops flowing. We do not die all at once. But those mere minutes are not a basis for belief in eternity.

3. See also Fear Itself from June 10, 2018.

4. But hey, it’s good exercise for me, too.

Sunday, June 28, 2020

The Antifanatic

Apple tease

Everyone these days is talking about “Antifa,” the “antifascists,” who enforce their views with bricks and fists. Well, I have a cure for that. It’s the realization that no one is morally or ethically or humanly required on any and all occasions to speak out, speak loud, speak proud. I call it the “Antifanatic.”

The call to action—putting your heart and soul into discussion or pursuit of a particular proposition—is all too often the call for absolutism. Distinctions are pared away. Qualifications are eliminated. Shades of gray are painted over. The pursuer—or defender—is left with a nice, clear, easy to understand, impossible to refute “yes” or “no.” Everyone is either with us or against us. Do or die. Death or glory. … And this is a remarkable renunciation of the human capacity for reason.

That said, you don’t want to enter into hard battle, a shooting war, life or death, fight or flight while you are beset by doubts. When the bullets are flying your way, you don’t want to be—and you don’t want those on your side of the trench line to be—questioning, maybe, your responsibility, just a little bit, and your moral justification for receiving those bullets. When war comes, you put your quibbles aside and take up the gun.

But not every social situation is a war. Not every injustice is a cause for rebellion and revolution. And that seems to be the problem today, in our politics, our economics, our culture, and our social connections. Every issue rises to the level of New York Times 48-point Cheltenham Bold Extra Condensed all-caps headline font, usually reserved for a declaration of war. The volume is turned up to twelve on a scale of ten. And so distinctions and qualifications drown in the noise of drumbeats.

I say this because most of life is gray. Living is a matter of distinctions and qualifications. Choices are seldom “yes” or “no” but more a case of “better” and “worse,” of “maybe” and “it depends.” In very few cases is the benefit or the risk absolute but rather a matter of calculation. And that is what the human brain was developed for, after a million years of deciding how to live in the world. In the hunt, there is no perfect knowledge and no obvious choice. Instead, the game may be in sight but at just a bit too long distance, or with the wind or foliage causing interference. There is seldom a perfect moment for the shot, just a collection of perceived opportunities and risks. And in the gathering of foodstuffs, there is no single, perfect berry or tuber to be chosen and plucked, only a selection of those on either side of optimum size or ripeness. How willing a person might be to eat an overripe apple, and maybe eat around a spot of brown decay or maybe a maggot on one side, depends on how hungry that person is. Circumstances and perspective are everything.

But it’s hard to get a political movement or a religion—and these days they can often be confused—started with an overlay of rational analysis, the weighing of distinctions and qualifications, the necessary consideration of “maybe” and “it depends.” The political leader wants his followers to shout “Yes!” The pastor wants his congregation to give a big “Amen!” That’s how you build a compliant crowd—by paring away individual thoughts and distinctions and cutting away any middle ground.

As a writer, I do not have the luxury of retreating into absolutes, of indulging the human craving for pure black or pure white, of sliding into absolutism. The writer’s job is to see multiple viewpoints, or so I believe. My characters are not—should not be, cannot be—creatures of simple tastes and absolute beliefs. My heroes need to have doubts and failings. My villains must have thought and reason.1

Every human has a story. Most humans—except those who were mentally impaired to begin with, or have lost their minds through fever, dementia, or other illness—have reasons for their choices and actions. The story may be tragic and leave the person bruised and selfish. The reasons may be faulty and the choices hasty. But those are all clues to character. Every human being is a ball of string set in the glue of circumstance and opportunity. And it’s the writer’s job with an imagined fictional character—or the psychiatrist’s, with a real patient—to untangle or cut through the string, to see how the glue seeps into the layers of choice and desire, and to find the main strands of personality. This is the excitement of exploring a character as he or she navigates the obstacles of a plot and action.

The fanatic would dismiss all of this discernment and analysis as irrelevant. The fanatic pushes a single thread, a single viewpoint, to the edge of absurdity—and sometimes goes right over into virtual madness. Any position taken as an absolute will lead to absurdity. In a world of “zero tolerance” about personal weapons, a nail file or a butter knife becomes a verboten possession. In a world of absolute support for a woman’s right to choose, strangling a baby at birth becomes a reasonable option. In a world of absolute justice without compassion or mercy, a man can be hanged for stealing a loaf of bread. Fanatics always end up defending positions that any reasonable person would find questionable, unjustified, and sometimes abhorrent.2

Reason is the human birthright. The ability to think and act as an individual, rather than as part of an arm-waving crowd, is every person’s prerogative. This is why human society is so varied and diverse: we each see a different set of distinctions, shades of gray, opportunities and obligations, benefits and risks.

Let’s keep it that way.

1. This is one reason I don’t write fantasy. It’s too easy to paint the villains as mindless evil for evil’s sake. Even a psychopath or a sociopath has a background story and a reason for his or her choices. In The Lord of the Rings, for example, the titular character and driver of the entire story line, Sauron, is never shown as a viewpoint character or even as any kind of living presence. Sauron is pure evil, malice without reason, power for power’s sake, the dark cloud that stretches in the east. We never get inside Sauron’s mind because there is no human mind there, just the darkness. This is not a failure on Tolkien’s part, but Sauron is the one character in the whole saga who operates without a discernible motivating cause and cannot be mistaken for human. Sauron is the fire, the storm, the mindless force of raw nature, not someone with whom we can identify—or bargain, negotiate, and reach agreement.

2. One of the reasons I am not a serious collector of things—stamps, coins, Hummel figurines—is that the initial attraction and the “Hey, neat!” response of the beginner eventually becomes the feverish pursuit of the rarest, least obtainable, and sometimes the ugliest or most flawed member of the defined group. (This is because ugly or flawed things usually don’t get produced in large numbers. Cf. the U.S. “Inverted Jenny” 24-cent airmail postage stamp.) I want to preserve the beginner’s initial fascination and avoid the fanatic’s ultimate heartbreak.

Sunday, June 21, 2020

The Static Viewpoint

Perspective

I sometimes wonder if the average human mind, born and raised into a particular place and time, has actually caught on to the reality that everything in life, the universe, and your own personal patch of ground is changing. All the time. With or without—usually without—your help. We all live in a kinetic universe. So get over it.

The adjective static and its noun form stasis come from Latin, which derived it from Greek, meaning “to stand.” And since this is not standing up or sitting down, rising or falling, coming or going, or moving from place to place, it implies “to stand still.” Everyone is guilty, depending on the subject and the circumstances, of thinking that the world around them hasn’t changed or won’t change, or when observing the change for themselves, hoping that it will eventually reverse and things come back to being the same.

To start with, most parents are by turns amazed and disappointed when their children grow and develop from cute babies to fretful toddlers, to winsome seven-year-olds, to moody, snarling teenagers. We want them to stay at each biddable stage and not progress. Sometimes they grow up to be adults we recognize and with whom we can become friends. But sometimes they grow up into strangers and even enemies. This is the pattern of life. And wouldn’t we hate it if our child failed to develop, through autism or some intellectual or emotional disability, so that he or she remained a charming imbecile, or a ranting child, in an obvious adult’s body? That would be the tragedy.

You can see this penchant for static thinking in the heritage of the oldest religions. The world of gods and men is fixed and unchanging. Yes, there are storms and earthquakes, wars and plagues, all of which change the fortunes of individuals and families, but the nature of the world itself remains static. You see this most plainly in the creation myths, where the supreme deity creates the land and the animals and plants that live on it, and they are all one-offs. A horse is the representative of a fixed pattern—essentially the Platonic idea of horse-ness—that was set at its creation and named as “the horse.” Yes, there are obviously related animals like donkeys and zebras, but they, too, were supposedly singular creations. And the horse is designed from scratch with teeth and stomach to eat grass and with neat hooves to run over the gentle turf in valley bottomlands. Just as camels were designed from scratch with floppy feet to walk on shifting sands and with water-retaining flesh in their humps to survive many days between drinks.

Of course, the religious writers and theorists of two and three thousand years ago did not have a true appreciation for the age of the Earth. For them, creation was just a few hundred generations ago, at most, or perhaps a few thousand years in the distant past. A world that cooled out of the solar disk four billion years ago would have been a thing beyond their understanding. Usually, they did not even have words and concepts for that large a number representing that long a span.

They also did not have a conception of tectonic forces, as I have noted elsewhere. If your world is just a few thousand years old, then you don’t ponder too long on the fact that you can see hills slump down in continual landslides, and mountains erode into silt and wash out to sea, and yet you don’t wonder why, over time, all the mountains and hills have not become flat. While you can see the agents of erosion and collapse, you never see the forces that built up the mountains and hills in the first place. It’s only when you have an appreciation for the great age of the planet and a working theory of plate tectonics—which wasn’t widely accepted until the 1960s—that you can understand the vast changes that take place over time on any part of the Earth’s surface and the necessity for all of these “created” life forms to have some mechanism for adapting and evolving to fit their current environment.

But antique religious explanations are not the only place that static viewpoints are held and nurtured. Political thinking of many types is bound up in the notion that the world and human consciousness does not or should not change.

Most obviously, there is the “conservative” view. It is a truism that conservatives1 favor tradition, that we are not offended by the vagaries of recent history, by the values we were taught and grew up with, or by the world as it presents itself to us. We are not ready to throw all of that—our parent’s world, or Western Civilization, or political values that pertained before we were born—away on the promise of a brighter, or at least different, and perhaps utopian future.

Conservatives are not always backward looking—or “reactionary” in the words of Marxists and Hegelians, meaning opposed to the just and natural evolution of liberal human thought. Conservatives can view the future with equanimity and a certain amount of hope. But we also know that history is not a just-so story, that it does not always bend in a particular direction, or not always in the direction that certain political parties—followers of those 19th-century Marxists and Hegelians—believe to be natural and inevitable. Conservatives can understand that history, and much of reality, is cyclical: ideas, political movements, economic and political conditions and parties, rise and fall against a background of human nature that has certainly evolved over many generations but is unlikely to change much from one generation to the next.

And then there is the stasis of the political left, the so-called “progressives.” They may believe that human nature is malleable and can be bent in certain directions toward idealistic perfection. But in the economic sphere, they clearly believe that a country’s goods and services, its money supply, and its scale and rate of personal and business transactions are all fixed. They believe the economy is a zero-sum game, where one person wins and prospers only through another person’s loss and misery. If one person grows rich, then he must have made a number of other people poor in proportion. As I have written before,2 the economy is not a pie, where if I take a bigger slice you must then get a smaller one. Instead, the economy is an ecology, where the more life that exists and more transactions that take place, the more wealth is generated for people to garner and to hold onto or share. The difference between a rainforest and a desert is not simply the amount of rain that falls, but the scale of interactions taking place on the ground and in the forest canopy.

And then, in the natural world, the progressives believe that conditions are—or should be—fixed and unchanging. They revile “climate change,” which also runs under the heading of “anthropogenic global warming,” because without humankind’s burning of fossil fuels and other environmental insults, they believe, the world would not change around them. They view a rise in global temperatures by a few degrees centigrade in a hundred years as an absolute catastrophe, missing the point that global temperatures have been rising steadily since the Maunder Minimum—a blank-faced Sun blank lacking in magnetic disturbances and sunspots—in the 17th century. They dismiss any historic temperature fluctuations—the Roman Warm Period, the Dark Ages Cold Period, the Medieval Warm Period, and that Little Ice Age starting in the 1600s—as a political fantasy. Similarly, they believe the sea level won’t rise without anthropogenic causes, as if shorelines haven’t been shifting, advancing and retreating, causing the relocation of city centers and populations for millennia, if not centuries.3 They also believe in a fixed amount of resources—such as Malthusian famines and “peak oil”—in relation to human population growth and density.

And finally, progressives appear to believe that, once they have achieved their goals, once human nature has been adjusted and perfected, society is ordered according to rational and egalitarian principles, and the world is led by like-minded professionals, then history will stop. That humankind, having reached the socialist or communist utopia, will cease to change and will simply roll forward, from one five-year plan to the next, in peace and personal safety, forever. For any thinking person, this idea is as narrow and stultifying as the religious conception that when we die we migrate to a place of either perfect bliss or perfect torment and reside there, as a disembodied but functioning and remembering consciousness, forever.4

Disbelief in and fear of change are written into most of the human experience. Change that we do not consciously plan for and initiate engenders the unknown. We believe we can plan for the future, and so a future that is moving ahead without us creates anxiety and dread. This is the product of having an active, thinking, projecting mind—the workings of the brain’s prefrontal cortex—that tries to meet an active, changing environment with an unknowable future.

And that, too, is part of the human condition.

1. Consider the very nature of that word, conservative. It means someone who “conserves” or “preserves.” A conservator is one who enters into another person’s life—when this person’s health or mental condition is failing—and helps them preserve their living situation, their wealth, and their dignity. To conserve something is to protect it and keep it from the ravages of time, the vicissitudes of nature, or careless waste by thoughtless humans.

2. See, for example, It Isn’t a Pie from October 3, 2010.

3. As if 11,000 years ago—well within the experience of modern human beings—the glaciers did not cover North America in ice a mile deep, and the shoreline did not extend down 400 feet to the edges of the continental shelf, all without human intervention.

4. And isn’t “forever” itself a static concept? Nothing is forever.

Sunday, June 14, 2020

Seeding the Stars

Starfield

I have already written about the ubiquitous nature of DNA on this planet,1 how every life form we can find—plant, animal, bacterial, and so on—uses the same basic replication system no matter how distantly isolated or temporally antique that organism may be. DNA is so successful that it may not have evolved on Earth at all. And that raises the question of how an intelligent species goes about adapting its kind of life to the different kinds of planets likely to be found in nearby solar systems.

In James Blish’s The Seedling Stars from 1957, scientists from Earth visit various exoplanets, sample their environments, and then genetically modify human beings to exist and thrive under those conditions. Perhaps they will have to breathe a different atmosphere, or stand up under heavier gravity, or—in one case—live as microorganisms in a bacteria-rich puddle on an otherwise dry and barren planet. It’s an interesting concept: making humans adapt to the environment on the ground, rather than living under domes of tailored atmosphere or trying to terraform the planet to fit human needs.

But Blish’s story requires a lot of hands-on synthesis. The geneticists and environmentalists on the seeding ships have to do a relatively quick study of the planet, figure out all of its advantages and pitfalls, and then redefine the human structure in exquisite detail—adapting the entire respiratory or alimentary system, adding muscle mass, or—in the case of that puddle—reducing the structure to microscopic size. While the concept makes a good story, I can imagine that if the scientists were wrong by a few degrees of deviation or just a couple of genes, the adapted humans would fail to function or die out in a generation or two.

But then, as we learned in the Biosphere 2 experiment, it is extremely difficult to develop a self-sustaining environment with resident human beings that is not continuously replenished from outside. We can maintain a crew aboard a ballistic missile submarine that operates submerged for months at a time only by having it surface, restock, and refit its systems after every mission. Living that way on the Moon or Mars without a continuously operating and available supply line from Earth may be more of a challenge than we think. Setting up to live under domes on a planet light years from our home world will likely be impossible.

The best way to adapt an organism to an alien planet may be to start small and literally from scratch. Send in a hardy bacterium that has only a few basic requirements for temperature, water, and food. Let it mutate, adapt, evolve, and spread by relying on the remarkable ability of the DNA-RNA-protein system to mutate and thrive according to available conditions. And then see what kind of life form—perhaps as intelligent as human beings, perhaps only as intelligent as dinosaurs and dogs, perhaps no more intelligent than slime molds—develops after a couple of million or a billion years.

Since you wouldn’t be sending your own life form—with its already developed physical characteristics, intelligence level, and environmental requirements—to the new planet, you wouldn’t be staking much on the “humanitarian” value of the colony and its fate. You might not even know that the planet you were sending your probe microbes to was capable of supporting life in the first place. You probably wouldn’t bother if they were going to land on a frozen gas giant or a hell world too close to its star. But close observation and exact measurement is difficult at interstellar distances. You could be wrong by a few degrees of deviation and still be successful in gaining a foothold for life—maybe not life in your own physical form but something sharing your basic gene structure—on a new world.

The odds of complete success—creating a life form that has recognizable intelligence and the ability one day to communicate back to your home world—would likely be small. And the timeframe would likely be long, on the order of millions or billions of years. So you would not place a great deal of trust or effort in the success of any single seeding mission. That being the case, you would not send just one or two missions to the most obvious planets and hope for the best. You would make minimal investment in hundreds or thousands of missions.

You might not be sending your designated seeds of life inside sealed and monitored environments aboard expensive, guided rockets with their own navigation and braking systems, as well as some method for gently sowing your seeds upon arrival. Instead, you might just reduce your hardy bacteria to a spore state that’s able to endure and protect its DNA-RNA-protein system for a thousand or a million years in space, scatter it on a comet or carbonaceous asteroid, and fling it out of your own solar system in any likely direction. Eventually, it will hit some solid body. Possibly, the spores will find a conducive environment and germinate. And maybe one time in a thousand, those seeded bacteria will mutate, evolve, and spread into something like complex life.

Do I think that’s what happened here? Do I think that we—the advanced hominids of planet Earth, product of three billion years of evolution—are the long-lost colonists of some ancient race of intelligent beings who were attempting to seed the stars with microbial DNA? Is there any way to know?

Not really. But it’s as likely an explanation as any other.

1. See Could DNA Evolve? from July 16, 2017. This blog describes the ubiquity on Earth of the DNA-RNA-protein system, not just in its basic pattern or structure but also in its finest details—like the choice among the various purines and pyrimidines for its four bases, the three-codon reading frame, and the limit of just twenty amino acids it uses in protein synthesis. This replication system seems to have evolved here without any significant or lasting competition—or perhaps it didn’t evolve here.

Sunday, June 7, 2020

The Other

Colored pencils

I may be naïve, especially when it comes to political theory. But I believe the root problem we face in this country in a time of divided politics is that we tend too often to see people in cohesive and monolithic groups and then identify them as “the other.”

“The other” is scary. It is defined as unknown, also unknowable, not like us, not approachable, not possessing our own core values, not responding to our most basic hopes and fears … not really human. The other is the bogeyman, the monster, the unnamable thing that hides plainly visible but also out of sight, under the bed or in the closet. The other is the dark. The other is danger.

This fear of core difference is so buried in the human consciousness that it might go back to earlier times, when human beings were not all one species but divided by actual differences in their basic genome. Forty or fifty thousand years ago—that is, two thousand generations or more ago1Homo sapiens wandered up out of the African Rift Valley and met an earlier form, H. neanderthalensis, in the forests of southern Europe. How the older model differed from the newer in terms of bone structure, braincase size, intellectual capabilities, motor skills, and social structure is still open to question. We know that a certain amount of interbreeding took place, because modern humans of European descent have been shown to carry about four percent of genes identified as Neanderthal, which are not present in humans who remained in Africa or took a different, more easterly route out of the continent.

Back at that meeting in the forest, the Neanderthals really were the other. Maybe they were tougher, or stronger, or clumsier, more or less energetic, or simply less sophisticated in their speaking, grooming, and eating habits. Maybe they just smelled wrong, so that only the most sexually adventurous among the H. sapiens band would try to pin down and mount one of their females—or not resist too strongly when one of their males tried the same thing. But while there may have been some interbreeding, there never was a coalescence. H. sapiens and H. neanderthalensis remained as separate human species and did not disappear into a new and improved—or devolved—human form. We know this because Europeans, even with that admixture of Neanderthal genes, are still thoroughly H. sapiens. Their genetics do not differ from those of human beings that remained in Africa or settled in Asia, other than those unimportant and easily blended genetic variations that all humans carry.2

But that early encounter with a different form of human being, layered on previous encounters up the line with Denisovans, members of H. erectus, and other hominins who were almost but not quite human, may have left a psychic scar. Certainly we don’t feel about other, more distant species—our dogs, cats, and horses—in the same distrustful way. And we fear tigers, bears, snakes, and spiders only because we have found them, or some of their related species, to be innately dangerous. But bears and snakes are not, for us human beings, “the other.” That status is reserved for people who are like us but definitely not us.

To see the other in people is a kind of blindness. It is to overlook the obvious case that we are all individuals with differences, some good traits and intentions, some bad, but mostly bland and indifferent. If you take people one at a time, as individuals, with a certain amount of basic respect for our commonalities and tolerance for our differences, then you don’t fall into the trap of treating whole groups of people as “the other.”

But these days it’s politically advantageous to divide people into groups, to define them by their difference rather than their commonality. And this tendency applies not just to physical, mental, and emotional differences but also—and sometimes more importantly—to differences in class, status, and sometimes even profession.

For the many consumers, the manufacturers and retailers of basic necessities have become “the other.” For many who identify as blue collar or working class, the business owners and financial services who support them—collectively, capitalists—are “the other.” For many people with a chronic medical condition, doctors, hospitals, pharmaceutical and insurance companies become “the other.” In every case, the other is composed of people whose interests and intentions are markedly different from yours, therefore unknowable, therefore dangerous.

I just viewed the film based on John Le Carré’s The Constant Gardener, about a British diplomat uncovering the scandal of a European pharmaceutical company testing a flawed tuberculosis drug in Africa and hiding the negative results. The presumption is that the drug makers, the clinical trial operators, and the government diplomats coordinating between them are all so driven by profit and so desensitized to human suffering that bringing a failed drug with deadly side effects to market would seem like a good strategy. That is, the drug companies and their technical supporters are the unknowable and dangerous “other.”

For a number of years I worked in the pharmaceutical business as a documentation specialist. I can tell you that these companies at every level, from operators in the labs and production suites to executives in the home office, care very much about the health of their potential patients and the safety and efficacy of their products. This is because these people are fully human and have respect for themselves, their endeavors, and their fellow human beings. Bringing out a bad drug that kills people, however profitable it might be in the short term, is a bad business decision in the long term. Consequences catch up with you. No one wants to be ashamed of the company they work for because it carelessly killed people. The scientists and technicians I worked alongside were not “the other” but instead were respectable human beings who cared about human life, as well as the laws and accepted practices of the society in which they operate.

To suppose that pharmaceutical employees are soulless demons driven solely by the profit motive—or that bankers are heartless demons seeking to foreclose on their borrowers, or that food processors and grocers are mindless demons selling poisons disguised as nourishment—is to miss the fact that these are all people endowed with much the same sensibilities and concerns as yourself. If you believe that humans in society are basically corrupt and conscienceless, then you might also believe that respected local museums, regardless of their dedication to art and history, quietly sell off their most valuable artifacts to private collectors for profit and replace them with fakes fabricated by their curators. Or that doctors, despite their Hippocratic oath, routinely treat patients with unnecessary tests, procedures, and medicines simply to inflate their billings. Or that police officers, regardless of their oath to protect and serve, are sociopathic bullies who use their power to mistreat innocent civilians.

To see only the differences in people and fail to grant them provisional respect and tolerance as human beings is to succumb to the fear and loathing of “the other.” To see people as faceless masses with unknown motives and intentions is to succumb to the myth of “the other.” Casual discrimination, blindly lumping people together as types, remaining deaf to individual traits and capabilities, is easy. It’s lazy. It’s the failure to act as a thinking being.

And it’s tearing us apart.

1. Counting a generation as approximately twenty years. Of course, in hunter-gatherer societies, where life was closer to the bone, child bearing and so the start of the next generational cycle might begin soon after puberty. That would put a “generation” at more like twelve to fifteen years. But we’re dealing in approximations here.

2. Here I’m talking about inherited characteristics like peculiarities in the shape of eyelid, nose, or lips, certain distinctive body types, and certain predispositions to or protections against disease. The most obvious genetic difference, that of skin color, is the most widely variable and pliable characteristic. It has more to do with where a population most recently resided—in the sunlight-rich tropical regions or in the relatively sunlight-poor higher latitudes—than any immutable genetic predisposition. Take a Congolese family to Finland and let them live there for a hundred generations without interbreeding among the locals, and their skins will naturally lighten. Transplant a Finnish family to the Congo for a hundred generations, and their skins will darken. Skin coloration—an increase in melanin distribution—is a protective adaptation against ultraviolet radiation and not a permanent feature of the visible characteristics we tend to associate with “race.”

Sunday, May 31, 2020

Life as a River

River rapids

The river as a metaphor for human life is one of the oldest clichés.1 But sometimes life imitates metaphor, and this time may be one of them.

When you’re traveling on a river that is broad, flat, and smooth, sometimes you hear a distant roaring, the sound of rapids and maybe even a waterfall ahead. You can’t turn around, you can’t stop the flow—either of the river or of time itself. All you can do is move forward in hope and confidence, knowing that your skills and courage might let you survive whatever drop in elevation and resulting rapids and rocks the river ahead throws at you. Or not, and then you will be beyond caring, because you are drowned and dead.

And if you could find a way off the river—pull to shore and abandon the water and its currents, stop time itself—all you would be doing is either placing yourself in limbo or joining another river. And there you might not even hear the falls before you went over them.

When I was growing up, my parents taught a severe form of bravery that was in one sense pure fatalism. Some bad things are just going to happen, so you might as well face them and get it over with. Or, as my mother would say when we had to clean up certain messes, “If that’s the worst thing you ever have to put your hands into, consider yourself lucky.” Life is hard, and there’s no use in cowering, because whatever lies ahead will come to you anyway.

But these were people who, despite having been raised in loving and relatively well-to-do families, graduated from high school into the start of the Great Depression, then graduated from college at its depths, and finally moved forward into World War II. In those days, if you weren’t strong, capable, flexible, and emotionally resilient, you collapsed under the weight of your own fear and despair.

I’ve experienced a similar fatalism—and written about it elsewhere—while riding a motorcycle.2 Sometimes you are faced with a difficult, declining-radius curve, or you take a bad line through any curve, or you suddenly discover an obstacle lying in the road around a corner. Or sometimes another driver cuts you off and suddenly truncates your path to safety. There is no way, on a motorcycle, to stop time, to reconsider, to take measurements, and then to lay out, analyze, and choose among all the available options. You just have to deal with what’s coming in real time, relying on all your learned skills and reflexes, hope for the best, and choose to have no regrets.

Twenty-twenty has proven to be a time something like the decade my parents faced. We started off in January with a hugely successful economic environment, low unemployment, and bright prospects. Then a novel virus with unknown but frightening prospects for transmissibility and lethality—and with remarkable differences of public and professional opinion, as well as quoted statistics, even among scientific and political experts, regarding its actual effects, even extending to its origins and possible human development—took the country, the world, and the global markets into economic standstill, if not freefall. In this country we were already in the midst of political turmoil, with one party declaring every “resistance” to a legally elected president, even challenging that election itself because, as sometimes happens, the popular and electoral votes did not coincide.

In the midst of what is shaping up to be another Great Recession, if not an economic malaise worse than the Great Depression, we are headed into a national election that is sure to be contested. If President Trump is elected with another minority popular vote, or valid claims of voter suppression, or any whiff of foreign collusion and interference, then the Democratic, progressive left will explode. If the prospective Democratic candidate Joe Biden, who appears to labor under some obvious mental handicaps, is elected along with a vice president largely chosen by the party to serve out the four-year term in the event of his incapacity, and with any hint of vote fraud or “ballot harvesting,” then the Republican, conservative right will explode. In any event, the strictures of the pandemic may lead some to call for the election itself to be postponed or delayed indefinitely—and then everyone’s head will explode.

These are difficult times. We are exposed to medical, political, and economic stresses that I have not experienced in my long years as a politically conscious adult. And as I have expressed recently, I don’t know what the future will bring. I never thought, as I was entering the placid delta of my life, with the beckoning, anonymous sea and its promise of dissolution just ahead, that I would hear the roar of rapids in front of me. At this point, all I can do is lighten my load, tighten my straps, firm up my grip on the paddle, and get ready to ride the river.

1. You are born in a spring that seeps from a hillside, far above the plains; spend your youth tumbling over rocks and rills, suffering the pains of early childhood and adolescent development as a physical and social being; enter the broad stream of experience, skills, and achievements as a competent adult; sometimes become trapped in a lake, where the stream no longer carries you forward, so that you must paddle hard to get anywhere; end up in the still, sluggish waters of the delta in late age, with all your forces spent; and finally get flushed out to dissolution in the great and anonymous sea. It’s a metaphor that ultimately tells you nothing new about life.

2. Much of this also applies while driving a car, but inside the steel cage you are a little more distanced from cold, hard reality, and your knees are a bit farther from the hard, unforgiving pavement.

Sunday, May 24, 2020

The Most Stable Government

Double eagle

I hate to say it—and I mean this line of thinking offends me as a “little-D democrat”—but the most stable form of government in human history is hereditary monarchy. Hands down, it wins the race as the longest-running, most often chosen, quickest-to-revert-to form of political organization. It would seem to be the natural way for human beings to govern themselves, the hierarchical imperative.

I do not say it is the best form of government. Or that it’s the fairest, most efficient, or most rational form. Just that it is the most stable—although it’s not exactly that in the short term, either. It’s the form that every society keeps coming back to.

Ancient Rome from its founding had seven kings,1 and they were deposed in favor of a democratically based republican form of government that lasted almost 400 years. The Republic was a system of meritocratic personal advancement through a course of political, military, and religious offices, culminating in election to a shared executive function, the consulship, that a man might hold only once in ten years. The Romans were deeply allergic to the idea of kingship, so much so that when they had to resort to a single leader holding extraordinary powers during a crisis, they instead used the term dictator. (This was simply Latin for “speaker.”) And yet, after a series of politically powerful men, having run the “course of honors” and already served their terms as consul, fought for ultimate power using their own armies in the Civil Wars of the first century B.C., they adopted a virtual king in the person of the Caesarian imperator, or “field marshal.” (From this we get our term “emperor,” now generally intended to mean a supreme ruler above any number of petty kings and chiefs—of which the Roman Empire had many.)

The Athenian Greeks, the progenitors of our earliest ideas about democracy, veered between elected officials and power-holding “tyrants” for most of what we think of as their ancient Golden Age in the sixth to the fourth centuries BC. But before they had democracy, they had the basileus, or “king.” And the Spartans never had much of a democracy, retaining a king who ruled alongside a council of “ephors,” or magistrates. After Athens lost the Peloponnesian War to Sparta, and then the whole country was subsumed into Macedonia under Philip II and his son Alexander, rule by hereditary kingship remained with the Greeks and what remained of the Alexandrian empire until its eventual takeover by Rome.

At the beginning of the 20th century, most of Europe was nominally ruled by local kings (of the Spanish, Greeks, Danes, Swedes, and English, to name a few), or a Kaiser in Germany, or Tsar in Russia.2 Being an enlightened age, most of these kings’ powers were either overseen by or shared with some form of parliament, or diet in imperial Germany and Japan, or duma in imperial Russia. Some kings, like those in England and Sweden, were more social figureheads than persons of power. Some, like those in Germany and Russia, ruled as virtual autocrats—or tried to. Two world wars swept away the actual power of even the most autocratic sovereigns, but in the case of Russia and Germany the forces that took over quickly devolved into a new form of ruler—the Secretary General of the Communist Party in Russia and the Reich chancellor, or simply Der Führer, in Germany, who were kings in all but name. And if either Stalin or Hitler had left children capable in time of succeeding him, there’s little doubt those titles would have become hereditary.

Of course, most of the rest of the world in antiquity and up to modern times has been ruled by kings under one name or another: Pharaoh in Egypt, Sultan among the Turks, Great King in Persia, Emperor in China, and chiefs among the many native tribes of North America or full kings among the urbanized native cultures of Central and South America. When Europeans conquered and attempted to colonize and “civilize” these lands, they eventually tried to bring in some form of parliamentary democracy or Western bureaucracy. But it seldom took hold, except perhaps in India. And China in the 20th century quickly went from the last imperial dynasty to a republic, and then to government by the Communist Party under Mao Zedong, who was the new “Red Emperor” in all but name.

Falling into line under the leadership of one man—or more rarely a woman—and obeying his or her orders seems to be in our human genes, going back to the hierarchical organization of the monkey troupe. In moments of crisis—and there is always a crisis, sometime, somewhere—we rely on the proven or probable skills and knowledge of a military, political, or spiritual leader, or whatever the tribe needs. This is rule of the fittest by common consensus. But once that person has tasted power, it’s difficult not to succumb to the temptation of continuing the crisis to stay in power. And this tendency is exacerbated by the leader’s naturally surrounding himself—and sometimes herself—with a cadre of lieutenants, counselors, or acolytes, to whom he or she owes favors and delegates powers in their own right, and from whom he or she exacts loyalty and support in the face of all challengers.

Sometimes, as in the Native American cultures, a tribe might rally around a war leader in times of military struggle and then a political or diplomatic leader or elder in times of peace and negotiation. The tribe’s leadership would be fluid and flexible. But those arrangements would occur in small groups, an extended clan or village, where almost everyone knew every member of the tribe. In larger groups, or groups that have grown larger by conquest, the person of the king becomes isolated, distant, and cloaked in ceremony and privilege. Then the functions of military, political, and sometimes even spiritual leader become blended in a single person. And because people have an innate respect for genes and heredity, it’s easy for a king to promote his eldest or most capable son as heir to the throne. Even if the king dies while the heir is still a child, that cadre of lieutenants and counselors will close ranks around the throne and defend the child’s rights, or promote a regent to serve in power until the child reaches maturity.

This is all very old stuff, going back to patterns laid down in human prehistory. And it works for most people, because democracy as practiced in its ideal form is hard. People have to take time out of their daily lives to take note of and learn about the major issues confronting the tribe or the nation. They have to exercise the vote and make what they believe or hope to be an intelligent choice. Then they have to take responsibility when their candidate wins the election but ultimately fails in action and creates more crisis. They have to get and stay involved. They have to care. In a busy life with not much free time, people get tired of grappling with national priorities and making decisions—especially when most of the time they have to compromise in their views or hold their tongues when the opposite party wins an election and exercises its own version of power.

A king surrounded by appointed counselors and people of rank, who have superior knowledge and together can make decisions for the good of the country, becomes an acceptable form of government. Their decisions might not be the best, or what the average citizen would choose for him- or herself, but they are usually good enough. The system is stable enough to be allowed to continue. And when the country reaches a crisis, when the decisions are bad, then the king’s royal but non-ruling relatives and chief counselors stage a coup, hold an internal war within the capital, and create a new king whom everyone can trust to sort out the mess and get the country back on a good enough footing. The situation is stable—not permanently so, because there are always the occasional coups and interregnums—but stable enough. It soon becomes time-honored tradition.

The current political situation in America calls into question our long-standing traditions under a democratically elected republican form of government. Our constitutional government is now under attack in favor of rule by technical experts appointed to administrative bureaucracies under the Executive Branch. The majority of the rules we now live by are written by cabinet-level functionaries, rather than by elected legislators. The legislators, instead of framing laws we can all read and understand, instead write loose and sometimes hypothetical “wish lists” or desired “end states,” granting powers to those bureaucracies to then write the actual, detailed rules. When the laws that people are supposed to live by are no longer simple, obvious, and readily available, the republic is in danger.

And then the legislators themselves are no longer citizen candidates serving one or two terms as their civic duty. Instead, they have become lifelong officeholders insulated by their extensive staffs and their stronger connections with one party or the other. Their constituencies are defined by incomprehensible district lines engineered to yield a predictable party superiority, called “gerrymandering.” And when that fails, they achieve superiority by the threat or actual practice of voter suppression and ballot fraud. When the average citizen’s vote is no longer equally counted or is rendered meaningless, democracy is in danger.

The Democratic Party devised and used a system of “super delegates” to quash the nomination of Bernie Sanders in the 2016 convention and put in place as their chosen candidate Hillary Clinton. That is a failure of democracy, at least on the party level. In a backlash to the unpopular Clinton—and in part to her unfortunate “deplorables” comment—we saw the populist election of outsider and demagogue Donald Trump in 2016, which brought in a charismatic figure who volubly opposes “the Swamp” of bureaucratic politics. However these anti-democratic forces play out, through repeated soft coup attempts or eventual open warfare, it’s going to be bad for a nation of laws, civility, and the traditional practice of peacefully relinquishing power after losing an election. And when the democratic structure supporting a republic collapses, whether through political crisis or civil war, the likely result is the surviving party establishing some kind of leader figure who is king in all but name.

After the Constitutional Convention of 1787 in Philadelphia, someone asked Benjamin Franklin what the group had created. His reply was, “A republic, if you can keep it.” These days, we may be very close to losing it.

1. Or that’s the tradition. It turns out that those legendary seven covered a span of about 300 years, from the city’s founding on seven hills in a bend of the Tiber around 700 BC. to the expulsion of the last king and creation of the republic about 400 BC. That’s a remarkable span, and the dating of the various kings is inexact, but each of them would have had to rule on average more than forty years apiece, in a primitive village founded around the margins of a great swamp. I’m not saying it’s impossible, but unlikely.

2. It’s a commonplace that “Kaiser” and “Tsar” are simply local linguistic forms of the original “Caesar,” showing how deeply the idea of emperorship and the name of Rome’s first incumbent marked European thinking.

Sunday, May 17, 2020

A Fly in Amber

Fly in amber

Right now, I’m stuck. I used to think this was a temporary condition, with my brain caught at top dead center.1 Now I feel like a fly in amber, with my brain trapped in a hopeless yellow fog.

The trouble started on Christmas Eve 2019, when I fell while walking the dog and broke three bones in my left hand and wrist. Aside from the pain, the inconvenience slowed my writing process, where my brain speaks directly to the keyboard and screen through my fingertips. One hand spider-typing interferes with this, and so I gave myself a break from the current book. And in January I got a helluva cold, or flu, or something—maybe an offshoot of the coronavirus, but probably not—which has resurfaced every couple of weeks since then. So I’ve had reason to delay my writing and give myself a longer break, extending into a dispiriting couple of months.

The truth is, to begin with, I’m not sure about the book I was working on at the time of the accident. I had just started outlining and doing initial drafts of the story about a young American army officer who gets punished for an international incident with reduced rank and a posting to Mars in the 22nd century, where he will head security at the nearly defunct U.S. embassy. Fortunately for him—or not—he arrives right when the various factions on the planet plan and pull off a revolution or a war for independence, and he has to deal with that in military fashion. Great stuff! Future stuff, with AIs, evolved politics, and a mysterious female! Except … as much as I know about Mars, I don’t know, or can’t imagine, much of anything new that any other writer hasn’t used before. And I’m not really sure I believe in colonizing Mars in the first place.

I mean, it’s a rock. The atmosphere is carbon dioxide with a surface pressure about one percent that of Earth. Open a window on a jet at 100,000 feet, and you’re dealing with the same pressure, except in a mix of unbreathable poisons.2 Because Mars has no magnetic field and such a thin atmosphere, solar wind and radiation are deadly on the surface without additional shielding. And if the planet has water, there’s not much of it, or not enough in any one place for human habitation to exploit casually. If you want some new land to colonize, go to Siberia, Patagonia, or Antarctica—they’re all a lot warmer and you can still breathe the atmosphere. It would be easier to build a five-star hotel with Olympic-sized swimming pool on the South Col of Mount Everest: the atmosphere is better and the logistics are much more manageable. Aside from the glory of the achievement, Mars is a really hard sell. For that matter, the Moon’s logistics and travel times are better than those of Mars, and the atmosphere is just a little bit harder vacuum.3

But in my mind, the soldier’s story was set in space, on Mars, from the beginning. And the more I planned and wrote, the hollower—more facile and silly—the story became. At a certain point, I just didn’t believe or trust in my own imagination. And so the writing process just … stopped.

Last year, when I finished The Divina in the Troupe, which is the sequel to The Children of Possibility and completed that three-book mini-series, I was casting around for what story in my imagined lineup to work on next. The young soldier on Mars was neck-and-neck with a third book in the ME group, which would address not two but three copies of the program and deal with some crisis in the network. But that story is still undeveloped in my mind, and I’m not sure the world really needs another dose of a smart-aleck AI who first endangers and then saves the world.

I have other book ideas, but they are even less developed, just glimpses of an idea without plot or characters. And frankly, with the way sales have been going on my previous books, my sense of urgency—if not my dedication to the writing craft—has begun to wane.

But through all that I was still able to write and post my weekly blog on this website. Except … the coronavirus shutdown has me heartsick over both the growing death rate and the effects on the economy, as I wrote in my last blog. My own life situation hasn’t changed all that much: get up, walk the dog, eat breakfast, clear my emails, check the web, do a bit of writing—or, these days, not—then walk the dog, eat lunch, check the stock market, read or nap, do a bit more writing, walk the dog, eat dinner, binge-watch a few shows or a movie, walk the dog, then go to bed and read until it’s time to roll over and turn out the light. Once or twice a week I shop for groceries and go for a motorcycle ride. Once a month or so, I visit family—now in abeyance because of the quarantine. Otherwise, I was already pretty much locked in place.

But when the novel in hand died out, and the whole world went into quarantine and, well … amber, my impulse to write about politics and economics, science and religion, or various art forms just died out. My blogs usually start with some persistent thought that intrudes on my mind, usually related to one of those three topic areas, that I then need to sit down and write out in order to explore my thinking. But the word-generator in my brain that throws up these proto-discussions just … shut down. The closest I’ve come in months was a few nights back, when I woke up at two in the morning to list the various transfers of kingship in England through the War of the Roses and the reason why Henry VIII was so eager to get a male heir. And that’s a story anyone can read about without my help or insight.

I’m trying, charitably, to think of myself as being in a fallow period and not indulge the D-word, let alone the B-word.4 After all, since I was laid off at the biotech in 2010, I’ve been writing hard, producing approximately one novel and fifty blogs each year. So perhaps I’m due for a break. And perhaps, after I give my brain a rest, I will come back with a fresh view on Mars, or the ME character, or some other future war for that young soldier, or something even better and more exciting to write about.

Or that’s my hope.

1. “Top dead center” refers to an internal combustion engine that stops with the piston all the way up at the top of the cylinder—or down at the bottom, which also works—so that any pressure on it just pushes against the vertical connecting rod and bearing without forcing the crank to move one way or the other. Modern, multi-cylinder engines almost never get caught this way, because while one cylinder might be at top or bottom, others are at different positions in the cycle and can move the crank.

2. The atmosphere on Mars would pass for a pretty good laboratory vacuum on Earth. The “air” is too thin for any kind of airfoil or rotor to lift any appreciable mass. So traveling across the Martian surface would be by ground vehicle or some kind of short-hop rocket. This would make human travel and physical commerce between different sites difficult, time-consuming, and expensive—about equal to, say, going from Boston to New York or Philadelphia in colonial times by horseback and wagon or stagecoach.

3. However, the thin atmosphere on Mars—only a partial vacuum—might allay the problem of electrostatic dust precipitation that clings to every surface and plagued the astronauts on the Moon.

4. The “D” is for depression. I’ve never actually been diagnosed with or treated for it, but my late wife once suggested that I might be suffering from depression. And since her death I’ve certainly had my down periods—but those are situational and not clinical.
       The “B” is for writer’s block—which I don’t actually believe in. Supposedly, when this strikes, a writer has lots to say but is inhibited from saying it for some other reason. I’ve never felt that kind of stoppage. If I sit down at the keyboard and can’t write, it’s because my subconscious knows that my thinking on the subject is not yet complete or fully developed, and whatever I tried to write would be a waste of time and would have to get ripped up and rewritten anyway. And that may be what’s going on here: my subconscious isn’t happy with the books that my forebrain and my strength of will have put on the writing schedule, and so it wants me to do something new.

Sunday, April 12, 2020

The Bonfire of the Coronavirus

Coronavirus

Image of the coronavirus taken with an electron microscope
(Credit: U.S. National Institutes of Health/AP/Shutterstock)

I probably shouldn’t write this. It goes against some of the themes I addressed in the essay two weeks ago. Instead, I should probably write something about some aspect of evolution, astrophysics, or other interesting (to me) areas of science. Or something merely tangential to the underlying conditions of our current politics and economics. Or something about writing novels—which I should also be doing right now. But I am too heartsick at the present state of affairs to do any of that. So here goes.

“Did IQs just drop sharply while I was away?”1 We have a virus of unknown but suspect origin, of unknown but worrisome characteristics concerning its transmissibility, symptoms, detection, and causes of mortality, and of unknown presence in the general population due to lack of universal testing. Some people claim that the “bad cold” with a “persistent cough” that was going around in December and January—I myself had a bout—was this same coronavirus but of unrecognized origin and status. Some people claim that this is a bat virus under study in a Chinese virology lab that got out and into the general population. Some still think it was a crossover from a bat in the nearby “wet market.” And the Chinese government is claiming this is a U.S. bioweapon that we launched against their country and that has now come back to bite us in spades. Meanwhile, some U.S. senators were given a confidential briefing—presumably about this virus—and immediately dumped their stock portfolios, as if the end times were at hand.

Everybody knows. Nobody knows. We live in contaminated times.

Back in March2 I was not unduly alarmed when the six Bay Area counties—closely followed by governor of California, for the whole state—ordered us all to “shelter in place” so as to “flatten the curve” of viral transmission. That made sense at the time. I gave the process two weeks, maybe three, just until the crisis had passed. I thought we could all do without sit-down meals in restaurants, movies in the theater, and shopping expeditions at the mall for at least that long. We could do with the schools being closed for a week or two, or even to the end of term. At the time, the president thought things would probably be back to normal by now.

But as I write this (admittedly about a week before the above posted date), and now in the third week of the general shutdown, and with the Bay Area counties extending their sheltering orders for another month, to the beginning of May, the emergency measure is starting to look like a new economic reality. People who worked in those restaurants, movie and sports complexes, shopping malls, and so many other places deemed as “non-essential” services are now being laid off. Utilities, once a safe refuge in the stock market, are taking a hit because they expect to experience massive non-payment of their bills in April. Banks are preparing for people to stop paying off their mortgages, and landlords for people to come up short on the month’s rent—all while some cities are already ordering a halt to all foreclosures and evictions. The consumer economy that was doing so well in January and early February just hit a wall and came to a stop.

And now people are generally suggesting that the shutdown and its effects could probably go on—should go on, will go on—for another three months at least. That takes us up to June or July. That will mean massive layoffs in collateral industries—I’m already hearing projections of double-digit unemployment—and jobless claims not experienced since the Great Depression. That will mean total disruption of personal and family finances, of mortgage banking, of the real estate market, and of the country’s service infrastructure. For the past couple of years, I’ve been hearing in the financial news that the strength of the U.S. and even the global economy is the U.S. consumer: we keep buying the world’s goods—the new car, the next iPhone, the latest fad, and then its upgrade—and this keeps the wheels of the world economy turning. Well … if you shoot the U.S. consumer in the head, all of that goes away, doesn’t it? This is an economic collapse that a mere two trillion dollars in emergency funding won’t cover.

If the shutdown goes on for eight to eighteen months—as some health officials are promising, to keep the currently isolated and uninfected population from ever experiencing the effects of this virus until a vaccine can be manufactured, tested, and put into general use—then we risk a political collapse as well as an economic one. Eventually people are going to venture out, and to keep them contained will require curfews, active enforcement by police and safety personnel, and ultimately martial law. Already, what was framed as voluntary shelter-in-place measures are now, in some areas, backed up with misdemeanor charges and fines for non-essential services that are still operating and even for isolated people who go out to exercise. The mayor of Los Angeles is now asking average citizens to anonymously report on neighbors who leave their homes. How soon before we have the National Guard patrolling the streets?

This is madness—or at least portends it. This is throwing our economy, our lifestyle, our culture on the bonfire of primal fear over a set of infection and mortality statistics that is widely variable by country and by state. Those who say that no amount of economic damage—mere money, just “the stock market”3—matters if it saves one life are not counting the cost in ruined lives and displaced families. Future fear is about to become present hurt.

Maybe we don’t deserve the good economy—more jobs, more income, more opportunities—that we’ve enjoyed over the last decade. Maybe it was all a dream, a fantasy, an illusion. But that would be a real shame. Because the downside is going to hurt more than we can imagine.

1. One of my favorite quotes from the eminently quotable movie Aliens.

2. As if the burgeoning health crisis and the stock market crash that followed it are now lost in the mists of February …

3. Where the market represents the valuation of, and ultimately the economic health of, the organizations that provide most of this country with jobs, material wealth, and retirement savings.

Sunday, March 29, 2020

The Limits of Accountability

Coronavirus

Image of the coronavirus taken with an electron microscope
(Credit: U.S. National Institutes of Health/AP/Shutterstock)

I generally try to keep these blogs (or essays, or meditations, whatever) away from absolute topicality and from following the news of the day in short order. My concern is the longer view, the background view, the why rather than the what of events. But the past few weeks have been extremely disturbing to all of us—emotionally, mentally, physically, and financially.

We have seen a virus of unknown quality as to its incubation time, severity of symptoms, transmission rate, and mortality arise and spread around the world in a matter of months—and perhaps, because of initial attempts at hiding the crisis, within just weeks. That has been one problem: governments, scientists, journalists, and anyone with social media access have lied, exaggerated, imagined, and spun counter-factual accounts (okay, “fake news”) about this virus and its effects. We have gotten comparisons—some real, some bogus, and some irrelevant—with the mortality associated with the Spanish influenza pandemic of 1918, the H1N1 influenza pandemic of 2009, and the caseload and mortality of the yearly seasonal flu—as well as with deaths by gunshot and automobile. We hear that young people may get the disease and remain asymptomatic but still be carriers. We hear that older people and anyone with systemic vulnerabilities will likely get it and die.

In response to all this, various state governments around the world and in the U.S. have locked down their populations. In California, we live under a shelter-in-place order that has emptied the streets, reduced restaurants to take-out service only, closed all entertainments and public gatherings, and supposedly limits travel for non-essential workers to visiting the grocery store and pharmacy. Many other states have followed suit in this country. This has disrupted the local economy for goods and services, crimped the national economy for travel and tourism, and forced every business and organization to reevaluate its most basic assumptions and activities. The result has been people trying to stock up like doomsday survivalists and emptying grocery store shelves—including toilet paper, which seems to be the first priority for everybody.

As a personal experience, my local Trader Joe’s, where I went to do my weekly shopping on Monday, has instituted entry controls, attempting to limit store occupation to one hundred customers at a time. A clerk at the entrance monitors the queue outside and only lets people enter when a clerk at the exit signals someone has left the store. The queue path along the sidewalk is marked off with chalk at six-foot increments for “social distancing,” and we all advance by two giant steps each time someone up front enters. Inside the store, people are orderly and even pleasant—but at a distance. The number of Trader Joe’s personnel now almost equals the number of customers, and the shelves are reasonably stocked. At the register, a sign limits purchases to just two of any one item to prevent hoarding—although the checker let me get by with my weekly supply of six apples, six yogurts, and four liters of flavored mineral water. It wasn’t a bad experience, but it was sobering: we all seem to be taking these restrictions on our movements very calmly.

Health officials would like to see this personal lockdown extended for two, or eight, or perhaps eighteen months in order to “flatten the curve” of the virus’s exponential spread and keep the infected population from exploding as it apparently has done in China, Italy, and Iran. If these experts are right about the need for extending the restrictions, then local economies will crash. Small businesses, many large businesses, and whole industries like hotels, travel, and entertainment will go bankrupt or disappear. Unemployment will reach Depression-era levels, if not greater. China locked down its entire economy—or so it’s said—and dropped their gross domestic product by thirty percent—or so it’s said.

Because of uncertainty about all of this—fears of massive infection rates and millions of dead, the looming prospect of a cratered economy and worldwide depression—the stock market lost a third of its value in two weeks, ending the longest bull market in a sudden and dizzying bear market. The bond markets also crashed. The price of oil collapsed—although this had help from a price feud between the Saudis and the Russians. Gold prices spiked and then relapsed. There has been no safe place to invest in all of this turmoil.

The point of my bringing up these events in such detail is that we may have reached the limits of human accountability in a world still driven by natural forces. Whether the novel coronavirus—that is, this unknown version of a known type of virus—is the unfortunate meeting of a bat and a pangolin in a Chinese wet market, or the intentional creation of a weapon in a biosafety Level 4 lab, it still spreads by the vulnerabilities of the human immune system, the vagaries of human touch, and the viability of its own protein coat. Airline travel—which is virtually instantaneous these days, compared to horseback and sailing ship—allows the virus to move farther and faster before it touches down in a population and blooms with disease, and there it spreads in ways that are still hard to stop.

Today we all live with awareness of our scientific, medical, and technical capabilities, and so with a consciousness moral and civilizational superiority, compared to earlier times and less-developed places. Our past success with vaccines in treating viral diseases like polio and measles makes us believe that we should be able to quickly and easily prevent and treat this disease. We become impatient with diagnostic and pharmaceutical companies who can’t produce a rapid test or a vaccine within a matter of weeks.

We are capable of wielding such enormous economic power and organizational resources that we tend to believe we are immune to natural disaster. And so when hurricanes and earthquakes strike, or a virus comes into the population, we blame the response of the Federal Emergency Management Agency, the Red Cross, the National Guard, and federal, state, and local governments as being inadequate to the task. Someone must be at fault for this.

We look at previous civilizations and historic events like the Spanish Flu, the Black Death, the eruption of Vesuvius, or the storms that swept the Armada’s galleons off course, and believe we are superior. Because we understand the nature of viruses and bacteria and their role in disease, or the nature of plate tectonics and its role in earthquakes and volcanoes, or the weather patterns that create typhoons and hurricanes, we think we should be able to prevent, treat, and immediately recover from their effects. And if we do not, we blame the experts, the government, the organizational structures that have been built to protect us. Someone should be held accountable.

The fact is, we are still relatively helpless. Humans are not the masters of this world, only its dominant tenants. We are still subject to the unpredictable movements of its lithosphere, its atmosphere, and the other inhabitants of its diverse biome, including the tiniest specks of DNA and RNA wrapped in a layer of reactive proteins.

No one gets the blame. Everyone is doing their best. And we all die eventually.

Sunday, March 15, 2020

Harry Potter’s Broom

Nimbus 2000

Harry Potter’s broom

I enjoy many stories, novels, and movies based on magic and magicians—the kind where magic is a real force, not a stage performance. But I have always resisted writing about magic as if it was real and not, in Arthur C. Clarke’s words, a “sufficiently advanced technology.”1

The problem, as I see it, is that I have too practical and inquiring a mind. Being the son of a mechanical engineer, grandson of a civil engineer, having worked all my life with engineers and scientists, and being good at asking questions and keeping my ears and mind open, I have a feel for the way things work in the real world. Which means I can just about smell a technical problem without having to take measurements.

So … Harry Potter’s broom raises an interesting question. In the Wizarding World, is it the broom that flies, and the person simply steers or wills it to fly in a certain direction at a certain altitude? Or is it the person that flies, and the broom is simply an adjunct, a supplement to his or her powers, perhaps functioning as some kind of talisman?

The reason I ask is one of balance. A person perched on top of a broom has his or her center of mass positioned above the shaft of the broomstick.2 In that condition—as I know from personal experience with the inertial dynamics and all the postures and gestures involved in riding a motorcycle—your balance would be severely proscribed. Like a ship whose center of gravity and center of buoyancy become misaligned, the whole rig will tend to turn over.

So why doesn’t a witch or wizard riding a broomstick—either in the Harry Potter world or in the traditional Salem and Halloween sense—with only her or his legs hanging below the shaft, and the rest of the body’s mass above it, not turn over? Why don’t we see these people flying upside down and hanging onto the broom for dear life?

The question is pertinent because I don’t think that—to the extent authors who deal in magic and flying broomsticks are actually thinking this matter through—the person is flying and only using the broom as a talisman. We’ve seen comic scenes, particularly during Quidditch games, where a player is knocked off his or her broom and must hang on, two-handed and legs flailing, underneath the floating broom while he or she tries to climb back aboard. Clearly, the broom and not the human is doing the actual lifting and flying.

So why isn’t the rider flying upside down? Does the broom have a preferred side or orientation? Do the laws of physics cease to operate in the vicinity of the broomstick?3 Or does it have something to do with the positioning of the rider’s hands and legs and the strength of their grip on the shaft?

It’s all a mystery, as magic should be. Still, inquiring minds want to know.

1. The whole quote is “Any sufficiently advanced technology is indistinguishable from magic.” And that is the basis of much good science fiction.

2. Don’t be fooled by the wire stirrups in the picture, as if they anchored the rider in any preferred position. Mass is mass and finds its own center of gravity. Just ask any horseback rider who, with or without stirrups, experiences a broken saddle girth.

3. Well, of course!

Sunday, March 1, 2020

A Material World

Buckminsterfullerene

The Buckminsterfullerene

In the movie Star Trek IV: The Voyage Home, Scott and McCoy try to find a light and strong material with which to build a giant seawater tank in the hold of their stolen Klingon ship. They locate a manufacturer of plexiglass in 20th-century San Francisco and offer him the formula for “transparent aluminum,” a material from the 23rd century. They assuage their consciences about temporal paradoxes by suggesting, “How do we know he didn’t invent it?”

Well, he didn’t. The crystal in many of today’s quality watches of all descriptions, including my upgraded Apple Watch, are made from synthetic sapphire. Since the composition of sapphire is corundum, or crystalline aluminum oxide (Al2O3)—the same material from which, in powder form, metallic aluminum is smelted—along with traces of iron, titanium, chromium, vanadium, or magnesium depending on the gem’s color,1 you could easily say that this crystal, which is durable, lightweight, strong, and scratch-resistant, is indeed “transparent aluminum.”

Synthetic rubies and then sapphires were invented in 1902 by French chemist Auguste Vermeuil. He deposited the requisite chemicals in the requisite combinations on a ceramic base by heating and passing them through a hydrogen-oxygen flame, then increasing the temperature to the point of melting and crystallizing the alumina. So far, we can make watch crystals and synthetic gemstones with this process. Whether it is scalable for fabricating whole spaceships is another question. But the technology is young yet.

If you are a dedicated browser among the pages of Science and Nature, as I am, with forays into Scientific American and Popular Science, you know that the world of materials science is hot right now. And the element carbon is getting a resurgence—but not as a fuel.

Carbon has the happy ability to bond with many different atoms including, sometimes, itself. Its four covalent bonding points allow it to share single, double, and even triple bonds with other carbon atoms, often forming chains and hexagonal rings that are the building blocks of organic chemistry and so the basis of all life on this planet. These rings and chains leave room for adding other atoms and whole other molecules, making carbon the backbone of the chemical world’s Swiss Army knife.

What modern materials scientists have discovered is that bonding among carbon atoms can be induced in several structural forms. We are all familiar with the three-dimensional, tetrahedral-shaped crystal of a diamond, whose bonds are so strong that they make it one of the hardest materials known. But those atoms can also be knit into fibers, which are then stabilized and supported in an epoxy resin to create a material that is light, strong, and useful in many applications, sometimes replacing steel. The carbon atoms call also form two-dimensional, hexagonal structures that can also be laid out in endless sheets, called graphene, which are strong and supple even at one molecule’s thickness.2 Or smaller sections of those sheets can be bent into nano-scale tubules, which are even stronger than the carbon fibers and have interesting chemical uses. And finally, the carbon atoms can be joined into microscopic soccer ball–like molecules, made of twenty hexagons and twelve pentagons with the formula C60 (pictured). This is the buckminsterfullerene—named after the architect Buckminster Fuller, who invented a spherical structure of similar configuration.

Graphene is not only strong but it is also electrically and thermally conductive, useful for dissipating heat. It has a high surface-to-volume ratio, which means it can be used to make batteries and fuel cells more efficient. It holds promise for flexible display screens and solar photovoltaic cells. And as an additive to paint and in other surface preparations, it can increase wear and resistance to corrosion.

Carbon nanotubes, which are also electromagnetically conductive, can be used in radio antennas and as the brushes in electric motors. Being biodegradable, they can be used in tissue engineering for bone, cartilage, and muscle. Because they are easily absorbed into cells, they can carry other molecules such as medicines as well as protein and DNA therapies. Spun into yard, the tubes would offer superior strength and wear in clothing, sports gear, combat armor, and even in cables for bridges and for space elevators—imaginative projects that have been proposed for hauling people and cargo up to geosynchronous orbit.

Buckyballs have potential uses as a drug delivery system, as lubricants that will resist breaking down under wear and heat, and as catalysts in chemical reactions. As a medicine in itself, the C60 fullerene can be used as an antioxidant, because it reacts with free radicals.

And that is just some of the potential for various pure forms of carbon.

Work on the genetics of plants and animals other than humans will have far reaching effects, too, in terms of bio-simulates. For example, spiders produce a raw silk that they spin into a strand which has a tensile strength greater than steel and more fracture-resistance than the aramid fibers used in Kevlar body armor.3 We could farm spiders for this silk, the way we do silkworms for their cocoon fibers, except that spiders in captivity will eat each other. But several companies are now working on creating synthetic spider silk.

Another area ripe for development is natural latex, the basis of all our rubber products. Rubber trees are native to South America, where they naturally grow in splendid isolation because a fungus-based leaf blight destroys any trees that grow too close together. Attempts by Ford to create a rubber plantation in Brazil in the late 1920s failed because of this blight. All of the world’s rubber currently comes from trees grown on plantations in Southeast Asia, where they survive only with the strictest vigilance—cutting and burning whole plantations at the first sign of blight—and government control of imported plants and vegetables.

Natural rubber is essential to modern life. Synthetics based on petroleum chemistry, like styrene-butadiene, are less resilient and elastic. A natural rubber tire can thaw from being frozen in the wheel well of an airliner at 35,000 feet in the time it takes for the plane to descend and land, while a synthetic-based tire will remain frozen and shatter upon impact. So discovering a genetic formula for latex and being able to extrude it in the same way the rubber tree weeps its sap would be a godsend.

One of the unsung stories of our modern life is the nature of our materials. They are not just getting cheaper but also lighter, stronger, and better. And this is only the beginning.

1. Just about every color but red. And a red crystal of virtually the same composition is called a ruby.
    Emeralds are a different material, however, based on beryl, which is composed of beryllium aluminum silicate (Be3Al2Si6O18) in hexagonal crystals with traces of chromium and vanadium.

2. The graphite in pencil “leads” is not chemically lead but a pure form of carbon. Small bits of what we now call graphene are layered into a three-dimensional composite, like the layers in sandstone or shale.

3. Interestingly, spiders that are fed a diet of carbon nanotubes make a silk that is even stronger, incorporating the tubules into its protein microfibers.

Sunday, February 23, 2020

Ancient Computers

Perpetual calendar

About twenty years ago, I used to keep on my desk—partly as ornament, partly paperweight, and somewhat as a useful device when I was writing fiction about the near future—a perpetual calendar like the one pictured here. It was a simple device: You align the month on the inner dial with the intended year on the outer dial, then read off the dates for the days of the week in the window at bottom. It took a minute to set, being mindful of leap years, and gave accurate readings over a span of fifty years.

This was a form of computer with no electronics and only one moving part. It was the sort of thing we all used to find specific information before we could carry a computer in our pockets that is the size of a deck of cards—and now wear one on our wrists that is the size of a matchbox.

I am of two minds about this, because I have always loved small devices with screens and keyboards. That love goes back to the first toy that my father made for me when I was about three years old. It was a box with a series of toggle switches and a line of small lights with red, green, yellow, and blue lenses. When you threw the switches, the lights would come on in different orders. It did nothing useful, except fascinate a small child. But it fixed my mind in a pattern that endures to this day.

Ever since the dawn of the Microprocessor Age, I have been chasing the ultimate handheld computer. It started with the first “personal digital assistants,” or PDAs, usually made in Japan and with crippled keyboards that required you to hold down three not-all-that-obvious keys to get a capital letter or a punctuation mark. Being a book editor and a stickler for form, I laboriously worked to get the right spelling and punctuation in my entries, so using the thing productively took forever. I then adopted, in rapid succession, a Palm Pilot—where you spelled everything out with a stylus or your fingertip—and then a variety of Hewlett-Packard calculators and tiny computers, chasing that holy grail.

My first cell phone had the traditional arrangement of rotary-dial numbers as a limited form of keyboard. That is, the digits 2 to 9 were each accompanied by three letters in sequence from the alphabet.1 You could store people’s names and numbers in the phone’s memory by “typing” them in using the keypad: Press 2 once for A, twice for B, three times for C, and wait a bit for the phone to sort out the right code and show your desired letter. And, of course, there were no lower-case letters or punctuation. It was really easier to keep your phone list separately, in a booklet or on a piece of paper, except that then your friends wouldn’t be on speed dial. But I digress …

Before we had computers at our fingertips, we had all sorts of handy ways to work out useful information.

The oldest is probably the Antikythera mechanism, a device of brass with geared wheels, now encrusted with corrosion and coral, discovered in a shipwreck off the Greek island of Antikythera in 1901. It has since been dated to about 200 BC, and x-rays of the gears and a reconstruction of their turning suggest that the mechanism was used to calculate astronomical positions and possibly to predict solar eclipses. The corollary would be the modern mechanical orrery, which dates back to the late medieval period and shows the positions of the sun and planets at any particular point in their continuously revolving orbits.

But mechanical representations of physical conditions are not the only form of ancient computer.

When I was compiling engineering resumes at the construction company, I came across a man whose work responsibilities included compiling “nomographs.” At first, I thought this was a typo and that he must actually be writing monographs—a literary pursuit, but an odd one for an engineer. Further checking revealed that, no, he really did make nomographs, also called nomograms. These are two-dimensional diagrams representing a range of variables associated with a mathematical function, usually shown as number sets along two or three parallel lines. Rather than solve the function mathematically, all an inquiring engineer had to do was draw a line of the correct angle across the parallel lines to achieve the answer.

As a form of computer, the nomograph is just a little more complex than a table of common logarithms2 or a telephone book—closer to a database than a calculation. And if we’re going to call a computer any device that gives you accurate astronomical readings, then a sailor’s sextant for “shooting the sun” at noon to determine latitude—and before that the astrolabe for calculating the angle between the horizon and the North Star—are also in the running as early “computers.”

But the point of this meditation is not to show how clever ancient peoples were but how much we are losing in the digital age. Orreries and sextants are now mechanical curiosities and decorative artifacts—the one lost to telescopes and observational satellites tied into much more sophisticated computer modeling, the other lost to satellite-based global positioning systems (GPS). Nobody writes nomographs anymore. The phone company doesn’t even publish the Yellow Pages anymore, or not on paper. Everything is online. And I can answer texts from friends by drawing letters with my fingertip—in both upper and lower case, with punctuation—on the crystal face of my Apple Watch, which doesn’t even need a keyboard.

The knowledge of the entire world along with real-time information, like GPS positioning and footstep counting with conversion to calories, is in our pockets and on our wrists. And that’s a wonderful thing.

But when the batteries die—or when future archeologists dig my Apple Watch out of a shipwreck, corroded with salt and perfectly nonfunctional—we will be left with lumps of silicon and dozens of questions. Who will draw the nomographs then?

1. Except for the 7, which picked up P, Q, R, and S, and the 9, which had W, X, Y, and Z. Presumably Q, X, and Z weren’t expected to get much use.

2. And a little less complex than a slide rule.