Sunday, May 31, 2020

Life as a River

River rapids

The river as a metaphor for human life is one of the oldest clichés.1 But sometimes life imitates metaphor, and this time may be one of them.

When you’re traveling on a river that is broad, flat, and smooth, sometimes you hear a distant roaring, the sound of rapids and maybe even a waterfall ahead. You can’t turn around, you can’t stop the flow—either of the river or of time itself. All you can do is move forward in hope and confidence, knowing that your skills and courage might let you survive whatever drop in elevation and resulting rapids and rocks the river ahead throws at you. Or not, and then you will be beyond caring, because you are drowned and dead.

And if you could find a way off the river—pull to shore and abandon the water and its currents, stop time itself—all you would be doing is either placing yourself in limbo or joining another river. And there you might not even hear the falls before you went over them.

When I was growing up, my parents taught a severe form of bravery that was in one sense pure fatalism. Some bad things are just going to happen, so you might as well face them and get it over with. Or, as my mother would say when we had to clean up certain messes, “If that’s the worst thing you ever have to put your hands into, consider yourself lucky.” Life is hard, and there’s no use in cowering, because whatever lies ahead will come to you anyway.

But these were people who, despite having been raised in loving and relatively well-to-do families, graduated from high school into the start of the Great Depression, then graduated from college at its depths, and finally moved forward into World War II. In those days, if you weren’t strong, capable, flexible, and emotionally resilient, you collapsed under the weight of your own fear and despair.

I’ve experienced a similar fatalism—and written about it elsewhere—while riding a motorcycle.2 Sometimes you are faced with a difficult, declining-radius curve, or you take a bad line through any curve, or you suddenly discover an obstacle lying in the road around a corner. Or sometimes another driver cuts you off and suddenly truncates your path to safety. There is no way, on a motorcycle, to stop time, to reconsider, to take measurements, and then to lay out, analyze, and choose among all the available options. You just have to deal with what’s coming in real time, relying on all your learned skills and reflexes, hope for the best, and choose to have no regrets.

Twenty-twenty has proven to be a time something like the decade my parents faced. We started off in January with a hugely successful economic environment, low unemployment, and bright prospects. Then a novel virus with unknown but frightening prospects for transmissibility and lethality—and with remarkable differences of public and professional opinion, as well as quoted statistics, even among scientific and political experts, regarding its actual effects, even extending to its origins and possible human development—took the country, the world, and the global markets into economic standstill, if not freefall. In this country we were already in the midst of political turmoil, with one party declaring every “resistance” to a legally elected president, even challenging that election itself because, as sometimes happens, the popular and electoral votes did not coincide.

In the midst of what is shaping up to be another Great Recession, if not an economic malaise worse than the Great Depression, we are headed into a national election that is sure to be contested. If President Trump is elected with another minority popular vote, or valid claims of voter suppression, or any whiff of foreign collusion and interference, then the Democratic, progressive left will explode. If the prospective Democratic candidate Joe Biden, who appears to labor under some obvious mental handicaps, is elected along with a vice president largely chosen by the party to serve out the four-year term in the event of his incapacity, and with any hint of vote fraud or “ballot harvesting,” then the Republican, conservative right will explode. In any event, the strictures of the pandemic may lead some to call for the election itself to be postponed or delayed indefinitely—and then everyone’s head will explode.

These are difficult times. We are exposed to medical, political, and economic stresses that I have not experienced in my long years as a politically conscious adult. And as I have expressed recently, I don’t know what the future will bring. I never thought, as I was entering the placid delta of my life, with the beckoning, anonymous sea and its promise of dissolution just ahead, that I would hear the roar of rapids in front of me. At this point, all I can do is lighten my load, tighten my straps, firm up my grip on the paddle, and get ready to ride the river.

1. You are born in a spring that seeps from a hillside, far above the plains; spend your youth tumbling over rocks and rills, suffering the pains of early childhood and adolescent development as a physical and social being; enter the broad stream of experience, skills, and achievements as a competent adult; sometimes become trapped in a lake, where the stream no longer carries you forward, so that you must paddle hard to get anywhere; end up in the still, sluggish waters of the delta in late age, with all your forces spent; and finally get flushed out to dissolution in the great and anonymous sea. It’s a metaphor that ultimately tells you nothing new about life.

2. Much of this also applies while driving a car, but inside the steel cage you are a little more distanced from cold, hard reality, and your knees are a bit farther from the hard, unforgiving pavement.

Sunday, May 24, 2020

The Most Stable Government

Double eagle

I hate to say it—and I mean this line of thinking offends me as a “little-D democrat”—but the most stable form of government in human history is hereditary monarchy. Hands down, it wins the race as the longest-running, most often chosen, quickest-to-revert-to form of political organization. It would seem to be the natural way for human beings to govern themselves, the hierarchical imperative.

I do not say it is the best form of government. Or that it’s the fairest, most efficient, or most rational form. Just that it is the most stable—although it’s not exactly that in the short term, either. It’s the form that every society keeps coming back to.

Ancient Rome from its founding had seven kings,1 and they were deposed in favor of a democratically based republican form of government that lasted almost 400 years. The Republic was a system of meritocratic personal advancement through a course of political, military, and religious offices, culminating in election to a shared executive function, the consulship, that a man might hold only once in ten years. The Romans were deeply allergic to the idea of kingship, so much so that when they had to resort to a single leader holding extraordinary powers during a crisis, they instead used the term dictator. (This was simply Latin for “speaker.”) And yet, after a series of politically powerful men, having run the “course of honors” and already served their terms as consul, fought for ultimate power using their own armies in the Civil Wars of the first century B.C., they adopted a virtual king in the person of the Caesarian imperator, or “field marshal.” (From this we get our term “emperor,” now generally intended to mean a supreme ruler above any number of petty kings and chiefs—of which the Roman Empire had many.)

The Athenian Greeks, the progenitors of our earliest ideas about democracy, veered between elected officials and power-holding “tyrants” for most of what we think of as their ancient Golden Age in the sixth to the fourth centuries BC. But before they had democracy, they had the basileus, or “king.” And the Spartans never had much of a democracy, retaining a king who ruled alongside a council of “ephors,” or magistrates. After Athens lost the Peloponnesian War to Sparta, and then the whole country was subsumed into Macedonia under Philip II and his son Alexander, rule by hereditary kingship remained with the Greeks and what remained of the Alexandrian empire until its eventual takeover by Rome.

At the beginning of the 20th century, most of Europe was nominally ruled by local kings (of the Spanish, Greeks, Danes, Swedes, and English, to name a few), or a Kaiser in Germany, or Tsar in Russia.2 Being an enlightened age, most of these kings’ powers were either overseen by or shared with some form of parliament, or diet in imperial Germany and Japan, or duma in imperial Russia. Some kings, like those in England and Sweden, were more social figureheads than persons of power. Some, like those in Germany and Russia, ruled as virtual autocrats—or tried to. Two world wars swept away the actual power of even the most autocratic sovereigns, but in the case of Russia and Germany the forces that took over quickly devolved into a new form of ruler—the Secretary General of the Communist Party in Russia and the Reich chancellor, or simply Der Führer, in Germany, who were kings in all but name. And if either Stalin or Hitler had left children capable in time of succeeding him, there’s little doubt those titles would have become hereditary.

Of course, most of the rest of the world in antiquity and up to modern times has been ruled by kings under one name or another: Pharaoh in Egypt, Sultan among the Turks, Great King in Persia, Emperor in China, and chiefs among the many native tribes of North America or full kings among the urbanized native cultures of Central and South America. When Europeans conquered and attempted to colonize and “civilize” these lands, they eventually tried to bring in some form of parliamentary democracy or Western bureaucracy. But it seldom took hold, except perhaps in India. And China in the 20th century quickly went from the last imperial dynasty to a republic, and then to government by the Communist Party under Mao Zedong, who was the new “Red Emperor” in all but name.

Falling into line under the leadership of one man—or more rarely a woman—and obeying his or her orders seems to be in our human genes, going back to the hierarchical organization of the monkey troupe. In moments of crisis—and there is always a crisis, sometime, somewhere—we rely on the proven or probable skills and knowledge of a military, political, or spiritual leader, or whatever the tribe needs. This is rule of the fittest by common consensus. But once that person has tasted power, it’s difficult not to succumb to the temptation of continuing the crisis to stay in power. And this tendency is exacerbated by the leader’s naturally surrounding himself—and sometimes herself—with a cadre of lieutenants, counselors, or acolytes, to whom he or she owes favors and delegates powers in their own right, and from whom he or she exacts loyalty and support in the face of all challengers.

Sometimes, as in the Native American cultures, a tribe might rally around a war leader in times of military struggle and then a political or diplomatic leader or elder in times of peace and negotiation. The tribe’s leadership would be fluid and flexible. But those arrangements would occur in small groups, an extended clan or village, where almost everyone knew every member of the tribe. In larger groups, or groups that have grown larger by conquest, the person of the king becomes isolated, distant, and cloaked in ceremony and privilege. Then the functions of military, political, and sometimes even spiritual leader become blended in a single person. And because people have an innate respect for genes and heredity, it’s easy for a king to promote his eldest or most capable son as heir to the throne. Even if the king dies while the heir is still a child, that cadre of lieutenants and counselors will close ranks around the throne and defend the child’s rights, or promote a regent to serve in power until the child reaches maturity.

This is all very old stuff, going back to patterns laid down in human prehistory. And it works for most people, because democracy as practiced in its ideal form is hard. People have to take time out of their daily lives to take note of and learn about the major issues confronting the tribe or the nation. They have to exercise the vote and make what they believe or hope to be an intelligent choice. Then they have to take responsibility when their candidate wins the election but ultimately fails in action and creates more crisis. They have to get and stay involved. They have to care. In a busy life with not much free time, people get tired of grappling with national priorities and making decisions—especially when most of the time they have to compromise in their views or hold their tongues when the opposite party wins an election and exercises its own version of power.

A king surrounded by appointed counselors and people of rank, who have superior knowledge and together can make decisions for the good of the country, becomes an acceptable form of government. Their decisions might not be the best, or what the average citizen would choose for him- or herself, but they are usually good enough. The system is stable enough to be allowed to continue. And when the country reaches a crisis, when the decisions are bad, then the king’s royal but non-ruling relatives and chief counselors stage a coup, hold an internal war within the capital, and create a new king whom everyone can trust to sort out the mess and get the country back on a good enough footing. The situation is stable—not permanently so, because there are always the occasional coups and interregnums—but stable enough. It soon becomes time-honored tradition.

The current political situation in America calls into question our long-standing traditions under a democratically elected republican form of government. Our constitutional government is now under attack in favor of rule by technical experts appointed to administrative bureaucracies under the Executive Branch. The majority of the rules we now live by are written by cabinet-level functionaries, rather than by elected legislators. The legislators, instead of framing laws we can all read and understand, instead write loose and sometimes hypothetical “wish lists” or desired “end states,” granting powers to those bureaucracies to then write the actual, detailed rules. When the laws that people are supposed to live by are no longer simple, obvious, and readily available, the republic is in danger.

And then the legislators themselves are no longer citizen candidates serving one or two terms as their civic duty. Instead, they have become lifelong officeholders insulated by their extensive staffs and their stronger connections with one party or the other. Their constituencies are defined by incomprehensible district lines engineered to yield a predictable party superiority, called “gerrymandering.” And when that fails, they achieve superiority by the threat or actual practice of voter suppression and ballot fraud. When the average citizen’s vote is no longer equally counted or is rendered meaningless, democracy is in danger.

The Democratic Party devised and used a system of “super delegates” to quash the nomination of Bernie Sanders in the 2016 convention and put in place as their chosen candidate Hillary Clinton. That is a failure of democracy, at least on the party level. In a backlash to the unpopular Clinton—and in part to her unfortunate “deplorables” comment—we saw the populist election of outsider and demagogue Donald Trump in 2016, which brought in a charismatic figure who volubly opposes “the Swamp” of bureaucratic politics. However these anti-democratic forces play out, through repeated soft coup attempts or eventual open warfare, it’s going to be bad for a nation of laws, civility, and the traditional practice of peacefully relinquishing power after losing an election. And when the democratic structure supporting a republic collapses, whether through political crisis or civil war, the likely result is the surviving party establishing some kind of leader figure who is king in all but name.

After the Constitutional Convention of 1787 in Philadelphia, someone asked Benjamin Franklin what the group had created. His reply was, “A republic, if you can keep it.” These days, we may be very close to losing it.

1. Or that’s the tradition. It turns out that those legendary seven covered a span of about 300 years, from the city’s founding on seven hills in a bend of the Tiber around 700 BC. to the expulsion of the last king and creation of the republic about 400 BC. That’s a remarkable span, and the dating of the various kings is inexact, but each of them would have had to rule on average more than forty years apiece, in a primitive village founded around the margins of a great swamp. I’m not saying it’s impossible, but unlikely.

2. It’s a commonplace that “Kaiser” and “Tsar” are simply local linguistic forms of the original “Caesar,” showing how deeply the idea of emperorship and the name of Rome’s first incumbent marked European thinking.

Sunday, May 17, 2020

A Fly in Amber

Fly in amber

Right now, I’m stuck. I used to think this was a temporary condition, with my brain caught at top dead center.1 Now I feel like a fly in amber, with my brain trapped in a hopeless yellow fog.

The trouble started on Christmas Eve 2019, when I fell while walking the dog and broke three bones in my left hand and wrist. Aside from the pain, the inconvenience slowed my writing process, where my brain speaks directly to the keyboard and screen through my fingertips. One hand spider-typing interferes with this, and so I gave myself a break from the current book. And in January I got a helluva cold, or flu, or something—maybe an offshoot of the coronavirus, but probably not—which has resurfaced every couple of weeks since then. So I’ve had reason to delay my writing and give myself a longer break, extending into a dispiriting couple of months.

The truth is, to begin with, I’m not sure about the book I was working on at the time of the accident. I had just started outlining and doing initial drafts of the story about a young American army officer who gets punished for an international incident with reduced rank and a posting to Mars in the 22nd century, where he will head security at the nearly defunct U.S. embassy. Fortunately for him—or not—he arrives right when the various factions on the planet plan and pull off a revolution or a war for independence, and he has to deal with that in military fashion. Great stuff! Future stuff, with AIs, evolved politics, and a mysterious female! Except … as much as I know about Mars, I don’t know, or can’t imagine, much of anything new that any other writer hasn’t used before. And I’m not really sure I believe in colonizing Mars in the first place.

I mean, it’s a rock. The atmosphere is carbon dioxide with a surface pressure about one percent that of Earth. Open a window on a jet at 100,000 feet, and you’re dealing with the same pressure, except in a mix of unbreathable poisons.2 Because Mars has no magnetic field and such a thin atmosphere, solar wind and radiation are deadly on the surface without additional shielding. And if the planet has water, there’s not much of it, or not enough in any one place for human habitation to exploit casually. If you want some new land to colonize, go to Siberia, Patagonia, or Antarctica—they’re all a lot warmer and you can still breathe the atmosphere. It would be easier to build a five-star hotel with Olympic-sized swimming pool on the South Col of Mount Everest: the atmosphere is better and the logistics are much more manageable. Aside from the glory of the achievement, Mars is a really hard sell. For that matter, the Moon’s logistics and travel times are better than those of Mars, and the atmosphere is just a little bit harder vacuum.3

But in my mind, the soldier’s story was set in space, on Mars, from the beginning. And the more I planned and wrote, the hollower—more facile and silly—the story became. At a certain point, I just didn’t believe or trust in my own imagination. And so the writing process just … stopped.

Last year, when I finished The Divina in the Troupe, which is the sequel to The Children of Possibility and completed that three-book mini-series, I was casting around for what story in my imagined lineup to work on next. The young soldier on Mars was neck-and-neck with a third book in the ME group, which would address not two but three copies of the program and deal with some crisis in the network. But that story is still undeveloped in my mind, and I’m not sure the world really needs another dose of a smart-aleck AI who first endangers and then saves the world.

I have other book ideas, but they are even less developed, just glimpses of an idea without plot or characters. And frankly, with the way sales have been going on my previous books, my sense of urgency—if not my dedication to the writing craft—has begun to wane.

But through all that I was still able to write and post my weekly blog on this website. Except … the coronavirus shutdown has me heartsick over both the growing death rate and the effects on the economy, as I wrote in my last blog. My own life situation hasn’t changed all that much: get up, walk the dog, eat breakfast, clear my emails, check the web, do a bit of writing—or, these days, not—then walk the dog, eat lunch, check the stock market, read or nap, do a bit more writing, walk the dog, eat dinner, binge-watch a few shows or a movie, walk the dog, then go to bed and read until it’s time to roll over and turn out the light. Once or twice a week I shop for groceries and go for a motorcycle ride. Once a month or so, I visit family—now in abeyance because of the quarantine. Otherwise, I was already pretty much locked in place.

But when the novel in hand died out, and the whole world went into quarantine and, well … amber, my impulse to write about politics and economics, science and religion, or various art forms just died out. My blogs usually start with some persistent thought that intrudes on my mind, usually related to one of those three topic areas, that I then need to sit down and write out in order to explore my thinking. But the word-generator in my brain that throws up these proto-discussions just … shut down. The closest I’ve come in months was a few nights back, when I woke up at two in the morning to list the various transfers of kingship in England through the War of the Roses and the reason why Henry VIII was so eager to get a male heir. And that’s a story anyone can read about without my help or insight.

I’m trying, charitably, to think of myself as being in a fallow period and not indulge the D-word, let alone the B-word.4 After all, since I was laid off at the biotech in 2010, I’ve been writing hard, producing approximately one novel and fifty blogs each year. So perhaps I’m due for a break. And perhaps, after I give my brain a rest, I will come back with a fresh view on Mars, or the ME character, or some other future war for that young soldier, or something even better and more exciting to write about.

Or that’s my hope.

1. “Top dead center” refers to an internal combustion engine that stops with the piston all the way up at the top of the cylinder—or down at the bottom, which also works—so that any pressure on it just pushes against the vertical connecting rod and bearing without forcing the crank to move one way or the other. Modern, multi-cylinder engines almost never get caught this way, because while one cylinder might be at top or bottom, others are at different positions in the cycle and can move the crank.

2. The atmosphere on Mars would pass for a pretty good laboratory vacuum on Earth. The “air” is too thin for any kind of airfoil or rotor to lift any appreciable mass. So traveling across the Martian surface would be by ground vehicle or some kind of short-hop rocket. This would make human travel and physical commerce between different sites difficult, time-consuming, and expensive—about equal to, say, going from Boston to New York or Philadelphia in colonial times by horseback and wagon or stagecoach.

3. However, the thin atmosphere on Mars—only a partial vacuum—might allay the problem of electrostatic dust precipitation that clings to every surface and plagued the astronauts on the Moon.

4. The “D” is for depression. I’ve never actually been diagnosed with or treated for it, but my late wife once suggested that I might be suffering from depression. And since her death I’ve certainly had my down periods—but those are situational and not clinical.
       The “B” is for writer’s block—which I don’t actually believe in. Supposedly, when this strikes, a writer has lots to say but is inhibited from saying it for some other reason. I’ve never felt that kind of stoppage. If I sit down at the keyboard and can’t write, it’s because my subconscious knows that my thinking on the subject is not yet complete or fully developed, and whatever I tried to write would be a waste of time and would have to get ripped up and rewritten anyway. And that may be what’s going on here: my subconscious isn’t happy with the books that my forebrain and my strength of will have put on the writing schedule, and so it wants me to do something new.

Sunday, April 12, 2020

The Bonfire of the Coronavirus


Image of the coronavirus taken with an electron microscope
(Credit: U.S. National Institutes of Health/AP/Shutterstock)

I probably shouldn’t write this. It goes against some of the themes I addressed in the essay two weeks ago. Instead, I should probably write something about some aspect of evolution, astrophysics, or other interesting (to me) areas of science. Or something merely tangential to the underlying conditions of our current politics and economics. Or something about writing novels—which I should also be doing right now. But I am too heartsick at the present state of affairs to do any of that. So here goes.

“Did IQs just drop sharply while I was away?”1 We have a virus of unknown but suspect origin, of unknown but worrisome characteristics concerning its transmissibility, symptoms, detection, and causes of mortality, and of unknown presence in the general population due to lack of universal testing. Some people claim that the “bad cold” with a “persistent cough” that was going around in December and January—I myself had a bout—was this same coronavirus but of unrecognized origin and status. Some people claim that this is a bat virus under study in a Chinese virology lab that got out and into the general population. Some still think it was a crossover from a bat in the nearby “wet market.” And the Chinese government is claiming this is a U.S. bioweapon that we launched against their country and that has now come back to bite us in spades. Meanwhile, some U.S. senators were given a confidential briefing—presumably about this virus—and immediately dumped their stock portfolios, as if the end times were at hand.

Everybody knows. Nobody knows. We live in contaminated times.

Back in March2 I was not unduly alarmed when the six Bay Area counties—closely followed by governor of California, for the whole state—ordered us all to “shelter in place” so as to “flatten the curve” of viral transmission. That made sense at the time. I gave the process two weeks, maybe three, just until the crisis had passed. I thought we could all do without sit-down meals in restaurants, movies in the theater, and shopping expeditions at the mall for at least that long. We could do with the schools being closed for a week or two, or even to the end of term. At the time, the president thought things would probably be back to normal by now.

But as I write this (admittedly about a week before the above posted date), and now in the third week of the general shutdown, and with the Bay Area counties extending their sheltering orders for another month, to the beginning of May, the emergency measure is starting to look like a new economic reality. People who worked in those restaurants, movie and sports complexes, shopping malls, and so many other places deemed as “non-essential” services are now being laid off. Utilities, once a safe refuge in the stock market, are taking a hit because they expect to experience massive non-payment of their bills in April. Banks are preparing for people to stop paying off their mortgages, and landlords for people to come up short on the month’s rent—all while some cities are already ordering a halt to all foreclosures and evictions. The consumer economy that was doing so well in January and early February just hit a wall and came to a stop.

And now people are generally suggesting that the shutdown and its effects could probably go on—should go on, will go on—for another three months at least. That takes us up to June or July. That will mean massive layoffs in collateral industries—I’m already hearing projections of double-digit unemployment—and jobless claims not experienced since the Great Depression. That will mean total disruption of personal and family finances, of mortgage banking, of the real estate market, and of the country’s service infrastructure. For the past couple of years, I’ve been hearing in the financial news that the strength of the U.S. and even the global economy is the U.S. consumer: we keep buying the world’s goods—the new car, the next iPhone, the latest fad, and then its upgrade—and this keeps the wheels of the world economy turning. Well … if you shoot the U.S. consumer in the head, all of that goes away, doesn’t it? This is an economic collapse that a mere two trillion dollars in emergency funding won’t cover.

If the shutdown goes on for eight to eighteen months—as some health officials are promising, to keep the currently isolated and uninfected population from ever experiencing the effects of this virus until a vaccine can be manufactured, tested, and put into general use—then we risk a political collapse as well as an economic one. Eventually people are going to venture out, and to keep them contained will require curfews, active enforcement by police and safety personnel, and ultimately martial law. Already, what was framed as voluntary shelter-in-place measures are now, in some areas, backed up with misdemeanor charges and fines for non-essential services that are still operating and even for isolated people who go out to exercise. The mayor of Los Angeles is now asking average citizens to anonymously report on neighbors who leave their homes. How soon before we have the National Guard patrolling the streets?

This is madness—or at least portends it. This is throwing our economy, our lifestyle, our culture on the bonfire of primal fear over a set of infection and mortality statistics that is widely variable by country and by state. Those who say that no amount of economic damage—mere money, just “the stock market”3—matters if it saves one life are not counting the cost in ruined lives and displaced families. Future fear is about to become present hurt.

Maybe we don’t deserve the good economy—more jobs, more income, more opportunities—that we’ve enjoyed over the last decade. Maybe it was all a dream, a fantasy, an illusion. But that would be a real shame. Because the downside is going to hurt more than we can imagine.

1. One of my favorite quotes from the eminently quotable movie Aliens.

2. As if the burgeoning health crisis and the stock market crash that followed it are now lost in the mists of February …

3. Where the market represents the valuation of, and ultimately the economic health of, the organizations that provide most of this country with jobs, material wealth, and retirement savings.

Sunday, March 29, 2020

The Limits of Accountability


Image of the coronavirus taken with an electron microscope
(Credit: U.S. National Institutes of Health/AP/Shutterstock)

I generally try to keep these blogs (or essays, or meditations, whatever) away from absolute topicality and from following the news of the day in short order. My concern is the longer view, the background view, the why rather than the what of events. But the past few weeks have been extremely disturbing to all of us—emotionally, mentally, physically, and financially.

We have seen a virus of unknown quality as to its incubation time, severity of symptoms, transmission rate, and mortality arise and spread around the world in a matter of months—and perhaps, because of initial attempts at hiding the crisis, within just weeks. That has been one problem: governments, scientists, journalists, and anyone with social media access have lied, exaggerated, imagined, and spun counter-factual accounts (okay, “fake news”) about this virus and its effects. We have gotten comparisons—some real, some bogus, and some irrelevant—with the mortality associated with the Spanish influenza pandemic of 1918, the H1N1 influenza pandemic of 2009, and the caseload and mortality of the yearly seasonal flu—as well as with deaths by gunshot and automobile. We hear that young people may get the disease and remain asymptomatic but still be carriers. We hear that older people and anyone with systemic vulnerabilities will likely get it and die.

In response to all this, various state governments around the world and in the U.S. have locked down their populations. In California, we live under a shelter-in-place order that has emptied the streets, reduced restaurants to take-out service only, closed all entertainments and public gatherings, and supposedly limits travel for non-essential workers to visiting the grocery store and pharmacy. Many other states have followed suit in this country. This has disrupted the local economy for goods and services, crimped the national economy for travel and tourism, and forced every business and organization to reevaluate its most basic assumptions and activities. The result has been people trying to stock up like doomsday survivalists and emptying grocery store shelves—including toilet paper, which seems to be the first priority for everybody.

As a personal experience, my local Trader Joe’s, where I went to do my weekly shopping on Monday, has instituted entry controls, attempting to limit store occupation to one hundred customers at a time. A clerk at the entrance monitors the queue outside and only lets people enter when a clerk at the exit signals someone has left the store. The queue path along the sidewalk is marked off with chalk at six-foot increments for “social distancing,” and we all advance by two giant steps each time someone up front enters. Inside the store, people are orderly and even pleasant—but at a distance. The number of Trader Joe’s personnel now almost equals the number of customers, and the shelves are reasonably stocked. At the register, a sign limits purchases to just two of any one item to prevent hoarding—although the checker let me get by with my weekly supply of six apples, six yogurts, and four liters of flavored mineral water. It wasn’t a bad experience, but it was sobering: we all seem to be taking these restrictions on our movements very calmly.

Health officials would like to see this personal lockdown extended for two, or eight, or perhaps eighteen months in order to “flatten the curve” of the virus’s exponential spread and keep the infected population from exploding as it apparently has done in China, Italy, and Iran. If these experts are right about the need for extending the restrictions, then local economies will crash. Small businesses, many large businesses, and whole industries like hotels, travel, and entertainment will go bankrupt or disappear. Unemployment will reach Depression-era levels, if not greater. China locked down its entire economy—or so it’s said—and dropped their gross domestic product by thirty percent—or so it’s said.

Because of uncertainty about all of this—fears of massive infection rates and millions of dead, the looming prospect of a cratered economy and worldwide depression—the stock market lost a third of its value in two weeks, ending the longest bull market in a sudden and dizzying bear market. The bond markets also crashed. The price of oil collapsed—although this had help from a price feud between the Saudis and the Russians. Gold prices spiked and then relapsed. There has been no safe place to invest in all of this turmoil.

The point of my bringing up these events in such detail is that we may have reached the limits of human accountability in a world still driven by natural forces. Whether the novel coronavirus—that is, this unknown version of a known type of virus—is the unfortunate meeting of a bat and a pangolin in a Chinese wet market, or the intentional creation of a weapon in a biosafety Level 4 lab, it still spreads by the vulnerabilities of the human immune system, the vagaries of human touch, and the viability of its own protein coat. Airline travel—which is virtually instantaneous these days, compared to horseback and sailing ship—allows the virus to move farther and faster before it touches down in a population and blooms with disease, and there it spreads in ways that are still hard to stop.

Today we all live with awareness of our scientific, medical, and technical capabilities, and so with a consciousness moral and civilizational superiority, compared to earlier times and less-developed places. Our past success with vaccines in treating viral diseases like polio and measles makes us believe that we should be able to quickly and easily prevent and treat this disease. We become impatient with diagnostic and pharmaceutical companies who can’t produce a rapid test or a vaccine within a matter of weeks.

We are capable of wielding such enormous economic power and organizational resources that we tend to believe we are immune to natural disaster. And so when hurricanes and earthquakes strike, or a virus comes into the population, we blame the response of the Federal Emergency Management Agency, the Red Cross, the National Guard, and federal, state, and local governments as being inadequate to the task. Someone must be at fault for this.

We look at previous civilizations and historic events like the Spanish Flu, the Black Death, the eruption of Vesuvius, or the storms that swept the Armada’s galleons off course, and believe we are superior. Because we understand the nature of viruses and bacteria and their role in disease, or the nature of plate tectonics and its role in earthquakes and volcanoes, or the weather patterns that create typhoons and hurricanes, we think we should be able to prevent, treat, and immediately recover from their effects. And if we do not, we blame the experts, the government, the organizational structures that have been built to protect us. Someone should be held accountable.

The fact is, we are still relatively helpless. Humans are not the masters of this world, only its dominant tenants. We are still subject to the unpredictable movements of its lithosphere, its atmosphere, and the other inhabitants of its diverse biome, including the tiniest specks of DNA and RNA wrapped in a layer of reactive proteins.

No one gets the blame. Everyone is doing their best. And we all die eventually.

Sunday, March 15, 2020

Harry Potter’s Broom

Nimbus 2000

Harry Potter’s broom

I enjoy many stories, novels, and movies based on magic and magicians—the kind where magic is a real force, not a stage performance. But I have always resisted writing about magic as if it was real and not, in Arthur C. Clarke’s words, a “sufficiently advanced technology.”1

The problem, as I see it, is that I have too practical and inquiring a mind. Being the son of a mechanical engineer, grandson of a civil engineer, having worked all my life with engineers and scientists, and being good at asking questions and keeping my ears and mind open, I have a feel for the way things work in the real world. Which means I can just about smell a technical problem without having to take measurements.

So … Harry Potter’s broom raises an interesting question. In the Wizarding World, is it the broom that flies, and the person simply steers or wills it to fly in a certain direction at a certain altitude? Or is it the person that flies, and the broom is simply an adjunct, a supplement to his or her powers, perhaps functioning as some kind of talisman?

The reason I ask is one of balance. A person perched on top of a broom has his or her center of mass positioned above the shaft of the broomstick.2 In that condition—as I know from personal experience with the inertial dynamics and all the postures and gestures involved in riding a motorcycle—your balance would be severely proscribed. Like a ship whose center of gravity and center of buoyancy become misaligned, the whole rig will tend to turn over.

So why doesn’t a witch or wizard riding a broomstick—either in the Harry Potter world or in the traditional Salem and Halloween sense—with only her or his legs hanging below the shaft, and the rest of the body’s mass above it, not turn over? Why don’t we see these people flying upside down and hanging onto the broom for dear life?

The question is pertinent because I don’t think that—to the extent authors who deal in magic and flying broomsticks are actually thinking this matter through—the person is flying and only using the broom as a talisman. We’ve seen comic scenes, particularly during Quidditch games, where a player is knocked off his or her broom and must hang on, two-handed and legs flailing, underneath the floating broom while he or she tries to climb back aboard. Clearly, the broom and not the human is doing the actual lifting and flying.

So why isn’t the rider flying upside down? Does the broom have a preferred side or orientation? Do the laws of physics cease to operate in the vicinity of the broomstick?3 Or does it have something to do with the positioning of the rider’s hands and legs and the strength of their grip on the shaft?

It’s all a mystery, as magic should be. Still, inquiring minds want to know.

1. The whole quote is “Any sufficiently advanced technology is indistinguishable from magic.” And that is the basis of much good science fiction.

2. Don’t be fooled by the wire stirrups in the picture, as if they anchored the rider in any preferred position. Mass is mass and finds its own center of gravity. Just ask any horseback rider who, with or without stirrups, experiences a broken saddle girth.

3. Well, of course!

Sunday, March 1, 2020

A Material World


The Buckminsterfullerene

In the movie Star Trek IV: The Voyage Home, Scott and McCoy try to find a light and strong material with which to build a giant seawater tank in the hold of their stolen Klingon ship. They locate a manufacturer of plexiglass in 20th-century San Francisco and offer him the formula for “transparent aluminum,” a material from the 23rd century. They assuage their consciences about temporal paradoxes by suggesting, “How do we know he didn’t invent it?”

Well, he didn’t. The crystal in many of today’s quality watches of all descriptions, including my upgraded Apple Watch, are made from synthetic sapphire. Since the composition of sapphire is corundum, or crystalline aluminum oxide (Al2O3)—the same material from which, in powder form, metallic aluminum is smelted—along with traces of iron, titanium, chromium, vanadium, or magnesium depending on the gem’s color,1 you could easily say that this crystal, which is durable, lightweight, strong, and scratch-resistant, is indeed “transparent aluminum.”

Synthetic rubies and then sapphires were invented in 1902 by French chemist Auguste Vermeuil. He deposited the requisite chemicals in the requisite combinations on a ceramic base by heating and passing them through a hydrogen-oxygen flame, then increasing the temperature to the point of melting and crystallizing the alumina. So far, we can make watch crystals and synthetic gemstones with this process. Whether it is scalable for fabricating whole spaceships is another question. But the technology is young yet.

If you are a dedicated browser among the pages of Science and Nature, as I am, with forays into Scientific American and Popular Science, you know that the world of materials science is hot right now. And the element carbon is getting a resurgence—but not as a fuel.

Carbon has the happy ability to bond with many different atoms including, sometimes, itself. Its four covalent bonding points allow it to share single, double, and even triple bonds with other carbon atoms, often forming chains and hexagonal rings that are the building blocks of organic chemistry and so the basis of all life on this planet. These rings and chains leave room for adding other atoms and whole other molecules, making carbon the backbone of the chemical world’s Swiss Army knife.

What modern materials scientists have discovered is that bonding among carbon atoms can be induced in several structural forms. We are all familiar with the three-dimensional, tetrahedral-shaped crystal of a diamond, whose bonds are so strong that they make it one of the hardest materials known. But those atoms can also be knit into fibers, which are then stabilized and supported in an epoxy resin to create a material that is light, strong, and useful in many applications, sometimes replacing steel. The carbon atoms call also form two-dimensional, hexagonal structures that can also be laid out in endless sheets, called graphene, which are strong and supple even at one molecule’s thickness.2 Or smaller sections of those sheets can be bent into nano-scale tubules, which are even stronger than the carbon fibers and have interesting chemical uses. And finally, the carbon atoms can be joined into microscopic soccer ball–like molecules, made of twenty hexagons and twelve pentagons with the formula C60 (pictured). This is the buckminsterfullerene—named after the architect Buckminster Fuller, who invented a spherical structure of similar configuration.

Graphene is not only strong but it is also electrically and thermally conductive, useful for dissipating heat. It has a high surface-to-volume ratio, which means it can be used to make batteries and fuel cells more efficient. It holds promise for flexible display screens and solar photovoltaic cells. And as an additive to paint and in other surface preparations, it can increase wear and resistance to corrosion.

Carbon nanotubes, which are also electromagnetically conductive, can be used in radio antennas and as the brushes in electric motors. Being biodegradable, they can be used in tissue engineering for bone, cartilage, and muscle. Because they are easily absorbed into cells, they can carry other molecules such as medicines as well as protein and DNA therapies. Spun into yard, the tubes would offer superior strength and wear in clothing, sports gear, combat armor, and even in cables for bridges and for space elevators—imaginative projects that have been proposed for hauling people and cargo up to geosynchronous orbit.

Buckyballs have potential uses as a drug delivery system, as lubricants that will resist breaking down under wear and heat, and as catalysts in chemical reactions. As a medicine in itself, the C60 fullerene can be used as an antioxidant, because it reacts with free radicals.

And that is just some of the potential for various pure forms of carbon.

Work on the genetics of plants and animals other than humans will have far reaching effects, too, in terms of bio-simulates. For example, spiders produce a raw silk that they spin into a strand which has a tensile strength greater than steel and more fracture-resistance than the aramid fibers used in Kevlar body armor.3 We could farm spiders for this silk, the way we do silkworms for their cocoon fibers, except that spiders in captivity will eat each other. But several companies are now working on creating synthetic spider silk.

Another area ripe for development is natural latex, the basis of all our rubber products. Rubber trees are native to South America, where they naturally grow in splendid isolation because a fungus-based leaf blight destroys any trees that grow too close together. Attempts by Ford to create a rubber plantation in Brazil in the late 1920s failed because of this blight. All of the world’s rubber currently comes from trees grown on plantations in Southeast Asia, where they survive only with the strictest vigilance—cutting and burning whole plantations at the first sign of blight—and government control of imported plants and vegetables.

Natural rubber is essential to modern life. Synthetics based on petroleum chemistry, like styrene-butadiene, are less resilient and elastic. A natural rubber tire can thaw from being frozen in the wheel well of an airliner at 35,000 feet in the time it takes for the plane to descend and land, while a synthetic-based tire will remain frozen and shatter upon impact. So discovering a genetic formula for latex and being able to extrude it in the same way the rubber tree weeps its sap would be a godsend.

One of the unsung stories of our modern life is the nature of our materials. They are not just getting cheaper but also lighter, stronger, and better. And this is only the beginning.

1. Just about every color but red. And a red crystal of virtually the same composition is called a ruby.
    Emeralds are a different material, however, based on beryl, which is composed of beryllium aluminum silicate (Be3Al2Si6O18) in hexagonal crystals with traces of chromium and vanadium.

2. The graphite in pencil “leads” is not chemically lead but a pure form of carbon. Small bits of what we now call graphene are layered into a three-dimensional composite, like the layers in sandstone or shale.

3. Interestingly, spiders that are fed a diet of carbon nanotubes make a silk that is even stronger, incorporating the tubules into its protein microfibers.