Sunday, November 11, 2018

What Is Good?

Vitruvian blood vessels

I have never concealed the fact that I am an atheist—although I sometimes sail under the flag of agnosticism, the state of “not knowing,” in order to avoid bar fights. I do not accuse or belittle people who have had an actual religious experience, heard the voice of God or the rustle of angel wings, and are guided by the principles of their religion. Peace be unto them. But I never had the experience, and I have neither the genetic makeup nor the cerebral or psychological components necessary to perceive that unseen whisper. But at the same time, I am not in G. K. Chesterton’s famous line, “capable of believing in anything.” I have my own principles, after all.

One of those principles is evolution. I have worked at both a manufacturer of biological pharmaceuticals and a developer of genetic analysis equipment. I know enough biology and have read enough about genetics and cladistics to appreciate that all life on Earth is related. The octopus is not an extraterrestrial alien dropped into this planet’s oceans—as some sources have recently claimed—but is cousin to the squid and the cuttlefish in the class Cephalopoda, just as human beings are cousin to the mouse and the lion in the class Mammalia.

Evolution is not just the “survival of the fittest,” as the popular saying goes. The evolution of a biological organism takes tiny changes in the genetic code, potentially effecting tiny changes in form and function, and then either implements them immediately—especially if the change is harmful or fatal to the bearer—or holds them quietly in the genome as a recessive or alternate copy of the gene until the features engenders can come into play. The DNA/RNA/protein coding system has many built-in safeguards that make most random changes in the code neither immediately fatal nor immediately helpful. For example, each three-codon reading frame, in which three base-pair sequences call for any one of the twenty amino acids used in assembling a protein molecule, usually has several alternate forms for calling each amino acid; so a change in just one of the “letters” will usually still create the intended protein. The system is robust—so that we can still have viable offspring and recognize them as human—and yet just fragile enough that changes are possible over generations.

And those changes and their effects are not necessarily crude, achieving just basic survival or writing off the individual organism with a lethal deletion. The cheetah was not born to limp over the veldt in pursuit of its ambulating prey. Time and the millions of minute alterations to the genetic code governing the cheetah’s musculature, metabolism, and nervous system allow it to lope gracefully and efficiently, outrunning the swiftest antelopes and wildebeests, which are themselves adapted to run just fast enough—most of the time—to elude their predators. Evolution is not just a mechanism of survival but a mechanism of optimization, efficiency, and ultimately of temporary perfection.

I have called DNA the “god molecule,”1 but that is not because I worship it or think it has supernatural powers. The DNA/RNA/protein system is simply the instrument of evolution. It has created not only all the varied life we see on this planet but also, because of the impact that life has had on shaping the atmosphere, seeding the oceans with abundant life, and covering the hills with vegetation and grazing animals that change their erosion patterns, it has changed the surface of our world itself. The original Earth, before the first bacteria and blue-green algae evolved to give it an oxygen-rich atmosphere, was as hostile to our kind of life as the surfaces of Venus or Mars are today.

But the principle of evolution applies to more than just organic structure and function. Most of the structure and function of human society and the approaches in any human endeavor, from technology to the arts, have advanced by a form of social evolution: small—but sometimes large—changes introduced into a complex situation, there to be either discarded, adopted, or further adapted. In rare cases, like the mathematical thinking of a Newton or an Einstein, a single person will make a significant change in human society and history. But for the most part, what one person starts another will then adapt and improve on, so that the seminal invention is lost in a continuous flow of minor and incremental developments. The invention of the stirrup and the wheeled plow, with their migration during the Middle Ages from Asia into Northern Europe, are such examples.

In the same way, the structure of many human social concepts like love, justice, honesty, reciprocity, personal freedom, and other exchanges that we consider “good” and weave into the stories we tell are the products of social evolution. Human families, clans, tribes, city-states, and nations learned over time by piling one experience and its consequences on another that certain strategies of exchange either worked or did not. For example, they settled early on the basic understanding that habitual lying is harmful both to the people who must deal with the liar and ultimately to the liar himself. That fair dealing and reciprocal trade are a better system of exchange than theft and plunder. That hereditary servitude is not proper treatment for any thinking human being, and a society that practices slavery may flourish for a time but will eventually collapse. That love is a stronger bond and lasts longer than hate. And on and on. We learned these “home truths” at our mother’s knee and passed them down through the cultural wisdom of our clan and tribe long before some prophet wrote them on tablets of stone or bronze and suggested they were the teachings of the gods.

This does not mean that dishonesty, plunder, slavery, hatred, and other injustices don’t exist in the world. Or that sometimes these strategies of exchange will not work just fine in some situations—especially if there is no one stronger around to keep you from getting away with them. Ask the Romans, or the Mongols, the Nazis, the Soviets, and any of history’s other bent and crooked societies that have made a bad name for themselves. But thinking human beings, left on their own to study and consider the situation, will conclude that these negative strategies do not work for the long haul or for the greatest good of the greatest number of people.

Not only has human society as a social construct but the human nervous system as a response mechanism evolved in tune with these beneficial strategies. Try taking from a toddler the treat that its mother has given and see if that tiny human brain does not immediately register and react to the unfairness of your action. Hear children on the playground taunting each other—perhaps even with names and descriptions having a superficial gloss of truth—and see if the recipient does not explode with anger at the perceived dishonesty. We all understand how the world works and know when others are practicing falsehoods and injustices upon our person and our sense of self.

It does not take a god from a burning bush with a fiery finger to write out the rules of what is proper and good in any human exchange. We know it from before we were born, because our brains and our society had already supplied the answer, hard-wired and ready to function. In the same way, we see the world in the colors for which our eyes were adapted, breathe the air for which our lungs were optimized, and recognize the adorable cuteness of babies and puppies because it is beneficial to both that our brains release the right endorphins at the sight of them.

Evolution says that we are at home in this world because we are the products of this world. And that is enough of a natural wonder for me.

1. See, for one example among others, The God Molecule from May 28, 2017.

Sunday, November 4, 2018

The Next Civil War

War devastation

It has been suggested for some years now, at least for the past decade, that this country is in the midst of a “cold civil war.” Disagreements of both policy and principle between the progressive left and the conservative right have reached a fever pitch. Factions are marching in the streets and attacking each other with bats and chemical sprays—although the fighting hasn’t reached the stage of firearms yet. Friendships are breaking up, sides have formed, and the lines are drawn on social media. I even know of one Facebook friend who seems ready to divorce her husband over his political views.1

We’ve been here before, back in the late 1960s, when I was at the university and the young people in college and the radical activists were protesting against the Vietnam War and in favor of civil rights and free speech. Back then, we had campus demonstrations, protest gatherings on the Washington Mall, and rioting in the streets—most notably outside the 1968 Democratic Convention in Chicago. The difference between then and now is that in the ’60s the radical view and its hard-line conservative response were both on the fringes of the political spectrum, while the two main parties could still conduct business in a relatively consensual, bipartisan fashion. Today, the two parties function in lockstep with their most radical elements. Discussion and votes in the Congress and decisions on the Supreme Court are divided along party lines with almost no crossover. The White House and the top echelon of the Executive bureaucracy swing back and forth with whichever party captures the Presidency.

On the one side, we have people who want to create and celebrate a “fundamental transformation” of the country’s political, economic, social, and environmental relations according to a perceived “arc of history.” On the other are those who don’t mind moving forward into the future by evolutionary steps but resist being pushed bodily through revolutionary action. Frustrations abound on either side, and with them come name calling, social shunning, brick throwing, and tear gas.

Some people are even speculating—myself among them, and mostly since the upheavals of the 2016 election—that the cold civil war will eventually turn hot. That our political and economic differences, our social and environmental positions, will reach a point where they can no longer be resolved by discussion and bargaining, by yielding on some points and advancing on others, to arrive at a national consensus. That the political crisis will demand a clear-cut winner and loser. That internal peace will only be achieved when one side or the other can no longer stand up for its position because its politicians and their supporters have—each man and woman—been economically subdued, personally incarcerated, or rendered dead. Or when the country has been divided by physical partition and personal and familial migration, as occurred between India and Pakistan in the late 1940s, with each party maintaining its own new national government.

The first American Civil War of the 1860s was a dispute between cohesive regions, North and South, Slave State and Free. But many people think the current differing viewpoints are too intermixed for the country to break and go to war along regional lines and across state boundaries. This view says that the coming hot war will be more like the Spanish Civil War of the 1930s, with neighbor fighting neighbor for control of the cities and the countryside for each party.

I can see the reasoning for either approach. In many ways, the opposing sides in this country reflect a divergence between urban progressives and rural conservatives. We keep seeing that map comparing the votes cast in Los Angeles County—which is just the urban core of the big place we think of as “LA”—and those in the seven states of the Upper Northwest, from Idaho to Minnesota. And really, even California is not a homogenous polity, because the feeling in communities of the foothills of the Gold Country and in the Sierra is more conservative than the progressive politics of the big cities in the Central Valley and along the Coast.

But I can also see a breakup between regions. The states along the Pacific Coast, in the Northeast, and across Upper Midwest are typically progressive, while the middle of the country is typically more conservative—with a few isolated exceptions like Colorado and New Mexico.

The question of how the country will break apart if and when war comes depends, in my mind, on what incident, what spark, finally sets it off. If the decisive point is internal, say, an election that fails to satisfy one party so greatly that it simply revolts, then we might see a piece-meal collapse as in the Spanish Civil War. But if the incident is external and the shock is to the whole country, then we might see a response that takes shape along regional and state lines.

The latter is the picture I painted as a leitmotif to my two-volume novel about life extension through stem-cell organ replacement, Coming of Age. There, the incident was the repudiation of the national debt.

When I was in college, my economics text book said the national debt was irrelevant because it was just money that we owed to ourselves, financed by Savings Bonds held among the citizenry. No one was going to call in that debt; so the government could just keep financing it by issuing more bonds. As recently as 2014, however, almost half of our publicly held debt in the form of U.S. Treasuries, and a third of our total debt, is held by other governments and offshore banks. The biggest holders are China, Japan, Ireland, Brazil, and the Caribbean banks.

If these external holders wanted to collapse this country—which, given that our global economy is so interconnected, would be a foolish thing—they could simply sell off huge blocks of the U.S. Treasuries they now hold. The federal government would then have to scramble to make good on the sales, and so would likely impose massive economic restrictions and additional taxes on the American public. In my book, this prompts many of the states in the central part of the country—whose residents don’t feel they are well represented in the federal government’s spending decisions—to renunciate the debt and along with it their allegiance to the Union: either secede from the union or go broke by staying in it.

Under those conditions, many of the National Guard units would side with their home states. And many U.S. Army, Navy, and Air Force bases located in these states might weigh their allegiance to the national government against the conservative political instincts of their commanders and troops. The split would not be uniform. The choices would not be pretty. And once initial blood was spilled in the breakup, it would not be much more of a step to spill blood in establishing either national dominance or domestic partition.

In my novel, the breakup along these economic lines came in the year 2018. Of course, that year has now come and is mostly gone. But the weight of the national debt and the simmering divisions of our domestic politics still hang over us all.

I don’t look for war or want it. But my novelist’s ear listens to the rhetoric that is now splitting the county along its fracture lines, and I cannot discount the possibility of a shooting war coming to these United States sometime soon.

1. My late wife and I had opposing political views: she an old Berkeley liberal Democrat, me an unreformed Eisenhower-era conservative Republican. But we fell in love and married in an earlier time, some forty years ago, when political differences were treated in the same way as differences of religious doctrine and practice: a private, personal matter that did not touch on the essentials of what made a good person. My wife and I shared the same values about honesty, integrity, kindness, education, and fair dealing—and that was what mattered. For the rest, we joked about making sure we each went to the polls on election day so that we could cancel each other’s vote.

Sunday, October 28, 2018

Radicals and Revolution

Joseph Stalin

Joseph Stalin

I am becoming more and more concerned about the direction of politics in this country.

It used to be—back when I was growing up, and for a decade or two afterward—that Democrats and Republicans could be civil to each other. They might differ on policy issues but they agreed about the principles and process of governing. Their disagreements were more about emphasis and degree than about goals and objectives. We debated a larger or smaller role for the public and private sectors, for amounts of taxation and regulation, for our stance in international relations, for the uses of military force versus diplomacy, and for other aspects of life in this country. But we could all agree that the United States was and would remain a country of individual liberties, rights, and responsibilities, functioning in open markets that supported investor capitalism, and with sovereign powers and responsibilities in the larger world, all under a government structure spelled out in the U.S. Constitution. These things were the bedrock of our society.

Yes, there were fringe groups. In the late 1960s, radicals on the New Left demonstrated not just against a war they didn’t like but actually in favor of our Cold War enemies in the Soviet Union and China, as well as for our “hot war” enemy in North Vietnam. They envisioned an end to capitalism and the introduction of a Communist-style command-and-control economy in this country. They wanted to overthrow the U.S. government in a violent revolution. The Students for a Democratic Society and their adherents marched for it, and the Beatles sang about it. But reasonable people in the mainstream parties ignored them—or called them “crazies”—and went about the business of governing.

Some of those New Left radicals grew up and matured but never abandoned their extremist principles. Saul Alinsky became a “community organizer” and wrote his signature work, Rules for Radicals. Bill Ayers, one of the founders of the Weather Underground, became an education theorist but remained in his heart a domestic terrorist. One of Alinsky’s admirers—she actually wrote her senior thesis at Wellesley College on him—was Hillary Rodham Clinton. A Democratic politician who was early-on linked to Ayers in Chicago was Barack Obama. But Obama’s and Clinton’s associations can be written off as errors in judgment due to their youth.

Now, and for the past dozen years or more, the internal direction of this country has shifted. The radical viewpoint has grown from a splinter at the extreme left of the Democratic Party to a driving force in much of that party’s current policies and rhetoric. In recent years, many Democrats have expressed sympathy with “Democratic Socialism,” which joins political democracy with government ownership of production and distribution. Many Democrats now favor open borders and the surrender of national sovereignty. The want a free-form and non-originalist interpretation of the U.S. Constitution, including revision of the First Amendment, abolishment of the Second Amendment, and elimination of the Electoral College. The largest unionized group in this country is now the public sector, which bargains for increased pay and protections from politicians who are not negotiating with their own money. That’s an idea that even so liberal a politician as Franklin Roosevelt thought disastrous and ought to be discouraged.

Since the presidential election of 2016, many Democrats have been in open revolt. Celebrities have called for bombing the White House. And seemingly rational people proclaim the current incumbent to be “Not My President.” Since the confirmation of Bret Kavanaugh to the Supreme Court went against the wishes of the Democrats in the Senate—largely because they had previously invoked the “nuclear option” of making judicial confirmation a simple 51-vote majority instead of the 60 votes formerly needed to override a potential filibuster—the party has called the legitimacy of the court itself into question.

People on the left now want the country run as a pure democracy rather than the republic of elected officials that the Constitution established. Without the Electoral College, which was put in place to give small states with scant population some parity with large states, the citizens of Los Angeles county would be able to outvote the citizens of the Upper Northwest from Minnesota to Idaho. Presidential elections are always strategic games of chess, trying to take and hold at least 270 electoral votes. The makeup of the Senate, with two Senators for every state regardless of size, protects the interests of small states that are out of sight and of mind of large state and big city politicians. The Electoral College is part of the “checks and balances” put into the Constitution to protect minority interests from the whims and tyrannies of the majority.

What I fear in this new direction is what the left now seems openly to want: revolution. A country’s government is just laws and procedures, words written on pieces of paper or, worse, loosely preserved in bits and bytes somewhere in computer memory. What makes those words work is the belief of citizens and their elected politicians in the principles behind them. We agree to these things being true and necessary. We dismiss them at our peril.

The Constitution has set up a mechanism to change any part of it or add to its effective structure through the amendment process. But that process is hard, requiring any amendment to get a two-thirds vote in the House and the Senate, or being ratified by two-thirds of the state legislatures meeting in a special convention. The idea is that any change has to be popular enough and gain enough bipartisan support to pass this hurdle. We’ve done it twenty-seven times over the years, addressing issues as important as the voting rights of former slaves, women, and people age eighteen to twenty-one, and issues as ephemeral as the drinking habits of the average citizen.

The radical notion of changing or abolishing elements of the government or the Constitution itself by popular vote would invite a landslide. We have seen something of that kind operating in California, where initiatives are placed on the ballot by petition and voted on by the public at large. Those that pass acquire the force of law. Some initiatives have had good effects, some bad. But all are subject to the same verbal tricks and legal manipulations of any advertising campaign: claiming to support one thing while actually undermining or subverting it. Popular politics is a rough game to play.

The edifice of the U.S. government is fairly robust and can stand to have a few bricks knocked loose from time to time. But a popular onslaught in the name of “revolution”—whether meant picturesquely or in deadly earnest—could lead to a collapse. We’ve seen that happen before: France in 1879, Russia in 1917, Cambodia in 1975. The problem is that immediate and necessary changes in government always start with well-meaning people who have goals, a plan, and a vision for the future. But once the structure collapses, anything goes. The process starts with essentially kind-spirited liberals like Jean-Paul Marat or Alexander Kerensky; it ends with closed-minded tyrants willing to spill vast quantities of blood like Maximilien Robespierre and Joseph Stalin.

My fear is that the people so innocently dreaming of and calling for revolution today are the nation’s Murats and Kerenskys. If they get what they want, these people will go up against a wall within about six months. Waiting in the wings are the Stalins and the Pol Pots. And they will not be so liberal or kind-hearted.

Sunday, October 21, 2018

Mind Games

Subatomic particle

I am just finishing up Adam Becker’s book What Is Real? about the relationship between quantum physics and the real world it is supposed to represent. Becker tells a good story, especially as an introduction to the world of quantum physics, the players over the years, and the intellectual principles involved. His basic premise is that, while the equations that physicists use to predict the outcome of their experiments—and so test the value of those equations as representations of the underlying world of the very small—have consistently proven their worth, the physicists themselves remain in doubt as to whether the world that they are describing actually exists.

Without going into the entire book chapter by chapter, the issue seems to be one of describing a world so small that we cannot detect it without changing it. Atoms and their component protons, neutrons, and electrons—plus all the other subatomic particles in the Standard Model—are not fixed in space like pins on a board. As with everything else, they move, as do galaxies, stars, and planets. However, instead of occupying observable orbits and tracks across the night sky, atoms mostly vibrate with the energy of what’s called “Brownian motion,” and electrons buzz frantically and randomly around their nuclei like flies in a cathedral.

We can detect the larger celestial bodies—and even masses as small as freight trains and automobiles—with visible light without the danger moving or deflecting them much. Bounce a few hundred thousand photons off a teacup, and you will not move it one millimeter. But the subatomic particles are so small that the wavelength of light we can see is so long that it misses the particle entirely, passing over and under it with no impact. Imagine that the wavelength is a long piece of rope that two girls are spinning in a game of Double Dutch. If a human-sized person enters the game and performs unskillfully, the rope has every chance of hitting—that is detecting—his or her body. But if a flea jumps through the game area, the chances of that long, curved rope ever touching its body become vanishingly small.

To detect subatomic particles, physicists must use other particles, as if in a game of subatomic billiards, or photons with much shorter wavelengths and thus having much higher energies. A high-energy photon impacting a moving electron or proton will change its direction of motion. So the issue in quantum physics is that when you locate the particle you are observing here, it’s now no longer there but going somewhere else. In quantum physics terms, no particle has an exact position until it’s observed, and then it has some other position or direction of movement in response to the observation. Mathematically, the particle’s supposed position can only be defined by probability—actually, a continuous wave function that defines various probable positions—and this wave “collapses” into a single definite position at the place and time of your observation.

Well and good. This is what we can know—all that we can know for sure—in the world of the very small.

The first issue that Becker’s book takes up is that most of the original proponents of quantum physics, including Niels Bohr and Werner Heisenberg, adopted this lack certain knowledge to an extreme. Called the “Copenhagen interpretation,” after Bohr’s institute in Denmark, their view insists that the entire point of quantum physics is the manipulation of the results of observation. The measurements themselves, and the mathematics that makes predictions about future measurements, are the only things that have meaning in the real world. The measurements are not proof that subatomic particles even exist, and the mathematics are not proof that the particles are doing what we think they’re doing. To me, this is like calculating the odds on seeing a particular hand come up in a poker game, or counting the run of cards in a blackjack game, and then insisting that the cards, the games, and the players themselves don’t necessarily exist. It’s just that the math always works.

Other physicists—including Albert Einstein—have been challenging this interpretation for years. Mostly, they pose thought experiments and new mathematical formulas to prove them. But the Copenhagen interpretation persists among quantum physicists.

A second issue in the quantum world is the nature of “entanglement.” Here two particles—two atoms, two electrons, two photons, or two other bits of matter that is sometimes energy, or matter that oscillates with wave-like energy, or waves that at the instant of detection appear as singular objects—become joined so that what one of them does, the other will do. This joining and the parallel actions persist through random occurrences—such as passing through a polarized screen—and are communicated instantly across distances that would violate the limit of light-speed travel for any object or piece of information. Here is the sort of “spooky action at a distance” that Einstein derided as a violation of general relativity.

A third issue in quantum physics is the nature of Schrödinger’s cat. To illustrate the limitations of measurement, Erwin Schrödinger proposed the thought experiment of putting a cat in a sealed box with an apparatus that releases a poison when triggered by the decay of an atomic isotope. Since the atomic decay is unpredictable, the cat in the box might be alive or already dead. It was Schrödinger’s point that until an observer opens the box, the cat exists in two “superposed” states—both alive and dead at the same time, expressed by a wave function of probability—and that the wave function does not collapse and reveal the cat’s final nature until the box is opened. As a thought experiment, this is a metaphor for measurement and observation. But some physicists insist that the superposition is real. The actual cat is physically both alive and dead until discovered.

This superposition has led some physicists to describe a splitting of the universe at the point of the box’s opening: one universe proceeds with a physicist holding a live cat; the other with a physicist mourning a dead cat. This is the “many worlds” interpretation. Both universes are equally valid, and both continue forward in time until the next quantum change that forces each universe to split again in some other way.1

Now, I freely confess that I do not have the mathematical skills to understand the equations of quantum physics. And mercifully, Adam Becker’s book does not focus on or discuss the math in detail, just the thought experiments and their supposed meaning. I also confess that I do not understand what condition enables two particles or two waves to become “entangled,” or how they interact at a distance in this state, or what might be required to untangle them. Becker does not explain any of this, either. Further, I confess that I can sometimes be simpleminded, rather literal and obvious about what I see, hear, and know, and oblivious to distinctions and nuances that other people perceive easily.

But, that said, it would seem to me that what we have here is a misinterpretation of a metaphor. The limitations of observation and measurement, as expressed in colliding particles and probabilistically dead cats, are simply reminders that we do not have direct perception of the quantum world in the same way that we can see, hear, touch, and taste, if necessary, a steam locomotive or a billiard ball. That’s a good thing to keep in mind: we don’t have all knowledge about all things. However, to insist that this metaphorical reminder means that quantum physicists are simply doing math, and that their calculations—no matter how enticingly predictive—have no meaning in the real world, that quantum physics is just a mind game … that’s taking things too literally.

I have criticized the use of mathematics to prove the improbable before.2 And I insist again that, if all you’ve got is a series of equations to prove your point, you may just be playing mind games with yourself and your fellow physicists. But the reverse is also true: the real world must exist at the quantum level. If the math works out, if the vision behind it holds together, then it must be describing something that has actual substance and energy. The details may not be exactly as we understand them. The description may be missing some elements, forces, or bits of math that we haven’t worked out yet. But the world must exist in the smallness of subatomic particles as much as it does in the vastness of stars and galaxies.

The math doesn’t exist in a quiet vacuum. The cards, the game, and the players must also exist to give the calculations meaning.

1. I have cheerfully used the many-worlds interpretation in my novel The Children of Possibility, about time travelers from the far future, and in its prequel The House at the Crossroads. But I know I’m having fun and don’t take this stuff too seriously. So much fun, in fact, that I’m now working on the sequel that picks up where Children left off.

2. See Fun with Numbers (I) and (II) from September 19 and 26, 2010.

Sunday, October 14, 2018

Courage in Authority

King Lear’s Fool

King Lear’s Fool

We have a young man on the board of directors of our condominium homeowners association who is consistently negative. He routinely predicts disaster in every situation. If someone proposes a solution, he calls for more consultants, more bids, more analysis, more legal review. He always criticizes proposals and decisions by other board members for their lack of “doing their homework” and “due diligence,” or their failure of “fiduciary responsibility.” If he offers a solution of his own, it is numbingly complex—if not self-contradictory—and hedged with so many technical and legal caveats that it becomes simply unworkable.

He has been responsible at times for bringing the entire organization into a state of paralysis. And if other board members vote for a motion that seeks to override his objections, he always votes against it or abstains, in order to preserve his right to later criticize the decision. Yet he never considers—or offers to take responsibility for—the negative consequences of action postponed or prevented by his criticisms and time and money spent on considering his objections.

If this young man, his attitude, and his effect on the organization were unique to our homeowners association, this might make a good story but would hardly rise above a curious local anecdote. The truth is, we see this kind of negativity too often in our current politics on both a local and a national level—and too often in the corporate and other spheres. Problems are insurmountable. Solutions are insufficient, infeasible, unprincipled, illegal, or unconstitutional. Nothing can be done but, at the same time, the situation cannot be allowed to continue.

The position of the naysayer, the delayer, and the critic is an easy one to assume. It involves no great courage to demand that the organization take more time to consider, seek another opinion, gather more data, investigate all possibilities.1 The organization usually places no blame if we don’t perform an action, approve a decision, praise or support a member, or confirm a vendor. For if the action or decision is not made, or the person or situation is left in a state of uncertainty, there is no discernible result that might later be examined and criticized. It’s a no-lose position for any member of a group to take.

What requires courage is to take action, make a decision, or give your approval and blessing to another person or group. Of course, the action might fail, the decision lead to disaster, and the person in question turn out to be a liar, a thief, or a scoundrel. Those possibilities always exist. The best that anyone can do is make a judgment based on available data, personal experience, imaginative projection, good founding principles, and common sense. After that, the outcome is in the realm of probability or—in an older view—the lap of the gods.

Any position of authority requires such courage. Even when an organization has a second in command, a board of directors or council of advisors, a legal and technical staff, and an on-site actuary, most decisions come down to one person willing to act—or to formulate and spread a vision upon which others can take action. Any deliberative body, like a senate, assembly, parliament, or a condo board will, on any one issue, look to the person who will take the lead to find or imagine a solution, provide arguments for it, defend it against its critics and naysayers, and call for action or a vote.

That person must inspire confidence among those who will vote for the solution or be required to act on its implementation. They must believe he or she is a person of integrity, sound judgment, and experience. Moreover, they must believe he or she is acting in the organization’s best interest and not for personal advantage.

But still, the person in authority is taking a risk. If the action or solution fails, the proposer or promoter will be labeled a failure along with it. Even if the proposal had a unanimous vote behind it, the leader who complains, “But we all agreed …” is taking a weak position. The rest of the organization will simply respond, “Yes, but we agreed with you!

This is why we ask of people in authority that they possess and demonstrate courage along with their other qualities of experience, judgment, integrity, and sobriety. The CEO of a corporation, the captain of a vessel, the pilot of an airplane are all required to take responsibility for their actions. They must make judgments, recommend and follow courses of action—sometimes in an instant and without recourse to advice, consultation, and second opinions—and trust that the people around them—subordinates, employees, crew, vendors, suppliers—will perform appropriately. And if the performance of the people undertaking the action, or the mechanism of the ship or plane itself, were to fail, then the CEO, captain, or pilot stands ready to take the blame. If the person in authority did not have this courage, then the company would never do anything, the ship never leave the dock, and the plane never leave the ground.

It’s a simple lesson: Action takes courage. Delay is not always wise or safe. And the path forward leads upward and requires strength.

1. For the role of the leader in making a decision, see the story of “five heartbeats” in The Immediacy of Life from April 29, 2018.

Sunday, October 7, 2018

The New Conservatism

Lenin on a Tribune

A. Gerasimov, Lenin on a Tribune

I believe there’s a common feeling among those who follow politics and economics, based mostly on the labels assigned, that “conservatives” want things to stay the way they are, while “progressives” want things to move forward.

Conservatives are supposed to yearn for the political, economic, and social conditions of their youth. In my case that would be rock-n-roll, ducktail haircuts, the postwar boom, Eisenhower political blandness, and stable nuclear families living in suburban housing with good schools. There were some downsides to be sure: duck-and-cover drills, Jim Crow segregation, Formica in loud colors, and Melmac dinnerware. But all in all, for the white middle-class majority, it was a good time to be alive in America. We didn’t see the social and economic problems or, if we did, we minimized them.

Progressives are supposed to look ahead to better times, which means focusing on the things that need to change right now. For most progressives these days that would be income inequality, industrial and automotive pollution, environmental damage and anthropogenic climate change, racial inequality, binary gender inequality, capitalist winners and losers, housing shortages, healthcare governed by insurance companies, and cultural hostility for “the other” leading to rampant hate speech. Sure, there are some good things: advances in renewable energy, administrative regulations on industry and finance, progressive income taxes, union protections, feminism, and the #metoo movement. But these things are not enough—may never be enough—when what is needed is a true social, cultural, and economic revolution to make people equal in both their expectations and outcomes, happier with their lives, and kinder to each other.

But are these labels correct?

I believe many conservatives have a forward-looking approach in many areas, including politics and technology. They believe the social and economic climate is improving all the time, compared to the situation fifty, a hundred, or two hundred years ago. They believe in continued evolution in this regard, but not abrupt revolution. Much of their expectation is based on humankind’s increasing knowledge and technological capability, derived from the application of scientific and humanitarian principles originating in the Enlightenment of the 17th and 18th centuries.

In contrast, many progressives seem to be in the position of tacit conservatives. They don’t trust evolutionary change in social, political, or technological conditions, largely because such change is not predictable or guided by the principles to which they subscribe. In other cases, they actually want to preserve a static world which is safe and predictable until they choose to change it through a directed revolution.

Let me suggest three areas in which this is so.

First, union protections. The history of unionism has been one of fighting changes in technology and working conditions that might affect the number and skill levels of jobs, or require workers with seniority in a craft to learn new skills or enter new positions. The classic example of this tendency was “featherbedding” in the railroads during the 1930s and ’40s, preserving the jobs of firemen who stoked the boilers on steam engines when the railroad companies converted to diesel-electric locomotives. An earlier example was hand weavers who tried to destroy and ban mechanical textile mills because the machines put them out of work. Unions consistently choose older ways of working over new efficiencies if it means that certain jobs and skills will become outmoded. This is a bid for stasis over advancement and is, at least in spirit, non-progressive. What they will make of artificial intelligence and increasing automation in the workplace is totally predictable.

Second, capitalism itself. The basis of market-driven economics and capital investment is “creative destruction.” Every product and service, every company that provides products and services, competes in the marketplace for consumer attention and dollars. Consumer favoritism and brand loyalty only go so far—and not far at all if a product line or service deteriorates in terms of quality, usefulness, price, or some other dimension that customers value. Sometimes, however, frivolous products or variations are introduced and sold; the classic example is Bernie Sanders’s complaint about “twenty-three kinds of deodorant.”1 But by and large, new and useful products are coming all the time: consider the personal computer and the internet revolution.

Capitalism in a free market means giving people what they want, even if it means giving them what they only think they want—or what you can convince them to want, or deceive them into wanting. Capitalism is not predictable and directed, but decidedly uncontrolled. Sixty years ago, when I was a child, everyone confidently predicted that my car would fly by the time I was middle aged. But no one, looking at the basement full of vacuum tubes or single transistors that was the current state of the art in computing predicted the development of the integrated circuit, the microchip, and telephones that would eventually replace cameras, stereo systems, movies and television, telegrams, libraries, and retail stores. Creative destruction is a wild and woolly territory—just ask a taxi driver whose radio-dispatched cab is being replace by a cellularly summoned Uber or Lyft driver.

We’ve seen enough of the command-and-control economies that were spawned from social and economic revolutions in the 20th century to know how they operate. They were all focused on preserving the status quo in terms of products, processes, and services. None of them developed the advances in computing, personal communications, or consumer goods—let alone medical technology and energy infrastructure, to name a few more areas—that we have steadily enjoyed in the capitalist West.2

Third, the environment. Is the climate changing? Oh yes! It was changing before modern industrialization and transportation fueled by coal, oil, and gas began increasing the atmosphere’s carbon dioxide load. We live on a planet with a precession in its orbit, under a variable star, with an active geology based on plate tectonics. We have gone through periodic ice ages, glaciations, warming and cooling periods, and occasional long winters due to volcanic eruptions ever since humans started recording their history—even before, if you count all the cultures with a flood story in their mythology.

Sea level rises and falls, deserts grow and shrink, forests advance and retreat, rivers change their course, all without the influence of human activity. Life has evolved on this planet to adapt to these changes. Every extant individual and species was shaped to take advantage of a particular environmental niche—except humans, of course, who use their big brains and clever hands to build shelters and machines that let us exploit areas where we otherwise could not live. Since those environmental niches—particularly the ones with marginal populations—are changing all the time, some species must either adapt, move, or die out. It matters not how picturesque or precious a species might be, if it lives too close to extinction in terms of diet or tolerance for environmental stress, it will eventually disappear. In the long run, no one can save the panda.

And yet the current crop of environmentalists would try to prevent this change wherever possible. They want a static world in which every river, swamp, and forest remains unchanged, where every butterfly and exotic plant can be preserved. They want to fix the world’s climate at some preferred set point—usually around the time and temperature of their childhood—and maintain it … forever.

Even the politics of the progressives is frozen in place and time. Their view of “the arc of history” is guided by a 19th-century view of social and economic order as prescribed by Marx and Lenin and then communicated by the anti-war radicals and anti-capitalist activists of the 1960s. It is a world view that values world peace at the expense of national sovereignty and the primacy of human-muscle labor at the expense of technological advancement. If they were alive today, Marx would not be a Marxist, and Lenin would be busily adapting and promoting some other social and economic creed.

I believe we are at a time of great confusion over labels and intentions. I also think we are at a time that demands a new teaching, a new world view, a new politics and economics that is neither “conservative” nor “progressive” but adopts a new social and philosophical stance entirely.

I just wish I knew what it was.

1. I’m sure all the ladies out there wouldn’t mind using my brand of deodorant, which has the image of a sailing ship on the package. Or that Bernie wouldn’t mind using the Secret brand—“Strong enough for a man, but made for a woman.” One of the comments about life in Russia in the 20th century was the prevalence of “Soviet scent,” as if one smell would fit all bodies.

2. To be fair, none of them made flying cars, either.

Sunday, September 30, 2018

Retroactive Prime Directive

Alien landing

In the Star Trek universe—in case you don’t follow the series—there is a rule called the Prime Directive. It forbids the Federation’s interstellar explorers from interfering with the civilizations they discover, especially the more primitive societies. Visitors to new civilizations are forbidden from offering advanced technologies or, in some cases, even revealing that they come from beyond the stars. The intention is to preserve the unique nature of these developing civilizations and allow them to achieve whatever their native skills, cultural qualities, and particular history will enable them to become. Many of the various Star Trek series include stories where the Prime Directive is tested and ultimately found to be wise and appropriate.

Of course, in the Progressive future world depicted by the series, the Prime Directive is an antidote to and an apology for Western imperialism. This is the world, or the galaxy, done right the first time. This is the situation in which an advanced civilization—the enlightened, gracious, Western European–based explorers of Star Fleet—“boldly go[es] to seek out new worlds and new civilizations” and then carefully and studiously leave them alone. No educating the natives here. And certainly no enslaving them and making off with their trade goods and raw materials.

It’s a pretty picture. An ideal of self-restraint. But is it real?

In the Progressive doctrine, the New World as discovered by 15th- to 19th-century Europeans embodied many such primitive civilizations. The “Native Americans,” the people who were here first—but only after crossing the Bering Sea land bridge at the end of the last Ice Age—were still living a mostly Stone Age existence. The hunter-gatherers of the North American plains needed something on the order of twenty square miles of open land to feed one family throughout the year, several thousand square miles or more to feed a whole tribe. The city-based civilizations of Central and South America practiced slash-and-burn farming and so could feed more people on less land, but they still were primitives compared to European farmers and their tools, and these populations were more vulnerable to climate cycles.

In either case, the North American tribes and civilizations possessed no horses—until, that is, the Spanish came and a few of their herds went feral in the wilderness. The natives had no iron, certainly no gun powder, no simple machines, and not even the wheel. Their spears and arrows were tipped with bits of knapped flint, and the “swords” of Central American warriors were clubs edged with flaked obsidian. The Maya had an advanced form of ideographic writing and sophisticated mathematics, as well as pretty good skills with stone work. The Inca of South America had a flair for hydraulic engineering equal to that of the Romans. But still, these were largely Stone Age peoples.

They also weren’t particularly peaceful or gracious themselves. The Aztecs and the Maya both practiced human sacrifice. The tribes of the plains went to war against each other long before the Europeans showed up. Widows and the aged in the tribe who had no one left to support them would be exiled and exposed. Life was hard. People died.

The modern, Progressive view that the Europeans came into the New World, committed genocide against the peaceful natives, enslaved the survivors, and stole their lands and raw materials is a compelling narrative. But absent a Western culture imbued with some kind of 15th-century Prime Directive, it is not a realistic one.

With the exception of small groups—prospective traders like Christopher Columbus, who was only seeking a passage to the markets of Asia; explorers and cartographers like John Cabot and Amerigo Vespucci, who were commissioned by royalty and functioned not unlike the explorers of Star Trek; and Portuguese and Basque fishermen, who landed in what was to become New England in order to process their catch of the Grand Banks cod fish—most of the Europeans who came to the New World were people seeking a new life, new land, refuge from persecution, and freedom from the religious restrictions, economic repressions, and monarchical wars of Europe. Some also came as transported convicts, who had no choice but indentured servitude until they could escape into the wilderness. These Europeans did not come to observe, study, and make a map. They came to stay and hoped to prosper.

One can imagine such people—the Pilgrims or the Spanish conquistadors—arriving on the eastern shores of the New World and exercising some form of Renaissance Prime Directive. “Oh my! There are already people living here! And they have formed stable hunter-gatherer—or in some places slash-and-burn—cultures capable of their own eventual development. It is not our place to intrude. We must preserve their heritage on their own land. We will now withdraw and not disturb them.” Maybe the Pilgrims could have found an isolated and uninhabited island somewhere else to establish their spiritual sanctuary. Maybe the conquistadors could go and invade some established neighbor who was both culturally and technologically equivalent, like Morocco, and had the ability to fight back.

That is not, however, the way these things work. And it’s not because Europe had experienced its own invasions from the dawn of prehistory: the Dorians, the Ionians, and Sea Peoples coming into Greece; the Romans into the rest of the Mediterranean and Western Europe; the Celts, Huns, Goths, Vandals, and Visigoths into Rome; and the Saxons, Danes, and Normans into England. The history of the world has been that of roving bands moving in on and pressuring their neighbors, when they weren’t carrying out explicit wars of conquest like the Mongols and the Muslim Caliphate. The fact that the New World pitted Stone Age people with flint spears against Iron Age invaders with horses, the wheel, and gun powder is a tragic accident of history, but it was not unforeseen.

When we first meet an intelligent species out among the stars, let us pray that we are the explorers and that our interstellar drives, dense energy sources, potential weaponry, and coherent organization allow us to be at least culturally and technologically equivalent to whomever we find. Then perhaps we can afford to follow our own Prime Directive. But if we meet that extraterrestrial species as it comes here to Earth, where the advantages in energy, weapons, and sophistication lies with them, then we had better prepare to either make friends fast and learn their technology even faster—or, in the words of Homer, “fall on the ground and bite the dust.”

In my opinion, it has never been a good strategy, in the words of Blanche Dubois, to “rely upon the kindness of strangers.” People possessing advanced skills and their own intentions will not wait upon the less developed.

Sunday, September 23, 2018

The Mark of a Gentleman

A gentleman

I recently quipped on Facebook: “While a Christian might be within his rights to refuse to bake a cake for a gay couple, a gentleman never would.”1 To me, this raises an important distinction in our modern world between rights and responsibilities among the choices an individual may make.

Our society and our laws, as embodied in the First Amendment, guarantee the right of free speech. You may say, write, advocate, and publish almost anything you want. There are, of course, legal exceptions that have been raised and confirmed over time. The classic example is that you must not shout “Fire!” in a crowded theater. And you can be held liable under law for defamation of another person, for inciting a riot, or in time of war for committing treason by offering aid and comfort to the nation’s enemies.2

Today, many people on the Left would like to add “hate speech” to the list of prohibited communications. Not unlike the definition of “pornography” or “sedition,” the actionable content of hate speech is vaguely defined. As Justice Potter Stewart wrote in the 1964 Supreme Court case about banning obscenity and pornography, “I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description … But I know it when I see it.” Too many people would like to see actionable hate speech defined as any kind of speech they believe would be offensive to groups whom they would like to support. That’s a little too broad for me and, I think, for most reasonable people.

But while any individual or group has the right to say, print, and broadcast anything they want within the narrower definitions of the law, that does not mean they should. The law of the land is necessarily open and nonjudgmental. But people who would use that law as the only guide to their personal behavior make poor acquaintances and bad neighbors. In most of polite society, they would be viewed as bit of a crank or crackpot.

A well brought up individual is—or at least used to be—taught manners by strict and loving parents, kindly aunts, uncles, and grandparents, and attentive teachers. For those who did not have such an upbringing, I would refer you to Miss Manners, which is the nomme de politesse of Judith Martin. I have been a secret fan of hers for years, and I would boil down the essence of what she advocates—if she has not already done so herself—as refraining from causing others discomfort.

In this modern world, all too many people are willing to make others feel weak, foolish, and stupid by pointing out some personal failing and invoking some law addressing it in the name of good society, personal etiquette or hygiene, or simply “manners.” It is a game of multi-variable “gotcha!” that any number can play.

How does this apply to the Christian baker and the making of artisanal cakes? In my mind, very simply.

A person’s religion is and should remain a private matter. If I am a practicing Christian—or Jew, Muslim, Buddhist, or Seventh Day Adventist—I am required by my principles to act in certain ways. These might include refraining from taking vengeance by turning the other cheek, or from consuming alcohol or pork products in observance of religious prohibition. More positively, my beliefs may require that I render alms to the poor, defend the weak, pray at certain times of day, and fast at certain times of year. This is my business. If anyone should ask why I do these otherwise outlandish things, I may explain that they are part of my religious observance. Or I might simply say that it is my preference and my own damn business.

The First Amendment allows this. So long as I am not breaking any laws—which would include, say, child endangerment, human sacrifice, or the pursuit of ethnic cleansing—the Constitution permits me to believe and practice as I will. The First Amendment also allows me to preach, proselytize, and advocate for my religion. I can print handbills, advertise on billboards, and show up at your door to explain to you the Four Noble Truths of the Lord Buddha. And you have a right to drop my handbills in the gutter, look past my billboards to the scenery beyond, and slam the door in my face.

But Miss Manners—if she were here and engaging in this discussion—might suggest that I not so actively seek to convert others to my way of thinking. After all, I should grant that they are adult, thinking human beings who have already chosen their beliefs and made their peace with the everlasting. I should respect their choices as free and independent human beings. I might, if asked, give my opinion and advice to people who are themselves in doubt or distress and seeking a new meaning for their lives. That would be the gracious thing to do. But I would be intruding upon their privacy and failing to respect their agency as human beings to insist that they were in error with their current beliefs and that the only way out of error would be to adopt the truth that I have personally embraced and now endorse.

A Christian baker—not my great aunt who likes to bake for family occasions, but an entrepreneur who has established a public place of business and put out a shingle—may believe that homosexuality violates the precepts and traditions of his or her religion. That is an acceptable private opinion. The baker may even believe it would be inappropriate for two men or two women to marry in what, so the baker believes, would be an irreligious ceremony. That is again a private opinion and belief supported by the First Amendment. But to confront those customers in a public place of business and refuse to serve them because their request is offensive to the baker’s beliefs and represents some personal failing in them would, in my opinion, be ungracious. Those customers are doing that they believe to be right and proper. They are not seeking to give offense by asking for a cake decorated to their liking. And it would hurt their feelings—cause them discomfort—to be refused on the grounds of something that is simply part of their nature.

In my opinion, the baker is well within his rights to refuse service to anyone. If someone wants a cake celebrating a bar mitzvah, a gay union, or the coming of the demon Belial, the baker can refuse and the customer will have no recourse under the law but to take his business elsewhere. Not all bakers are gentlemen and ladies properly brought up to consider the feelings of the people around them.

But a gentleman would ask whether it was his place to criticize the personal and apparently heartfelt choices of his customers. He would then, in my opinion, decide that it was the appropriate practice of his art to create the best cake he could to celebrate their joyous occasion. This is not a matter of rights but rather of responsibility to a higher principle.

There are many things that a citizen may do under the law that a gentleman—or a lady—never would.

1. This is, of course, a paraphrase from the line in Susanna Clarke’s excellent fantasy Jonathan Strange & Mr Norrell. Lord Wellington asks Strange if a man might be killed by magic, and Strange replies that while a magician might kill a man by magic, a gentleman never would.

2. Unfortunately, the distinction about “time of war,” and so the definitions of “treason” and “sedition,” become blurred when we are fighting wars and police actions in two or three areas around the globe at the same time, have emerged from decades of an undeclared Cold War with a number of as-yet unreconciled former enemies, and now exist in an Orwellian state of continuing undeclared war against pretty much anyone the adherents of law and order would like to name.

Sunday, September 16, 2018

Situational Ethics

Ancient of Days

William Blake’s Ancient of Days

A young friend of the family recently started the first day of a freshman ethics class. The teacher’s first question, requesting a show of hands, asked how many of the students believed ethics are a social and cultural construct. All but one hand went up. And how many thought ethics are a universal given. My young friend’s hand went up. At that point, the teacher told him that he was wrong, and he later dropped the class.

This appears to be a doctrine of our times, at least in the academic world: that everything is a cultural construct, from morality to sexuality to the principles of science itself. Of course, if everything is a construct, then one might question if the construct might somehow, somewhere be constructed differently. The old values that you learned “at your mother’s knee,” or in your church or synagogue, or as the bedrock of your native civilization can then be characterized as local, parochial, and false. And new values—values more suited to the questioner’s purpose—might be substituted in their place. But I digress …

My first quarrel with this teacher—whom I never met, except in the abstract of the story—is that this definition of “ethics” is too broad. Yes, some questions of ethics and morality are culturally based, like not pointing the sole of your shoe at a person in some Eastern cultures. Even some principles that we in the West hold to be universal, like intentional killing, can be culturally and situationally approved. Every war is based on provisionally ignoring that commandment.

Early in my studies about Zen, I learned that the response to certain types of questions should properly be mu, or “no thing.” When a question is too broad, or poses an assumed but unproven dichotomy, or creates a logical fallacy, then the answer cannot be either “yes” or “no.” So the only right answer is “no thing,” meaning “the question does not apply.” And that would be my answer to this ethics teacher’s question.

Yes, certain ethical practices that shade between etiquette and morality—like pointing with your shoe—are purely cultural. Not all of them are minor and involve petty insults. In other Eastern cultures, for example, a father may kill his children if they dishonor the family, and religious persons are called upon to deceive, beset, and even sometimes kill idolaters and nonbelievers who remain steadfast and unrepentant in their error. In other cultures and contexts, however, these practices are simply wrong, wrong, wrong.

But I would argue that there is a universality to certain basic ethical questions. The transmission of the principle may be cultural, as told in religious stories, fables, children’s fairytales—or simply passed on from parent to child—but the principle remains solidly based in the dynamics of human interaction.

For example, I would challenge the ethics teacher to name one society that would condone, approve, or recommend coming up behind a stranger, bashing his head with a rock, and then picking through his pockets for his wallet and other valuables. The victim is not known to be a nonbeliever or idolater or belonging to any other class worthy of killing. The act is not motivated by mercy killing or implemented as part of wartime tactics. It is purely intended for personal gain.

Name a society that condones telling lies to someone who has reason to trust you—friend, family member, or other responsible person in your community—again for the purposes of personal gain. These are not the “white lies” of commission or omission on the order of answering the question “Do I look fat in these jeans?” This is lying in order to swindle someone out of land, money, or some valued possession that the liar wants to obtain for him- or herself.

Name a society that recommends or supports the genocide of a people who have previously been accepted and valued in the community, people who were once friends and neighbors but have suddenly become “the other” and outsiders for the political, economic, or religious purposes of some subset of the community.

The list could go on indefinitely. And it’s not that people don’t do these things, or that they sometimes get away with them during the upheavals of war, economic disintegration, or natural disaster. But find me a society or culture that would point to these ethical challenges and say that this is right and proper behavior.

I am not arguing that these actions are wrong because a god or a religious book somewhere said thou shalt not kill, lie, cheat, steal, or murder your enemies once you get the upper hand. Many religious traditions do transmit these and other cultural values and still prohibit such foul deeds. My argument is that these ethical principles are like the adaptations of biological evolution. They are so, not just because your tribe or culture says so, not because your god or your priest invokes them, but because these are the only ways in which human civilization can reliably function.

If a person cannot walk the streets without fear of becoming the victim of imminent and unrestrained murder for profit, then you don’t have a society but a jungle. If you cannot trust your friends, family, and respected members of your community to have your best interests at heart and seek to protect your life and rights to property and security, then you don’t have a family or a friend—or a community. And if your extension of good will and fair dealing to others in your society can sour to the point of murder over matters of race, religion, politics, or other noncritical and immaterial differences, then again you don’t have a society but a state of undeclared war.

Every species on Earth represents a hard-fought and -won adaptation to a particular environmental niche. The bodily configuration, reactions, capabilities, energy levels, and metabolism of any one species are not designed by an intelligence or selected according to some ideal pattern. Instead, they developed and became perfected over time because these features worked best in that place. And the fact that we see some of these species as precious and beautiful—think of songbirds and butterflies—is a fact of our own evolution. While the fact that we see others as creepy and scary—think spiders and alligators—is also evolutionary. We humans are evolved to find both beauty and terror in this world. We are adapted to this environment. If we had adapted to metabolizing sulfur compounds in the dark and boiling water of an undersea volcanic vent, we would find that kind of life beautiful, too.

In the same way, our nature—human nature—has evolved over time. While some of this evolution is adaptive to the physical environment—such as our peripheral vision, allowing us to perceive subtle movements in the bushes beside us, which might be a leopard waiting to pounce—much of our nature evolved in relation to our mental environment. Like many other mammals and some insects, we are social creatures. Our life exists in both the physical world and in the mental world of dealing with others of our kind, predicting their actions and reactions, and keeping ourselves and our loved ones safe.

In this sense, yes, much of our ethical teaching is a social construct. But it is not cultural in the sense of being limited to one cultural interpretation—say, Western Civilization—and either useless or irrelevant, and perhaps harmful, in terms of other cultures around the world, like being careless about where you point your shoe.

The core issues of ethics and morality are human issues, which means they bridge cultural affectations. They are so universal that they might as well have been pronounced by a god and preserved in a religious book. Because the image of that god is always created from some aspect of human nature and our species’ collective wisdom.

Sunday, August 26, 2018

Tracing Evolution Backwards

Jupiter’s moon Europa

Jupiter’s moon Europa

This meditation is an extension of a series of Facebook posts around the question of what conditions are necessary for the development of life, which itself is an extension of the Drake equation for estimating the probability of finding other life and civilizations in the universe. The proposer, William Maness of my Facebook acquaintance, posted: “Let’s go the other way. Let’s say that Earth’s condition is astonishingly rare. How rare does it have to be to be the only one in the galaxy. How rare to be the only one in the universe?”

And then he proposed conditions for life on Earth as we know it: strong magnetic field, stable sun, Goldilocks zone (meaning both the right part of the galaxy, in terms of density of nearby stars and their radiation, as well the solar system’s “habitable zone,” with planetary temperatures that can support liquid water), a large companion body (to create tides, which set a pattern of inundation and exposure for sea life at the edge of the land, among other things), no gamma emitters nearby, debris-cleared orbit (to minimize life-killing asteroid impacts), abundant liquid water, no conditions that kill carbon life in said ocean, an active lithosphere (with plate tectonics to renew the surface, replenish the atmosphere, and relieve geothermal stresses1), an active water cycle, and a transparent atmosphere. “These are just a few that come to mind,” he wrote.

My first response was to say that some of these conditions overlap and work to the same purpose. For example, the conditions of having a strong magnetic field and a stable sun are related, as their result is to protect developing and existing life from the solar wind and radiation bursts. Having no nearby gamma emitters is part of that requirement, too. But note that if your definition of life includes—or is excluded to—cockroaches and tardigrades, which seem not to care much about hard radiation, these several requirements may not be absolute.

Having liquid water and an abundance of carbon are nice. But as I’ve noted elsewhere,2 you could construct a parallel DNA chemistry from silicon and arsenic. The silicon atom has the same chemical-bonding valence as carbon, while arsenic has the same valence as phosphorus. So silicon might replace the carbon atoms in the ribose rings and the purines and pyrimidines that are the main features of DNA and RNA molecules. And arsenic might replace the phosphorus atoms in the bonds that connect those ribose rings into a long-chain polymer. The resulting molecules would be heavier, of course, having a higher aggregate atomic weight. And they would be somewhat more fragile, because their traded electrons would occupy a higher electron orbit. But these replicant molecules would still function like carbon-based DNA.

And liquid water does have some unique properties. The water molecule is easily dissociated into its component oxygen and hydrogen atoms. The molecule has an asymmetrical arrangement, placing the two hydrogen atoms at sixty degrees apart on one side of the oxygen atom, creating a positive and negative side to each molecule. This arrangement allows other molecules to be either “hydrophilic” and attracted to water or “hydrophobic” and repel water. Water as a fluid is also relatively incompressible—you can’t squeeze it in its liquid phase—so that the water in a deep lake or ocean doesn’t get thicker and sludgier as you descend, becoming paste-like or semi-solid. Instead, the pressure just increases while the density remains the same. These features create an important condition for life forms like Earth’s sea creatures, who are composed of mostly water themselves, metabolize the dissociated oxygen in water, and range freely from the surface to the deeps.

That angular separation on the water molecule forces it to form a hexagonal crystal when frozen, so that the solid phase is actually less dense than the liquid phase, enabling it to float. If solid water sank to the bottom of a pond or ocean, where temperatures are generally cooler, then a temporary drop in ambient temperature might freeze any body of water solid. And there it would stay frozen for who knows how long—not until next summer but more likely until the next extreme in the climate cycle.

But other liquids with a low chemical reactivity and low compressibility could support life almost as well as water does—although it would be chemically and physically different from ours and might prefer different ambient conditions.

Other planetary features like a large companion (for tides) and active lithosphere (for plate tectonics and volcanoes) are only required for the kind of life we recognize. I’m betting that, when we find life out there among the stars, it will surprise us. But that wasn’t the premise of the question as originally posed, which acknowledged that it was working backward (i.e., “going the other way”): What kind of conditions will produce us, the life that we know and recognize? And that may be too limiting a definition.

We can begin as a given that the same laws of physics and chemistry exist elsewhere throughout the universe. Go to any other star with a planet, and you’ll find the same atoms from our Periodic Table—although not necessarily in the same abundance and distribution. They will tend to form similar molecules—although perhaps with different underlying chemical reactions having different, temperature-dependent endo- and exothermic requirements—and so the abundance and distribution of life-creating or life-destroying substances will depend on local conditions. The gravity curve will follow the equations we use to measure it here on Earth—although the resulting values will necessarily be different, based on solar and planetary density and distance. The physics of electromagnetism and radiation will apply—although the quality of the light and its effects on biochemistry and biodiversity will be different, based on the output of the local star.

The nature of life, however defined, is that is evolves in and adapts to the environment it finds. Otherwise, whatever you find on a new planet is just an artifact or an exception. This presumes, of course, that evolution is present on the planet and is based on either a system of replicating molecules, similar but not necessarily identical to Earth’s DNA-RNA-protein coding system.3 Once the principle of replication-with-modification becomes established and gives rise to “life,” it will already be adapted to the conditions that it finds and then change itself as they change.

This evolution will be able to give rise to organisms that are not like us either physically or chemically. Even on Earth, and working under the DNA-RNA-protein coding system, we can find life that is strange and different. Consider the organisms that our deep-ocean searches have discovered clinging to the sides of undersea volcanic vents: adapted to total darkness and huge surface pressures, tolerating the extreme temperatures of superheated water, and metabolizing sulfur compounds instead of carbohydrates. The life that we recognize from this planet’s surface was able to descend and adapt to that hell. Or rather, our kind of life didn’t adapt itself: any of its great-great-grandchildren who happened to survive because of compounding genetic mutations became able to thrive under those conditions. Remember that the original life on Earth evolved in a carbon dioxide–rich atmosphere. Then plants began metabolizing that carbon in a photosynthetic reaction driven by sunlight and released free oxygen into the atmosphere. Only then did later organisms—“our” kind of life which moves, wiggles, walks, and talks—adapt to breathe and metabolize that oxygen.

As for what conditions might be required to create life, consider the smallest of the Galilean moons, Europa. Jupiter is not in the Sun’s “habitable zone,” with temperatures that generally keep water a liquid. Still, Europa is suspected of having an ocean under its icy shell that is kept warm by tidal flexing in its orbit around the giant planet. The ocean under the ice might contain life, protected not by a thick atmosphere and planetary magnetic field, as on Earth, but by the layers of ice themselves, because water is a good shield against radiation.4 Whatever life develops in this ocean would be different from ours—not based on or even seeing the Sun’s light, with no possibility of moving out onto land and developing the things we humans cherish, like fire, metals, and radio and television. But it would still be life under conditions that do not entirely match those on Earth.

When we get out among the stars, we’re going to have to expand our definition of life exponentially. I suspect that will quickly turn our teaching of biology—and so much else—on its head.

1. If you think geothermal stress isn’t important, consider Earth’s sister planet, Venus. By studying the uniformly limited number and apparent recent age of the impact craters on the surface, astronomers have determined that Venus must lack a system of plate tectonics, with its corresponding subduction of surface layers and creation of volcanic hot spots that release core heat, as on Earth. Instead, the planet appears to go through periodic renewals, where the entire surface melts from within and then resolidifies. That would be bad for any life trying to gain a foothold on the rocks there.

2. See The God Molecule from May 28, 2017.

3. For example, a machine-based organism that was able to sample its environment and rewrite its underlying operating code to thrive under those conditions would be a similar but different analog of our biological kind of life. For that matter, you might consider our molecular form of life as simply a kind of nanotechnology.

4. When I was in college, I had a roommate who worked as shift operator at the university’s TRIGA reactor. This was one of those “swimming pool” reactors, used for research, training, and experiments with radiation. When he took me on a tour, we stood at the railing and looked directly down at the reactor core, which when operating glowed with the beautiful blue light of Cherenkov radiation. I pointed at the active core and asked my roommate, “Why am I not dead?” He replied that the twenty feet of water between us and the radiation flux with its fast neutrons was better protection than a foot of lead shielding. I then saw bubbles of gas rising from the reactor and bursting on the surface about eight feet away. I asked what it was, and he said it was a radioactive isotope of oxygen. “Why am I not dead?” Because that isotope possesses a half-life of eight seconds and had mostly decayed to regular oxygen by the time it reached the surface, where any residual isotope dissipated into the room’s atmosphere before decaying further.

Sunday, August 19, 2018

Thinking With Our Skin

Embryo fold

It’s fascinating to think about how an embryo, a single cell comprising the genetic material of a mother’s egg and father’s sperm, develops into a complex, multi-celled organism with each cell having its own place and function.

I’ve written before about the beginning of the development process,1 as described by the late Eric Davidson at Caltech, who was working with sea urchin embryos. After the original zygote created by the union of egg and sperm had divided again and again to form a hollow sphere of cells called a blastula, he and his team sacrificed these embryos at subsequent fifteen-minute periods to map out the interaction of genes in the nuclei of these undifferentiated cells. The team discovered that, depending on where in the sphere a cell was situated and time after the blastula formed, one gene would produce a bit of microRNA that moved elsewhere inside the nucleus and promoted another gene to make a different miRNA.

This process of promoting different genes continued—all without coding for any proteins—and directed that particular cell toward becoming a different and unique body part, like a backbone spine or a section of gut. These various bits of microRNA and their interactions formed an instruction set and timing mechanism for developing the entire animal. Davidson’s team then compared these miRNA patterns with other animals that had not shared an ancestor with sea urchins for millions of years and found the mechanism to be “highly conserved”—which means that something similar takes place inside the cells of a human embryo.

In mammals and the vertebrate animals closely related to them by evolution, the blastula sphere develops into the gastrula, a hollow, cup-shaped structure composed of three layers of cells. The inside layer, the endoderm, eventually becomes the gut with associated structures like the small and large intestines and organs like the stomach, pancreas, liver, and lungs. The middle layer, the mesoderm, gives rise to the connective tissues including muscles, bones, bone marrow, and blood and lymphatic vessels, and to associated organs like the heart, kidneys, adrenal glands, and sex organs. The outside layer, the ectoderm, becomes the skin with associated structures like hair and nails, the nasal cavity, sense organs including the lens of the eye and the tongue, parts of the mouth including the teeth, and the anus. The ectoderm also forms the body’s nervous tissue including the spine and brain.

So how does that outer layer of this cup-shaped gastrula become something so interior to our bodies as the brain inside its bony skull and the spine inside its chain of bony vertebrae? The answer is that during embryonic development of this outer layer, the ectoderm acquires what’s called a neural fold. A groove forms in the layer that soon folds over to become a hollow tube, the notochord, which eventually becomes the spine inside its sheath of protective bones. The anterior or front end of the spine becomes the brain. In primitive life forms, the brain remains just a cluster of nerve cells, a ganglia. In more developed organisms—from fish on up to humans—this ganglia develops a complex structure with the brain stem, where consciousness originates; the cerebellum or hindbrain, governing autonomic functions like breathing and balance; the neocortex, governing thinking, speech, and fine motor control; and the limbic system, which is associated with emotions, instincts, moods, and the creation of memories.

It’s no accident that the nervous system arises from the skin, because sensation through the skin is the brain’s first major contact with the outside world. Other structures derived from the ectoderm include the eyes, ears, taste buds, and sense organs in the nasal cavity. It might at first seem that the brain, where the cell bodies of the neurons reside, should form in place and then extend long, thread-like nerve fibers, the axons, down along the spine and out into the skin to get such widespread coverage. But instead they all form in place from the same tissue, starting embedded in the skin.

At the same time that the neural fold is forming the spine and brain, the cup-shaped structure of the ectoderm and endoderm curve around and fuse to form the body cavity. That puts the guts on the inside, the skin on the outside, and the skeleton and muscles somewhere in between. All of these tissues are developing together and sometimes—as with mesoderm muscles of the heart and the endoderm structure of the lungs—merge to form integrated systems.

But how does the developing organism know which end of the spine will become the “anterior” as well as what and where the posterior might be? What directs one end of the hollow tube that represents our bodies to become the mouth and nasal cavity, with their sense clusters, and the other end to become the anus? One further set of genes is needed to manage all this, the homeobox genes, or “hox” for short.

This is another highly conserved gene set. Fish have it, as do frogs, lizards, dinosaurs, birds, and all the mammals. So do the insects and arachnids. The hox genes are only active during embryonic development and determine the major body parts that we share with all these other animals: the head with its brain or nerve cluster, the major sensory organs, and mouth parts; the thorax with the heart, lungs, and nexus of the blood vessels; and the abdomen with its digestive and reproductive systems. In human beings, the thorax is enclosed inside the ribcage and separated from the abdomen by the diaphragm. In insects like the fruit fly and arachnids like spiders, the thorax and abdomen are separate body structures. The hox genes also define the limbs and where they are attached: four limbs connected to the spine in the tetrapods, which developed out of lobe-finned fish and first walked on land—that’s us, along with frogs, lizards, dinosaurs, birds, and all the mammals. Other less closely related animals like insects and arachnids have multiple legs attached to the thorax and sometimes wings, too.

It’s not just a coincidence that we share the same basic body structure with fish and frogs. It’s written into our genes. I always marveled at the movie Avatar, where on the planet Pandora the humanoid natives, the Nav’i, are four-limbed like the Earthly humans, but every other species in close evolutionary proximity to them has six limbs. Given that the hox gene set is relatively stable, creatures so closely related that they can attain near-telepathic communication by mixing the tail ends of their neurons really ought to have a parallel body structure.

The hox gene set is also the reason that we classify mythological creatures like Pegasus, the flying horse; gryphons, which are half lion–half eagle; and dragons, which have four legs and a pair of wings, as “chimera,” or impossible animals. The hox gene set simply doesn’t allow for mashups of six-limbed creatures that closely parallel the known tetrapods. It also forbids angels with two arms, two legs, and a pair of wings. All of them are violations of basic body structure.

We still have a lot to learn about fetal development. And certainly the hox gene set deserves more study. But I find it fascinating that the process of going from a single cell to a complex organism passes through a multi-layered sphere that then folds inward and outward like a piece of origami. And it’s a bit chilling to understand that we all think and feel with cells that originate in our skin.

1. See Learning as a Form of Evolution from December 10, 2017.

Sunday, August 12, 2018

Keeping Busy

Storyteller

Storyteller in a Turkish coffee house

We human beings are endlessly concerned with finding our “purpose” in life. It’s a question that faces a child from the first time he or she is asked “What do you want to be when you grow up?” Answering “I just want to be” is not considered sufficient, although it’s the answer that every other life form, every bacterium, plant, and animal on this planet has for the question.

Biologists define life with a number of different characteristics. First is cellular organization—any organism, even a one-celled prokaryote, has an arrangement of pieces and parts, systems and subsystems, that enable it to function. Second is reproduction—it survives for a time and then divides into or buds off daughter cells, or joins with a complementary partner to form a new organism sharing the traits of each. Third is metabolism—it ingests nutrients such as proteins and carbohydrates, or in the case of plants, minerals and sunlight, and excretes waste products. Fourth is homeostasis—it tends to maintain a stable internal environment and seeks to maintain a stable external environment. Fifth is heredity—it can trace an ancestry based on changes through mutation from its parent cell or organism. Sixth is response to stimuli—it senses and reacts to its environment, moving toward light or nutrients or prey, avoiding predators or unfavorable conditions. Seventh is growth and development—the result of that heredity and metabolism is successful accumulation of resources and changes in structure. Eighth is adaptation through evolution—while the individual may not always change in response to its environment, the hereditary line changes through natural mutations that enable some future individuals, but not necessarily all of them, to survive.

These characteristics are not immutable like the laws of physics. Bacteria don’t react to their environment as readily as a gazelle being chased by a leopard. And not every individual successfully reproduces. Some of the characteristics listed above, also, are concatenated on other biologists’ lists, such as heredity being an element of evolution. But the principle is the same: life reacts to its environment in a way that, say, a stone weathering on a mountainside does not.

For every other species on Earth, this is enough. My dog does not question her life. She does not attempt to be something other than a part of the situation in which she finds herself. This is a shame, really, because in an earlier age of the world she would have been hunting small mammals, finding and mating with a male dog, digging a den and giving birth to litters of puppies, and only occasionally getting to lie in the sun in contentment. It would have been an active life full of interesting activities with occasional moments of terror. As it is, she is an adjunct to my household and has the primary function of nuzzling my hand when she wants something and having her coat stroked and hearing soothing words when I choose to give her attention—or feeling the tugs of the brush and the terror of the toenail cutter when I groom her. She won’t mate or reproduce because that potential was surgically removed at the shelter where I found her. So her life is reduced to eating the food that I put down for her, exercising her excretory functions only when I take her for a walk, and otherwise lying in the sun or on a cushion under my desk, waiting for something to happen. But it’s a life.

Human beings would go mad in this situation. We cannot be kept as pets—or not most of us, and not the best of us. And therein lies one of the basic problems of our modern world.

For a million years or more, our hominid ancestors lived as hunter-gatherers. Life was a struggle. We lived from one animal kill to the next, from one berry bush to the next. And when the seasons changed and the streams dried up, we suffered. We mated according to our hormones and our opportunities. We carried our feeble young along on the trail by instinct alone, not dreaming of a different or better life for them. We had an existence prescribed for us by circumstance, full of interesting if repetitive activities with occasional moments of terror. No one among this primitive species—or almost no one, surely—looked up into the sky at night and wondered about the Moon and the stars and what they might be or mean. Almost no one asked if there might be any other purpose to life. Everyone was just too busy surviving to ask such stupid questions.1

All of that started to change when human beings settled down in the fertile river valleys, planted crops and tended domesticated animals, invented city life with its artificial hierarchies and its wonder at the Moon and stars and what supernatural beings might lurk behind them. We suddenly had more food—most of the time—than one person could hunt or gather and eat by him- or herself. We had an unfamiliar condition called abundance. And we could indulge the pastimes of people who did not directly produce food, shelter, or clothing and yet still wanted to eat, sleep indoors, and cover their own nakedness. We had room for priests, shamans, storytellers, tax collectors, and other government officials. We began having a civilization and all of its questions.

Things have only gotten better—or worse, depending on your point of view—with the advent of science, technology, and modern methods of agriculture, production, and distribution. Where the labor of one person on the soil might once, in that fertile river valley, have supported two or three more people in the nearby town, now the labor of one or two people plus a cohort of robotic machines and systems support a hundred more. Working to stay alive and wondering where your next meal is coming from are no longer the primary concerns of most people in the Western and developed countries.

Physical needs have been replaced in our modern society by existential needs. A person who eats, lives in, and wears the products of other people’s labor has to question his or her own existence, no matter how the value of those goods and services in terms of dollars, credits, or other forms of exchange was acquired. More importantly, without the requirement of spending every waking moment concerned with the fulfillment of those physical needs, what is the person going to do just to keep busy? The question “What are you going to be when you grow up?” becomes “What are you doing here in the first place?”

Some people have a specific answer to that question. They are usually the humans lucky enough to be born into a family with a tradition of productivity: the family farm, the family business, or a profession followed by parents and grandparents such as medicine, law, or engineering. These family situations set a child’s mind in a pattern of work, responsibility, and obligation.

Many people transfer the question of personal purpose to a higher authority. They know they are valuable and worth the food they eat, the shelter they inhabit, and the clothes they wear because their deity sets apart all human life as having such value. What they do in their day-to-day occupation or their role as homemaker and caregiver is secondary to this important and holy purpose.2

My own role, which I think came about from my maternal grandfather’s love of books and my own father’s lifelong interest in reading, is that of perpetual student, then as an interpreter and explainer of life and the world, and finally as a storyteller. The family thought that, with my facility for languages, I would become a lawyer, like that same grandfather, but I lacked the aggressive instinct for courtroom battle. Instead, I became fascinated with stories themselves, with fictions that make more sense of the world than the daily lives we all encounter, with their power to sum up and explain the human condition. I spent my high school and college years learning the literature of my culture as an English major. This was not just the language but its use in the business of transmitting personal and cultural experience. I worked my entire professional life as a communicator. First, I was a book editor and technical editor, helping authors and engineers tell their stories in a coherent and pleasing manner. Then I was a technical writer, a speechwriter, and an internal communicator, telling about and explaining the business—whatever business I found myself in: engineering company, public utility, pharmaceutical company, or maker of genetic analysis equipment—to its operators and its other employees.

And all the while I knew that I was peripheral to that corporation and to society as a whole. The publishing business, in which I was a direct contributor to the end product, is a nice-to-have in a civilized society but not need-to-have in the way that farmers, carpenters and masons, weavers and tailors, and the truck drivers that move their products to market are necessary to life. As a technical writer and internal communicator, I was not even central to the business function but a convenience to the employees who do the actual work and the managers who want to see it continue. As a novelist, I might directly bring my readers moments of interest and even joy—or at least a release from tedium while waiting for a bus—but I am not central to their lives.

I don’t regret any of this, and performing these peripheral functions has paid me well over the years. I’m one of the living examples that an English major does not necessarily have to teach or ask “Do you want fries with that?” But I also know that my function in society has not been critical to its operation. If I had disappeared years ago, no one would have starved, been made homeless, or gone naked to the elements. And when the end finally does come, I will know that my life has been an elaborate and complicated form of keeping busy.

But that’s more than some people have. And it may be better than chasing rabbits with a sharpened stick or pulling berries off a bush for a living. At least I never had to run from a leopard, either.

1. So you can imagine that the subjunctive mood was not a part of their speech patterns. There’s not a lot of need for expressing potential or counterfactual conditions—shoulds, woulds, coulds, oughts—when you’re chasing a rabbit with a sharpened stick or tasting a new and unfamiliar kind of berry for the first time. You do or you die.

2. Not having a personal god—nor even an abstract idea of any god—I cannot rely on this definition of personal value. Unless the thoughts of my brain are made real by writing them down and preserving them in my function as a student, explainer, interpreter, and ultimately a storyteller, I have no more personal dignity or right to life than a bacterium or a dung beetle.