Sunday, October 13, 2019

God and the U.S. Constitution

The U.S. Constitution

I am an acknowledged atheist. I don’t wear the label proudly or in rebellion. I know that this admission sets me apart from many of my friends and fellow human beings. I am an atheist not because I know a secret that others don’t, but because I lack a gene or a brain circuit or existential antenna that would let me commune with and feel the presence of God.

In my conception, in this absent state then, God—or Yahweh, Allah, Brahma, or any other name you use—is still a good idea. That is, for most people, the Existential Being that we revere and worship is a conception of goodness personified, something to strive for and emulate, a guide to right action and good thoughts, an inducement to calm and serenity.1

In my conception, this godhead exists in the human mind, is transmitted through spoken and written words, and in utter reality has no existence outside of human thinking, action, and cultural conventions. In other words, do I believe that an immaterial Being, a palpable Force, a Spirit or Intelligence, Creator of the universe as well as of the human form and mind, has an actual presence out beyond the stars and existing for all time and outside of time? For me, no. For others, possibly yes to probably maybe.

But god, godhead, the idea of a loving, creative, affirming presence seems to be part of the human psyche. I credit this to the fact that humans, with our brains and developed skulls larger in volume than the pelvic passages of our birth mothers, are born prematurely, with soft heads. We therefore require the loving attention of our parents through the first couple of years of our being. It’s not for nothing that we think of the godhead then as a either or both Sky Father and Earth Mother. We needed these beings, both distant and close, from our first moments of consciousness.2

In the same way, the U.S. Constitution is a human idea—an ideal, if you will—that has no existence outside of the human mind and the actions it engenders. Like the idea of a god, it is formed in words and written down on parchment and printed in booklets, the same way that God is revealed in the Bible, the Quran, or other sacred texts. But the Constitution has no other physical presence or existence outside of the human mind. It has no force that the human mind does not give it.

And yet the Constitution has a powerful influence in American life. We refer our laws and practices back to it. We see it as the ultimate test of rightness for the American culture and virtues. It has stood for 229 years. It has been amended twenty-seven times—the first ten of those at its very creation and called the “Bill of Rights.”3

Today, we seem to be undergoing either a schism or a reformation with regard to the U.S. Constitution. On the one hand are the originalists, who want to strictly construct its language according to the words on the page and the body of legal practice surrounding the interpretation of such texts. On the other hand are the proponents of a “living Constitution,” who want to interpret the text in terms of the current culture and mores, and pin the words on the page to the presumed intentions and attitudes of 18th-century white Europeans who could not have anticipated our technically advanced, multicultural, and supposedly superior view of law and justice. To the latter, the Constitution is a good start but needs work. The disagreement over interpretation seems to center in that same Bill of Rights and not so much in the basic structure of government laid down in the main text.

Although that governmental structure, too, is under subtle attack. Consider the movement away from laws being written in compact, comprehensible, easily analyzed form by the Legislative branch and merely enforced by the Executive branch—toward a Congress that writes long, abstruse bills full of intentions and to-be-desired end states, which are then left to the Administration and its Cabinet departments and alphabet agencies to interpret into regulations and laws. This isn’t so much a direct and reasoned attack on the Constitution and the government’s structure as a decades-long, bipartisan, and mostly lazy approach to sliding responsibilities around from one branch that has to fight for reelection every two or six years onto the other branch whose staff of bureaucrats and regulators is largely unelected and insulated from public criticism.4

And what happens when the popular belief in the rightness of the Constitution or the power of God goes away?

We can see in the sober words of G. K. Chesterton what abandoning the guiding notion of God and the principles of religion has wrought in our current culture: “When men choose not to believe in God, they do not thereafter believe in nothing, they then become capable of believing in anything.” Socialism, Communism, environmentalism (absent of, and even opposed to, human values), “the arc or history,” and every other -ism, doctrine, cult, and clever notion springs forth as a mainstay of human thinking. It’s not just that many of these doctrines are destructive, unstable, or unsustainable in the long term. They are also not as hopeful and sustaining in a person’s everyday life as the belief in a benign and loving presence. Prayer offers the believer, at the very least, a little daily chat with a presumed intelligence that is stronger, wiser, and more forgiving than the believer him- or herself might personally be. Adherence to the tenets of pure social justice, environmental sacrifice, or some other collective doctrine usually entails a bitter denial of personal hopes.

In the same way, if we abandon the strict construction of the U.S. Constitution, its Bill of Rights, and the various modifying Amendments, we are left with the personal opinions of competing politicians—and those guided by some social, political, or economic theory with which the rest of us might not agree. The Constitution and its constituent Amendments are remarkably silent on issues of social, political, and economic theory, other than support for the individual in particular and the people in general against the tyranny of the state or the majority in control of the government. It’s a pretty lenient and forgiving structure; other systems of government are a lot more aggressive and demanding.

But again, in my conception, none of this is dependent on outside, impersonal forces. Both God and the Constitution are human creations, operating wholly within the scope of the human mind, and having effect only through the interactions among those who so believe in the first place.

These are, ultimately, fragile things.

1. Unless, of course, you worship Satan, the Devil, Baron Samedi, or some other dark force—and then your heart is in a different place from mine.

2. I imagine that a race of intelligent sea turtles would have a very different conception of God. They are abandoned by their mothers as a clutch of eggs in the sand, are warmed by an invisible Sun, hatch by the light of the Moon, and scramble across dry land to find the ocean, pursued all the time by hungry gulls and other predators. I imagine their God would not be caring or life-giving at all, but cold and distant like the Moon itself.

3. That is, the rights that citizens have above, beyond, and preceding the Constitution, as inalienable rights that come from some source—God, perhaps?—greater than the state or the federal union and the document that binds that union.

4. This is a basic problem with democracy. Ultimately, the people who have to run for office must seek approval from the voters. You do this in one of two ways: offer favors, projects, and advantages that other politicians can’t or won’t offer; or avoid association with damaging and restrictive laws and their effects that the voters won’t like. Promise the sky but avoid the whirlwind and the lightning.

Sunday, October 6, 2019

The Myth of Powered Armor

The Human Condition:

The Myth of Powered Armor – October 6, 2019

Purgatory

Boston Dynamics robot “Atlas”

I have been re-reading William Barton’s When Heaven Fell, about Earth and the rest of the galaxy being overrun by our computer overlords with the assistance of the finest fighters among the universe’s species—humans, it turns out, among them. In the novel, powered armor with semi-magical properties of endurance, impenetrability, stability, and power supply plays a big role in battles, turning frail human bodies into invincible supermen, something like the comic-book hero Ironman.

Soldiers in the field do need some kind of protection against shrapnel and spent rounds, like helmets and sometimes a “bulletproof” vest. This is all light, unpowered body armor, more a guard against scrapes and cuts, concussion and sucking chest wounds, than invincible Ironman stuff. But the impact of large rounds and high explosives will defeat the purpose of anything you can feasibly wear on your body. We have known since World War I that anytime a human crew has to move into a shitstorm of firepower, you put them in an armored vehicle: a tank or personnel carrier. And that was a hundred years ago.

Today, we try not to put people into these situations at all. If there is immediate, lethal, unrecoverable danger, we will send a drone or a robot. With current technology, a robot’s responses are limited, so we use them in static situations like investigating unexploded bombs and entering for surveillance purposes buildings that are about to collapse. But as robots and automata become more agile, dexterous, and intelligent—through the marriage of AI to machine bodies—we will use them in more active roles.

We already fly unmanned aerial vehicles, UAVs, by remote control to perform surveillance and ground missions like bombing and strafing where a human pilot’s endurance, attention span, and bladder capacity might not hold out in the cockpit. Instead, the pilot sits on the ground, in a comfortable chair equipped with a videogame console, and flies the vehicle. If he or she needs to take a “bio break” (as we used to call it at my old biotech company), the pilot can pass control to another human or put the machine on autopilot. With satellite communications, the pilot does not have to be in the same country, not even in the same hemisphere, as the active machine itself.

And that’s just today. Again, with the mating of AI and better machines, we are not far from fielding a legion of robot soldiers that can act independently—or under the satellite control of a human expert sitting comfortably in a chair—to conduct our wars. One of the great lies told to American audiences came in one of the Star Wars prequels where the fighting robots were depicted as fragile imbeciles. One blaster shot, or even a sharp jab with a human elbow, and they fell apart. They moved stiffly and stupidly, and they were mowed down by ranks in combat. Maybe war is stupid, but its machines won’t be. The robot warrior of the future will be more like the Terminator and less like an animated broomstick.

The trend away from putting humans in harm’s way started, I think, in the 1970s. We sent people into space, into orbit, and to the Moon in the 1960s and early ’70s as a matter of national pride and endurance—and it was glorious. But we never went back, and not for reasons of economics or public boredom. We have no reason to go back to the Moon in human-rated ships and spacesuits. What we need to know, we can find out with satellites in orbit and robots on the surface, which are far more robust, cover more ground, and don’t suffer from inattention and boredom. Sending humans to do observation, research, and gee-whiz science experiments—dropping a feather and a hammer in the vacuum on the Moon’s surface—is valuable only for the “you are there” experience. Humans looking through camera lenses and probing with sophisticated equipment whose reach exceeds our five senses can cover more ground at less expense.

It’s even less important to send human beings to Mars and the other planets: other than astronomical, atmospheric, and geological studies—plus the off chance of discovering traces of life—we have no pressing economic need to colonize these distant and hostile environments. Not when so much of our home planet is easier to get to, make habitable, and profitably exploit for a much larger proportion of our population.1 It would be easier to level the top of Mount Everest and then build and supply a five-star hotel with Olympic swimming pool there than to put a habitat on Mars. Actually living on Mars, after the novelty and the science experiments had worn out, would be tedious beyond belief.2

When we need to go back to the Moon, or on to Mars or Alpha Centauri, to establish a human base and the a prioris for colonial expansion, we will do it handily. We will build on what we learned in the 1960s Space Program and in other endeavors like deep-sea exploration, cybernetics, and our robotic explorations. Sending and attending to human bodies will then be a subset of existing technologies. But human beings living in domes or underground, visiting the surface only with robots or in spacesuits—think mobile, compact, Earthlike habitats—will not be the point. Eventually, we will have to terraform the planet or moon.

This is not an easy prospect. Terraforming first of all implies a breathable atmosphere. We humans like 78% nitrogen, 21% oxygen, and 0.9% argon or other inert, nonpoisonous trace gases. We like a surface pressure of 14 pounds per square inch, although that’s negotiable within a few pounds either way. And our lungs are pretty delicate, so we don’t like nonnegligible amounts of fine dust or spores or hostile proteinaceous substances. The Earth can hold this atmospheric mix at this pressure because of the planet’s size, volume, and density, and so its mass and surface gravity.

The Moon can’t hold on to much of any atmosphere because its gravity is one-sixth that of Earth: all of the gases—which are in constant motion on a warm body, especially under the bombardment of the solar wind—take one bounce, exceed the gravity’s escape velocity, and fly off into interplanetary space, leaving a vacuum. Mars, with a surface gravity only two-fifths that of Earth, can’t hold on to any molecule much lighter than carbon dioxide. And even with that heavy molecule—which flows down onto the floor and into depressions when released on Earth—the Red Planet supports an atmosphere with a pressure only 1% that of our home. Think of flying a jet up to about 80,000 feet—more than twice the height of Mount Everest—and opening a window; that’s the atmosphere of Mars.

Terraforming any planet that does not have Earthlike geodetic specifications3 would require adding significant mass to the planet—something we cannot currently do—or running the atmospheric processors overtime at high pressure to make up for the gases lost to escape velocity. Either way, it’s going to be an expensive proposition, and the result will probably not be stable enough for people to bet their lives on.

No, human beings are not going to become super soldiers in their own frail bodies, or go out among the stars in their shirtsleeves, until we can solve some intractable problems of materials, mass, and energy supply. But when we do solve those problems, the sky will no longer be the limit.

1. And as for mining the asteroids for their mineral riches, when the time comes it will be much better to send intelligent machines than clumsy and inattentive human beings in spacesuits.

2. Oh, we’ll do it. After all, we have a continuous human presence at various national bases on Antarctica, and there is still scientific work to do. But no one moves there to live—to colonize the continent, make a life, and raise the next generation of “Antarcticans.”

3. And then there’s Venus, which is the most Earthlike of this solar system’s planets. But, because of its atmospheric composition and fractionally greater proximity to the Sun, the atmosphere is a high-pressure Hell composed of hot carbon dioxide laced with clouds of sulfuric acid.

Sunday, September 29, 2019

Growing Old

Warrior in snow

Warning! This posting is not going to be “happy thoughts.” So, if you’re at all depressed or suicidal, look away!

As I enter the eighth decade of my life, and after losing my wife of forty years just two years ago, I have come to a certain realization: the future, for me, is not what it used to be. This is not a good thought, but it’s also not something I can put away as idle rumination. It’s staring me in the face.

Ever since childhood, the direction of my life has been forward. I have anticipated, planned for, expected, and tried to divine the next step: next job, next project, next vacation, next holiday, next week, next day, next meal … next stage in my life.1 My head, my thinking, have all been about what’s coming. And yes, you can to some extent foretell the future, because you live in the prefrontal lobe of your brain where the planning and scheming parts reside. You can make plans for tonight, tomorrow, next week, and even next month and year, so long as you allow a marginal calculation for uncertainty, the unexpected, the occasional accident, and even tragedy.

I have lived so much forward, in the future, that memory, for me, is a hazy thing. I know generally what I was doing in 1988. I can remember details of past jobs, projects, and vacation trips, mostly as random flashbacks. But the exact sequencing and duration are a blur. And as for what I had for dinner last week, that’s a guess. The past has no special meaning to me, except as the foundation for what is to come.

But now … With seven-eighths or nine-tenths or whatever amount of my life left before me, the future is more and more foreclosed. All I’m supposed to have left is memory—and that’s still somewhat hazy, because my natural direction is still forward. But now that forward is into grayness and eventually darkness.

I knew from approximately my days at junior high school that I was not ever going to fulfill every little boy’s dream of becoming President. I knew from sometime in high school that I was never going to be a serious professional like a lawyer or doctor, with an upward career course and riches assured. I knew from that terrible night that the Selective Service introduced the draft lottery during my senior year in college that I would never be a soldier, unless I wanted to enlist.2

After an adolescence and early twenties with no serious romantic attachments or prospects, I began to suspect that I would live my life as a single man. And then I met Irene and the course of my life took a sudden turn. She introduced me to many things, including traveling for pleasure, going to the theater, Celtic music, cooking and entertaining as an art form, a broader political scope, and a new sense of family. She made me into a different person. But now I am single again, and my prospects for another life companion are dwindling to the point of extinction. That part of my life is just memories—disjointed ones at that.

Now, although my body is relatively healthy and strong, I am beginning to appreciate some of the things I once could do without a thought. Now I need to plan for them and allow time for recovery. For instance, a week or so ago I took a long motorcycle ride into the mountains: 375 miles from the Bay Area up to Lake Tahoe and back in one day. I used to make this trip with friends and think nothing of it the next day. Now I needed a day and a half of idleness, sitting in my chair, reading and dozing—plus a couple of aspirin for the strain in my right wrist—in order to recover my poise and balance. An airplane flight across country is wearing, and the twelve-hour flight to Europe is daunting because of the seats, the limited legroom, the cramping, and the dehydration.

So I am beginning to foreclose the idea of future travels to either Europe or Asia. And I suspect that long motorcycle rides—especially the cherished idea of a road trip of several days on the bike, off to see the Northwest or other parts of this great country—may no longer be possible.

I still have family and friends, and I see them about as often as ever. So far, I have been spared the cycle of deaths that afflicts older people. I believe that’s because we have all been leading healthier lives since our young age: vitamins, nutrition, exercise, the Surgeon General’s report, and a health consciousness that our parents never had. I still do my karate katas, although sometimes less frequently when I get busy or when my back hurts in the morning. I watch what I eat, but not as much as I should. My morning pills still comprise more supplements than actual medicines. But I know that all of this will slowly change as my eighth decade slides into my ninth.

I still have some new books in me, along with the mental energy and focus to write them. The two ME novels need a third book, I think, to go along with the third books that I have written for The Judge’s Daughter and its sequel, and for The Children of Possibility and its prequel.3 And after that I have a half-dozen other story ideas to develop. If I can produce a new book every year or so, these ideas will take me a long way through this decade. So that’s a future to hang my hopes on.

Growing up is a hard transition. You put your childhood dreams and ambitions at risk against the development of your actual skills, intelligence, endurance, energy level, and personal prospects. Going from youth to middle age is hard, too. You adjust to a new set of dreams and ambitions, with a certain amount of bitterness as you gain an understanding of how the world really works and regret for your own lost naïveté. And finally, growing old is hard. If you’re lucky, you may have achieved a stable place in life—financially, emotionally, spiritually, and politically—but now you take your store of knowledge and skills, your preserved wisdom, your surviving sense of wonder, and your failing expectations into the long twilight that precedes night.

But there is no sense in having regrets. And no sense, either, in fearing the inevitable slide and the dark pit that awaits you at the bottom. All you can do is buckle your belt, tighten your laces, check the loadout of your weapons, sharpen your wits …

And move forward as if everything is still the same. Because nothing has really changed.

1. Everything except next the next woman, the next romance. Irene sealed all that for me. We were an item.

2. With a lottery number of 347, I took a big sidestep away from my generation’s war: Vietnam.

3. I don’t call them “series,” because that presupposes a dedication to producing a whole string of books. Let’s call them “groupings.”

Sunday, September 22, 2019

War No More

War devastation

War is innate to the human species. It’s part of our hierarchical social structure and goes back to our heritage as primates in the monkey troupe. Anytime a group of human beings gets large enough—whether tribe, city-state, country, or nation—to have common interests—whether political, social, economic, or cultural—among its members not related by blood, you will find an instinct to defend those interests. We will defend them against competing socioeconomic or national groups, whether human or not. It’s in our nature.

The apparent exception to this would seem to be the U.S. involvement in the two world wars of the last century. In each case, there was a strong isolationist, America First sentiment that kept us out of the first war for three years and out of the second war for two. Both of those wars were European and, in the case of the second, European and East Asian affairs.1 America had economic ties to both sides and good reason to stay neutral—at least until the war finished up in each area and had the potential to cross one or the other ocean to reach us. But we had strong cultural affinity with the Anglo-French side of the first war (“Lafayette, we are here!”), and we were attacked in Hawaii—our territory, not yet a state—in the second. That attack was a blunder on the part of the Japanese, but it gave us an excuse to follow up on our cultural affinity with the British and Australians in both theaters. And, given the expansionist nature of German and Japanese foreign policies, we suspected they both would sooner or later be coming after us.

So yes, even when the U.S. is fighting on foreign soil without a direct and immediate threat to our irreducible national interests, we still feel the need to go to war.

War has become a terrible thing. It is no longer a matter of trained soldiers killing each other with swords and spears, slings and arrows, and occasionally torching the villages, raping the women, and slaughtering the children of the vanquished side. War has become a mechanized business that, with the invention of saturation bombing and nuclear weapons, threatens to obliterate whole civilian populations and render present military capability obsolete.

And yet, terrible as the nature of war has become, no human society is prepared to give it up. We become inured to terror; we try to put rules on the nature of conflict; we hope that some spark of human decency—or at least self-interest—will help us refrain from global annihilation. But we are not prepared to give up the idea of war altogether.

And there is a reason for that. War—taking up arms, banding together, and preparing to go down fighting—is the last resort of any group. It is what you do when there is no alternative.

After those two world wars, enlightened minds in the West conceived of a grand, globally supported council that would keep this atrocity from ever happening again. The League of Nations after World War I and the United Nations after World War II were both supposed to be places for reconciling national differences short of war. The League died quickly, and the United Nations limps along as a debating society, public “conscience,” occasional source of toothless “peacekeepers,” and promoter of international good works through non-governmental organizations (NGOs) to less developed countries. But no one actually goes to the Security Council for redress of grievances, and no nation would wait to adopt resolutions and legislation from the General Assembly in place of its own laws.2

Under the pressure of the Cold War and potential nuclear annihilation, many science-fiction memes arose about alternatives to war. In one Star Trek episode, for example, the opposing governments on a planet suffering an interminable global war agreed not to fight but instead feed their battle strategies into a common computer. The computer would dispassionately consider all the factors, declare a winner, and then assign civilian casualties on each side. And each nation pledged to send that many docile people into internationally monitored cremation chambers, rather than risk a further outbreak of their devastating global war.

In other scenarios, authors have suggested that instead of fighting and dying, nations resolve their differences through games. Pitting our best chess or go player against their best and pledging to stand down and honor the result was seen as a way to avoid war’s devastation. Of course, that system would be open to cheating: one side might, for instance, hire a mercenary chess master to represent it. And any other game with less than perfect knowledge—for example, poker with its capacity for bluffing, or Risk®with its reliance on dice rolling—would have the potential for outright deception and cheating. Not to mention bad sportsmanship.

In ancient times, two armies facing one another might put forward a champion, their best fighter, to take on and beat the other army’s best. But how many times did the side whose champion fell in single combat quietly put down their spears and relinquish the ground? Maybe, sometimes, if it was a Sunday afternoon’s entertainment, or if both armies were out to claim a disputed valley that neither side really wanted but they didn’t care to let the opposing side have. But if national integrity, right of survival, hearth and home were at stake, the side with the fallen champion would gird their loins and prepare for all-out battle. Because the stakes were important.

Alternatives to war like chess games, poker games, and other personal battles are too easy to lose—but also too easy to declare and try to win. And the stakes are not always clear. Sure, in the case of nuclear war, the stakes are your life and your family’s, plus the ground you stand on, all reduced to a sheet of radioactive glass thirty miles across. But if war is easy and painless, based on achieving a checkmate or turning over the right cards, then where does it end? Does Belgium declare war on Germany for the right to rule Europe—on the basis of a game? And if they lose, are the Belgian people prepared to become German vassals? Or march passively into cremation chambers?

Not bloody likely.

The same would go for the rulings any supposed international court of justice or the decisions of some international executive or legislative body. Would Belgium give up territory, or its national sovereignty, just because Germany brought suit in the Hague? Or because of a ruling in the U.N. Security Council? How has that worked out in Palestine?

Not all wars have to happen. Not many wars should happen. Not if human beings were rational, sober, honest, sympathetic, charitable creatures. Not if human beings took the long-term, Olympian view of gods or angels. But we aren’t and we don’t. If we all were as rational as Vulcans and dispassionate as Buddhist monks, we would no longer be human but something else.

Maybe that “something else” wouldn’t need war. Maybe that something else could look at enemies pouring across the border with weapons of mass destruction or intolerable social, cultural, or economic demands, and resign themselves to giving up gracefully and peacefully. Maybe they would bend and bare their necks, rather than stoop to pick up a sword or a sharp rock and fight back.3 But they wouldn’t be human.

War is the last resort. War is the resolution when policy, diplomacy, and negotiated advantage fail, when political, social, economic, and cultural survival are at stake, when the alternative is worse than dying in a fight. And that is our human heritage.

1. In the case of Europe, World War I (“the Great War”) was a total disaster. The political, economic, and social differences between the Allies (Britain, France, Russia, Italy) and the Central Powers (Germany, Austria-Hungary, Bulgaria, the Ottoman Empire) were mostly minor and potentially reconcilable. A lot of the buildup to war had to do with feelings of national honor between the sovereigns—even though the English King, the German Kaiser, and the Russian Tsarina were all family related—plus some cultural and economic differences. It was a war that didn’t need to happen, except that Europe hadn’t had a good rousing war—other than the Crimean, and that didn’t count— since the days of Napoleon a century before and … it was time.

2. Can you imagine the U.S. allowing a General Assembly that is dominated by a coalition of Muslim states to tell us what to do with our Jewish population?

3. And maybe that would be a good thing: human beings as docile and careless as cattle walking up the chute to the knacker man. They would be a lot less trouble as a group and a lot easier to control. But they wouldn’t be human.

Sunday, September 15, 2019

Where Are They?

Starfield

It’s called the Fermi paradox, after nuclear physicist Enrico Fermi: if the galaxy is host to billions of stars and potentially millions of planets, a huge number of them capable of generating intelligent life, then why don’t we see signs of that life? Why haven’t we been visited? Where are the signals from planets broadcasting their own versions of the evening news, I Love Lucy, and Xenobia’s Got Talent out into the universe? Where are the space aliens?

Personally, I think it’s a naïve question. Considering the time scales involved, and then the distances, I would be very surprised if we ever met intelligent life forms other than ourselves and possibly other species on this planet like whales, dolphins, elephants, and gorillas.

Our solar system—the Sun, the Earth, and other planets—formed about four billion years ago. In a universe with an imputed age1 of approximately thirteen billion years, that’s about a third of the lifetime of the existence of everything we have ever known or can know. Our Sun is then either a third- or fourth-generation star—which is to be expected, since all the metals in the dust cloud that formed our system had to be born in the fusion reactions of earlier stars (for elements lighter than iron) or in the compression forces of supernovae (for the heavier elements). Given the generous amounts of lead, gold, and uranium found in just the Earth’s crust, and the huge proportion of iron and nickel that makes up its core, fourth generation sounds about right.

Earlier generations of stars would be different from ours. The clouds of dust and gas from which they formed would have had a much lower content of metals and of the minerals we value, like the silicon that makes up most of our soils and rocks, and the carbon, calcium, phosphorus, and other light elements that make up most of our bodies. Earlier-generation stars, which comprise the first two-thirds of the time span of the universe, would not be likely to have any kind of life that we would recognize. Even if they had plentiful water on planets within their solar system’s habitable zone, where that water remained liquid most of the time, they would still lack an abundance of the complex chemicals that support our kind of life.2

So we are only likely to find creatures whom we would view as intelligent, organized in complex, multi-cellular forms, and capable of recognizable communication out among the planets in solar systems of our own generation.

Now consider that, while we have evidence of life on this planet going back some three and a half billion of those four billion of the Earth’s life-span, such life is mostly in the form of bacteria and blue-green algae. These microbial forms were necessary to create the oxygen-rich atmosphere and sustainable ocean waters that allowed higher forms of life to develop. And those higher forms took a long time to develop. The Earth did not see the birth of multi-celled creatures, with nucleated DNA and differentiated cells and organs, until just half a billion—five hundred million—years ago. That was an explosion of life, to be sure, but none of it was what we’d call familiar or friendly.

Our kind of life did not come out of the oceans onto dry land until about three hundred million years ago, give or take—at least for the animals. The plants had taken over the land sometime earlier. But no one looking for extraterrestrial life forms is expecting to find plants.

Life on the land flourished and developed to fill all the available environmental niches: plant eating, flesh eating, walking, burrowing, and flying. Almost all of it was reasonably intelligent, if you consider intelligence as a spectrum and not the special capacity of H. sapiens. Frogs, turtles, lizards, reptiles including the dinosaurs and birds, and the early mammals all had the cerebral capacity to find food for themselves, recognize and move away from danger and toward comfort and safety, and bear and occasionally nurture their young. Survivals of those creatures today—think of dolphins, elephants, dogs, and cats—are able to recognize and respond to human beings and even understand some of our verbal commands.

That does not mean that any of these life forms—going back through the history of our planet—was capable of building a radio or television station able to broadcast the evening news or I Love Lucy out to the stars. Even human beings, who have been around for at least the past hundred thousand years—a mere fraction of the domain of life on this planet—have only had that capacity for the last hundred years or so. And, for all our science and technology, we have only sent a handful of probes out beyond our solar system and only landed human representatives half a dozen times on our nearest celestial neighbor, our own Moon, at a distance of a mere 240,000 miles.

So if it took life on this fourth-generation star this long to get to the very edge of interstellar space, why would we expect so much more from the life on other, similar planets? For one thing, though, that other life might be a lot older. A few million years of difference in the time frame of stellar formation might put the putative human beings of some not-too-distant planet a million years ahead of us. Imagine what we could achieve in a million years beyond the Apollo Moon landings, the deciphering of the human genome, or the invention of radio astronomy. But then, we might also be a million years ahead of our nearest interstellar neighbors. They might still be in their equivalent of the early Pleistocene, with the closest thing to a human mind still roaming the savannah and hooting at their moon, like our hominid ancestors.

And then there is the question of distance. If the speed of light is an immutable boundary, and the currently available forms of energy are all that humankind will ever invent or discover, then crossing the gulf to our nearest stellar neighbor, the Alpha Centauri group, is going to take a long time. Chemical rockets shooting out reaction mass just can’t achieve the speeds needed to cover such a distance in less than many human generations. Travel by interdimensional means, such as artificial wormholes and warp drives,3 are still figments from the minds of science fiction authors. There may still be some magical—in the sense of Arthur C. Clarke’s “Any sufficiently advanced technology is indistinguishable from magic”—means of energy release, like Star Trek’s controllable matter-antimatter reactions, and some fantastic reactionless drive, like the gravity polarizer from Larry Niven’s Known Space universe. Maybe one day, in a lot less than a million years, we will discover such wonders. After all, it’s only recently that we’ve stumbled upon materials that work as high-temperature superconductors.

But until we make some significant scientific breakthroughs, the distance between the stars is still a limiting factor. We aren’t getting there—and by “there,” I mean someplace more friendly and habitable than Alpha Centauri—anytime soon. And we can’t expect intelligent aliens at our level of technology or even a bit more advanced to come visiting here.

But what about those radio and television signals? Shouldn’t we have heard the alien versions of the evening news and I Love Lucy by now? After all, we’ve been listening with radio telescopes to every likely star for more than sixty years, and analyzing signals with the SETI (Search for Extraterrestrial Intelligence) program for almost as long. And yet not a peep.

Well, there’s that distance thing again. Our random programs—and presumably theirs—are all by broadcast: the radio or television station sends out a signal in all directions. Any such signal is going to be subject to the inverse-square law, which says that signal strength diminishes with the square of the distance. By the time one of our programs—or one from the alien broadcasting network—reached past Alpha Centauri, its signal would be barely a mouse squeak, drowned out by the clanging together of two hydrogen atoms in space. And our signals—and presumably theirs—are not that strong to begin with, usually about fifty thousand watts on Earth, because we limit the power of broadcast stations to prevent them overpowering other users in the same band of the electromagnetic spectrum. And, finally, many of our broadcasts—certainly in the UHF and VHF television bands—are now converting to cable, or to signals beamed by geosynchronous satellites down toward the Earth. So they would become invisible to interstellar aliens.4

To have any chance of communicating with a civilization around another star, you would need to know where they are and then beam your signal—direct it with a parabolic antenna or similar device—to that location. Given the vastness of this galaxy, let alone the whole universe of galaxies, and the distribution of four-generation stars likely to have advanced civilizations, the chances of our planet crossing a coherent beam and interpreting it, are diminishingly small.

And finally, there are accidents. Many science fiction authors and others responding to the Fermi paradox believe that advanced civilizations necessarily destroy themselves. These are minds bruised by catastrophic thinking: first the vision of a Cold War nuclear holocaust—something I and most of my generation grew up with—and then the succeeding visions of totalitarian takeover, population explosion, and climate collapse, and finally culminating in the presumed death of computerized commerce in the Y2K fright. The popular notion is that organized civilization is a force for stupidity and always poisons and destroys itself. But somehow we humans have managed to survive.

Still, there have been accidents beyond human control. The Earth has supposedly undergone five mass extinctions, whether from asteroid strikes like the Chicxulub meteor that took out the dinosaurs and most of the other animals on the planet, or from natural variations like Earth’s orbital perturbations. In the lifetime of the human species, we have experienced and survived the last of the great glaciations, which covered parts of the planet in ice a mile deep. Humanity is still in no position to ward off a large incoming asteroid, and our civilization might not survive another glaciation, although our species just might.

It doesn’t take the human stupidity of a nuclear war or the mismanagement of a population explosion to set back a developing planet. It happened five times around here. And who knows, without the Chicxulub event, the dinosaurs might eventually have grown smart and started building radios and rockets a couple of million years ago.

But given all these factors, I’m not surprised that we haven’t seen any aliens yet. We’ve only been thinking about them and listening for their signals in the last sixty years or so. And maybe we’re just not that interesting a species, or not yet, to make visiting us worthwhile.

1. Working backward through the expansion rate following the Big Bang—if, in the first place, you actually believe in a “Big Bang” and its consequences.

2. This would discount, of course, energy beings, ghosts from other and older dimensions, and similar denizens of the science fiction canon.

3. See Warp Drive from March 17, 2019.

4. Of course, some of our Earth stations communicating with satellites in geosynchronous orbit are beaming up, away from the planet’s surface and toward those satellites. Those transmissions might be strong enough and focused enough to travel through interstellar space and still be recognizable to alien intelligences—although they are not aimed at any particular star.

Sunday, September 8, 2019

Good Manners vs. Political Correctness

Political correctness

I first remember seeing the phrase “political correctness” sometime back in perhaps the 1980s.1 At first, as I remember things, it was used disparagingly: when someone objected to an off-color remark or a joke about women or minorities, their objection was dismissed as being “politically correct,” along with being “too sensitive.” That is, the term “P.C.” was generally used by the reactionary forces of the right against the progressive forces of the left.

Except … no thinking person of average sensibilities, even at the time, would defend making an off-color remark or joke that separated, isolated, and ridiculed a person or group based on non-relevant and inconsequential attributes like gender, skin color, ethnic background, or religious viewpoint. And if a person had an obvious and consequential mental or physical deformity, then the appropriate response was to overlook it or tacitly compensate for it, rather than making a point of it.

This was not a political position from one side of the aisle or the other; it was simply good manners. A well-brought-up person, a polite person, a civilized person did not make jokes at other people’s expense, did not point and stare when someone with an obvious physical or mental difference entered the room, did not consider all the possible human differences of gender, skin color, ethnic background, or religious viewpoint as subjects for levity, disdain, or even much notice and distinction. A person of good manners tried to make others feel welcome and comfortable, and strove to put them at their ease. Such a person extended good will and provisional respect to everyone within his or her circle—even within his or her vicinity—as a gesture of presumed equality. And that good will and respect remained in effect until and unless the person who was its object showed by word or action that he or she did not merit such presumption.

These were the attitudes of a gentleman or a lady. It was a code of honor. It was expected in polite society. And it worked. Of course, not everyone was brought up that way. A person needed parents and teachers, aunts and uncles, grandparents, and family friends who were kind and thoughtful themselves, who understood that youth is often a time of distinction-making and difference-mocking, and who moved positively and directly to instill the virtues of politeness and social blindness before that distinguishing and mocking became ingrained. Not everyone was fortunate enough to be raised in such a household. But enough of us were that most social encounters could be endured without rancor, screaming, and fisticuffs.

But there was also another dimension to this attitude of politeness. While one did not call out irrelevant distinctions and make fun of them, that also generally extended to relevant distinctions. A person might not personally practice or approve of—and might even detest—dubious or immoral pursuits like adultery, prostitution, promiscuity, incest, sodomy, pedophilia, gambling, gluttony, loan-sharking, and all the other old vices and/or antisocial behaviors that take place behind closed doors and might be known only by inference and rumor. But a well-brought-up person did not take notice of them. A person of good will tried to look past such negative and unseemly distinctions in the interest of social harmony, especially when the abhorred practice did not affect one’s own life and person directly. However, that did not mean a polite person had to approve of, condone, or celebrate such behavior.

Is this a form of hypocrisy? Of course, it is. But it’s a trivial bit of social dishonesty because, again, the goal is to put others in the immediate vicinity at their ease, avoid discomfort, and extend a measure good will—not to point and hoot and chase them out of the room.

Where political correctness has gone in the last forty years or so, and the teeth it has grown, is that this sort of polite disinclination, social blindness, and hypocrisy are no longer allowed. It seems no longer possible to have one attitude in public but another in private. It is no longer a matter of manners to publicly ignore a vice or behavior that a person might disdain in his or her private thoughts. With the naïve intolerance of the very young, the advocates and practitioners of current political correctness appear to have decided that there can be no private thoughts. And since public ridicule of practices and opinions that many of us consider immoderate, vicious, or shameful would be impolite on the face of it, then we must all condone—no, celebrate and rejoice in—those negative distinctions.

Rather than presume commonality and lack of distinction in the people around us, today everyone is supposed to notice, presume, and celebrate distinction, whether relevant or not. This is a complete reversal of past norms of politeness.

Well, complete reversal for a start … Where the impulse toward political correctness has headed in the past few years seems to lead away from celebrating differences and toward denigrating2 the sameness that once was presumed and extended. The values of the once-dominant majority—middle class, heterosexual, Christian, and yes, traditionally European (e.g., from the north side of the Caucasus Mountains)—are now becoming objects of scorn. To assume any kind of norms or common values is considered backward and oppressive. The term “cis”—derived from the Latin, meaning nearer to the subject, as in Cisalpine Gaul—has become a pejorative. “Cisgender,” as in conforming to the idea of traditional male and female attributes, behaviors, and attitudes, is derogative.

All of this goes beyond good manners and becomes an attack on the structure of values themselves. I try not to become conspiracy-seeking and hysterical, but the current trend has gone too far. Instead of valuing the good will that people of good intentions once—in the Christian, Western Civilized, Enlightened tradition—extended to people who were different from themselves in trivial, non-relevant ways, political correctness was then urging the mainstream of our culture to value—even to prefer and celebrate—those differences, whether trivial or not, and whether they encompassed a moral distinction or not. And now the canons of political correctness appear to be attacking and denying all Christian, Western, Enlightened, and traditional moral values in favor of a value-free, nonjudgmental, and ethically and morally absent viewpoint. At the same time, persons operating from any other cultural, moral, or even biological perspective are invited to deny, blacken, and detest those once liberating values. And this is all in the name of refuting past oppressions, colonialism, mercantilism, consumerism, and any other nit you care to pick with what was once a remarkably successful, free, and accepting cultural tradition.

If this all ended up in some place that was more free, more accepting, and more successful, then I could accede to it. But I don’t see any kind of reciprocity here. Those different cultural, moral, or biological viewpoints are not bound to return the favor to the Christian, Western, Enlightened tradition on the basis of its past sins. And they also don’t generally extend the favor of politeness, acceptance, and social blindness to each other—or not that I can see. In fact, I cannot see that the current trend presumes the existence of any human values being shared and agreed to among the different cultural traditions and biological viewpoints. So the current political situation tends to favor centrifugal, disintegrating forces without the gravity and cultural cohesion that can hold a society together. Forcing things to fly apart and cheering as they depart, without the offer of a better set of values to replace them, is in my mind a recipe for disaster.

Is all of this planned? Is any of it planned? Probably not, in the sense that the Illuminati, the Fifth Column, or the Wobblies have worked it out in their secret, smoke-filled soviets and written it all down in a manifesto to which we will, every one of us, sooner or later, have to swear allegiance. And some of it is probably the natural tendency of human beings to grow bored with a system that generally works, trundles along producing social cohesion, and appears never to need the individual’s conscious preservation for it to keep working. Perpetual sameness and contentment are a breeding ground for contempt and the itch to fiddle.

But some of this cultural process serves a political purpose. And I think none of it is well-intended, liberating, or … polite.

1. It may have been used earlier, but it seems to me it came into common usage with its current meaning about thirty to forty years ago … which for many people is a lifetime ago and by now suggests “lost in the mists of the 20th century.”

2. If that’s even a word now. From the Latin de for a general negative and niger for black, it means to darken, cast aspersion, or deny value. And that use of color attribution itself has now become a racially loaded behavior.

Sunday, September 1, 2019

Survival

Climbing rope

Again, if living well is an art—and I believe it is—then managing your life and taking care of yourself are certainly an important art form. Note the active verbs here: “managing” and “taking.” Living is not a passive activity. Those who drift through life or expect others to take care of them—other than in instances of grave disability—cannot expect to have a good life.

When I was growing up, children were told they could be anything they wanted. And they were encouraged to “dream big.” After all, every little boy in America could “grow up to become President.”1 Perhaps today’s parents, teachers, and guidance counselors still tell children that. It would be a shame if they didn’t tell children to dream big dreams. But … there’s a caveat with that.

To get what you want, to become who you want, to live as you want—and not as other people command, direct, or allow—you have to scramble. You first need to dream, of course, but then you need to work, to do, to persevere, and to fight. And sometimes you have to do these things not just to have your dream job or preferred way of life. Sometimes you have to scramble and fight just to exist. But the alternative is death, either the slow and lingering death of the soul for lack of fulfillment, or the fast and hard death of the body for lack of eating and breathing.

I think too many people today, brought up in the richest, freest, most bountiful, most dream-inspired country in the world, believe that having what they want, living how they want, with the dream job, the inevitable success, and a comfortable living situation, with the big house, plentiful toys, and personal indulgence that go along with it—that all of this will be easy. That all a person has to do is show up, put in a maximum eight hours a day, and not screw up—or not too badly—and then success will be assured.

When I was growing up, everyone in the family thought I would follow in my maternal grandfather’s and great-grandfather’s footsteps and become a lawyer and perhaps one day a judge. I did not have the head for numbers that would have enabled me to follow my paternal grandfather and father into engineering. But I did have a certain facility with words and logic. The law seemed to be a natural fit until I also demonstrated a facility for writing and the imagination to tell stories. I began to think of myself as a fiction writer, even a novelist, after the fashion of a John Cheever, Robert Graves, or Herman Wouk. And my parents did not actively discourage this.

Other members of my extended family certainly did. Two friends of my grandfather, who were editors at the advertising trade magazine Printers’ Ink, advised him to warn me that very few people make a living by writing fiction, and this was echoed by my aunt, who also knew these friends and their negative opinion. Their estimate was that probably only ten or twelve people in the whole country—and this was in the late 1950s and early ’60s—lived solely by writing novels.2

On reflection, I can see the sense of this. A novelist, even one with a bestseller or two, cannot survive an entire career—a span of forty years or more, from the end of college at the traditional age of 22 until retirement at age 65—on the royalties from one or two bestselling books, even with movie tie-ins. A productive writer working in novels must produce a book every year or two. (A writer of short stories has to produce even more in terms of words, because the payout per story is lower and the market is actually smaller.) And then, not every book is going to be a bestseller, because the public is fickle and the competition is fierce. The most reliable way to make a living at writing—if you can—is to find a formula and stick to it. Think of Ian Fleming, who found his model early in the James Bond thrillers and pursued it through fourteen popular books, publishing one a year from 1953 to 1966. Or J.K. Rowling, who fashioned a series of books on the sequential school years of young wizard Harry Potter and his friends and pursued them with miraculous success.3

But still, that’s a handful of authors. And most writers would feel trapped writing book after book to a popular formula, as Stephen King suggested with his novel Misery. A creative writer naturally wants to branch out, try new forms, new genres, new characters in new situations. I certainly tried to do this, with books ranging from science fiction to literary fiction, and various novels based on history, computer science, biotechnology, and time travel. Doing the same old thing year after year is a kind of living death. So you try new things and experiment with genres and forms, but even with loyal readers, not every effort is going to be well received. Look at the difficulty Rowling has had with her novels set outside the Potter universe.

I also discovered that learning to write takes time and effort. Even someone with a big imagination and a facility with words needs to learn the craft of storytelling: how to structure and pace the story, where to place emphasis and what to leave to the reader’s imagination, how to inject the elements of character into action and dialogue, and on and on. I wrote my first novel—a dreadful space opera which could never be published—while in high school. That nailed my ambition to become a published author. I studied English literature in college, which grounded me in the background of storytelling but did not teach me the necessary skills; those I had to pick up on my own by reading and analyzing current fiction.4

But when I graduated from college I discovered an awful truth: most twenty-something people have little to say.5 My knowledge of the actual world was limited to my career as a student in academia. And my knowledge of science fiction and fantasy was dependent on what others had written before me. It would take me at least ten years working in two different industries—first in engineering and construction, then at an electric and gas utility—before I had enough experience of the world and the real people in it to begin framing stories.

In the meantime, I had to scramble. I started in business by using my English degree to work in book publishing, but that’s a hard business with low salaries. I took that experience into technical editing at the engineering company, and from there I got into communications writing: doing newsletters, magazines, and promotional brochures. From engineering I went to the utility company, and then with a few science fiction novels published in paperback—plus a small inheritance from an uncle—I tried to make a living with my fiction writing. I never made more, in total advances and royalties, than one year’s salary from working at my day job. And I was slowly starving to death, because my talent would always be that of a midlist author: a writer with a small following and reliable but not remarkable annual sales on that book-a-year treadmill. And then the midlist died with the collapse of the traditional publishing business in the early 1990s. So I had to scramble again, working as a temp or contractor at a petroleum refinery, a waste disposal company, and finally at a pharmaceutical company. Only then could I translate that experience to direct-hire employment as, first, a document writer, then as internal communicator at the biotech firm. And that was my last regular day job before being forced into retirement.

In short, to pursue my dream of writing fiction, I had to use my writing and editing talents—that facility for words and logic—in a variety of roles at various industries, whose requirements I had to learn on the fly and understand in order to survive. I had to continually reinvent myself. And only now, in retirement with an income based partly on Social Security and partly on conservative investment of my 401(k) savings, can I live the dream of writing fiction full time. I still don’t make much money at it, because I have no talent for self-promotion and marketing, which are required for both independent authors and those who can still make a sale to traditional publishers.

It’s been a hell of a life. I’ve never actually wavered from my dedication to words, logic, and storytelling. And I’ve learned a tremendous amount about the way the world works—and how people survive and prosper in it—at every industry for which I’ve written. It was not exactly my dream on my terms. But it came close.

I hope every child in school today, when told to “dream big,” gets the same chances I had and can make similar choices to survive and prosper. Because the alternative is a slow death or a quick one.

1. It wasn’t the same for girls, of course, back in the fifties and sixties. They could grow up to become good wives and mothers. Or, if career-minded, they could become nurses, secretaries, or librarians—some nice, indoor job where they would be helping and nurturing other people. President, senator, astronaut, truck driver, baseball player, brick layer—these were not in the cards for young girls. Thankfully, that has changed now.
    Oh, and a boy could not be President if he wasn’t a natural-born citizen and resident in the country for at least 14 years. And then he would have to wait until he was 34 years old. Other than that, it would help to have served a lifetime in politics and been rich, but those weren’t absolute requirements.

2. I also had an encouraging letter from the science fiction master Ray Bradbury, when I tried to send him a short story in high school. He declined to read it, but he did advise that if I wanted to be a writer, I should not go to college but instead get a job as a dishwasher and write, write, write every day, then submit and submit until my writing was accepted. Luckily, I did not follow this advice and did go to college, which was the basis for my being able to work at “day jobs” significantly better than washing dishes.

3. Some of the series and authors we think of as wildly successful—the Nancy Drew books by Carolyn Keene, the Hardy Boys series by Franklin W. Dixon—were actually collaborative works by a group of serial ghostwriters. Both of those two young adult series, and a number of other familiar titles like Tom Swift and the Bobbsey Twins, were originated and packaged by advertising genius and writer Edward Stratemeyer. Other long-established and successful authors, like techno-thriller writer Tom Clancy, have extended their range by bringing on serial collaborators who often end up writing the whole book under direction of the senior author. Early in my career, I wrote four books that way by arrangement with my publisher—and I was lucky enough to get my name on the cover, which many collaborators do not.

4. Some writers—even a few I know personally—have learned the craft by joining writers’ groups, attending formal, author-led classes, and going to workshops and retreats. I was never much of a joiner, and I found that what other authors, mostly amateurs themselves, had to say about writing tended only to limit—rather than expand—my sense of the possible. Maybe the workshop model works for some, but I have to explore my talent on my own, driven solely by my inner ear and my personal vision, and ultimately by what works.

5. Yes, there are brilliant young novelists who produce first bestsellers, but most of these books are about young adults coming of age. I was looking for an adult view of the world, like the fiction I tended to read. And I know from experience than any published “first novel” is generally the author’s third or fourth attempted manuscript. Look at Harper Lee’s first effort in Go Set a Watchman. The craft takes time to learn and many tries to master.

Sunday, August 25, 2019

Leadership

Conductor hands

If bringing people together is an art form—and I believe that helping people work efficiently together certainly is—then the highest level of that art is leadership.

A “leader” may be defined, in the broadest possible sense, as one who accomplishes work or effort through the active participation of others. This definition would immediately rule out, then, the sort of supervisor, manager, or even chief executive who is so smart, experienced, or goal driven that he or she shoulders aside subordinates who are trying to do a task and completes it him- or herself, so that it gets “done right.” Such a person is not a leader but a star performer surrounded by frustrated assistants.

This is not to say that the leader cannot take a hand in the effort. After all, when invaders are pouring in over the walls, everyone picks up a rife and shoots. And yes, sometimes, the leader has to lay the first brick, make the first cut with the axe, or otherwise physically participate. This might be a ceremonial start to the project. Or it might be the leader demonstrating how he or she wants the task to be performed. And it may serve the psychological purpose of showing that the leader is not above getting his or her hands dirty. But for the most part, in day-to-day operations, the leader’s role is hands-off.

So how does a leader perform work “through the active participation of others”?

At first, it would seem that the leader simply gives orders. And this might work in situations and institutions where the command structure is clearly defined, enforced by rules and sanctions, and embedded by long-standing practice. Think of the Roman Catholic Church or the U.S. Army. Bishops have absolute authority over parish priests, and archbishops and cardinals have authority over bishops. Sergeants have absolute authority over privates, and lieutenants have authority over sergeants. And so on up and down the rigidly prescribed chain of command.

Except … a weak or vacillating bishop may have no effective control over the priests under him. A green or unskilled lieutenant may have no influence with an experienced, battle-hardened sergeant. Every organization, even the most hierarchical, has ways for underlings to evade or obstruct the orders they don’t like or can’t follow. In some cases, and with collusion, this might be a “white mutiny,” in which subordinates follow stupid orders to the letter until the whole effort—along with the command structure—just falls apart. And in the Vietnam War, there were supposedly incidents of “fragging,” in which soldiers rolled live grenades into the tents of incompetent officers whom they feared would get them killed in battle.

While simply giving orders can work in unusual or extreme situations, where other forms of leadership might take too long or be misunderstood, there is a further caution. In one of my favorite books, Dune by Frank Herbert, Duke Leto Atreides advises his son: “Give as few orders as possible. Once you’ve given orders on a subject, you must always give orders on that subject.” This is because people—especially those in a functioning command structure—can be fearfully literal-minded. They note and observe where a leader or a person up the line from them places his or her interest, attention, and focus. Written or verbal orders are specific and purposeful, and they will tend to be obeyed until rescinded or countermanded. And if the situation surrounding those orders changes in the slightest, the person receiving them will expect a new order amending the previous order. Otherwise, and fearful of doing something wrong or displeasing, they will follow the original order or wait, frozen, until the countermand arrives.

The trap with direct orders is always that they undermine personal initiative. “The chief said to do this and, so help me God, that’s what I’m doing.” Sometimes, perhaps with front-line soldiers in a shooting war, that kind of unthinking dedication and persistence is valuable. But few of us are engaged in a shooting war.

American business culture over the last seventy or eighty years has undergone a sea change. Coming out of World War II and into the boom times in which most of the returning soldiers found civilian jobs, the military-oriented, top-down, command-and-control structure was dominant. It worked to beat the Germans and the Japanese, so why not apply it to the assembly line and the sales floor? And that worked, too—so long as the boss was all-seeing and always right; the jobs were correctly structured and easily performed; and unusual situations were rarely encountered. It also worked because America was in its prime, being the last industrial country left standing, after Germany and Japan had been bombed flat, and England, France, and the rest of the world were exhausted by war. It was a good time for us Baby Boomers to grow up, because American business could make few lasting or damaging mistakes.

But soon, under pressure from rising global markets, advancing computer controls and information technology, and competition in a shrinking internal market, the rigid, top-down model started to falter. Employees who could only follow orders were not as nimble, flexible, or effective as those who had the initiative to identify problems and either fix them on the spot or recommend solutions to supervisors who tended to trust—rather than question—their judgment. Getting things right the first time on the assembly line avoids a lot of lost time in quality control, remediation, and recalls. Solving the customer’s problems on the sales floor avoids a lot of exchanges and restocking—not to mention loss of the customer’s good will.

The need for personal initiative has followed this cultural change back into the military. The modern army unit is no longer made up of people who shoulder a rifle, march in straight lines, and charge on command. The modern naval ship no longer needs teams of people who only pull on a rope when they are told. The modern battlefield is interconnected, data-heavy, and invisible to the naked eye. It is the place of “combined arms,” where soldiers on the ground coordinate intimately with air support, indirect artillery, and electronic intelligence sources. The modern soldier must be a team player, versed in complicated weapons systems—some of which he or she never personally handles—and react to a changing environment both flexibly and nimbly. The modern sailor is a technician operating one of the most complicated and versatile weapons systems ever made.

So how does a leader in either the modern business model or the modern military accomplish work “through the active participation of others”? Primarily, through articulating, setting, and enforcing values. Values are better guides to individual initiative than any set of standing or direct orders. Values last longer in the mind and memory. And when they are correctly stated and willingly adopted, they are more flexible in the face of emergent and changing situations.

The first value a leader must articulate, demonstrate, and encourage, is dedication to and respect for training. The modern soldier or employee needs to know every aspect of his or her own job, and it helps the person take initiative if he or she also understands and can on occasion fulfill the jobs of the people on either side, with whom the employee deals directly, and above, so that the employee understands the reason for and purpose of those orders and directions. Further, the employee should be educated in broader issues like the company’s or the unit’s mission, the needs of customers, the nature of the competition, and the factors leading to success. Some companies try to do this in blanket form with some sort of universal, homily-laded “mission statement” printed and laminated in posters that hang on the wall. But that’s the easy way out. If the mission is going to become a value embedded in the employee’s mind and heart, then it needs to be taught and refreshed at every opportunity.

The complementary value to training is trust: for the leader to trust that the employee has absorbed this knowledge and to encourage and reward the employee when he or she shows initiative. Without trust and encouragement, the employee will suspect that all of the training he or she has received was some sort of management gimmick. That Human Resources is manufacturing another “policy du jour,” a temporary thing like all the others, and what the company really wants is unquestioning obedience. That attitude taps into the vein of cynicism that lies not far beneath the skin of any alert and educated human being. And that sort of cynicism is poison to personal initiative.

Ultimately then, the leader establishes performance standards. This is the hands-on part of the job. The leader voices these directly: “This is how we do things.” “Here is what I expect.” And the leader rewards and publicly praises good performance: “That was well done.” “This is what success looks like.” Setting standards is not a do-once-and-done process. It cannot be achieved with a memo or a motivational poster. It is an active, repetitive, daily flow of discovery, example setting, and encouragement. The leader looks for occasions to teach and train to standards. Only with repetition and reinforcement do the values in all their varied applications become part of the team’s understanding and sense of purpose.

Modern leadership is more about persuasion than power. It treats people as willing participants with minds of their own, rather than as slaves, servants, or “meat robots” to be programmed. It is a personal relationship between supervisor and employee, between officer and soldier or sailor. It is built on mutual trust and respect, rather than on fear and obedience. And it takes conscious effort and an appreciation for the science of psychology and the art communication to establish this relationship.

But once the bond it made, it is stronger than any other. People will literally die for it.

Sunday, August 18, 2019

Grace

Graceful hands

If living is an art form—and I believe that living well certainly is—then the highest level of that art is what I call “grace.”

In religious terms, grace means having the favor of, making oneself pleasing to, or benefiting from the generosity of God. These are acceptable meanings, but not the subject of this discussion. Religious grace is a state that comes from, or is predicated on, or defined by the outside influence of the Almighty. Similarly, in archaic terms, a sovereign might grant a pensioner or a loyal servant grace along with some form of bounty. But again, I’m not talking about the grace that comes from outside.

A person of any religious or political stripe has grace, and can be considered gracious, if he or she adopts a particular set of values and their corresponding behaviors. So let me attempt to list them.

First of all, a gracious person is interested in and sympathetic toward others. He or she lives outside of him- or herself. Such a person, when approached, offers a smile; when asked for an opinion, offers encouragement and support. He or she cares about the impression made upon other people. More important, he or she wants other people to feel secure, comforted, and at ease. The gracious person creates an atmosphere of calm and contentment.

This is not to say that a person with grace has no inner life, or sacrifices everything for the people in his or her immediate vicinity, or never says “no” to a request. But any necessary “no” is always framed in terms of regret rather than righteousness. The protection that a gracious person puts in place around his or her inner life and personal substance is understated if not actually invisible. This is not deceit but an advanced form of minding one’s own business.

Second, and corollary to the first, the gracious person is generous. The old-fashioned word is “magnanimous.”1 When confronted with a direct appeal, he or she generally gives what is asked and a bit more. Or, if direct giving is not possible due to the state of his or her purse and other resources, the gracious person goes out of his or her way to offer other forms of help and aid. Once again, that giving or that aid may have limits, but they are generally invisible to the person receiving the help. At the very least, the gracious person makes time to listen to the request and, if help is not forthcoming, refrains from letting the supplicant feel bad about the asking. This again is a form of sympathetic treatment.

Third, the gracious person respects the freedom and personal agency of others. This is a delicate point, because it is so often overlooked in today’s communications. Offering respect in this case is a subtle negative: refraining from making assumptions about another person’s situation, motives, capabilities, or prospects. Just because someone is in a bad way and asking for help does not mean that he or she is unwilling, incapable, or lazy. The other person’s life is not a subject for debate and speculation. The other person’s story is his or hers alone to tell—or not. The gracious person respects personal boundaries—both in speaking and in imagination—just as he or she maintains his or her own boundaries and secrets. The gracious person trusts that others—barring direct evidence to the contrary—have abilities and prospects similar to his or her own. After all, the basis of respect is granting that others have an equal footing in this business of life.

Fourth, and corollary to all of the above, the gracious person is confident of and secure in his or her own situation, motives, capabilities, and prospects. To be confident means that a person has examined his or her own life and knows him- or herself at some depth.2 To be secure in life is to have control of one’s person and resources and to maintain a calm appreciation of what the future may hold. A person who is insecure or afraid cannot be—or, more generally, believes he or she cannot afford to be—generous, sympathetic, or respectful of others. A person who does not trust him- or herself is unlikely to extend a similar trust to other people.

Fifth, a person with grace generally leads a sober and thoughtful life. He or she knows that alcohol and drugs, loose talk, unrestrained behavior, reckless impulses, crude humor, and even excessive jocularity—all tending toward a loss of personal control and restraint—have the possibility to be damaging to oneself and to those nearby. Wit and a well-told joke are acceptable. Imprudence, impudence, and a generally sarcastic attitude are not.

Sixth, and finally, the gracious person maintains his or her balance. We think of a person who is physically graceful as moving in a controlled, purposeful, balanced manner, without sudden lunges, hesitations, missteps, or reversals. In the same way, the gracious person’s life proceeds in balance, with a sense of purpose and direction, and with forethought. It avoids both impulsive lunges and fearful retreats. The gracious, magnanimous person is superior to events, plots a life course focused on the long view, and remains steadfast, loyal, and calm in the face of both opportunity and adversity. He or she thinks before acting, and acts with precision and purpose.

All of these traits exist irrespective of any political views. A generous, thoughtful, sympathetic, and controlled person can exist in any political sphere—and even more so in times of disruption and confusion, which are merely opportunities to exercise these talents. Similarly, the life situation that allows a person to demonstrate grace is not dependent on wealth, education, intelligence, health, or other outward resources—although it is easier to maintain a gracious demeanor if one is not scrabbling for breadcrusts, ignorant of the world, dim in perception, or wracked with pain. Still, the true gift is to rise above these shortcomings.

I have been blessed in my life to know a number of gracious people. My mother was one, my late wife was another, and I cherish their memory. They both lived with a calmness of person and generosity of spirit that made the people around them feel both confident and secure. And that is something that can only come from the inside.

1. This is a direct combination of two Latin words, magnus, meaning “big,” and animus, meaning “spirit” or “soul.” Such a person is big-spirited.

2. It was written above the door to the oracle at Delphi: gnōthi seauton, “know thyself.” To understand the will of the gods, you must first understand your own situation, motives, capabilities, and prospects.

Sunday, August 11, 2019

Who Gets to Say?

Founding Fathers

“Declaration of Independence” by John Turnbull, 1818

All political questions, every one of them, come down to a basic proposition: Who gets to say? Who in society will decide the big questions and, by extension, the smaller ones for the rest of us? Who will make the rules? Who will rule on what is allowed and what forbidden? Whose vision of culture and society will shape the world in which the rest of us must live?

For a hundred thousand years or so of our hunter-gatherer existence, going back even before the H. sapiens genome settled down in its current configuration, the question of who-gets-to-say was probably handled as close to democratically as one could wish. After all, each group or band was about the size of an extended family or tribe, probably no more than fifty individuals maximum. In such a group, you might think that the oldest, wisest, strongest, or bravest would be the natural leader and decider. But I believe such a small group would probably have its own ideas.

The person whom the tribe picked as “wisest,” “strongest,” or “bravest” might be the result of a lot of individual choice and input. It would really come down to whomever the collected individuals trusted or respected or hoped would take them in a direction that found food, water, shelter, and safety. And that might not be the person with the smartest mouth, the biggest muscles, or an unflinching attitude when staring down a wild boar or tiger. It ultimately would be the person to whom the most members of the group would listen, respond, and obey.

It’s also possible, with a group that small, to have a situation of shared leadership. One person might know the most about and be the most skilled at hunting. Another might have special diplomatic skills when dealing with other tribes. And a third might be the best to consult about campsite rules and interpersonal relationships. In this, the hunter-gatherer tribe might be like any extended family, with different natural leaders at the focus of different family problems.

But when the tribe settles down and becomes a village, a culture, or a nation, the daily push and pull of personal familiarity, family loyalty, and individual input no longer works to select a leader to speak for and direct the whole district or city-state. Each street and neighborhood might have its local wise man or woman, its go-to person for conflict resolution, and its natural spokesperson. But as soon as that local neighborhood bands together with others to form an association that is larger, better able to plan community rules and infrastructure, and benefit from extended trade with other cities and states—and so become richer and more powerful as a whole—then the need for a unified leadership becomes apparent.

The district chief, thane, or regional king would not necessarily be chosen by popular election, and probably not by simple trust, respect, and response on an individual level. And this would be the initial selection, when the old king dies without an heir or the community undergoes a popular revolution. Otherwise, for all our long history, humans have been willing to trust the children of the king to take the mantle, crown, or talking stick when the old king passes. This trust in inherited capability was not blind, because it responded to an innate understanding of biology, observed from a thousand generations of animal husbandry and seed selection: strong and capable parents usually beget strong and capable children.

But in that time of dynastic disruption—either death of a childless king or social revolution—then the choice of who-gets-to-say would be based on both self-selection and individual input. First, the candidate must put himself1 forward, and then the people who have risen to positions of respect and authority in their own neighborhoods and districts must decide whether they will trust, follow, and obey him or hold out for somebody else.

If the king or his son proved to be incompetent, then the matter of who-gets-to-say falls to the privy council, the closest advisors, or the “kitchen cabinet.” Power will still be exercised in the name of the king or the idiot son, but the decisions will be made by those who are closest to the throne. This is, of course, an unstable situation, because proximity is not a finite but a relative measure. And among the counselors and advisors there will always be those who must choose how far they will trust, follow, and obey the loudest or most persuasive speakers.

The purest form of democracy—where no one leader emerged as king, archon, or dictator—seems to have been the government of Athens in the fifth to the fourth centuries BC. There every male citizen of a certain age had a voice in the Assembly and a chance to represent his village, tribe, or administrative district (his deme) for a year’s time in the Council of 500, whose members were drawn from each tribe or district. In the Assembly, citizens could discuss and vote on proposed rules and legislation. The Council prepared the legislation for discussion in the Assembly and could also make rules by itself.

This system worked well, for as long as it did. But it was also subject to takeover by groups of oligarchs2 and individual tyrants. Rule by the direct vote of all citizens in either the Assembly or the Council is a nice idea—unless you were a woman, a slave, or a non-citizen—but it was also unwieldy. Discussion, negotiation, and consensus all take time. Not everyone is happy with every outcome. And it requires great personal and moral discipline to accept an outcome that you don’t like, or perhaps one with which you violently disagree, solely on the basis that your neighbors find it good and acceptable and that you have sworn—or just personally chosen—to uphold the democratic ideal as more important than your feelings on this one matter or string of issues.

Turning the decision power over to a smaller group of smart fellows—or even one particularly persuasive, charming, and capable fellow—who can get things done in a sensible amount of time and with a minimum of fuss, might seem like a good alternative. Especially so when there’s a crisis at hand—and almost any issue can be made to seem like a crisis—or when the bickering among Council and Assembly members has gone on so long that the issues are piling up, the roads and sewers aren’t getting fixed, the wharves are crumbling down at the port, and the rats are everywhere.

Something in the human soul gets tired of taking direct responsibility for every detail of government, especially when you can’t win every debate. It just becomes easier to turn the issues of the day over to a bunch of smart guys, or one smart guy, and let them or him figure things out. And then, if they really mess things up, you can always go out into the street with your neighbors, carry a torch or a pitchfork, and call for reforms. You can at least get the heads of the current smart guys mounted on a pike and put in their place the people who promise to do things better. And that’s a sort of democracy, too—direct action by social revolution.

In the modern political entities of the West—and in the Western-inspired parts of Asia—direct democracy and its upheavals have been replaced by representative democracy, usually in the form of a republic with an agreed-upon charter or constitution. Instead of letting smart fellows just take over the government, citizens vote to put them in power. In most of the world, exemplified by the United Kingdom, the representatives sit in an assembly or parliament, and the leaders in that body also form the administrative government. In the United States, the assembly of representatives is the Congress, with two houses whose members are seated differently—either on an equal basis among all states (the Senate) or proportionally by population (the House)—and with different terms for each. The administrative government is voted directly in the person of the Executive—the President, with cabinet members and various department heads proposed by the President but confirmed in Congress. Congress makes the laws, the President’s administration enforces them, and the federal courts up to the Supreme Court—with justices also named by the President but confirmed in Congress—rule on the laws in practice.

This system has worked well—or at least without outright revolution, and only one civil war—for more than 230 years. Like Athenian democracy, it has its satisfied winners and its unhappy losers. Sometimes the Congress has more direct power as the law-making body. More recently, the actual power of interpreting the details of Congress’s large and clumsily written laws has fallen to the Executive. Like all governments and all agreements among human beings, it’s a plastic situation that changes with the times.

The genius of the American system—or its defect, in some eyes—is that we have generally focused our discussion and debate through two main parties: once the Democrats and the Whigs, now the Democrats and the Republicans. The issues and emphases of these two groups change and sometimes reverse themselves over the generations, but they keep the discussion within bounds, and each party forms an inclusive group for handling similar issues under similar principles. The alternative is the multiplicity of parties—sometimes with single issues, sometimes splintering off of larger parties—that is common in Europe. There government by coalition—shared power without necessarily shared viewpoints—has become more common. And this is an unstable situation, especially when coalitions can break up and that precipitates an unscheduled election through a “vote of no confidence.”

We may be headed in that direction in this country, as the two major parties move their centers of gravity further out to fringe positions left and right, leaving the voters in the moderate middle with fewer and fewer good choices. So far, third parties in this country have been a pit filled with wasted votes, but that may soon change. And then we may have more government by uneasy coalitions.

But whatever comes, and whether the issues of the day reflect real or invented situations and dangers, all political issues are ultimately just wedges to put one person, one party, or a strong coalition of similar intentions in a position to make day-to-day, on-the-ground decisions for the rest of us. Issues, principles, and voiced positions are one thing, but access to and the use of actual decision-making power is the final purpose.

1. In this part of the discussion I am purposely using the male pronoun. Yes, there have been some matriarchies and matrilineal societies—but not many. And yes, in recorded history many Western nations have been led by queens. During the Roman invasion, the English had a warband leader known as Boadicea and she must have been an extraordinary woman. The English have also had queens in recent history, too: Maud, Elizabeth, Anne, Victoria. But these women—extraordinary or not—generally came to power when the dynastic tradition was strong but there was no better-placed male heir at hand. The free and unrestricted choice of a woman as national leader is a much more modern phenomenon.

2. The roots of this word are the Greek olig, meaning “few, a little, or too little,” and archon, meaning “ruler” and being one of the nine chief magistrates of the ancient city.