Sunday, September 20, 2020

The Truth

Total honesty

I have always believed in the truth: that one should try to understand it, to know and speak it whenever possible, and to accept it, even if the implications and consequences work against one’s own prior assumptions, beliefs, advantages, and one’s personal situation. I would rather know and follow the truth than be happy and whole in the shadow of ignorance or a lie.

It was this basic adherence to the concept of truth that kept me from following my grandfather’s career path—although he was a great believer in truth, too—into the law, which everyone in the family thought would be my future, because I was so verbal as a child. But as I grew older, I realized that a lawyer deals mainly in argument, precedent, and the intricacies of the law as a giant logical puzzle weighing rights and advantages. I knew or suspected that a lawyer must sometimes decline to know or search for the truth—the facts of what actually happened, which he or she is required to bring into court, if known—while working toward an argument or an interpretation of the known facts that will best serve the client’s purpose. By putting some gain above the human obligation to know and speak the truth, I knew the law was something in which I feared to and dared not dabble.

So I studied English literature and became a devotee of storytelling. Fiction, a made-up tale about made-up people, is not necessarily a violation of the truth. It is not exactly telling lies. An author telling a story is like a blacksmith forging an iron blade. The smith hammers away the surface scale, and with it the impurities that cloud the pure metal underneath. And so the author hammers away the alternate interpretations and contradictions of a life situation in order to reveal a pure fact, or sequence of events, or understanding of the human condition that the author recognizes as true.

When I write about “the truth” here, I am not referring to biblical truth, or revealed truth, or a studied construct made of equal parts belief and hope. I am talking about a summation of observations, of experienced cause and effect, of facts that have been seen and where possible tested and annotated, of things we know to apply in the real world. It’s an elusive thing, this truth, but something that I believe can be observed by one person, formulated into a statement or story, communicated to another person, and received by that second person as something that is apparently if not obviously real and congruent with known facts.

It is therefore an article of faith with me as a fiction storyteller and a nonfiction communicator that language can have adequate if not exact meanings, in terms of the denotation and connotation of words. That one person can share an idea, a realization, a piece of truth with another person through verbal means and not be completely misunderstood. Some misunderstanding may take place. Sometimes one person does not have the same meaning—denotation or connotation—for a word that the original speaker does. Sometimes the recipient of the thought has different ideas or beliefs that get in the way of examining the story or statement and perceiving it in the same way that the original formulator meant or intended. Accidents of language and intention do happen. But, on the whole, between people of fair mind and unbiased perception, communication of the truth is possible.

It is also an article of faith with me that truth exists outside of our personal, subjective perceptions. That is, truth is an object waiting to be discovered, analyzed, and discussed. It is not merely a personal belief that is particular to each person and changes with his or her perceptions based on personal needs and desires. Two people can point to the results of a scientific experiment, or an art form or artifact that was carved, painted, written, or created by a third person, or to an event common to their experience, and reach agreement as to its nature, purpose, and quality.1

Of course, I am not a fool. I do not believe that every truth can be discovered and stated. I understand that some things are the product of chance or probability and so can fall out one way or another. I understand the quantum physicist’s dilemma when dealing with very small, intimate systems, that the act of observing a particle in flight—usually by bouncing a photon or some other particle off it—changes the direction of flight immediately. So the physicist can know where a particle was but not where it is now or where it’s going.

And I do understand that humans, their perceptions and interpretations, and the things they hold to be important are constantly changing: that we do not live in the same world of values and feelings that was inhabited by the ancient Greeks and Romans, the Medieval or Renaissance Europeans, the ancient or modern Chinese, or the Australian Aborigines. Humans are exciting and varied creatures, constantly evolving and reacting to the products of their own minds, and this is not a cause for concern. But I hold it as a postulate that, given good will and a common language, they learn from each other, share ideas, and can arrive at an objective truth about any particular situation or experience. However, I also understand that this level of understanding may require one or more human lifetimes and leave little room for other explorations and understandings. That’s why we have books and can read.

At the same time, I do not believe that human nature changes very much. What people believe and hold to be real may be influenced by great thinkers, prophets, and teachers. Otherwise, the Islamic world would not have taken such a turn away from the Judeo-Christian tradition that was in part its heritage. But people still have basic needs, basic perceptions of fairness and reciprocity, and a basic sense of both the limitations and the possibilities of the human condition. Until we become incorporeal creatures of energy or immortal cyborg constructs, issues of life and death, family and responsibility, need and want, will be for each of us what they were in the time of our hunter-gatherer ancestors.

And yet there are also things we cannot know about each other. I cannot know what is really going on inside your head. Even if I know you well enough to trust your nature and sense your honesty, even if you use words well enough to express your deepest feelings accurately, there are still secrets people keep to themselves and never tell even their nearest and dearest. There are still secrets people hide away from their own conscious mind and, like the movements of great fish in the deep waters, can only be discerned by the effects that these deep secrets have on their lives, their loves, and their mistakes and missed opportunities.

That is a lot of unknowing and things unknowable for someone who believes in the truth. But as I said, to be completely knowledgeable would take a library of great books and several lifetimes to read them. All any of us can do is try to start.

1. When I was studying English literature, back in the mid to late 1960s, we were taught what was then called the New Criticism. This was the belief that the work of a writer or poet—or a painter, sculptor, or musician—stood on its own and could safely be analyzed by any person with sense, feeling, and a knowledge of the language. This displaced the author’s own expertise with the object. The author’s claims about “what I meant” or “what I intended” might be interesting but did not define the work. Sometimes an author intends one thing but manages—through accidents of carelessness or vagaries of the subconscious—to achieve something else and sometimes something greater than intended.
    This is opposed to the literary criticism called “Deconstruction,” which has been taught more recently in English departments but is something I never studied. Deconstruction apparently teaches—at least as I understand it—that words, their usage, and their underlying reality are fluid rather than fixed. That they are so dependent on a particular time, place, and culture that trying to understand the author’s intended meaning, from the viewpoint of another time or place, is practically impossible. And therefore it is useless to discuss “great books” and their enduring value. That nothing is happening in any universally objective now, and everything is subject to current reinterpretation. This is, of course, anathema to me. It is a denial of any kind of perceivable truth.

Sunday, September 13, 2020

End of Days, or Not

Red-sky dystopia

This past week has been weird and depressing. A growing number of fires in California cast a pall of smoke into the atmosphere over the northern part of the state, like a high fog but with a gritty perspective in the middle distance and bits of ash floating silently down, so that your car’s hood and fenders are speckled white. There’s a cold, red-orange darkness at noon, like the fume out of Mordor, or like life on a planet under a red-dwarf star. You’ve seen the pictures on friends’ Facebook pages—not quite as apocalyptic as my stock photo here, but still disturbing. It’s like—and I think this is a quote from either J. R. R. Tolkien or J. K. Rowling—you can’t ever feel cheerful again.

On top of that, Monday was the Labor Day holiday. So, for those of us who are retired and only loosely connected to the working world’s rhythms, that day felt like a second Sunday. Then Tuesday was like a Monday, Wednesday like Tuesday, and what the hell is Thursday supposed to be? Glad as we are for a holiday, it throws off the pace of the week and makes everything feel subtly weird.

And then there are the overlying, or underlying, or background burdens of 2020. The pandemic drags on and on, so that we are isolated from family, friends, and coworkers, except through the synthetic closeness of a computer screen. We wear masks in public, so that we are all strangers to each other, even to the people that we know and would normally smile at. We avoid people on the sidewalk and in elevators, maintain a shopping cart’s distance at the grocery store, and feel guilty about touching a piece of fruit in the bin and then putting it back when we find a suspicious bruise. This illness, unlike any other, is not so much a matter of concern about personal safety as a national and social pall that has descended on everyday life.

Because of the closures, our robust, consumer-oriented economy has tanked, and we don’t know when it will come back. The stock market has revived from its swoon in the spring, apparently rising on shreds of pandemic optimism. But anyone who follows the market knows that these weekly swings of 500, 1,000, and 1,500 points on the Dow alone, with comparable lurches in the other indexes, just can’t be healthy. It’s like the entire investor class is cycling from mania to depression, too. Meanwhile, we all know people who have been laid off and are scrambling. We all have a favorite restaurant that is eking by with takeout service or a favorite shop that has closed, apparently for good. We all miss going downtown or to the mall for some “shopping therapy”—not that we need to buy anything, but we look forward to what the richness of this country and its commercial imagination might have to offer us. Buying dish soap, toilet paper, face masks, and other necessities off the Amazon.com website just isn’t as satisfying.

And then there’s the politics. The divisions in this country between Left and Right—emblematic of but, strangely, not identical with the two major parties—have grown so deep and bitter that friendships are ended and family relationships are strained. The political persuasion opposite to your long-held point of view has become the other, the enemy, and death to them! We are slouching, sliding, shoved inexorably into an election that has the two sides talking past each other, not debating any real points of policy but sending feverish messages to their own adherents. And whichever way the national polling falls out—with the complication of counting votes in the Electoral College from the “battleground states”—the result promises to bring more bitterness, more rioting, more political maneuvering, and perhaps even secession and civil war.1 There’s a deep feeling in the nation that this election will solve nothing.

The one ray of hope in all of this is that things change. This is not a variation on the biblical “This too shall pass.” Of course it will pass, but that does not mean things will get back to the pre-fire, pre-pandemic, pre-boom, pre-strife normal. This is not the “new normal,” either. There never was, never is, any kind of “normal.” There is only the current configuration, life as we have it, and what the circumstances will bring. But this is also not the “end of days.”

Every fire eventually burns out. The rains come, the ground soaks up their moisture, and the stubbornest embers are extinguished for another year. We may have other and worse fires—or possibly better drought conditions—next year, but this year’s firestorm will eventually be over. Yes, the ground is burned, homes are lost, and a number of lives and livelihoods are upended. But the ground is also cleared for new growth, and the way is clear for people to start over. As someone who sits on a forty-year pile of accumulated possessions and closets full of just “stuff,” I sometimes think a good fire is easier to handle than a clearing operation where I would have to weigh and consider every piece of bric-à-brac against future need or desire.2 Sometimes you just have to let events dictate what happens in your life.

Every plague eventually fades away. The virus or bacteria mutates into a harmless nuisance, our immune systems adapt to handle it, or medical science comes up with a vaccine, and a devastating disease disappears from our collective consciousness. Yes, we have death and disability in its wake. But death and disability come to us all, if not from Covid-19 then from the annual influenza, or a cancer, or accident, or other natural and unnatural causes. For those who survive, our lives and our attitudes become more resilient, more grounded, more able to take life’s hard blows. That which does not kill me makes me stronger—until it or something worse finally kills me. And this is the way of life on this planet. The essence of the human condition is that we have the self-knowledge, foresight, and insight to understand this, where for every other animal on Earth, life’s stresses are pure misery and death is the ultimate surprise.

Every economic downturn paves the way for growth. At least, that is the cycle in countries that enjoy free-market capitalism. “Creative destruction,” the watchword of economist Joseph Schumpeter, captures the vitality of markets that are able to respond to current conditions and meet the needs and demands of people who are making their own decisions. In my view, this is preferable to one person or group, or a committee of technical experts, trying to guide the economy and in the process preserving industries, companies, and financial arrangements that have outlived their usefulness but provide some kind of national, political, social, or emotional stability that this group values above letting the mass of people make their own decisions.

Every political crisis passes. Issues get resolved, the emotions die down again, and life goes on in uneasy balance. The new stability may not reflect the goals and values that you were prepared to fight for, actually fought for, or maybe even died for. But the resolution is usually a compromise that most people can live with … unless the end of the crisis is a terminal crash, a revolution, a civil war, and a crushing loss that results in a majority—or worse, a virtual minority—beating the other side’s head in and engendering animosities and unhealed wounds that fester for generations and destroy everyone’s equanimity. Sometimes the best we can hope for is an uneasy, unsatisfying compromise that will hold until the next round of inspirations and aspirations takes control of the public psyche.

There never was a normal, just the temporary equilibrium that kept most people happy, a few people bitter, and many people striving to make things better. There never is an “end of days,” because history has no direction and no ultimate or logical stopping place—at least, not until the human race dies out and is replaced by the Kingdom of Mollusks, if we’re lucky, or the Reign of the Terror Lizards, if we’re not.

1. But see my take on that possible conflict in That Civil War Meme from August 9, 2020.

2. I recently completed such an operation with a forty-square-foot storage locker that I was renting, and the exercise took three months and was exhausting. You stare at a book you once thought you would read, or a jacket you once wore as a favorite, and have to decide its ultimate fate. Sooner or later, you just have to let go and throw this stuff away.

Sunday, September 6, 2020

Counterclockwise

World turned upside down

The other day I was reading in Astronomy magazine one of my favorite features, “Ask Astro.” There readers pose questions about the universe and astronomy in general, and experts are called in to answer them. This one asked why the Sun orbits our galaxy in a clockwise direction, while the planets orbit the Sun in a counterclockwise direction.1 And that got me thinking about the arbitrary nature of directions and much else in our daily lives.

After all, the conceptions of “clockwise” and “counterclockwise” didn’t come into use until people started telling time with geared mechanisms instead of the angle of the sun, sand running through an hourglass, bells rung in a church tower, or candles burning down past inscribed markings. Clocks with gears have been invented and reinvented using different driving forces—water, pendulums, springs—in ancient Greece in the 3rd century, China and Arabia in the 10th and 11th centuries, respectively, and Europe in the 14th century. The fact that most round clock faces count time by moving the hands from left to right—clockwise—is based on the usage of early sundials. These instruments track the Sun rising in the east, and therefore casting the shadow from the sundial’s gnomon in the west. Then the Sun moves to the south at midday, casting it’s shadow to the north. And finally, the Sun sets in the west, casting the shadow in the east. All of this, of course, is predicated upon the person observing the sundial facing north as a preferred direction. This daily rotation from west, or left, to east, or right, was so familiar that early clockmakers copied this movement.

Of course, all of these cultures that used sundials and invented mechanical clocks were spawned north of the equator and only lately spread them to cultures and European colonies established south of the equator in southern Africa, South America, and Australia. If those areas had been the home of a scientific, technically innovative, colonizing, and marauding culture, and if the peoples of the Eurasian continent had been inveterate stay-at-homes, then things would have been different.

Clockmakers originating in South Africa, Tierra del Fuego, or Australia might have faced in their preferred direction—south, toward the stormy seas and distant ice flows of Antarctica. And then they would have erected their sundials and drawn their clock faces based on the Sun rising at their left hands and casting a shadow to the west, moving to the north behind them and putting the shadow in front of their faces to the south, and finally setting at their right hands in the west and casting a shadow to the east. Their clock hands would have run in the direction we call “counterclockwise,” and the rest of the world would have followed suit. It all depends on your point of view, which is based on accidents of geography, demography, and historic migrations.

What else might have been different based on these historic accidents?

Certainly, our book texts and traffic signs reflect differing points of view. We in the European-based and -influenced world read texts from left to right, pretty much the same as the movement of our clock hands. But this was not universal. If we had kept the alphabets and scripts of the ancient Hebrews and Arabs, writing from right to left, and orienting their books from what we would consider the back cover to the front, then our literary world would be different and we would stack our library shelves in a different order. But we don’t, because we follow the Latin practice of the Roman culture that dominated the western, and eventually the eastern, end of the Mediterranean Sea and surrounding lands.

The earliest writing forms were different yet again. The Egyptians wrote in both rows and columns, depending on whichever was more convenient, and indicated the direction in which the symbols were to be read by the way that the animal signs—birds, snakes, and so on—faced at the top or side of the text. And anyway, hieroglyphs were for the priestly and aristocratic classes, intended to preserve the thoughts of important people for their future generations, and not for just anyone to read and understand. Early cuneiform writing from Mesopotamia was written from top to bottom and right to left, although they changed that direction from left to right at a later date. Chinese, Japanese, and other Asian scripts are generally flexible, written left to right when in horizontal rows, or top to bottom in columns and then mostly reading those columns from right to left—although sometimes also left to right.

Ancient Greek was the most practical of all, because texts were written and read from left to right for the first line, right to left for the second, back to left to right for the third, and so on. This was economical because they had no fraction of a second lag in brain time between the eyes finishing one row of letters on the right end and then tracking back to the left side of the page to start anew. This form of writing was called “boustrophedon,” or literally “as the ox plows.” Like most things Greek, it was eminently sensible—but it never caught on elsewhere.

And then, as to the shape of our books themselves, consider that what we think of as a “book” is really an invention of medieval monks with their manuscript codices,2 followed by Gutenberg and his printing press in Europe of the 15th century. Because Gutenberg was printing single broad sheets, folding them into pages, stacking them, and sewing the stacks together in a continuous, linear format, we have the modern book. Gutenberg probably inherited the idea of printing itself from Chinese books of pasted pages that were developed in the Song Dynasty around the 11th century.3

Before that, the Romans, Greeks, Hebrews, and just about everyone else wrote on scrolls. These were rolled up and packed into cubbies on library shelves, identified for the searcher by clay or wax tags attached to the tube ends. I have often thought that the order of the books we read in the Old and New Testament is rather arbitrary—except for Genesis, of course—and originally was based on whatever scroll you happened to pick up next. Someone must have written out a “cheat sheet” somewhere to direct you to some kind of chronological order after Genesis and throughout the New Testament. But things became easier when the pages were put in neatly linear order in a single sewn book.

A lot of the world we inhabit today—from clock faces, to the way we write, to which side of the road we drive on, to the shape of our keyboards—is pretty much a matter of geography, demography, and perspective. And the solutions we live with are not always the most convenient and sensible.

1. Short answer: The planets formed out of a cloud of dust and gas that started to spin in a particular direction—counterclockwise, when viewed from “above,” or from the Sun’s “north pole”—as it collapsed. But that gas cloud was already moving in another particular direction—clockwise, when viewed from “above” or “north” of the galactic plane. The opposite motions are more or less separate, arbitrary, and related to your point of view.

2. Codices (plural of codex) were handwritten single pages that were grouped together and bound between two wooden boards, as opposed to the rolled scrolls used in earlier times.

3. Printing was presumably invented by the Chinese about four hundred years earlier, where the entire page was carved from a single block of wood. Of course, this was just an advanced form of the rolling seals and stamps that had been in use for thousands of years. Carving a single page made sense when individual Chinese ideograms numbered about 50,000—too many to sort and select as single pieces of type. However, by the Song Dynasty the Chinese printers were doing just that. Gutenberg, with only twenty-six characters to choose from, plus capitals and punctuation marks, had an easier time with movable type.

Sunday, August 30, 2020

Absent the Middle Class

Girl with magic box

I was born just after the Second World War, which means I grew up and became politically aware—or at least what I think of as “aware”—in the Eisenhower, Kennedy, and Johnson administrations. This was a time when the United States was the “last man standing” among the nations that participated in the war, and we came out better than any on either side. We had our infrastructure intact and had built up a huge capacity in raw materials like steel and aluminum as well as manufacturing due to the war effort. We were on top of the world.

That was also the time when the middle class in America was doing its best. Soldiers returning from the war were getting free education on the GI Bill. Homes were being built in newly defined and rapidly expanding suburbs. Business was booming and, even with the returning soldiers, jobs were plentiful. Most people—there were exceptions, of course, especially in the Jim Crow South—were prospering as never before. It was the good times.

The middle class is a relatively new thing in human history. It didn’t really develop until political and social structures had changed: urban life became commonplace, rather than the exception; and capitalism, the free market, and international trade became encoded with commonly accepted practices and rules, rather than just things that happened casually at the village level. The middle class was the place where people who were not nobles and landowners, yet too ambitious and too well educated to remain peasants, could find profitable employment and eventually riches by engaging in large-scale trade outside of selling butter and eggs on market day, or manufacturing outside of single-family cottage industry, or taking on the new roles in banking and legal transactions that supported these intermediate activities.

The middle class was for people with hopes and ideas, for those who sought independence from the old social classes, for those who wanted to do better than their fathers and grandfathers, for those who hungered to prove that they were as good as anyone and a damned sight better than most. It was the class of the feisty ones.

From Roman times and into the Middle Ages and then the Renaissance, the landed class, the nobles and the gentry, despised these strivers. Going into trade or handling money professionally was all about getting your hands dirty. And while anyone might admire a legally trained mind in the Roman Senate or a lawyer at court doing the king’s business, the sort of person who argued about the price of injury to a cow or the placement of a fence line was little better than a conniver and a con man in his lordship’s domain.

And of course the peasants, lately serfs, and still working the land that their father’s had farmed and sharing the proceeds with the lord of the manor, all viewed members of the middle class as social upstarts, the big men from town, whose fathers might have been the local blacksmith or miller, and whose grandfathers had been serfs like the rest of us. People who wore britches and waistcoats rather than the peasant’s smock were already getting too big for themselves.

So the middle class has been under suspicion and under fire for a long time. It wasn’t just idle animosity that made Karl Marx and the other socialists of the 19th and 20th centuries despise the middle class with its striving and materialistic values as “bourgeois”—which is just the French word for this class—or worse, “petit bourgeois,” as if they were too small to be significant. And why not? When the politics you’re selling involves state ownership of the means of production, and puts them all in the hands of appointed technocrats, or the revolutionary vanguard, or the modern equivalent of Plato’s philosopher kings, then the people who know how to handle their own or their neighbors’ business practices and money, who will start new enterprises simply because they think they can make a profit from them, and who will obey rules but not wait patiently on instruction from their betters—these people are the bureaucrats’ natural enemies. These are the people who will upset the serenely floating boat of socialistic doctrine and practice. And so these are the people who must be the first to go up against the wall.

And the peasants, the modern blue-collar workers, the ones who are content to do what they are told and lack the ambition or the education to go out and start their own businesses, even as house painters and contractors—they will be quite happy to work in a factory owned by the moneybags class with protections from their union, or work in the factory owned by the state with those same protections in place according to state law, and still have their union—if that’s even needed. The fate of those middlemen, professionals, and entrepreneurs is irrelevant to the new peasant class, at least at the surface of their minds.1

The middle class has always been in, well, the middle, between two classes that would just as happily see it disappear. And the middle class is disappearing these days. Not only is the upper class getting bigger—in terms of its power if not its numerical size—with wealth beyond the dream of kings and emperors of old. But the lower class is also getting bigger, with more people finding it harder to get the education and the good jobs that will enable them to enter the middle class as professionals, business owners, and independent traders. It is getting harder to own a house rather than rent, buy a new car instead of a used one or a lease, ensure your children a good education, take annual trips on your vacation—if you even get one while working two jobs—and plan for a comfortable retirement.

The middle class is being squeezed. Whether this is a planned process or just the natural course of modern economics,2 it’s happening. It has been going on in every decade of my life since I became politically aware. And I don’t know if it’s because the upper class and the Marxists do well when the majority of the people are more dependent on government and the largesse of big corporations than on their own initiative, or because we’ve lost something of the entrepreneurial spirit that fed bright and hopeful people into the middle class.

But something’s missing. And neither the top nor the bottom seems to notice or care.

1. If the peasant or the blue-collar worker thinks deeply, however, he will wonder where the technologies and inventions of the modern age—electricity, telephones and televisions, personal computers and smartphones, numerous medical advances, and easy credit and banking—all came from, if not from the entrepreneurial spirit of those who have their philosophical roots, if not their family background, in the middle class. But I digress …

2. As Robert A. Heinlein noted: “Throughout history, poverty is the normal condition of man. Advances which permit this norm to be exceeded—here and there, now and then—are the work of an extremely small minority, frequently despised, often condemned, and almost always opposed by all right-thinking people. Whenever this tiny minority is kept from creating, or (as sometimes happens) is driven out of a society, the people then slip back into abject poverty. This is known as ‘bad luck.’ ”

Sunday, August 23, 2020

Thingness and Nothingness

King Lear fool

The other morning I awoke to a realization: that while my mind, my conscious thought and memory, is a “spirit of the air” and therefore immortal, my body, my brain and the heart, hands, and legs that support it, is still a thing, and like all things it has a finite and ending existence in space and time. This is not, of course, a new thought in the history of human inquiry. But it is usually masked on a personal level by the day-to-say assumptions we all make, that our lives are virtually infinite and that death is far, far away.

As a professed atheist with a predilection for scientific empiricism, I am denied the consolation of belief in any kind of afterlife: transmigration of the soul into another body, translation of the mind into a spirit residing in a heaven or hell, or even transition into some kind of universal all-soul or cosmic consciousness. I have to believe that, like everyone in my parent’s generation and even some in mine, or a once-favored pet, a tree in the forest, or a dead fly on the windowsill, we are creatures of time and space, have existence and activation for a brief period, and then go out. The closest my beliefs come to religion is the Buddha’s description of Nirvana, the end of karma and its oppressive, continuing cycle of rebirths: you go out, like a candle flame. Where do you go? Don’t ask, because it is not defined.

In my view, this “going out” is nothing you have to work for or struggle to obtain, like the Buddha’s Nirvana. Instead, it just comes with the package of a person’s being a thing in time and space.

I have argued elsewhere against this “thingness” aspect of being human. My point was that transporter beaming as in The Fly and the Star Trek series cannot work simply because I—and people in general—are not things but processes. Every cell in our bodies, unlike the fixed materials in a brick or a bowl of potato salad, is in constant motion: absorbing nutrients, processing them, and casting off wastes, as well as transcribing DNA and translating RNA to make proteins, and using those proteins as building blocks and catalysts within the cells, along with chemical messages being sent through the bloodstream, and electrochemical messages sent across a network among 86 billion neurons in the human brain. None of this is static; everything is moving. Even with the fastest supercomputer, you can’t map the position of every atom in each cell, deconstruct it, transmit the information across space, and rebuild the body—and not expect to create a muddle of chemical reactions that got reversed, DNA and RNA replications that got interrupted, nerve impulses that got scrambled in transit, and a resulting accumulation of errors and poisons in each cell, the bloodstream, and the nervous system.

But still, much as I am a process, the machinery that is running that process has a fixed place in space and time. My software is firmly anchored to hardware, a squishy, chemically based machine.

The difference between a brick and a human body is that elusive, self-replicating, self-activating electrochemical thing called “life.” A brick is unchanging, except for the abrasions it suffers from use and weathering, and so it is virtually immortal. A living body is constantly changing, renewing cells and tissues as they wear out—at least up to a point—and adapting to its environment both physically as well as mentally and emotionally. But when the physical errors mount up—to say nothing of the mental and emotional errors—and the healing process fails, then the electrochemical spark dies, and the body becomes not unlike a brick—except that it deteriorates a lot faster. This is the mortality of all living things.1

Eventually, the brick will wear away—or you can smash it first—and then you have its component molecules of silica, alumina, lime, iron, and so on. But it will no longer be a brick. You can chemically unmake those molecules into their component atoms, then smash them down to protons, neutrons, and electrons, and finally break the neutrons down into protons, electrons, and antineutrinos.2 Protons have never been observed to decay, and their theoretical half-life is many times longer than the estimated age of the universe, so they may in fact be immortal—but proton decay still could happen. So, yes, you can unmake a brick or a human body down to the point of dissipated energy and subatomic fragments. And then neither of them will be a recognized thing anymore—nor, in the case of the human being, any kind of a process—and that’s as close to nothingness as you can get.

When my process dies away to mere thingness, and my thingness disintegrates to nothingness. Then nothing will be left. I may hope to be outlived by my children (if any), by my good works, by the love of the family and friends who knew me—all mortal and thing-based processes themselves—and by my books in the form of disintegrating paper and dissipating electrons. But in reality, any whispers of my existence will all have disappeared long before the last molecules of my body have blown away in a dust of subatomic particles.

That’s a grim thought for a weekday morning. I suppose you would call it a depressing thought. But in the long view—and here I’m talking about human history, culture, and many multiples of a human life—we don’t want things to last forever. Yes, I’d like a “lifetime warranty” on my mattress or my refrigerator, maybe not so much on my car, as I tend to enjoy the process and excitement of buying a newer model every couple of years. But we don’t want people, objects, or even stories and ideas to outlive their usefulness, to become meaningless—or worse, a fool’s punchline—for later and later generations. The imagined life of an Elrond or a Dracula, persisting through the ages of men, is indeed a tragic story.

Ozymandias was a warning to all who would yearn for immortality.

1. And yes, literature is filled with virtually immortal creatures such as elves and vampires. But I remind you that these are creations of the human mind and not actually found in nature. Even the Sequoia sempervirens, which can live for up to two millennia, eventually sickens, falls, and decays—if fire doesn’t get to it first.

2. Free neutrons outside an atomic nucleus tend to decay in about fifteen minutes.

Sunday, August 16, 2020

AI and Personality

Grinning dog
Starleth quadruped

So I’ve been following advances in artificial intelligence (AI) ever since I wrote my first of two novels about a self-aware computer virus. The current computer science isn’t up to actual self-awareness yet—more like at the stage of complex routines that can establish a goal, accumulate new information, hold it in long-term memory, and act on and repeatedly modify that information. This can simulate a living intelligence in terms of the Turing test, but it still might not be self-aware.

My little dog Sally is not self-aware, either. That is, unlike humans, dolphins, and perhaps elephants, she does not perceive of herself and have consciousness of herself as separate from the reality she inhabits. The test of this is the mirror: when human beings look in a mirror, they recognize themselves, associate the image with what they think of as “me,” and may touch their face or hair to improve that image. If you put a mirror in a dolphin’s pool, then strap a funny hat or put a mark on the dolphin’s head, the animal will go to the mirror to check out its “own” image. But if my Sally approaches a mirror—we have several floor-length mirrors around the apartment complex—she either pays no attention to the small dog coming toward her or is eager or shy about meeting that dog. For her, the image is “other,” not herself.

And yet, Sally has a definite personality. I was reminded of that this morning. We had gone out through the back of the complex for her early walk, where she had dutifully completed her business. There were other dogs out there, she knew, and she is shy about meeting any dog bigger than herself. When we got back inside, at the elevator lobby, I discovered that the up and down buttons on that floor were not working: we would have to go outside again to an external stairway and climb to the second-floor lobby, where I knew the buttons would work. And at the outside door, Sally balked. She would not go back out again. She didn’t have any more business to do; there were big dogs out there; this was not the morning routine; and because … why? It took a lot of coaxing, tugging on her leash—to the point of pulling her collar up around her ears, which she hates—and encouraging noises on my part, until she relented, wagged her tail, and came along willingly.

Clearly, something in her perception of the situation had changed. She either overcame her fear of whatever dogs were outside, or she decided that I—her buddy, walk coordinator, and the fellow holding the other end of the leash—knew best about the decision to go back outside, or she remembered the last time we went out that way because the elevator buttons didn’t work—although she could hardly understand the concepts of “button” and “work.” Whatever, she had a new thought and, after her initial stubbornness, came with me happily.

I’ve been watching the development of agile robots on four legs—dog surrogates—with extensible necks and a jaws-like pincer that can open doors and carry objects. I read recently that some of the Boston Dynamics robots have been assigned to patrol parks and warn any humans they see violating social distancing. That’s pretty sophisticated activity. And the four “paws” of the robot dog are better at moving cross-country than wheels would be. That got me to wondering what it would be like to have an artificially intelligent dog instead of my Sally. It would probably be a demon at defined tasks like patrolling the complex perimeter, herding sheep—if I had any—or fetching my slippers. It would bark less and obey more. And it would never, ever leave a puddle in the bay window. That, and it would never need to be taken for a walk on a leash several times a day in order to do that business, either.

But a robot dog would still be a machine: purpose built, goal oriented, and just a complex of embedded responses. It might be programmed to wag its tail and lick my hand. It might even be taught to use human speech and say endearing things. And it might—if you kind of unfocused your eyes and willingly suspended your disbelief—serve as a therapeutic presence when I was sad or depressed, and be programmed to detect these states and act accordingly. But it would not be “intelligent” in the way a dog is, with its own quirky personality, its own mind about why it will or won’t go through a door, its own reasons for barking its head off at a noise upstairs, its own silly, toothy grin and hind-leg dancing when I come home, and its own need to put its chin on my knee after we climb into bed. It wouldn’t shed—but it wouldn’t be warm, either.

I’m no expert on artificial intelligence, although I can fake it in science fiction. But right now, AI expert systems are perfectly acceptable as automated drivers, button sorters, pattern recognizers, data analysts, and yes, four-legged park rangers. We value them in these roles because they function reliably, give predictable answers and known responses, and they never balk at going through a door because … why? If they did have that endearing quirkiness—the tendency to give inexplicable, self-centered, unexpected responses to unusual situations—and occasionally left a puddle of oil on the floor, we would value them less.

However sophisticated their programming, robot dogs would not be our companions. And much as we liked them, we would not love them.

Sunday, August 9, 2020

That Civil War Meme

War devastation

Almost everyone who is paying attention will agree that the political situation in this country between the progressive Left and the conservative Right is becoming desperate. Families and friendships are being sundered over political differences. The differences represented are existential and encompass radically opposed views of what this country stands for and where it is or should be going. There is no middle ground upon which members from opposite sides of the question can build a workable compromise. The stakes have become all or nothing.

The last time this happened, between the views of the Northern abolitionists and the Southern slaveholders and states’ rights advocates in the 1950s, the only conceivable result was dissolution, secession from the Union, an attempt at a parallel slaveholding government in the South, and ultimately a war to bring the seceding states back into the Union. When the issues are existential and are believed to encompass the survival of one side or the other, when there is no middle ground or possible compromise, then breakup and/or civil war becomes the only answer—terrible as that may be.

Some would say that the “cold civil war” over political and cultural differences—which has been going on in this country for the last dozen or so years and perhaps started as far back as the 1960s—has already grown hot. In the past month, we’ve seen what are supposed to be “peaceful protests” in various cities (Minneapolis, Seattle, Portland) meld into violent riots and attacks on both city and federal properties both there and in other cities (Richmond, Austin, Oakland) in a spreading conflagration. Now the Department of Homeland Security, a recent addition to the federal government based on earlier terrorist activity, is supposedly fielding agents to protect federal buildings and round up the people attacking them. To me, this looks like insurrection. This looks like the earliest stages of an armed conflict.

The supposed “Second Civil War” that is being shouted in various novels, blogs, and memes right now—including some of mine—is not going to look like the first Civil War of 1861-65. Here is why.

First, the nature of war and the weapons used to fight it have changed drastically in the last 160 years. The Union and Confederate armies were composed of foot soldiers who marched in relatively tight formations and fired muzzle-loading muskets, supported by muzzle-loading cannon and men on horseback scouting ahead of the marching armies. The fastest means of communication was the telegraph wire, usually strung along railroad rights of way. But armies in the field away from the rail lines had to rely on a man riding a horse and carrying a handwritten message. The armies themselves could only meet on suitable ground, a defined battlefield, and usually tried to outflank an opponent to reach their own objective, or ambush an opponent to keep him from reaching his objective. This was all two-dimensional and—except in punitive expeditions like Sherman’s March to the Sea—paid little attention to strategic operations against civilian objectives.

As we’ve watched the progress of war from marching brigade lines to the immobilized trenches of World War I, through the mobile armies of World War II and Korea, to the Air Cavalry in Vietnam, and finally the village and urbanized insurrections of Afghanistan and Iraq—all with the background of a nuclear exchange in the offing—we know that a modern war on the continental United States will not be anything like the first Civil War. What would Lincoln and Grant not have done if they had helicopters and F-16s, let alone the threat of atomic weapons? A civil war today would probably not even be about taking and holding territorial objectives, especially if the war was not preceded by states seceding from the Union. It might be more like Vietnam, Afghanistan, and Iraq, all about winning “hearts and minds” and punishing insurrection. It might be neighbor against neighbor, with the frontlines drawn between cities and suburbs, neighborhood against neighborhood, like the Spanish Civil War of the 1930s.

Second, the looming civil war might well not be one of secession and recapture. Between the first Civil War and today, the nature of our governments at both the state and federal level has also changed. During the 1860s, the state governments were relatively strong, and the federal government was relatively weak. The federal government, at least at the start of the war, was small and funded mostly by customs duties, tariffs, excise taxes, and some direct taxes. There was no national income tax—but neither were there immense federal programs and outlays for Social Security, Medicare, and Medicaid; transportation projects associated with the Interstate Highway System, along with control and regulation of rail and air travel; educational standards and directives, backed up by grants and benefits; environmental projects and regulations; financial audits and controls, including the Federal Reserve and its management of the economy and the money supply; and the thousand other things we depend on the federal government to provide today.

Whether the current “Red States” in the central part of the nation secede from a Union dominated by the “Blue States” along the two coasts and the upper Midwest, or vice versa, one group is going to be left with all those federal programs, along with the Federal Reserve and responsibility for all those Treasury bonds and the federal debt. Maybe everyone in the part of the country that secedes will be comfortable with giving up their Social Security and Medicare contributions and future benefits, all that highway and education money, and everything else we’ve come to rely on the federal government to supply. Maybe forgoing their share of the looming federal debt would be compensation enough. But rewriting those funding and social service obligations under a newly conceived and authored Constitution and code of laws would be a gamble for most people. Some—especially those with much to lose under a new interpretation of the tax code—might think twice about giving up the devil they know for the one that has yet to be born.

And then there are the pesky details of what would become international transactions. For one side or the other, the companies and networks we all expect to function smoothly—the internet and its cloud computing resources, the U.S. Postal Service and delivery services like FedEx and UPS; distribution networks like Amazon and eBay; communications services AT&T and Verizon; farming, food processing, and distribution companies that keep the rest of us supplied with flour and bread, vegetables, chicken, beef, and Hostess Twinkies; the electric power pools and their system exchanges; control and security of interstate air travel and railroads; oil and gas transmission pipelines, to name a few—all will all be tossed into a cocked hat and distributed variously between two different countries. Some of these functions will continue smoothly under international agreements. Others will become fragmented, prizes to be pulled apart in the interest of benefiting one party while hurting the other.

Any way you look at it, our country—the whole United States—has become far more interconnected and centrally governed, less regional and local, less independent, than it was 160 years ago. A breakup into Red and Blue, if that is even the correct dividing line anymore, would be far more difficult to pull off, and even more difficult to operate in two halves—especially if the Blue halves were physically separated by a big Red chunk in the middle, with borders, tariffs, and travel restrictions going both ways—than the country that divided in 1861.

All of this is food for careful thought before we embrace the Civil War Meme and start picking sides.

Sunday, August 2, 2020

Beyond Socialism

Puppet master

The progressive far left of the Democratic Party, taking the line from Bernie Sanders and other members of the “Democratic Socialist” persuasion, has put forward a number of proposals designed to appeal to American voters: Medicare for All, Free College for All, Universal Basic Income, and similar direct subsidies from the federal government. To the ears of those of us on the center and right, this sounds like classic socialism. But is any of it really the kind of socialism that Karl Marx, Rosa Luxemburg, Vladimir Lenin, Leon Trotsky, or Mao Zedong would recognize?

Let’s take Medicare For All. Presumably, it would extend Medicare coverage to all Americans, not just those over 65. Presumably, it would continue the payroll withholding that we all pay as registered Social Security recipients, currently taxed at 2.9% of wage and salary income, split between employer and employee (or wholly paid by the self-employed) on the first $125,000 of income. Also presumably, it would still cover only 80% of the recipient’s medical and hospital costs, and still allow the recipient to buy supplemental private insurance under “Medicare Advantage” plans to cover the remaining 20%, usually with a modest co-pay for doctor and hospital visits and drug costs. So this is basically a government-run insurance program, managed by the Centers for Medicare and Medicaid Services. This is not philosophically different from private insurance, except that it is supported—at least in part—by payroll tax revenues and does not allow for a profit motive. As the largest single payer in the country’s medical system, this extension of Medicare would be able to dictate to doctors and hospitals the levels of service they might provide and the charges they could bill. These would be “negotiated” to the same extent that any dealings with a monopoly supplier or monopoly consumer—think of the U.S. Department of Defense when buying weapon systems—are a negotiation.

Contrast this with the healthcare offered under socialist systems like the British National Health Service (NHS). There the government health agency owns and operates the hospitals and other facilities, directly purchases medical equipment and supplies, directly employs doctors, nurses, and other staff, and provides service for free to British citizens—other than through the taxes they pay. Some doctors and services may be privately owned and operated, but they must be supported by patients paying their own way or having private health insurance. The Canadian healthcare system, called Canadian Medicare, is funded by taxes through the provinces and pays the bulk of costs associated with hospitals and doctors—which are then owned by private, nonprofit institutions. Unlike the British system, Canadians cannot use private insurance in most of the provinces to pay for government-covered basic services or obtain private, fee-for-service medical care anywhere within the country.

Or consider Free College for All. Presumably, the government—which now guarantees almost all student loans—would continue to offer them but at almost no interest or without payback terms, making the money virtually free as a taxpayer-funded service. I have not heard of any proposal with a plan for nationalizing the universities and community colleges, putting those institutions under government ownership and control, and directly hiring and paying the professors and other staff.1

Neither of these progressive proposals in their most advanced form would involve the government actually owning the means of production (hospitals and colleges), managing the infrastructure for provision of services (administration and billing), or employing the personnel who offer those services (doctors and professors)—which is the classic definition of socialism. Doctors would still have to rent, staff, and furnish their own offices. Hospitals would have to build and equip facilities and maintain a requisite hospital beds based on their estimates of the available market. Colleges would still have to acquire land, build classrooms and football stadiums, and determine their own curriculums based on the needs of their projected student population.

The “socialism” offered by the progressive Left in America is not about ownership of the means of production and infrastructure for services.2 It’s not even about financing these means on the producing side. Instead, it’s about providing the individual buyer with the money to pay for the goods and services he or she takes. Building and paying for the factories to make goods, the commercial associations to provide services, the stores and offices to distribute them, and the logistic systems to mediate between them are all left to the individual, privately held providers. This is even less socialist than our current system of providing universal, K-12 education: the local public school, which is owned and operated by the local school district, which in turn is an agency of the local government.

Socialism is messy. As every country that has tried it—nationalizing all means of production and provision of services—has discovered, it’s hard to get politically connected bureaucrats—usually rising party members—to care deeply about or know and understand intimately the facilities and people they’ve been sent to manage. Suddenly, the government is responsible for making things that actually work, getting them into the public’s hands in a timely fashion, and providing the services that people need to go on living and thriving. That’s all hard and takes dedication—and usually a stake in the game, represented by some kind of return on the managing individual’s time and effort, not just a gold star at party headquarters. Owning the business and making people happy are hard when done at the remove of government and party politics.

Even Elizabeth Warren, one of the Democrats’ most ardent progressives, doesn’t want to nationalize the U.S. economy as the Soviets tried to do (and failed) or the Chinese Communists tried (and eventually elided into a form of crony capitalism). Instead of abolishing U.S. corporations and their funding mechanism of venture and shareholder capitalism, Warren would put them under a charter system. The corporations would still own their productive facilities, make investments, and manage themselves, but they would operate under rules and obligations dictated by the federal government. This system would put the government’s social goals and recognition of other stakeholder needs—like communities, minorities, customers, unions, the environment, and whoever else comes to mind—alongside whatever the shareholders and owners of the company are trying to achieve. She wants to control the corporations and the U.S. economy without actually taking responsibility for making wise investments, creating and supplying useful products, offering good service, or running the business without running it into the ground. However, when you’re not responsible for the productive outcome, you can cheerfully make rules without regard for consequences.

That said, in relation to Medicare for All, Free College for All, and similar proposals, there is still the moral equivalent of socialism when the government is the sole buyer—the monopoly buyer—of medical, educational, and other services through the rules it lays down for what products and options the individual buyer will have access to through the government program and under what conditions they will be provided. The government is then in a position to dictate pricing and supply terms to the independent providers. Some providers, as under the British NHS system, would still exist outside the government-funded products and services, able to charge wealthy clients paying with their own money or with private insurance for non-government–covered services and procedures. However, there may not be enough wealthy people to go around to make offering these products and services a profitable strategy. Other providers, as under the Canadian system, would likely be limited to serving all citizens through the government-funded facilities without a private exception and rationing products and services accordingly. No, he who pays the piper not only calls the tune but, essentially, owns the piper.

Under those conditions, we can only hope the piper survives. And maybe, ultimately, that is the point of these proposals.

1. However, for example, the California State University (CSU) and the Regents of the University of California (UC) systems, both of which operate campuses up and down the state, are largely funded by public money from taxes, supplemented by tuition, fees, and other resources.

2. See, for example, Why Own When You Can Rent? from October 13, 2013.

Sunday, July 26, 2020

Created Equal

Declaration of Independence

You know where in the Declaration of Independence, in the second paragraph, it says, “We hold these truths to be self-evident, that all men are created equal”—I have always stopped right there.

To me, this is not a self-evident truth but fatuous nonsense. No two persons are or ever were equal. One or the other is always going to be bigger, stronger, faster, smarter, shrewder, more handsome, more sexual, more adept with women, better able to handle his money or his liquor, better at cards, better at life. No two things in this universe, not two stars or planets or galaxies,1 were ever “created equal.” So, for a long time, this statement—at the core of the Declaration—puzzled me.

But the idea of equality encompassing all these characteristics is actually a late 20th- and early 21st-century notion. To understand what Jefferson was writing about, you have to see this statement in terms of 18th-century suppositions2 and pay close attention to the clauses that follow.

The 18th century was a time when people were presumed to be divided at birth between nobles and commoners, lairds and crofters, lords and serfs, “the better sort of people” and “the rest of them.” This was a given: that a man (or woman) was born into a certain station in life, and that for a person to rise out of it—or fall from it—was a remarkable thing. “Rising above your station” was also contrary to God’s preordained intentions, because it was God who put you there in the first place; it was part of His plan for you. Changing one’s position in life violated the “Great Chain of Being”3 and was unheard of in the Europe of the Middle Ages and even into the Renaissance. People and their classes were fixed for eternity. But by the 18th century, in England at least, a prominent person of rather humble birth who had done good service to the crown or the nation as a military officer or politician might be granted a baronetcy, which ranked him as a minor lord equivalent to an old-fashioned knighthood with the privilege of hereditary transference. Sort of a back door into the club.

But in the American colonies, where people with actual noble rank were only visitors and not cast upon the distant shore to make their way or die, elevated station was not gained through inheritance—or not at first, and not officially—but rather as the product of holding large amounts of land as a scion of the early settlers or making or inheriting vast amounts of money in trade. And such holdings, along with their unofficial rank or status, could just as easily be lost through stupidity, ill fortune, or a bad turn at cards. By 1776, the notion that some people are born better than others, rightfully holding a higher station in life, favored by God, granted rights and privileges to which the common run of humanity was not entitled—well, that Old World view was pretty much blown. Or that was the mindset of the radicals of the time: Jefferson, Washington, Adams, and everyone else looking to start a new country on this continent.

And so we come to the clauses that follow: “that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.” And that’s the nut of it. A free man on American soil was not a peasant to be bound to the land or ridden down on the road by a lord on his high horse. He was free to work where he wanted at the thing that made him happiest.4And it was hard to hold a man at a job in town or on the landlord’s farm when the whole colony occupied only the easternmost fringe of coastline—penetrating only one or two hundred miles inland—on a continent that spanned three thousand miles. Anyone dissatisfied with his lot could easily pick up an axe and a hoe, walk off into the woods, come to the first empty clearing, and start farming for himself.

So the Americans of the mid-18th century, the time of the Declaration, did not see themselves as anything less than King George, their lordships of Old England, or members of the Parliament that made rules for their American colonies. Americans were not of lesser station or deserving of less consideration or fewer rights because of their humble or obscure heritage. But at the same time, the concepts of life, liberty, and happiness embedded in the Declaration were understood to ensure only a person’s equal treatment under the law, which granted him or her equal rights and privileges. But the Declaration never meant to guarantee equal expectations and outcomes in life, because each person has to make those for him- or herself.

The distinctions between noble and commoner—or indeed between white European and black African—that weighed so heavily on the minds of 18th-century thinkers have now been worn away by two and a half centuries of actual judicial equality. For most Americans, our shared humanity is a simple matter of personal enlightenment in a society that already values equality. We have made a society where all men are truly equal before the law—or at least that is our stated intention, since the Civil Rights Act of 1964. And so the class and judicial distinctions that impelled the language of the Declaration no longer exist for us. This leaves the present-day reader free to imagine other, perhaps deeper meanings for the word “equal” in that context.

The old class system in England and in Europe generally—between the peasant and the lord—were systems to confer and maintain advantage and power. The families that attained leadership positions in the neighborhood or as adherents to the king, or those with large landholdings—usually the same thing, as land was generally gifted by the king as a remnant of feudal times—maintained these rights by passing them down to their sons and occasionally through their daughters. Noble titles usually accrued to such positions of power, and eventually the inherited title came to mean something even after the political power or the land and money were gone.

In America, although a person might be born into low station, that guarantee of liberty and the pursuit of happiness allowed the individual to pursue and maintain large landholdings—remember those three thousand miles of open land stretching to the Pacific Ocean—or earn vast fortunes through enterprise and industry. And the 19th and 20th centuries were a time of huge technical advances in mechanics and energy, which enabled many entrepreneurs to build great wealth, just as the late 20th and early 21st century now provide opportunities in cybernetics. By passing the wealth down through sons and daughters, wealthy dynasties not much different from European nobility have been built.5

But now, with a new reading of that “created equal”—focusing on the desirability of equal expectations and outcomes—the American spirit seems to want no inequality. There should be no systems of power, no wealth to be made for oneself and one’s family by hard work, prudent saving, and shrewd investment. The government, it is presumed, will see to it that everyone prospers equally, shares equally in the goods and services created by society—not by individuals or corporations or inspired inventors or from the fruits of personal labor, but by the faceless forces of society as a whole. That way, no one gets more than anyone else, and we will all be equal in fact as well as under the law.

Of course, that’s poppycock. Marx may have imagined that, after a brief struggle under the dictatorship of the proletariat, the state would eventually wither away, and everyone would settle into joyously producing according to his abilities and contentedly consuming according to his needs—forever. But Marx was a madman. In reality, in a nation with a large and diverse population and variously located farms and factories, and long-distance logistics, someone has to operate the distribution system, decide whose needs will be met, in what order, and when. That, my dears, is a position of power. And people with power will tend to profit by it and want to pass it along to their sons and daughters. It is in our human nature. That impulse created the new nobility of the Nomenklatura in the old Soviet Union and the ranks and privileges of Party members in every other socialist totalitarian state.

You can no more eliminate the concept of power from human society than you can eliminate sex or hunger, envy or greed. You can dampen the notion of ambition and make it generally distasteful to a large swath of the population. But someone will notice that the reins are flopping around on the horse’s back and take it upon himself to reach for them. That is in our nature, too.

So long as people are born with differences in the characteristics that count in life—some being smarter, some shrewder, some with more energy, or talents, or just with different dreams—equality in expectations and outcomes will remain a phantom, a figment, a madman’s goal, a utopia never to be found under heaven.

1. Except maybe for subatomic particles—like protons, electrons, and neutrons—but even there modern physics is having doubts about the construction of the Standard Model of quantum mechanics.

2. We also have to get over our modern feminist principles and the visceral reaction to that reference to “all men.” In the 18th century, the differences—or lack of them!—between the sexes and their roles in family and public life were not questioned. “All men” was not read as a distinction from or exclusion of women, but instead the phrase encompassed mankind—what we now are taught to call “humankind”—and the species H. sapiens generally.

3. The Great Chain of Being was a medieval Christian construct, intended to order the universe and everything in it from God Himself (the highest level) down through the ranks of angels, the classes of humans, then animals in the order of their natures and station in the food chain, next the plants, and finally the minerals (the lowest). Without a working knowledge of physics, chemistry, biology, and genetics, it was one way to bring structure to chaos and to keep the scholar’s mind busy.

4. Of course, indentured servants and chattel slaves were different, although early in the next century, if not in the 18th, people were beginning to question that assumption.

5. With the difference that these holdings can be lost through stupidity, changes in the marketplace, or a bad turn at cards. If you doubt this, ask yourself what has happened with the heirs to the Carnegie, Ford, or Kaiser fortunes. The pattern in this country is for an industrial empire to be built by an innovative entrepreneur, managed by the second generation or converted to a publicly traded corporation under a board and appointed managers, and if not converted, then generally lost in the third generation—unless the resources are put into a foundation, such as those of Carnegie and Ford, and managed for some public good.

Sunday, July 19, 2020

On Respect

Helping hand

It has long been a key demand of people who describe themselves as being in some kind of lesser social position that they must get “respect.” What they mean by this is not exactly clear to me. Do they want recognition by total strangers of their innermost and sometimes hidden talents and skills? Deference to their inner sense of excellence? Special treatment because of their reduced situation? Special treatment because of past injustices? What?

I am not prepared to give anyone unknown to me any kind of special treatment. That is, I reserve intimacy, generosity, understanding, and the willingness to be discomfited and indisposed only to my friends and family. To all others I offer payment in cash—and at arm’s length.

But what I am willing to extend to strangers on the street is a limited form of good will. And this involves a small number of unremarkable acts and gestures.

First, I will show the stranger my own form of courtesy, including the performance of small favors. Courtesy is represented by brief eye contact and a polite smile. A small favor would be holding the door open for someone whose hands are full, or holding it after I pass through so that it does not slam in the face of the person behind me. I will also step to one side on a narrow path to allow another person to pass without hindrance. In traffic, I will let the driver in the other lane make his or her turn, or cross the intersection when it is unclear who arrived at their stop sign first, or who holds the privileged position of being “to the right” (a California specialty). And on the freeway, I will not speed up to get ahead of someone coming up the on-ramp and trying to merge—in fact, I will even move one lane to the left, space permitting, to allow them to enter.1

Second, with that eye contact and smile, I am acknowledging the stranger’s shared humanity. My tacit message is that we have certain things in common: vulnerability to gravity, the laws of physics, a certain unspoken regard for rights of way and fair dealing, a lack of violent intent, and a shared helplessness before the existential ennui of the human condition.2 I am prepared to extend this basic humanity to anything that walks on two legs—or with appropriate prosthetics3—and has human form. Trial, testing, and perhaps being found wanting will come later, if our contact extends beyond mere passage on the street. Until you prove otherwise, you are human and I expect you to be self-aware, properly motivated, gracious in return, and reliably housebroken.

Third, if and when our involvement does become more complex, I will deal with you fairly. This might not be your idea of fairness or how you would act in the same situation. I don’t presume to know your standards and feelings, or how you view the world. This will be what I consider fair and even-handed. But, just because you are not one of my intimate friends, that does not mean I will try to short-change or cheat you, take advantage of you, or treat you as prey, a confidence mark, or an enemy. I have no reason to hate you.

Fourth, with that increased involvement will also come my tacit pledge to tell you the truth. Again, this will not necessarily be your truth or anything you might wish to hear.4 I do not presume to know your mind. This will be the version of reality as I understand it, without fear or favor for what might lurk within your consciousness. I will try to present you with an interpretation of reality that we both can find useful. Just because I don’t know you, that does not mean I will try to trick or deceive you. I have no reason to lie to you.

Fifth and finally, if in that extended encounter we should develop differences of opinion or intention, I will extend to you the benefit of the doubt. I know that my understanding of reality and of the current situation might not be your understanding. I will assume that, when a misunderstanding occurs, it is a case of miscommunication—language being such a slippery thing, and intentions not always clear and obvious—rather than the result of intentional misrepresentation or bad conduct on your part.

This is about as much as I can manage with a stranger—and I expect the rest of the world to be reasonably well brought up and extend the same conditional good will back to me. But, for some people, these underlying, tacit acts and gestures may not be enough.

The person who craves, publicly calls out for, and in every situation demands a visible show of a priori respect generally wants one of two things. Either they want to be treated with the same acceptance and understanding that they believe I extend to my intimate friends and family. Or they want to be accorded the credit, acceptance, and admiration that the public generally gives to popular entertainers, politicians, and sports stars: recognition of past accomplishments—or sometimes simple recognition—that the individual demanding such respect has not yet achieved and may not deserve. They want to be put on a pedestal in which they haven’t yet invested the effort of climbing.

And that falls under the heading of “Secret Desires and Intentions.” As I noted above, I cannot know your mind. And if you make a claim to notoriety that is unsupported, I am within my rights in failing to support it.

The claim of undue or special respect, like so much in our modern discourse, belongs to the dissonance between “your feelings” and “my reality.” That’s a set of transactions that, even among people who are intimately related, can be full of slippery surfaces. Among strangers, it’s a recipe for social disaster.

1. However, courtesy in driving can be overdone. Too much deference, extended too long or in the wrong situation, can get you rear-ended. When directing two tons of steel on four wheels—or a half-ton gross vehicle weight of motorcycle plus rider—at high speeds, you have to take your position, move in a predictable fashion, and uphold your rights.

2. If I am really feeling jovial and acknowledging that ennui, I might give the stranger a wink. But that’s happened maybe twice in this century.

3. However, a person in a motorized wheelchair or exoskeleton with advanced hydraulics, rendering him or her faster, more mobile, or stronger than the average two-legged variety, will get special attention and, from me, a defensive attitude and positioning.

4. However, I am not in the habit of telling people unpleasant things, as a version of truth, ostensibly “for their own good.” That is an uncivil habit that should not be practiced on people in the street, much less on one’s own friends and family members.

Sunday, July 12, 2020

The Power in Losing

Sprint finale

It has been a commonplace of modern education—and I can only hope it was never this common—that teachers and coaches in the lower grades have abandoned competition for some sort of communal achievement. Teams no longer pick their own players. Matches no longer keep score. Everyone gets a participation trophy. No one gets their feelings hurt. Everyone feels good about themselves.

That sort of toothless play might be appropriate in preschool among the toddler set, or perhaps in kindergarten, so that every child gets a baby-taste of sports while they are still too young to understand the game and its rules. But by the time a child is six or so, and capable of understanding—and also in the final stages of forming his or her initial character1—then the training wheels must come off. Then the child must begin learning what it is to win and, more importantly, what it is to lose, and how to do either one gracefully and in good fellowship.

Young people are supposed to test themselves, try hard things, and occasionally fail. If life is easy, if you are never challenged and never given the opportunity to fail, you only develop half of your character. Sure, it hurts not to be picked for the team at tryouts—or to be picked last when the hopefuls line up, knowing that no one really wants you but the coach says everybody has to play, or the team still has an empty place on the roster. Sure, it hurts when your team does its best and still loses. But these are opportunities to build character and decision more than they are blows to self-esteem.

If you are picked last, you then are forced think about why. Are your skills demonstrably lacking at softball, soccer, or whatever sport you have chosen? Are you unpopular with the other children? This is the time for self-reflection. If you are unpopular, then do you care more about playing than being yourself? Or are your personality, your behavior, your treatment of others in need of improvement? If you are unskilled, then do you care enough about playing to practice on your own and get better? Or is this the time to decide that softball or soccer is not really one of your core interests, and maybe your time would be better spent practicing for football season—or reading a book?

If your team has lost, then you are also forced think about why. Did you make obvious mistakes? Does your team need more practice? Did the other team have special moves or better signaling and coordination? Is there a way to improve your play so that you can win against them next time? It’s an adage in business that to increase your success rate, you need to increase your failure rate. That means to trying more things, trying different things, and learning from unforced errors and non-stupid mistakes. It’s also an adage in the arts—attributed to graphic novel writer Stephen McCranie—that the master has failed more times than the beginner has even tried.

A child who has never been appropriately challenged does not learn the skills of introspection and self-evaluation. He or she does not have the opportunity to become really skilled and developed at something difficult. Worse, he or she does not get the chance to make deep and lasting decisions about his or her core strengths and desires.

Such a child grows into a brittle and indecisive adult. Such an adult lacks the resilience of someone who has experienced both success and failure and knows where his or her strengths and weaknesses lie. Such an adult is not practiced in the art of self-examination and self-knowledge. Such an adult will tend to believe, deep down, that life should be fair and even-handed in order for him or her to succeed.

The world is a brutal and savage place. For good reason, the childhood version in the classroom and at play needs to be controlled and monitored so that children have a place to develop without being emotionally and spiritually crushed. But neither should they be so sheltered from reality that they fail to develop at all. Then the crushing will come later, in the real world that does not care for feelings, only results. And the young adult will not be prepared for it.

Such an adult will also lack the ability to exhibit what I call “grace.” He or she will lack the calmness—attained through self-knowledge and confidence in one’s own skills—to face challenges and work through them, come what may. He or she will lack the perspective to win with charity, knowing that the loss might have gone the other way, and to lose with dignity, knowing that the win was within his or her grasp. This quality makes for balanced, stable, and charitable people—the sort who are a pleasure to work alongside and to make your friend.

Success teaches you very little, except—sometimes—how to be a gracious winner and good sport to the people you’ve bested. In failure lies the power to grow and develop. And that’s golden.

1. The Jesuit founder Ignatius of Loyola is supposed to have said, “Give me the child for the first seven years and I will give you the man.” Whether this is true or not, we learn a lot in those preschool and kindergarten years.

Sunday, July 5, 2020

Without the Option

Dead bird

This is a dark thought. So if you are at all depressed or suicidal—I’m not, just thinking out loud here—please stop reading and come back some other day. If you are strong and happy, read on at your peril.

The Covid-19 situation has us all thinking, marginally if not centrally, about our own mortality—especially if we are in the age group of the Baby Boomers, whom this disease particularly seems to like. Yes, I could be hit by a bus on the street—or run over on my motorcycle by a semi-trailer truck—and be killed today. Yes, I could develop cancer or some other devastating disease and my life turn terminal tomorrow. Yes, I could get the regular old Influenza A HxNx and die of its debilitating symptoms sometime this year or next. But we haven’t been soaked in four months of statistics about any of those causes and how many are dying each day, each week, each month. This Covid-19 doesn’t sink into the background noise of daily life but remains at the forefront. So, for those of us old enough to take notice, the thought of impending death seeps into our brains.

As an atheist, I am without the comforting option of any kind of belief in an eternal afterlife.1 I know—or certainly believe—that when I die, my mind and my various brain functions, such as thought and memory, will cease along with my bodily functions.2 I will not ascend to some other sphere as a discorporate spirit or psychic wave or sentient vibration. I—the part of me that thinks and plans and hopes—will quickly disappear into darkness. I will not sit on a cloud and look down on this world, on my surviving friends and family, or on any part of my reputation that might live after me, and feel anything positive or negative about them. I will not care. I will be as dead as roadkill, or a tree fallen in the forest, both of which eventually return to dust and their component atoms, leaving no discernible trace in the world. And in a hundred or thousand years, my life will have just as much meaning as that tree or animal among whatever passes for my distantly related family members or Western civilization itself. “Dust thou art, and unto dust thou shalt return”—body and soul, or so I believe.

You might believe this thought would be terrifying. That being nearer to death now than at any time in the past—when, as a younger person, I could cheerfully forget or ignore my mortal nature—would make me dread and fear those last minutes, make me scramble around in this life, frantically trying to put off death and preserve every hour, every minute of breathable viability. Or that it would make me rush out to experience every possible moment of joy or passion or novelty this life still has to offer. But that is not the way.

I am the same person I was up until February of this year: measured, thoughtful, introspective, and curiously unafraid.3 I am unlikely to become panicky or rushed just because the death that was always near has come a little closer—but then, ask me again five minutes before the final exit.

From this vantage point, however, I find death is not so scary. In fact, it will be something of a relief.

For one thing, I will find freedom from responsibility. It seems my life right now—and for all the years before this—has been an endless and widening cycle of responsibilities. These days, I must gather and protect my financial resources, because I am unlikely to earn any more against the future. I must pay my taxes, my condo dues, my ongoing debts—even though I try to pay the latter down every month and am careful about incurring more. I must care for my family members—in these days of coronavirus more in spirit than by my actual presence. I must walk my dog four times a day, following along her trail of smells and sniffs, because we live on the twelfth floor and I cannot just open the backdoor to let her out into a protected yard.4 But these are just my largest responsibilities today, and they are shared with almost everyone in my age group.

In my own particular makeup, I have lesser responsibilities that have been with me since childhood. Most are the residue of a lingering obsessive-compulsive disorder; the rest are the result of my upbringing by careful parents. I keep straightening pictures that go askew, as well as area rugs—which must align with the pattern in the parquet flooring—and the corners of my piles of books and magazines. I keep wiping, cleaning, polishing—caring for!—surfaces and finishes. I keep my clothes neat and clean—although I don’t iron them anymore, thank you. I must keep the car neat enough to entertain guests, as well as gassed up, serviced, and ready to roll. I do the same for the motorcycle, plus wipe dust off the shiny surfaces every time I take it out and clean bug splatters every time I bring it back. I worry over every scratch and stone chip in the paint, and chase every blemish with a dab of clearcoat followed by polishing compound. I wash and wax, where applicable, relentlessly.

For another thing, death will release me from the need to be and stay strong. It was the way I was brought up—as I suppose with most of the children in my generation. We were taught by parents who had gone through the Great Depression and World War II themselves to be resilient, enduring, patient, and uncomplaining. When work would get hard or complicated, and I would have to stay late or come in over the weekend, that was simply the price of being an adult. The inconvenience of a head cold is not stronger than the daily pattern of obligations, nothing about which to stay home and pamper myself—certainly nothing to deprive the dog of her walks and for me to resolve to clean up any messes she might make indoors. When my back goes into spasms—as it does in the cold and damp weather—and bending over is hard, that’s not enough to make me stop filling her water bowl, or leave a piece of lint on the floor, or let a rug remain askew. The pattern of life, as established, is more important than its minor disruptions.

Putting up with pain and inconvenience, suffering through that which must be endured, walking with back straight and unbowed into the whirlwind—this is the price of fulfilling my own self-image and the precepts that my parents followed and taught my generation.

Death, when it comes, will be a release of self from the web of life. Even if that is without the option of an afterlife, it may come as a blessing.

1. See, for example, My Idea of Heaven from July 22, 2012.

2. However, brain function may persist for some seconds or minutes after the body stops working. The story is told of Anne Boleyn, whose decapitated head, when held up for inspection, looked down on her severed body and moved her mouth as if speaking. We also know from extensive medical experience that brain cells can survive and be revived without irreversible damage for three to six minutes after the blood stops flowing. We do not die all at once. But those mere minutes are not a basis for belief in eternity.

3. See also Fear Itself from June 10, 2018.

4. But hey, it’s good exercise for me, too.

Sunday, June 28, 2020

The Antifanatic

Apple tease

Everyone these days is talking about “Antifa,” the “antifascists,” who enforce their views with bricks and fists. Well, I have a cure for that. It’s the realization that no one is morally or ethically or humanly required on any and all occasions to speak out, speak loud, speak proud. I call it the “Antifanatic.”

The call to action—putting your heart and soul into discussion or pursuit of a particular proposition—is all too often the call for absolutism. Distinctions are pared away. Qualifications are eliminated. Shades of gray are painted over. The pursuer—or defender—is left with a nice, clear, easy to understand, impossible to refute “yes” or “no.” Everyone is either with us or against us. Do or die. Death or glory. … And this is a remarkable renunciation of the human capacity for reason.

That said, you don’t want to enter into hard battle, a shooting war, life or death, fight or flight while you are beset by doubts. When the bullets are flying your way, you don’t want to be—and you don’t want those on your side of the trench line to be—questioning, maybe, your responsibility, just a little bit, and your moral justification for receiving those bullets. When war comes, you put your quibbles aside and take up the gun.

But not every social situation is a war. Not every injustice is a cause for rebellion and revolution. And that seems to be the problem today, in our politics, our economics, our culture, and our social connections. Every issue rises to the level of New York Times 48-point Cheltenham Bold Extra Condensed all-caps headline font, usually reserved for a declaration of war. The volume is turned up to twelve on a scale of ten. And so distinctions and qualifications drown in the noise of drumbeats.

I say this because most of life is gray. Living is a matter of distinctions and qualifications. Choices are seldom “yes” or “no” but more a case of “better” and “worse,” of “maybe” and “it depends.” In very few cases is the benefit or the risk absolute but rather a matter of calculation. And that is what the human brain was developed for, after a million years of deciding how to live in the world. In the hunt, there is no perfect knowledge and no obvious choice. Instead, the game may be in sight but at just a bit too long distance, or with the wind or foliage causing interference. There is seldom a perfect moment for the shot, just a collection of perceived opportunities and risks. And in the gathering of foodstuffs, there is no single, perfect berry or tuber to be chosen and plucked, only a selection of those on either side of optimum size or ripeness. How willing a person might be to eat an overripe apple, and maybe eat around a spot of brown decay or maybe a maggot on one side, depends on how hungry that person is. Circumstances and perspective are everything.

But it’s hard to get a political movement or a religion—and these days they can often be confused—started with an overlay of rational analysis, the weighing of distinctions and qualifications, the necessary consideration of “maybe” and “it depends.” The political leader wants his followers to shout “Yes!” The pastor wants his congregation to give a big “Amen!” That’s how you build a compliant crowd—by paring away individual thoughts and distinctions and cutting away any middle ground.

As a writer, I do not have the luxury of retreating into absolutes, of indulging the human craving for pure black or pure white, of sliding into absolutism. The writer’s job is to see multiple viewpoints, or so I believe. My characters are not—should not be, cannot be—creatures of simple tastes and absolute beliefs. My heroes need to have doubts and failings. My villains must have thought and reason.1

Every human has a story. Most humans—except those who were mentally impaired to begin with, or have lost their minds through fever, dementia, or other illness—have reasons for their choices and actions. The story may be tragic and leave the person bruised and selfish. The reasons may be faulty and the choices hasty. But those are all clues to character. Every human being is a ball of string set in the glue of circumstance and opportunity. And it’s the writer’s job with an imagined fictional character—or the psychiatrist’s, with a real patient—to untangle or cut through the string, to see how the glue seeps into the layers of choice and desire, and to find the main strands of personality. This is the excitement of exploring a character as he or she navigates the obstacles of a plot and action.

The fanatic would dismiss all of this discernment and analysis as irrelevant. The fanatic pushes a single thread, a single viewpoint, to the edge of absurdity—and sometimes goes right over into virtual madness. Any position taken as an absolute will lead to absurdity. In a world of “zero tolerance” about personal weapons, a nail file or a butter knife becomes a verboten possession. In a world of absolute support for a woman’s right to choose, strangling a baby at birth becomes a reasonable option. In a world of absolute justice without compassion or mercy, a man can be hanged for stealing a loaf of bread. Fanatics always end up defending positions that any reasonable person would find questionable, unjustified, and sometimes abhorrent.2

Reason is the human birthright. The ability to think and act as an individual, rather than as part of an arm-waving crowd, is every person’s prerogative. This is why human society is so varied and diverse: we each see a different set of distinctions, shades of gray, opportunities and obligations, benefits and risks.

Let’s keep it that way.

1. This is one reason I don’t write fantasy. It’s too easy to paint the villains as mindless evil for evil’s sake. Even a psychopath or a sociopath has a background story and a reason for his or her choices. In The Lord of the Rings, for example, the titular character and driver of the entire story line, Sauron, is never shown as a viewpoint character or even as any kind of living presence. Sauron is pure evil, malice without reason, power for power’s sake, the dark cloud that stretches in the east. We never get inside Sauron’s mind because there is no human mind there, just the darkness. This is not a failure on Tolkien’s part, but Sauron is the one character in the whole saga who operates without a discernible motivating cause and cannot be mistaken for human. Sauron is the fire, the storm, the mindless force of raw nature, not someone with whom we can identify—or bargain, negotiate, and reach agreement.

2. One of the reasons I am not a serious collector of things—stamps, coins, Hummel figurines—is that the initial attraction and the “Hey, neat!” response of the beginner eventually becomes the feverish pursuit of the rarest, least obtainable, and sometimes the ugliest or most flawed member of the defined group. (This is because ugly or flawed things usually don’t get produced in large numbers. Cf. the U.S. “Inverted Jenny” 24-cent airmail postage stamp.) I want to preserve the beginner’s initial fascination and avoid the fanatic’s ultimate heartbreak.