Sunday, September 20, 2020

The Truth

Total honesty

I have always believed in the truth: that one should try to understand it, to know and speak it whenever possible, and to accept it, even if the implications and consequences work against one’s own prior assumptions, beliefs, advantages, and one’s personal situation. I would rather know and follow the truth than be happy and whole in the shadow of ignorance or a lie.

It was this basic adherence to the concept of truth that kept me from following my grandfather’s career path—although he was a great believer in truth, too—into the law, which everyone in the family thought would be my future, because I was so verbal as a child. But as I grew older, I realized that a lawyer deals mainly in argument, precedent, and the intricacies of the law as a giant logical puzzle weighing rights and advantages. I knew or suspected that a lawyer must sometimes decline to know or search for the truth—the facts of what actually happened, which he or she is required to bring into court, if known—while working toward an argument or an interpretation of the known facts that will best serve the client’s purpose. By putting some gain above the human obligation to know and speak the truth, I knew the law was something in which I feared to and dared not dabble.

So I studied English literature and became a devotee of storytelling. Fiction, a made-up tale about made-up people, is not necessarily a violation of the truth. It is not exactly telling lies. An author telling a story is like a blacksmith forging an iron blade. The smith hammers away the surface scale, and with it the impurities that cloud the pure metal underneath. And so the author hammers away the alternate interpretations and contradictions of a life situation in order to reveal a pure fact, or sequence of events, or understanding of the human condition that the author recognizes as true.

When I write about “the truth” here, I am not referring to biblical truth, or revealed truth, or a studied construct made of equal parts belief and hope. I am talking about a summation of observations, of experienced cause and effect, of facts that have been seen and where possible tested and annotated, of things we know to apply in the real world. It’s an elusive thing, this truth, but something that I believe can be observed by one person, formulated into a statement or story, communicated to another person, and received by that second person as something that is apparently if not obviously real and congruent with known facts.

It is therefore an article of faith with me as a fiction storyteller and a nonfiction communicator that language can have adequate if not exact meanings, in terms of the denotation and connotation of words. That one person can share an idea, a realization, a piece of truth with another person through verbal means and not be completely misunderstood. Some misunderstanding may take place. Sometimes one person does not have the same meaning—denotation or connotation—for a word that the original speaker does. Sometimes the recipient of the thought has different ideas or beliefs that get in the way of examining the story or statement and perceiving it in the same way that the original formulator meant or intended. Accidents of language and intention do happen. But, on the whole, between people of fair mind and unbiased perception, communication of the truth is possible.

It is also an article of faith with me that truth exists outside of our personal, subjective perceptions. That is, truth is an object waiting to be discovered, analyzed, and discussed. It is not merely a personal belief that is particular to each person and changes with his or her perceptions based on personal needs and desires. Two people can point to the results of a scientific experiment, or an art form or artifact that was carved, painted, written, or created by a third person, or to an event common to their experience, and reach agreement as to its nature, purpose, and quality.1

Of course, I am not a fool. I do not believe that every truth can be discovered and stated. I understand that some things are the product of chance or probability and so can fall out one way or another. I understand the quantum physicist’s dilemma when dealing with very small, intimate systems, that the act of observing a particle in flight—usually by bouncing a photon or some other particle off it—changes the direction of flight immediately. So the physicist can know where a particle was but not where it is now or where it’s going.

And I do understand that humans, their perceptions and interpretations, and the things they hold to be important are constantly changing: that we do not live in the same world of values and feelings that was inhabited by the ancient Greeks and Romans, the Medieval or Renaissance Europeans, the ancient or modern Chinese, or the Australian Aborigines. Humans are exciting and varied creatures, constantly evolving and reacting to the products of their own minds, and this is not a cause for concern. But I hold it as a postulate that, given good will and a common language, they learn from each other, share ideas, and can arrive at an objective truth about any particular situation or experience. However, I also understand that this level of understanding may require one or more human lifetimes and leave little room for other explorations and understandings. That’s why we have books and can read.

At the same time, I do not believe that human nature changes very much. What people believe and hold to be real may be influenced by great thinkers, prophets, and teachers. Otherwise, the Islamic world would not have taken such a turn away from the Judeo-Christian tradition that was in part its heritage. But people still have basic needs, basic perceptions of fairness and reciprocity, and a basic sense of both the limitations and the possibilities of the human condition. Until we become incorporeal creatures of energy or immortal cyborg constructs, issues of life and death, family and responsibility, need and want, will be for each of us what they were in the time of our hunter-gatherer ancestors.

And yet there are also things we cannot know about each other. I cannot know what is really going on inside your head. Even if I know you well enough to trust your nature and sense your honesty, even if you use words well enough to express your deepest feelings accurately, there are still secrets people keep to themselves and never tell even their nearest and dearest. There are still secrets people hide away from their own conscious mind and, like the movements of great fish in the deep waters, can only be discerned by the effects that these deep secrets have on their lives, their loves, and their mistakes and missed opportunities.

That is a lot of unknowing and things unknowable for someone who believes in the truth. But as I said, to be completely knowledgeable would take a library of great books and several lifetimes to read them. All any of us can do is try to start.

1. When I was studying English literature, back in the mid to late 1960s, we were taught what was then called the New Criticism. This was the belief that the work of a writer or poet—or a painter, sculptor, or musician—stood on its own and could safely be analyzed by any person with sense, feeling, and a knowledge of the language. This displaced the author’s own expertise with the object. The author’s claims about “what I meant” or “what I intended” might be interesting but did not define the work. Sometimes an author intends one thing but manages—through accidents of carelessness or vagaries of the subconscious—to achieve something else and sometimes something greater than intended.
    This is opposed to the literary criticism called “Deconstruction,” which has been taught more recently in English departments but is something I never studied. Deconstruction apparently teaches—at least as I understand it—that words, their usage, and their underlying reality are fluid rather than fixed. That they are so dependent on a particular time, place, and culture that trying to understand the author’s intended meaning, from the viewpoint of another time or place, is practically impossible. And therefore it is useless to discuss “great books” and their enduring value. That nothing is happening in any universally objective now, and everything is subject to current reinterpretation. This is, of course, anathema to me. It is a denial of any kind of perceivable truth.

Sunday, September 13, 2020

End of Days, or Not

Red-sky dystopia

This past week has been weird and depressing. A growing number of fires in California cast a pall of smoke into the atmosphere over the northern part of the state, like a high fog but with a gritty perspective in the middle distance and bits of ash floating silently down, so that your car’s hood and fenders are speckled white. There’s a cold, red-orange darkness at noon, like the fume out of Mordor, or like life on a planet under a red-dwarf star. You’ve seen the pictures on friends’ Facebook pages—not quite as apocalyptic as my stock photo here, but still disturbing. It’s like—and I think this is a quote from either J. R. R. Tolkien or J. K. Rowling—you can’t ever feel cheerful again.

On top of that, Monday was the Labor Day holiday. So, for those of us who are retired and only loosely connected to the working world’s rhythms, that day felt like a second Sunday. Then Tuesday was like a Monday, Wednesday like Tuesday, and what the hell is Thursday supposed to be? Glad as we are for a holiday, it throws off the pace of the week and makes everything feel subtly weird.

And then there are the overlying, or underlying, or background burdens of 2020. The pandemic drags on and on, so that we are isolated from family, friends, and coworkers, except through the synthetic closeness of a computer screen. We wear masks in public, so that we are all strangers to each other, even to the people that we know and would normally smile at. We avoid people on the sidewalk and in elevators, maintain a shopping cart’s distance at the grocery store, and feel guilty about touching a piece of fruit in the bin and then putting it back when we find a suspicious bruise. This illness, unlike any other, is not so much a matter of concern about personal safety as a national and social pall that has descended on everyday life.

Because of the closures, our robust, consumer-oriented economy has tanked, and we don’t know when it will come back. The stock market has revived from its swoon in the spring, apparently rising on shreds of pandemic optimism. But anyone who follows the market knows that these weekly swings of 500, 1,000, and 1,500 points on the Dow alone, with comparable lurches in the other indexes, just can’t be healthy. It’s like the entire investor class is cycling from mania to depression, too. Meanwhile, we all know people who have been laid off and are scrambling. We all have a favorite restaurant that is eking by with takeout service or a favorite shop that has closed, apparently for good. We all miss going downtown or to the mall for some “shopping therapy”—not that we need to buy anything, but we look forward to what the richness of this country and its commercial imagination might have to offer us. Buying dish soap, toilet paper, face masks, and other necessities off the Amazon.com website just isn’t as satisfying.

And then there’s the politics. The divisions in this country between Left and Right—emblematic of but, strangely, not identical with the two major parties—have grown so deep and bitter that friendships are ended and family relationships are strained. The political persuasion opposite to your long-held point of view has become the other, the enemy, and death to them! We are slouching, sliding, shoved inexorably into an election that has the two sides talking past each other, not debating any real points of policy but sending feverish messages to their own adherents. And whichever way the national polling falls out—with the complication of counting votes in the Electoral College from the “battleground states”—the result promises to bring more bitterness, more rioting, more political maneuvering, and perhaps even secession and civil war.1 There’s a deep feeling in the nation that this election will solve nothing.

The one ray of hope in all of this is that things change. This is not a variation on the biblical “This too shall pass.” Of course it will pass, but that does not mean things will get back to the pre-fire, pre-pandemic, pre-boom, pre-strife normal. This is not the “new normal,” either. There never was, never is, any kind of “normal.” There is only the current configuration, life as we have it, and what the circumstances will bring. But this is also not the “end of days.”

Every fire eventually burns out. The rains come, the ground soaks up their moisture, and the stubbornest embers are extinguished for another year. We may have other and worse fires—or possibly better drought conditions—next year, but this year’s firestorm will eventually be over. Yes, the ground is burned, homes are lost, and a number of lives and livelihoods are upended. But the ground is also cleared for new growth, and the way is clear for people to start over. As someone who sits on a forty-year pile of accumulated possessions and closets full of just “stuff,” I sometimes think a good fire is easier to handle than a clearing operation where I would have to weigh and consider every piece of bric-à-brac against future need or desire.2 Sometimes you just have to let events dictate what happens in your life.

Every plague eventually fades away. The virus or bacteria mutates into a harmless nuisance, our immune systems adapt to handle it, or medical science comes up with a vaccine, and a devastating disease disappears from our collective consciousness. Yes, we have death and disability in its wake. But death and disability come to us all, if not from Covid-19 then from the annual influenza, or a cancer, or accident, or other natural and unnatural causes. For those who survive, our lives and our attitudes become more resilient, more grounded, more able to take life’s hard blows. That which does not kill me makes me stronger—until it or something worse finally kills me. And this is the way of life on this planet. The essence of the human condition is that we have the self-knowledge, foresight, and insight to understand this, where for every other animal on Earth, life’s stresses are pure misery and death is the ultimate surprise.

Every economic downturn paves the way for growth. At least, that is the cycle in countries that enjoy free-market capitalism. “Creative destruction,” the watchword of economist Joseph Schumpeter, captures the vitality of markets that are able to respond to current conditions and meet the needs and demands of people who are making their own decisions. In my view, this is preferable to one person or group, or a committee of technical experts, trying to guide the economy and in the process preserving industries, companies, and financial arrangements that have outlived their usefulness but provide some kind of national, political, social, or emotional stability that this group values above letting the mass of people make their own decisions.

Every political crisis passes. Issues get resolved, the emotions die down again, and life goes on in uneasy balance. The new stability may not reflect the goals and values that you were prepared to fight for, actually fought for, or maybe even died for. But the resolution is usually a compromise that most people can live with … unless the end of the crisis is a terminal crash, a revolution, a civil war, and a crushing loss that results in a majority—or worse, a virtual minority—beating the other side’s head in and engendering animosities and unhealed wounds that fester for generations and destroy everyone’s equanimity. Sometimes the best we can hope for is an uneasy, unsatisfying compromise that will hold until the next round of inspirations and aspirations takes control of the public psyche.

There never was a normal, just the temporary equilibrium that kept most people happy, a few people bitter, and many people striving to make things better. There never is an “end of days,” because history has no direction and no ultimate or logical stopping place—at least, not until the human race dies out and is replaced by the Kingdom of Mollusks, if we’re lucky, or the Reign of the Terror Lizards, if we’re not.

1. But see my take on that possible conflict in That Civil War Meme from August 9, 2020.

2. I recently completed such an operation with a forty-square-foot storage locker that I was renting, and the exercise took three months and was exhausting. You stare at a book you once thought you would read, or a jacket you once wore as a favorite, and have to decide its ultimate fate. Sooner or later, you just have to let go and throw this stuff away.

Sunday, September 6, 2020

Counterclockwise

World turned upside down

The other day I was reading in Astronomy magazine one of my favorite features, “Ask Astro.” There readers pose questions about the universe and astronomy in general, and experts are called in to answer them. This one asked why the Sun orbits our galaxy in a clockwise direction, while the planets orbit the Sun in a counterclockwise direction.1 And that got me thinking about the arbitrary nature of directions and much else in our daily lives.

After all, the conceptions of “clockwise” and “counterclockwise” didn’t come into use until people started telling time with geared mechanisms instead of the angle of the sun, sand running through an hourglass, bells rung in a church tower, or candles burning down past inscribed markings. Clocks with gears have been invented and reinvented using different driving forces—water, pendulums, springs—in ancient Greece in the 3rd century, China and Arabia in the 10th and 11th centuries, respectively, and Europe in the 14th century. The fact that most round clock faces count time by moving the hands from left to right—clockwise—is based on the usage of early sundials. These instruments track the Sun rising in the east, and therefore casting the shadow from the sundial’s gnomon in the west. Then the Sun moves to the south at midday, casting it’s shadow to the north. And finally, the Sun sets in the west, casting the shadow in the east. All of this, of course, is predicated upon the person observing the sundial facing north as a preferred direction. This daily rotation from west, or left, to east, or right, was so familiar that early clockmakers copied this movement.

Of course, all of these cultures that used sundials and invented mechanical clocks were spawned north of the equator and only lately spread them to cultures and European colonies established south of the equator in southern Africa, South America, and Australia. If those areas had been the home of a scientific, technically innovative, colonizing, and marauding culture, and if the peoples of the Eurasian continent had been inveterate stay-at-homes, then things would have been different.

Clockmakers originating in South Africa, Tierra del Fuego, or Australia might have faced in their preferred direction—south, toward the stormy seas and distant ice flows of Antarctica. And then they would have erected their sundials and drawn their clock faces based on the Sun rising at their left hands and casting a shadow to the west, moving to the north behind them and putting the shadow in front of their faces to the south, and finally setting at their right hands in the west and casting a shadow to the east. Their clock hands would have run in the direction we call “counterclockwise,” and the rest of the world would have followed suit. It all depends on your point of view, which is based on accidents of geography, demography, and historic migrations.

What else might have been different based on these historic accidents?

Certainly, our book texts and traffic signs reflect differing points of view. We in the European-based and -influenced world read texts from left to right, pretty much the same as the movement of our clock hands. But this was not universal. If we had kept the alphabets and scripts of the ancient Hebrews and Arabs, writing from right to left, and orienting their books from what we would consider the back cover to the front, then our literary world would be different and we would stack our library shelves in a different order. But we don’t, because we follow the Latin practice of the Roman culture that dominated the western, and eventually the eastern, end of the Mediterranean Sea and surrounding lands.

The earliest writing forms were different yet again. The Egyptians wrote in both rows and columns, depending on whichever was more convenient, and indicated the direction in which the symbols were to be read by the way that the animal signs—birds, snakes, and so on—faced at the top or side of the text. And anyway, hieroglyphs were for the priestly and aristocratic classes, intended to preserve the thoughts of important people for their future generations, and not for just anyone to read and understand. Early cuneiform writing from Mesopotamia was written from top to bottom and right to left, although they changed that direction from left to right at a later date. Chinese, Japanese, and other Asian scripts are generally flexible, written left to right when in horizontal rows, or top to bottom in columns and then mostly reading those columns from right to left—although sometimes also left to right.

Ancient Greek was the most practical of all, because texts were written and read from left to right for the first line, right to left for the second, back to left to right for the third, and so on. This was economical because they had no fraction of a second lag in brain time between the eyes finishing one row of letters on the right end and then tracking back to the left side of the page to start anew. This form of writing was called “boustrophedon,” or literally “as the ox plows.” Like most things Greek, it was eminently sensible—but it never caught on elsewhere.

And then, as to the shape of our books themselves, consider that what we think of as a “book” is really an invention of medieval monks with their manuscript codices,2 followed by Gutenberg and his printing press in Europe of the 15th century. Because Gutenberg was printing single broad sheets, folding them into pages, stacking them, and sewing the stacks together in a continuous, linear format, we have the modern book. Gutenberg probably inherited the idea of printing itself from Chinese books of pasted pages that were developed in the Song Dynasty around the 11th century.3

Before that, the Romans, Greeks, Hebrews, and just about everyone else wrote on scrolls. These were rolled up and packed into cubbies on library shelves, identified for the searcher by clay or wax tags attached to the tube ends. I have often thought that the order of the books we read in the Old and New Testament is rather arbitrary—except for Genesis, of course—and originally was based on whatever scroll you happened to pick up next. Someone must have written out a “cheat sheet” somewhere to direct you to some kind of chronological order after Genesis and throughout the New Testament. But things became easier when the pages were put in neatly linear order in a single sewn book.

A lot of the world we inhabit today—from clock faces, to the way we write, to which side of the road we drive on, to the shape of our keyboards—is pretty much a matter of geography, demography, and perspective. And the solutions we live with are not always the most convenient and sensible.

1. Short answer: The planets formed out of a cloud of dust and gas that started to spin in a particular direction—counterclockwise, when viewed from “above,” or from the Sun’s “north pole”—as it collapsed. But that gas cloud was already moving in another particular direction—clockwise, when viewed from “above” or “north” of the galactic plane. The opposite motions are more or less separate, arbitrary, and related to your point of view.

2. Codices (plural of codex) were handwritten single pages that were grouped together and bound between two wooden boards, as opposed to the rolled scrolls used in earlier times.

3. Printing was presumably invented by the Chinese about four hundred years earlier, where the entire page was carved from a single block of wood. Of course, this was just an advanced form of the rolling seals and stamps that had been in use for thousands of years. Carving a single page made sense when individual Chinese ideograms numbered about 50,000—too many to sort and select as single pieces of type. However, by the Song Dynasty the Chinese printers were doing just that. Gutenberg, with only twenty-six characters to choose from, plus capitals and punctuation marks, had an easier time with movable type.

Sunday, August 30, 2020

Absent the Middle Class

Girl with magic box

I was born just after the Second World War, which means I grew up and became politically aware—or at least what I think of as “aware”—in the Eisenhower, Kennedy, and Johnson administrations. This was a time when the United States was the “last man standing” among the nations that participated in the war, and we came out better than any on either side. We had our infrastructure intact and had built up a huge capacity in raw materials like steel and aluminum as well as manufacturing due to the war effort. We were on top of the world.

That was also the time when the middle class in America was doing its best. Soldiers returning from the war were getting free education on the GI Bill. Homes were being built in newly defined and rapidly expanding suburbs. Business was booming and, even with the returning soldiers, jobs were plentiful. Most people—there were exceptions, of course, especially in the Jim Crow South—were prospering as never before. It was the good times.

The middle class is a relatively new thing in human history. It didn’t really develop until political and social structures had changed: urban life became commonplace, rather than the exception; and capitalism, the free market, and international trade became encoded with commonly accepted practices and rules, rather than just things that happened casually at the village level. The middle class was the place where people who were not nobles and landowners, yet too ambitious and too well educated to remain peasants, could find profitable employment and eventually riches by engaging in large-scale trade outside of selling butter and eggs on market day, or manufacturing outside of single-family cottage industry, or taking on the new roles in banking and legal transactions that supported these intermediate activities.

The middle class was for people with hopes and ideas, for those who sought independence from the old social classes, for those who wanted to do better than their fathers and grandfathers, for those who hungered to prove that they were as good as anyone and a damned sight better than most. It was the class of the feisty ones.

From Roman times and into the Middle Ages and then the Renaissance, the landed class, the nobles and the gentry, despised these strivers. Going into trade or handling money professionally was all about getting your hands dirty. And while anyone might admire a legally trained mind in the Roman Senate or a lawyer at court doing the king’s business, the sort of person who argued about the price of injury to a cow or the placement of a fence line was little better than a conniver and a con man in his lordship’s domain.

And of course the peasants, lately serfs, and still working the land that their father’s had farmed and sharing the proceeds with the lord of the manor, all viewed members of the middle class as social upstarts, the big men from town, whose fathers might have been the local blacksmith or miller, and whose grandfathers had been serfs like the rest of us. People who wore britches and waistcoats rather than the peasant’s smock were already getting too big for themselves.

So the middle class has been under suspicion and under fire for a long time. It wasn’t just idle animosity that made Karl Marx and the other socialists of the 19th and 20th centuries despise the middle class with its striving and materialistic values as “bourgeois”—which is just the French word for this class—or worse, “petit bourgeois,” as if they were too small to be significant. And why not? When the politics you’re selling involves state ownership of the means of production, and puts them all in the hands of appointed technocrats, or the revolutionary vanguard, or the modern equivalent of Plato’s philosopher kings, then the people who know how to handle their own or their neighbors’ business practices and money, who will start new enterprises simply because they think they can make a profit from them, and who will obey rules but not wait patiently on instruction from their betters—these people are the bureaucrats’ natural enemies. These are the people who will upset the serenely floating boat of socialistic doctrine and practice. And so these are the people who must be the first to go up against the wall.

And the peasants, the modern blue-collar workers, the ones who are content to do what they are told and lack the ambition or the education to go out and start their own businesses, even as house painters and contractors—they will be quite happy to work in a factory owned by the moneybags class with protections from their union, or work in the factory owned by the state with those same protections in place according to state law, and still have their union—if that’s even needed. The fate of those middlemen, professionals, and entrepreneurs is irrelevant to the new peasant class, at least at the surface of their minds.1

The middle class has always been in, well, the middle, between two classes that would just as happily see it disappear. And the middle class is disappearing these days. Not only is the upper class getting bigger—in terms of its power if not its numerical size—with wealth beyond the dream of kings and emperors of old. But the lower class is also getting bigger, with more people finding it harder to get the education and the good jobs that will enable them to enter the middle class as professionals, business owners, and independent traders. It is getting harder to own a house rather than rent, buy a new car instead of a used one or a lease, ensure your children a good education, take annual trips on your vacation—if you even get one while working two jobs—and plan for a comfortable retirement.

The middle class is being squeezed. Whether this is a planned process or just the natural course of modern economics,2 it’s happening. It has been going on in every decade of my life since I became politically aware. And I don’t know if it’s because the upper class and the Marxists do well when the majority of the people are more dependent on government and the largesse of big corporations than on their own initiative, or because we’ve lost something of the entrepreneurial spirit that fed bright and hopeful people into the middle class.

But something’s missing. And neither the top nor the bottom seems to notice or care.

1. If the peasant or the blue-collar worker thinks deeply, however, he will wonder where the technologies and inventions of the modern age—electricity, telephones and televisions, personal computers and smartphones, numerous medical advances, and easy credit and banking—all came from, if not from the entrepreneurial spirit of those who have their philosophical roots, if not their family background, in the middle class. But I digress …

2. As Robert A. Heinlein noted: “Throughout history, poverty is the normal condition of man. Advances which permit this norm to be exceeded—here and there, now and then—are the work of an extremely small minority, frequently despised, often condemned, and almost always opposed by all right-thinking people. Whenever this tiny minority is kept from creating, or (as sometimes happens) is driven out of a society, the people then slip back into abject poverty. This is known as ‘bad luck.’ ”

Sunday, August 23, 2020

Thingness and Nothingness

King Lear fool

The other morning I awoke to a realization: that while my mind, my conscious thought and memory, is a “spirit of the air” and therefore immortal, my body, my brain and the heart, hands, and legs that support it, is still a thing, and like all things it has a finite and ending existence in space and time. This is not, of course, a new thought in the history of human inquiry. But it is usually masked on a personal level by the day-to-say assumptions we all make, that our lives are virtually infinite and that death is far, far away.

As a professed atheist with a predilection for scientific empiricism, I am denied the consolation of belief in any kind of afterlife: transmigration of the soul into another body, translation of the mind into a spirit residing in a heaven or hell, or even transition into some kind of universal all-soul or cosmic consciousness. I have to believe that, like everyone in my parent’s generation and even some in mine, or a once-favored pet, a tree in the forest, or a dead fly on the windowsill, we are creatures of time and space, have existence and activation for a brief period, and then go out. The closest my beliefs come to religion is the Buddha’s description of Nirvana, the end of karma and its oppressive, continuing cycle of rebirths: you go out, like a candle flame. Where do you go? Don’t ask, because it is not defined.

In my view, this “going out” is nothing you have to work for or struggle to obtain, like the Buddha’s Nirvana. Instead, it just comes with the package of a person’s being a thing in time and space.

I have argued elsewhere against this “thingness” aspect of being human. My point was that transporter beaming as in The Fly and the Star Trek series cannot work simply because I—and people in general—are not things but processes. Every cell in our bodies, unlike the fixed materials in a brick or a bowl of potato salad, is in constant motion: absorbing nutrients, processing them, and casting off wastes, as well as transcribing DNA and translating RNA to make proteins, and using those proteins as building blocks and catalysts within the cells, along with chemical messages being sent through the bloodstream, and electrochemical messages sent across a network among 86 billion neurons in the human brain. None of this is static; everything is moving. Even with the fastest supercomputer, you can’t map the position of every atom in each cell, deconstruct it, transmit the information across space, and rebuild the body—and not expect to create a muddle of chemical reactions that got reversed, DNA and RNA replications that got interrupted, nerve impulses that got scrambled in transit, and a resulting accumulation of errors and poisons in each cell, the bloodstream, and the nervous system.

But still, much as I am a process, the machinery that is running that process has a fixed place in space and time. My software is firmly anchored to hardware, a squishy, chemically based machine.

The difference between a brick and a human body is that elusive, self-replicating, self-activating electrochemical thing called “life.” A brick is unchanging, except for the abrasions it suffers from use and weathering, and so it is virtually immortal. A living body is constantly changing, renewing cells and tissues as they wear out—at least up to a point—and adapting to its environment both physically as well as mentally and emotionally. But when the physical errors mount up—to say nothing of the mental and emotional errors—and the healing process fails, then the electrochemical spark dies, and the body becomes not unlike a brick—except that it deteriorates a lot faster. This is the mortality of all living things.1

Eventually, the brick will wear away—or you can smash it first—and then you have its component molecules of silica, alumina, lime, iron, and so on. But it will no longer be a brick. You can chemically unmake those molecules into their component atoms, then smash them down to protons, neutrons, and electrons, and finally break the neutrons down into protons, electrons, and antineutrinos.2 Protons have never been observed to decay, and their theoretical half-life is many times longer than the estimated age of the universe, so they may in fact be immortal—but proton decay still could happen. So, yes, you can unmake a brick or a human body down to the point of dissipated energy and subatomic fragments. And then neither of them will be a recognized thing anymore—nor, in the case of the human being, any kind of a process—and that’s as close to nothingness as you can get.

When my process dies away to mere thingness, and my thingness disintegrates to nothingness. Then nothing will be left. I may hope to be outlived by my children (if any), by my good works, by the love of the family and friends who knew me—all mortal and thing-based processes themselves—and by my books in the form of disintegrating paper and dissipating electrons. But in reality, any whispers of my existence will all have disappeared long before the last molecules of my body have blown away in a dust of subatomic particles.

That’s a grim thought for a weekday morning. I suppose you would call it a depressing thought. But in the long view—and here I’m talking about human history, culture, and many multiples of a human life—we don’t want things to last forever. Yes, I’d like a “lifetime warranty” on my mattress or my refrigerator, maybe not so much on my car, as I tend to enjoy the process and excitement of buying a newer model every couple of years. But we don’t want people, objects, or even stories and ideas to outlive their usefulness, to become meaningless—or worse, a fool’s punchline—for later and later generations. The imagined life of an Elrond or a Dracula, persisting through the ages of men, is indeed a tragic story.

Ozymandias was a warning to all who would yearn for immortality.

1. And yes, literature is filled with virtually immortal creatures such as elves and vampires. But I remind you that these are creations of the human mind and not actually found in nature. Even the Sequoia sempervirens, which can live for up to two millennia, eventually sickens, falls, and decays—if fire doesn’t get to it first.

2. Free neutrons outside an atomic nucleus tend to decay in about fifteen minutes.

Sunday, August 16, 2020

AI and Personality

Grinning dog
Starleth quadruped

So I’ve been following advances in artificial intelligence (AI) ever since I wrote my first of two novels about a self-aware computer virus. The current computer science isn’t up to actual self-awareness yet—more like at the stage of complex routines that can establish a goal, accumulate new information, hold it in long-term memory, and act on and repeatedly modify that information. This can simulate a living intelligence in terms of the Turing test, but it still might not be self-aware.

My little dog Sally is not self-aware, either. That is, unlike humans, dolphins, and perhaps elephants, she does not perceive of herself and have consciousness of herself as separate from the reality she inhabits. The test of this is the mirror: when human beings look in a mirror, they recognize themselves, associate the image with what they think of as “me,” and may touch their face or hair to improve that image. If you put a mirror in a dolphin’s pool, then strap a funny hat or put a mark on the dolphin’s head, the animal will go to the mirror to check out its “own” image. But if my Sally approaches a mirror—we have several floor-length mirrors around the apartment complex—she either pays no attention to the small dog coming toward her or is eager or shy about meeting that dog. For her, the image is “other,” not herself.

And yet, Sally has a definite personality. I was reminded of that this morning. We had gone out through the back of the complex for her early walk, where she had dutifully completed her business. There were other dogs out there, she knew, and she is shy about meeting any dog bigger than herself. When we got back inside, at the elevator lobby, I discovered that the up and down buttons on that floor were not working: we would have to go outside again to an external stairway and climb to the second-floor lobby, where I knew the buttons would work. And at the outside door, Sally balked. She would not go back out again. She didn’t have any more business to do; there were big dogs out there; this was not the morning routine; and because … why? It took a lot of coaxing, tugging on her leash—to the point of pulling her collar up around her ears, which she hates—and encouraging noises on my part, until she relented, wagged her tail, and came along willingly.

Clearly, something in her perception of the situation had changed. She either overcame her fear of whatever dogs were outside, or she decided that I—her buddy, walk coordinator, and the fellow holding the other end of the leash—knew best about the decision to go back outside, or she remembered the last time we went out that way because the elevator buttons didn’t work—although she could hardly understand the concepts of “button” and “work.” Whatever, she had a new thought and, after her initial stubbornness, came with me happily.

I’ve been watching the development of agile robots on four legs—dog surrogates—with extensible necks and a jaws-like pincer that can open doors and carry objects. I read recently that some of the Boston Dynamics robots have been assigned to patrol parks and warn any humans they see violating social distancing. That’s pretty sophisticated activity. And the four “paws” of the robot dog are better at moving cross-country than wheels would be. That got me to wondering what it would be like to have an artificially intelligent dog instead of my Sally. It would probably be a demon at defined tasks like patrolling the complex perimeter, herding sheep—if I had any—or fetching my slippers. It would bark less and obey more. And it would never, ever leave a puddle in the bay window. That, and it would never need to be taken for a walk on a leash several times a day in order to do that business, either.

But a robot dog would still be a machine: purpose built, goal oriented, and just a complex of embedded responses. It might be programmed to wag its tail and lick my hand. It might even be taught to use human speech and say endearing things. And it might—if you kind of unfocused your eyes and willingly suspended your disbelief—serve as a therapeutic presence when I was sad or depressed, and be programmed to detect these states and act accordingly. But it would not be “intelligent” in the way a dog is, with its own quirky personality, its own mind about why it will or won’t go through a door, its own reasons for barking its head off at a noise upstairs, its own silly, toothy grin and hind-leg dancing when I come home, and its own need to put its chin on my knee after we climb into bed. It wouldn’t shed—but it wouldn’t be warm, either.

I’m no expert on artificial intelligence, although I can fake it in science fiction. But right now, AI expert systems are perfectly acceptable as automated drivers, button sorters, pattern recognizers, data analysts, and yes, four-legged park rangers. We value them in these roles because they function reliably, give predictable answers and known responses, and they never balk at going through a door because … why? If they did have that endearing quirkiness—the tendency to give inexplicable, self-centered, unexpected responses to unusual situations—and occasionally left a puddle of oil on the floor, we would value them less.

However sophisticated their programming, robot dogs would not be our companions. And much as we liked them, we would not love them.

Sunday, August 9, 2020

That Civil War Meme

War devastation

Almost everyone who is paying attention will agree that the political situation in this country between the progressive Left and the conservative Right is becoming desperate. Families and friendships are being sundered over political differences. The differences represented are existential and encompass radically opposed views of what this country stands for and where it is or should be going. There is no middle ground upon which members from opposite sides of the question can build a workable compromise. The stakes have become all or nothing.

The last time this happened, between the views of the Northern abolitionists and the Southern slaveholders and states’ rights advocates in the 1950s, the only conceivable result was dissolution, secession from the Union, an attempt at a parallel slaveholding government in the South, and ultimately a war to bring the seceding states back into the Union. When the issues are existential and are believed to encompass the survival of one side or the other, when there is no middle ground or possible compromise, then breakup and/or civil war becomes the only answer—terrible as that may be.

Some would say that the “cold civil war” over political and cultural differences—which has been going on in this country for the last dozen or so years and perhaps started as far back as the 1960s—has already grown hot. In the past month, we’ve seen what are supposed to be “peaceful protests” in various cities (Minneapolis, Seattle, Portland) meld into violent riots and attacks on both city and federal properties both there and in other cities (Richmond, Austin, Oakland) in a spreading conflagration. Now the Department of Homeland Security, a recent addition to the federal government based on earlier terrorist activity, is supposedly fielding agents to protect federal buildings and round up the people attacking them. To me, this looks like insurrection. This looks like the earliest stages of an armed conflict.

The supposed “Second Civil War” that is being shouted in various novels, blogs, and memes right now—including some of mine—is not going to look like the first Civil War of 1861-65. Here is why.

First, the nature of war and the weapons used to fight it have changed drastically in the last 160 years. The Union and Confederate armies were composed of foot soldiers who marched in relatively tight formations and fired muzzle-loading muskets, supported by muzzle-loading cannon and men on horseback scouting ahead of the marching armies. The fastest means of communication was the telegraph wire, usually strung along railroad rights of way. But armies in the field away from the rail lines had to rely on a man riding a horse and carrying a handwritten message. The armies themselves could only meet on suitable ground, a defined battlefield, and usually tried to outflank an opponent to reach their own objective, or ambush an opponent to keep him from reaching his objective. This was all two-dimensional and—except in punitive expeditions like Sherman’s March to the Sea—paid little attention to strategic operations against civilian objectives.

As we’ve watched the progress of war from marching brigade lines to the immobilized trenches of World War I, through the mobile armies of World War II and Korea, to the Air Cavalry in Vietnam, and finally the village and urbanized insurrections of Afghanistan and Iraq—all with the background of a nuclear exchange in the offing—we know that a modern war on the continental United States will not be anything like the first Civil War. What would Lincoln and Grant not have done if they had helicopters and F-16s, let alone the threat of atomic weapons? A civil war today would probably not even be about taking and holding territorial objectives, especially if the war was not preceded by states seceding from the Union. It might be more like Vietnam, Afghanistan, and Iraq, all about winning “hearts and minds” and punishing insurrection. It might be neighbor against neighbor, with the frontlines drawn between cities and suburbs, neighborhood against neighborhood, like the Spanish Civil War of the 1930s.

Second, the looming civil war might well not be one of secession and recapture. Between the first Civil War and today, the nature of our governments at both the state and federal level has also changed. During the 1860s, the state governments were relatively strong, and the federal government was relatively weak. The federal government, at least at the start of the war, was small and funded mostly by customs duties, tariffs, excise taxes, and some direct taxes. There was no national income tax—but neither were there immense federal programs and outlays for Social Security, Medicare, and Medicaid; transportation projects associated with the Interstate Highway System, along with control and regulation of rail and air travel; educational standards and directives, backed up by grants and benefits; environmental projects and regulations; financial audits and controls, including the Federal Reserve and its management of the economy and the money supply; and the thousand other things we depend on the federal government to provide today.

Whether the current “Red States” in the central part of the nation secede from a Union dominated by the “Blue States” along the two coasts and the upper Midwest, or vice versa, one group is going to be left with all those federal programs, along with the Federal Reserve and responsibility for all those Treasury bonds and the federal debt. Maybe everyone in the part of the country that secedes will be comfortable with giving up their Social Security and Medicare contributions and future benefits, all that highway and education money, and everything else we’ve come to rely on the federal government to supply. Maybe forgoing their share of the looming federal debt would be compensation enough. But rewriting those funding and social service obligations under a newly conceived and authored Constitution and code of laws would be a gamble for most people. Some—especially those with much to lose under a new interpretation of the tax code—might think twice about giving up the devil they know for the one that has yet to be born.

And then there are the pesky details of what would become international transactions. For one side or the other, the companies and networks we all expect to function smoothly—the internet and its cloud computing resources, the U.S. Postal Service and delivery services like FedEx and UPS; distribution networks like Amazon and eBay; communications services AT&T and Verizon; farming, food processing, and distribution companies that keep the rest of us supplied with flour and bread, vegetables, chicken, beef, and Hostess Twinkies; the electric power pools and their system exchanges; control and security of interstate air travel and railroads; oil and gas transmission pipelines, to name a few—all will all be tossed into a cocked hat and distributed variously between two different countries. Some of these functions will continue smoothly under international agreements. Others will become fragmented, prizes to be pulled apart in the interest of benefiting one party while hurting the other.

Any way you look at it, our country—the whole United States—has become far more interconnected and centrally governed, less regional and local, less independent, than it was 160 years ago. A breakup into Red and Blue, if that is even the correct dividing line anymore, would be far more difficult to pull off, and even more difficult to operate in two halves—especially if the Blue halves were physically separated by a big Red chunk in the middle, with borders, tariffs, and travel restrictions going both ways—than the country that divided in 1861.

All of this is food for careful thought before we embrace the Civil War Meme and start picking sides.