Sunday, October 18, 2020

Too Many Superheroes

Superhero

It’s no secret that our movies, television, and to some extent also our popular fiction are inundated with superheroes.1 The main characters, or the essential focus of the story, is on people with some physical or mental enhancement: super strength, x-ray vision, ability to fly, increased lifespan, or genius-level perception. And I would include here people who are otherwise separated from the human race by exceptional circumstances: vampires, witches, fallen angels, and the victims of medical experimentation.

These movies, television shows—series, I guess you call them now, with extended story arcs—and books are aimed at the young adult, the middling young, and the young at heart. The trouble is that, in my view, they tend to arrest the normal human development from child to functioning adult.

Life’s problems, which all of us must deal with, cannot be solved by punching through walls, seeing through doors, outsmarting your enemies with a genius IQ, or becoming immortal. A functioning adult has to use the skills and knowledge developed through hard work, proper choices, and good use of time in order to gain confidence, capability, and self-esteem. These things cannot be granted by birth on another planet, a medical advance, or a fortuitous afterlife. There are no shortcuts to growing up.

One of my favorite science-fiction series is Frank Herbert’s Dune books, telling the fantastic far-future history of the accomplished Atreides family. The series actually climaxes in the fourth book, The God-Emperor of Dune. The main character there is Leto II, who is the ultimate superhero: emperor of the known universe, served and protected by fiercely loyal people, commanding a superb fighting force, as well as being virtually immortal, physically invulnerable, able to predict the future, and able to access the living memory of every one of his ancestors and so the entire history and example of all humanity. And yet, in Herbert’s brilliant style, he is brought down by two skilled but not super-powered human beings who resist being his slaves. The book is really the anti-superhero story.

To be an adult is to possess hard-won knowledge, to develop skills that cannot be acquired magically or through a pill or genetic manipulation, to have endured experiences that are both constructive and destructive and enable you to know and understand the difference, and to become adept at foreseeing and dealing with the consequences of your actions. All of this must be learned. It must be acquired by having hopes and dreams, working toward them, and sometimes—maybe often—seeing them dashed. It is acquired through working through your problems, paying attention to what happens and when, remembering those consequences, and formulating rules of living both for yourself and your children, if you have any. This is the process that every child, every young adult, and every post-adolescent goes through. If you are lucky to survive, you keep learning and updating your internal database through adulthood and into middle and old age. Perfecting who you are should never stop until you draw your last breath.

And that is the final lesson. To be an adult includes the sober knowledge and acceptance of the fact that you, personally, in your own self, will one day die.2 This is not a cause for grief, fear, rage, or despair. Humans die, animals and plants die, bacteria and funguses can be destroyed, cell lines come to an end. Even rocks and whole mountains wear away to dust and silt, then break down into their component atoms, and rejoin the cycle of life on this planet. In my view, this is the key understanding of the human condition. We are not immortal. We have no lasting power over death, only good fortune and small victories. We only have the strength of our bodies, the power of our intelligence, and the focus of our wills. That is all we human beings can command.

When you know that you will eventually die, then you know how to value your life, your time, and your effort here on Earth. To be willing to sacrifice your life for something you believe is greater than yourself, you have to know how to value your remaining time. This is a rational decision that our brains were designed to make—if they are not clouded by the veil of hope that we, in our own bodies, just might be immortal. That hope protects us when we are young and stupid and have little experience of death. It is a foolish thing to carry into adulthood and middle age, when we are supposed to know the truth and act accordingly.

Oh, and in addition to what we can command and accomplish as individuals, we can also work together, pooling our achievements and our knowledge over time. We can raise vast cathedrals, each person adding his own carved stone or piece of colored glass. We can build a body of scientific knowledge by researching and writing down our findings in a discipline that we share with others. We can join a company—in the oldest sense of that word, whether an economic enterprise, a body of troops, or a group of travelers—to attempt and achieve more than a single human can do. And if we cannot do any of these things directly, then we can support the efforts of others by mixing mortar for their cathedral, serving as an archivist of their scientific endeavors, or becoming the financier, accountant, or quartermaster to that company in whatever form it takes.

Any of these tasks shared with other humans requires a knowledge of self and your limitations, a willingness to hold your own dreams and desires in check and subvert them to the common will, and to take and give orders for the good of the common effort. And this is another aspect of becoming an adult: to put aside the me-me-me of childhood and adopt the us of a collaborative group.

Superheroes, in fiction and on the screen, leap over these everyday problems and concerns. If they experience disappointment and existential angst at all, it is usually focused inward, on their supposed powers and their failure when they meet a foe who exhibits a greater power. But it’s all a conception of, and played out in the mind of, the graphic artist, the writer, or the film director: the presumed power, the challenges, and the intended result. And, curiously enough, the superhero always manages to win in the end. That is the way of fiction.

Real life involves dashed expectations, failed attempts, physical and mental limits, rejection by loved ones, and sometimes rejection by society itself. It is what a person does with these situations, using only the strength and wits, skills and knowledge, that he or she has acquired through conscientious development, that marks a successful human being. And ultimately the extinction of body and mind comes for us all. If you’re not dealing soberly with these things—and superheroes don’t—then you remain a species of child.

Those developing-adult stories, dealing with growth and change, are really the ones worth telling.

1. In fact, about fifteen years ago, when I was still trying to find an agent for my science-fiction writing, one potential candidate asked, “Who is your superhero?” That was the literary mindset: the main character had to have extraordinary powers for any book that could hope to be optioned for a movie—and back then selling a million copies and making it to the big screen had become the sole purpose of publishing. Maybe it still it, for all I know. But Covid-19 and the closing of the theaters might change all that.

2. I believe I first read this in a Heinlein story—perhaps Stranger in a Strange Land, although I can’t find the reference—that the difference between a child and an adult is the personal acceptance of death. To that, one of the characters in the conversation replies, “Then I know some pretty tall children.”

Sunday, October 11, 2020

Modeling Nature

Mandelbrot fractal

A saying favored by military strategists—although coined by Polish-American scientist and philosopher Alfred Korzybski—holds that “the map is not the territory.”1 This is a reminder that maps are made by human beings, who always interpret what they see. Like the reports of spies and postcards from vacationing tourists, the observer tends to emphasize some things and neglect or ignore others. Human bias is always a consideration.

And with maps there is the special consideration of timing. While the work of a surveyor, depending on major geographic features like mountain peaks and other benchmarks that tend to stand for thousands of years, may be reliable within a human lifespan, mapmakers are taking a snapshot in time. From one year to the next, a road may become blocked, a bridge collapse, a river change course, or a forest burn—all changing the terrain and its application to a forced march or a battle. If you doubt this, try using a decades-old gas station map to plan your next trip.

This understanding should apply doubly these days to the current penchant for computer modeling in climatology, environmental biology, and political polling. Too often, models are accepted as new data and as an accurate representation—and more often a prediction, which is worse—of a real-world situation. Unless the modeler is presenting or verifying actual new data, the model is simply manipulating existing data sources, which may themselves be subject to interpretation and verification.

But that is not the whole problem. Any computer model, unless it becomes fiendishly complex, exists by selecting certain facts and trends over others and by making or highlighting certain assumptions while downplaying or discarding others. Model making, like drawing lines for topological contours, roads, and rivers on a map, is a matter of selection for the sake of simplicity. The only way to model the real world with complete accuracy would be to understand the situation and motion of every component, the direction and strength of every force, and the interaction and result of every encounter. The computer doesn’t exist that can do this on a worldwide scale for anything so complex and variable as weather systems; predator/prey relationships and species variation and mutation; or political preferences among a diverse population of voters and non-voters.

Computer modeling, these days—and especially in relation to climate change and its effects, or concerning political outcomes—is an effort of prediction. The goal is not so much to describe what is going on now but to foretell what will happen in the future, sometimes by a certain date in November, sometimes by the beginning of the next century. Predicting the future is an age-old dream of mankind, especially when you can be the one to know what will happen while those around you have to grope forward blindly in the dark. Think of oracles spoken only for the powerful or the practice of reading tea leaves and Tarot cards for a paying patron.

But complex systems, as history has shown, sometimes revolve around trivial and ephemeral incidents. A single volcanic eruption can change the weather over an entire hemisphere for one or several years. A surprise event in October can change or sour the views of swing voters and so affect the course of an election. The loss of a horseshoe nail can decide the fate of a king, a dynasty, and a country’s history. Small effects can have great consequences, and none of them can be predicted or modeled accurately.

When climate scientists first published the results of their models showing an average global temperature rise of about two degrees Celsius by the year 2100, the counterclaims were that they focused on carbon dioxide, a weak greenhouse gas; that the models required this gas to produce a “forcing,” or positive feedback loop, that would put more water vapor—a more potent greenhouse gas—into the atmosphere; and that the models did not consider negative feedback loops that would reduce the amount of carbon dioxide or water vapor over time. The climate scientists, as I remember, replied that their models were proprietary and could not be made public, for fear they would be copied or altered. But this defense also rendered them and their work free from inspection. Also, as I remember, no one has since attempted to measure the increase, if any, in global water vapor—not just measured in cloud cover, but also by the vapor loading or average humidity in the atmosphere as a whole—since the debate started. And you don’t hear much anymore about either the models themselves or the water vapor, just the supposed effects of the predicted warming that is supposed to be happening years ahead of its time.2

Add models that, for whatever reason, cannot be evaluated and verified to the general trend of results from scientific studies that cannot be reproduced according to the methodology and equipment cited in the published paper. Irreproducibility of results is a growing problem in the scientific world, according to the editorials I read in magazines like Science and Nature. If claims cannot be verified by people with the best will and good intentions, that does not make the originally published scientist either a liar or a villain. And there is always a bit of “noise”—static you can’t distinguish or interpret that interferes with the basic signal—in any system as vast and complex as the modern scientific enterprise taking place in academia, public and private laboratories, and industrial research facilities. Still, the issue of irreproducibility is troubling.

And, for me, it is even more troubling that reliance on computer models and projections are now accepted as basic research and scientific verification of a researcher’s hypothesis about what’s going on. At least with Tarot cards, we can examine the symbols and draw our own conclusions.

1. To which Korzybski added, “the word is not the thing”—a warning not to confuse models of reality with reality itself.

2. We also have a measured warming over the past decade or so, with peaks that supposedly exceed all previous records. But then, many of those records have since been adjusted—not only the current statement of past temperatures but also the raw data, rendering the actual record unrecoverable—to reflect changing conditions such as relocations of monitoring stations at airports and the urban “heat island” effects from asphalt parking lots and dark rooftops.
    As a personal anecdote, I remember a trip we made to Phoenix back in October 2012. I was standing in the parking lot of our hotel, next to the outlet for the building’s air-conditioning system. The recorded temperature in the city that day was something over 110 degrees, but the air coming out of that huge vent was a lot hotter, more like the blast from an oven. It occurred to me that a city like Phoenix attempts to lower the temperature of almost every living and commercial space under cover by twenty or thirty degrees, which means that most of the acreage in town is spewing the same extremely hot air into the atmosphere. And I wondered how much that added load must increase the ambient temperature in the city itself.

Sunday, October 4, 2020

Clever Words

Dissected man

Our politics is—and, I guess, has always been—susceptible to clever word combinations, puns, and rhymes that appear to tidily sum up a grievance, intended consequence, or course of action. For most of us, they are mere curiosities. But in my view they are treacherous if taken as a philosophy or a substitute for rational thought.

I’m sure there were chants and slogans that caught on during the American War of Independence, probably something to do with Indians and the tea shipments arriving in Boston Harbor. The slogan that comes readily to mind is from slightly later, the dispute with Canada in the mid-19th century about the Oregon border: “Fifty-four Forty or Fight,” relative to the latitude line that would define the hoped-for demarcation. I suppose it was just fortuitous that the map offered the preponderance of all those F’s and the opportunity for a stirring alliteration. If the border had been along the twentieth or thirtieth parallel, I guess the proponents would have had to come up with something else.

And then there is the modern-day all-purpose chant: “Hey-hey! Ho-ho! Fill in the Blank has got to go!” This one is particularly useful when a group of organizers want to stir up and direct a crowd. It’s got a rhythm that gets your arms and legs moving almost like a dance or a march step.1

To me, one of the worst substitutes for rational thought also comes from the 19th century, although a bit later. It is attributed to the journalist Finley Peter Dunne and his fictitious alter ego Mr. Dooley. In its shortened form it says: “The job of the newspaper is to comfort the afflicted and afflict the comfortable.” This formula, clever in its reversal—almost a palindrome—of verbs and objects, has been taken up by generations of progressives ever since. For some, it’s an exquisite summation of how they should heal social ills.

But this combination is, of course, nonsense. Clever, but still nonsense. It depends on a false equivalency: that the sufferings of the afflicted—the poor, the weak, the disabled, the denied and discriminated against—are directly attributable to the smug satisfactions of the people not so burdened. It presumes that those who have worked, saved, invested, and planned for the future of both themselves and their families—all of those middle-class virtues—have created conditions of poverty and injustice for those not so fortunate. And this is not so. Those who have taken up the virtues have simply removed themselves from the class of the destitute and the desperate, not caused their condition.

By all means, one should “comfort the afflicted.” Heal their hurts where it is possible. Work to change their current situation and their opportunities where you can.2 But at best, “afflicting the comfortable” serves only to remind them that an underclass exists in their society and that one should spend some portion of one’s day, one’s mind, and one’s charity—if not just their taxes—to alleviating the situation. “Afflicting the comfortable” is intended to be fighting words, suggesting that by reducing their comforts a society can somehow magically improve the lot of the afflicted. And that magical thinking is just pure Marxism: Been tried; didn’t work.

Another set of fighting words, intended to stir up the complacent and draw them into a social battle, are the various formulas intended to fight social apathy: “If you’re not part of the solution, you’re part of the problem,”3 and more recently “Silence is violence.” Again, the false equivalency that those who are not actively joining the fight—and on the side of, under the terms of, the sloganeers—are causing the wrong, are in fact wrong-doers themselves, that is the unspoken purpose of the chant.

These clever slogans are meant to give the great mass of people no choice. Join us or die—or worse, gain our everlasting contempt. They raise the issue in contention to the level of an existential crisis, a civilizational catastrophe, or a cause for civil war. However, for some of us, for many of us, perhaps for most of us in the middle of the political spectrum, who are spending our days doing all of that working, saving, investing, and planning for our own futures, in order not to be counted on the public rolls, the issue is not existential or catastrophic and does not merit a civil war. Yes, perhaps, the issue may demand our notice and concern. We might even add the deserving recipients to our list of charities or our list of considerations in the voting booth. But many of us, most of us, know that there’s nothing we can personally do about a lot of these social problems. We are not prepared to climb on the barricades, bare our breasts, and offer “our lives, our fortunes, and our sacred honor”4 to the project.

And no amount of clever words and scornful chants is likely to change that reality.

1. And in terms of serving multiple purposes, there is also: “No justice, no peace!” Simply pick your object of “justice,” and fill in your action for withholding “peace.”

2. But you have to be realistic about this approach. You can work to improve other people’s conditions sometimes, but that should not include a free ride or a lifetime’s residency on the dole. A taut safety net, not a soft and cushy safety hammock. Human beings are designed by a hundred thousand years of heredity to have personal goals and to seek satisfaction and self-worth through attaining them. No one—not children, not the mentally or physically disabled, nor the socially or economically disadvantaged—benefits from having their personal agency removed by a benevolent parent’s or government’s lifting and carrying them through all the vicissitudes of life.

3. Speaking of clever, I have always favored the chemist’s version: “If you’re not part of the solution, you’re part of the precipitate.” In other words, if you don’t join in this fight, you’ll be part of the fallout. Chuckle, smirk.

4. To quote from the last line of the Declaration of Independence, which for the signers did involve an existential crisis and, right quickly, a civil war.

Sunday, September 27, 2020

Monopoly Power

French marketplace

The Emperor Caligula was quoted by Suetonius as saying, “Would that the Roman people had but one neck!” Apparently so that he could hack through it more easily. Everyone wants to have control of their situation, and on the easiest possible terms.

In the business world, this tendency is represented by monopoly, where for the sake of simplicity, economy, efficiency, or some other perceived value there is only one producer or supplier of a particular category of goods or services, and by monopsony, where for the same set of reasons there is only one buyer. Think of the Defense Department and its need for complementary weapon systems, as opposed to individual purchases by each branch of the service, or by each military unit and base. Or the current drift toward single-payer medical coverage and its promise of cost reductions through the government’s negotiating power and volume purchases.

Monopolies have always enjoyed state support. The English crown, up until the 17th century, regularly granted royal favorites the monopoly trade in certain products, such as sweet wines in the Elizabethan period. And the British East India Company was granted exclusive trade rights in lands bordering the Indian Ocean. Americans have not generally favored monopolies until the widespread distribution of electricity in the late 19th and early 20th century, when it became inconvenient to have several power companies stringing wires up and down both sides of the street to reach their customers. It then became necessary to grant regulated monopolies to electricity and gas providers to systematize their distribution.

Generally though, big players do better in any market. If a company making anything, from cars to soft drinks, reaches the position of first, second, or third in the marketplace, it will want to crush its competition and take all the customers.1 And the government likes a marketplace dominated by big players: they are easier to deal with, regulate, and tax.2 Certainly, government regulation tends to work against a field of small players, who do not have the legal and regulatory affairs departments or the budgets to lobby government, respond to regulations, and engage in defensive lawsuits.

While our government has officially been “antitrust” since the days of the Robber Barons and the interlocking directorates of various companies controlling, for example, the markets in coal and steel, government has turned a blind eye to amalgamation and unification in the labor market. There different unions have banded together into effective monopolies on the labor supply for factory workers, service employees, and truck drivers. Again, big players do better in the market. They swing more weight. As individual union members join together in a giant, amalgamated union—they can speak with one voice. They can get more things done to their liking. They can have their way. And it’s actually a form of democracy—at least for the members of the union.

And where unions don’t exist, or have been withering for decades under our huge economic expansion, they soon may make a comeback as government increases its reach into the economy. For example, the current push for single-payer medical plans, or some version of Medicare for All, would make it easier for the nurses’ union to negotiate a favorable pay rate with a single government entity, rather than with a handful of large hospital corporations or thousands of local hospitals and clinics. And a government monopsony on health care would push the rest of the medical profession—doctors’ associations and collections of other health care specialists—into some form of consolidated negotiation or full unionization. It would also further the amalgamation of hospitals into larger corporations and combinations.

But while bigger may be better for the dominant players in the marketplace, the trend towards monopoly and monopsony isn’t necessarily good for the market itself.

First, when one product or system dominates, it tends to limit invention and technological progress. Success tends to make people conservative. Yes, monopoly players worry about some competitor coming along and beating them at their own game, but then their urge is to buy up, buy out, and shut down that competitor, or simply crush it by temporarily lowering prices on their own products. If AT&T (“Ma Bell”) had retained its monopoly on long-distance telephony and its ownership of the various local telephone companies (“Baby Bells”), its own manufacturing arm with Western Electric, and its research facilities with Bell Telephone Laboratories (“Bell Labs”), how soon do you think cellular phones, which are not dependent on wires at all and are instead a radio product, would have become available? The phone company would have crushed any radio product that needed to touch its phone system and landlines—except, possibly, for automotive radiophones, which would have been expensive and limited to very special users.

Second, in a monopoly situation, or under the conditions forced by a monopsony, employment choices are more limited. If you were a telecommunications technician or inventor in the Ma Bell era, you could either work for AT&T or find some other career. And if you disagreed with the company’s directives, choices, and planning, you could either speak out and find your career truncated, or you could keep your head down, rise in the organization, and hope to one day influence those decisions. Jumping ship to join a competitor or starting your own company with a better idea just wasn’t in the cards. The same goes for employees at NASA or your regulated local utility company.

Third, monopolies and monopsonies are almost always bad for the average person, the individual buyer, the customer, the person at the ultimate end of the supply chain. Where one organization has purchasing and pricing power over the market, the little guy accepts what he gets and pays what is asked. Not everyone wanted a Model T in “any color so long as it was black.” Not everyone wants a single choice of deodorant or sneaker. Not everyone wants the government deciding who will get a CT scan and when, because someone far up the food chain made a nationwide decision about how many CT scanners to buy for each county. People might appreciate efficiency, simplicity, economy, or some other overriding value in the abstract. But not everyone prefers white bread over pumpernickel, plain whisky over flavored vodkas, or the deodorant with a sailing ship on the label over any other brand. People like choices, making their own decisions, and deciding how and where to spend their money.

Fourth, and finally, monopolies and monopsonies almost never last. Sooner or later, the entrenched position becomes so cautiously conservative, so calcified, and so behind the times that a clever inventor can find a work-around: a new and disruptive product, a new marketplace, or a new champion. That’s happening all over the place these days, in the automotive world (hybrids, Tesla), in telecommunications (cell phones), in computers (laptops and tablets), in medicine (genetic analysis, personalized medicine), and in space exploration (SpaceX, Blue Origin). Big players become vulnerable unless they can also become nimble—not just crushing the competition but learning to dance with it.

Caligula’s desire for Rome to have just one neck, to make it easier for him to put his foot on and eventually to hack through, was the cry of every tyrant. But for anyone, even for a Roman emperor, life just isn’t that easy.

1. Unless, of course, the competition is good for the top players. Think of Coca-Cola and Pepsi-Cola, both of whom benefited by fostering their brand loyalty over the other competitor. Or the “Big Three” auto makers, who sold more cars by competing with the other guys on styling, horsepower, or some other popular enhancement, thus churning the annual sales of new cars.

2. If you doubt this, remember the senator who complained about the inefficiency of a market that offered Americans a variety of products: “You don't necessarily need a choice of 23 underarm spray deodorants or of 18 different pairs of sneakers.” It’s much easier to manage an economy with fewer choices and a monopoly player making all the decisions for the folks doing the buying.

Sunday, September 20, 2020

The Truth

Total honesty

I have always believed in the truth: that one should try to understand it, to know and speak it whenever possible, and to accept it, even if the implications and consequences work against one’s own prior assumptions, beliefs, advantages, and one’s personal situation. I would rather know and follow the truth than be happy and whole in the shadow of ignorance or a lie.

It was this basic adherence to the concept of truth that kept me from following my grandfather’s career path—although he was a great believer in truth, too—into the law, which everyone in the family thought would be my future, because I was so verbal as a child. But as I grew older, I realized that a lawyer deals mainly in argument, precedent, and the intricacies of the law as a giant logical puzzle weighing rights and advantages. I knew or suspected that a lawyer must sometimes decline to know or search for the truth—the facts of what actually happened, which he or she is required to bring into court, if known—while working toward an argument or an interpretation of the known facts that will best serve the client’s purpose. By putting some gain above the human obligation to know and speak the truth, I knew the law was something in which I feared to and dared not dabble.

So I studied English literature and became a devotee of storytelling. Fiction, a made-up tale about made-up people, is not necessarily a violation of the truth. It is not exactly telling lies. An author telling a story is like a blacksmith forging an iron blade. The smith hammers away the surface scale, and with it the impurities that cloud the pure metal underneath. And so the author hammers away the alternate interpretations and contradictions of a life situation in order to reveal a pure fact, or sequence of events, or understanding of the human condition that the author recognizes as true.

When I write about “the truth” here, I am not referring to biblical truth, or revealed truth, or a studied construct made of equal parts belief and hope. I am talking about a summation of observations, of experienced cause and effect, of facts that have been seen and where possible tested and annotated, of things we know to apply in the real world. It’s an elusive thing, this truth, but something that I believe can be observed by one person, formulated into a statement or story, communicated to another person, and received by that second person as something that is apparently if not obviously real and congruent with known facts.

It is therefore an article of faith with me as a fiction storyteller and a nonfiction communicator that language can have adequate if not exact meanings, in terms of the denotation and connotation of words. That one person can share an idea, a realization, a piece of truth with another person through verbal means and not be completely misunderstood. Some misunderstanding may take place. Sometimes one person does not have the same meaning—denotation or connotation—for a word that the original speaker does. Sometimes the recipient of the thought has different ideas or beliefs that get in the way of examining the story or statement and perceiving it in the same way that the original formulator meant or intended. Accidents of language and intention do happen. But, on the whole, between people of fair mind and unbiased perception, communication of the truth is possible.

It is also an article of faith with me that truth exists outside of our personal, subjective perceptions. That is, truth is an object waiting to be discovered, analyzed, and discussed. It is not merely a personal belief that is particular to each person and changes with his or her perceptions based on personal needs and desires. Two people can point to the results of a scientific experiment, or an art form or artifact that was carved, painted, written, or created by a third person, or to an event common to their experience, and reach agreement as to its nature, purpose, and quality.1

Of course, I am not a fool. I do not believe that every truth can be discovered and stated. I understand that some things are the product of chance or probability and so can fall out one way or another. I understand the quantum physicist’s dilemma when dealing with very small, intimate systems, that the act of observing a particle in flight—usually by bouncing a photon or some other particle off it—changes the direction of flight immediately. So the physicist can know where a particle was but not where it is now or where it’s going.

And I do understand that humans, their perceptions and interpretations, and the things they hold to be important are constantly changing: that we do not live in the same world of values and feelings that was inhabited by the ancient Greeks and Romans, the Medieval or Renaissance Europeans, the ancient or modern Chinese, or the Australian Aborigines. Humans are exciting and varied creatures, constantly evolving and reacting to the products of their own minds, and this is not a cause for concern. But I hold it as a postulate that, given good will and a common language, they learn from each other, share ideas, and can arrive at an objective truth about any particular situation or experience. However, I also understand that this level of understanding may require one or more human lifetimes and leave little room for other explorations and understandings. That’s why we have books and can read.

At the same time, I do not believe that human nature changes very much. What people believe and hold to be real may be influenced by great thinkers, prophets, and teachers. Otherwise, the Islamic world would not have taken such a turn away from the Judeo-Christian tradition that was in part its heritage. But people still have basic needs, basic perceptions of fairness and reciprocity, and a basic sense of both the limitations and the possibilities of the human condition. Until we become incorporeal creatures of energy or immortal cyborg constructs, issues of life and death, family and responsibility, need and want, will be for each of us what they were in the time of our hunter-gatherer ancestors.

And yet there are also things we cannot know about each other. I cannot know what is really going on inside your head. Even if I know you well enough to trust your nature and sense your honesty, even if you use words well enough to express your deepest feelings accurately, there are still secrets people keep to themselves and never tell even their nearest and dearest. There are still secrets people hide away from their own conscious mind and, like the movements of great fish in the deep waters, can only be discerned by the effects that these deep secrets have on their lives, their loves, and their mistakes and missed opportunities.

That is a lot of unknowing and things unknowable for someone who believes in the truth. But as I said, to be completely knowledgeable would take a library of great books and several lifetimes to read them. All any of us can do is try to start.

1. When I was studying English literature, back in the mid to late 1960s, we were taught what was then called the New Criticism. This was the belief that the work of a writer or poet—or a painter, sculptor, or musician—stood on its own and could safely be analyzed by any person with sense, feeling, and a knowledge of the language. This displaced the author’s own expertise with the object. The author’s claims about “what I meant” or “what I intended” might be interesting but did not define the work. Sometimes an author intends one thing but manages—through accidents of carelessness or vagaries of the subconscious—to achieve something else and sometimes something greater than intended.
    This is opposed to the literary criticism called “Deconstruction,” which has been taught more recently in English departments but is something I never studied. Deconstruction apparently teaches—at least as I understand it—that words, their usage, and their underlying reality are fluid rather than fixed. That they are so dependent on a particular time, place, and culture that trying to understand the author’s intended meaning, from the viewpoint of another time or place, is practically impossible. And therefore it is useless to discuss “great books” and their enduring value. That nothing is happening in any universally objective now, and everything is subject to current reinterpretation. This is, of course, anathema to me. It is a denial of any kind of perceivable truth.

Sunday, September 13, 2020

End of Days, or Not

Red-sky dystopia

This past week has been weird and depressing. A growing number of fires in California cast a pall of smoke into the atmosphere over the northern part of the state, like a high fog but with a gritty perspective in the middle distance and bits of ash floating silently down, so that your car’s hood and fenders are speckled white. There’s a cold, red-orange darkness at noon, like the fume out of Mordor, or like life on a planet under a red-dwarf star. You’ve seen the pictures on friends’ Facebook pages—not quite as apocalyptic as my stock photo here, but still disturbing. It’s like—and I think this is a quote from either J. R. R. Tolkien or J. K. Rowling—you can’t ever feel cheerful again.

On top of that, Monday was the Labor Day holiday. So, for those of us who are retired and only loosely connected to the working world’s rhythms, that day felt like a second Sunday. Then Tuesday was like a Monday, Wednesday like Tuesday, and what the hell is Thursday supposed to be? Glad as we are for a holiday, it throws off the pace of the week and makes everything feel subtly weird.

And then there are the overlying, or underlying, or background burdens of 2020. The pandemic drags on and on, so that we are isolated from family, friends, and coworkers, except through the synthetic closeness of a computer screen. We wear masks in public, so that we are all strangers to each other, even to the people that we know and would normally smile at. We avoid people on the sidewalk and in elevators, maintain a shopping cart’s distance at the grocery store, and feel guilty about touching a piece of fruit in the bin and then putting it back when we find a suspicious bruise. This illness, unlike any other, is not so much a matter of concern about personal safety as a national and social pall that has descended on everyday life.

Because of the closures, our robust, consumer-oriented economy has tanked, and we don’t know when it will come back. The stock market has revived from its swoon in the spring, apparently rising on shreds of pandemic optimism. But anyone who follows the market knows that these weekly swings of 500, 1,000, and 1,500 points on the Dow alone, with comparable lurches in the other indexes, just can’t be healthy. It’s like the entire investor class is cycling from mania to depression, too. Meanwhile, we all know people who have been laid off and are scrambling. We all have a favorite restaurant that is eking by with takeout service or a favorite shop that has closed, apparently for good. We all miss going downtown or to the mall for some “shopping therapy”—not that we need to buy anything, but we look forward to what the richness of this country and its commercial imagination might have to offer us. Buying dish soap, toilet paper, face masks, and other necessities off the Amazon.com website just isn’t as satisfying.

And then there’s the politics. The divisions in this country between Left and Right—emblematic of but, strangely, not identical with the two major parties—have grown so deep and bitter that friendships are ended and family relationships are strained. The political persuasion opposite to your long-held point of view has become the other, the enemy, and death to them! We are slouching, sliding, shoved inexorably into an election that has the two sides talking past each other, not debating any real points of policy but sending feverish messages to their own adherents. And whichever way the national polling falls out—with the complication of counting votes in the Electoral College from the “battleground states”—the result promises to bring more bitterness, more rioting, more political maneuvering, and perhaps even secession and civil war.1 There’s a deep feeling in the nation that this election will solve nothing.

The one ray of hope in all of this is that things change. This is not a variation on the biblical “This too shall pass.” Of course it will pass, but that does not mean things will get back to the pre-fire, pre-pandemic, pre-boom, pre-strife normal. This is not the “new normal,” either. There never was, never is, any kind of “normal.” There is only the current configuration, life as we have it, and what the circumstances will bring. But this is also not the “end of days.”

Every fire eventually burns out. The rains come, the ground soaks up their moisture, and the stubbornest embers are extinguished for another year. We may have other and worse fires—or possibly better drought conditions—next year, but this year’s firestorm will eventually be over. Yes, the ground is burned, homes are lost, and a number of lives and livelihoods are upended. But the ground is also cleared for new growth, and the way is clear for people to start over. As someone who sits on a forty-year pile of accumulated possessions and closets full of just “stuff,” I sometimes think a good fire is easier to handle than a clearing operation where I would have to weigh and consider every piece of bric-à-brac against future need or desire.2 Sometimes you just have to let events dictate what happens in your life.

Every plague eventually fades away. The virus or bacteria mutates into a harmless nuisance, our immune systems adapt to handle it, or medical science comes up with a vaccine, and a devastating disease disappears from our collective consciousness. Yes, we have death and disability in its wake. But death and disability come to us all, if not from Covid-19 then from the annual influenza, or a cancer, or accident, or other natural and unnatural causes. For those who survive, our lives and our attitudes become more resilient, more grounded, more able to take life’s hard blows. That which does not kill me makes me stronger—until it or something worse finally kills me. And this is the way of life on this planet. The essence of the human condition is that we have the self-knowledge, foresight, and insight to understand this, where for every other animal on Earth, life’s stresses are pure misery and death is the ultimate surprise.

Every economic downturn paves the way for growth. At least, that is the cycle in countries that enjoy free-market capitalism. “Creative destruction,” the watchword of economist Joseph Schumpeter, captures the vitality of markets that are able to respond to current conditions and meet the needs and demands of people who are making their own decisions. In my view, this is preferable to one person or group, or a committee of technical experts, trying to guide the economy and in the process preserving industries, companies, and financial arrangements that have outlived their usefulness but provide some kind of national, political, social, or emotional stability that this group values above letting the mass of people make their own decisions.

Every political crisis passes. Issues get resolved, the emotions die down again, and life goes on in uneasy balance. The new stability may not reflect the goals and values that you were prepared to fight for, actually fought for, or maybe even died for. But the resolution is usually a compromise that most people can live with … unless the end of the crisis is a terminal crash, a revolution, a civil war, and a crushing loss that results in a majority—or worse, a virtual minority—beating the other side’s head in and engendering animosities and unhealed wounds that fester for generations and destroy everyone’s equanimity. Sometimes the best we can hope for is an uneasy, unsatisfying compromise that will hold until the next round of inspirations and aspirations takes control of the public psyche.

There never was a normal, just the temporary equilibrium that kept most people happy, a few people bitter, and many people striving to make things better. There never is an “end of days,” because history has no direction and no ultimate or logical stopping place—at least, not until the human race dies out and is replaced by the Kingdom of Mollusks, if we’re lucky, or the Reign of the Terror Lizards, if we’re not.

1. But see my take on that possible conflict in That Civil War Meme from August 9, 2020.

2. I recently completed such an operation with a forty-square-foot storage locker that I was renting, and the exercise took three months and was exhausting. You stare at a book you once thought you would read, or a jacket you once wore as a favorite, and have to decide its ultimate fate. Sooner or later, you just have to let go and throw this stuff away.

Sunday, September 6, 2020

Counterclockwise

World turned upside down

The other day I was reading in Astronomy magazine one of my favorite features, “Ask Astro.” There readers pose questions about the universe and astronomy in general, and experts are called in to answer them. This one asked why the Sun orbits our galaxy in a clockwise direction, while the planets orbit the Sun in a counterclockwise direction.1 And that got me thinking about the arbitrary nature of directions and much else in our daily lives.

After all, the conceptions of “clockwise” and “counterclockwise” didn’t come into use until people started telling time with geared mechanisms instead of the angle of the sun, sand running through an hourglass, bells rung in a church tower, or candles burning down past inscribed markings. Clocks with gears have been invented and reinvented using different driving forces—water, pendulums, springs—in ancient Greece in the 3rd century, China and Arabia in the 10th and 11th centuries, respectively, and Europe in the 14th century. The fact that most round clock faces count time by moving the hands from left to right—clockwise—is based on the usage of early sundials. These instruments track the Sun rising in the east, and therefore casting the shadow from the sundial’s gnomon in the west. Then the Sun moves to the south at midday, casting it’s shadow to the north. And finally, the Sun sets in the west, casting the shadow in the east. All of this, of course, is predicated upon the person observing the sundial facing north as a preferred direction. This daily rotation from west, or left, to east, or right, was so familiar that early clockmakers copied this movement.

Of course, all of these cultures that used sundials and invented mechanical clocks were spawned north of the equator and only lately spread them to cultures and European colonies established south of the equator in southern Africa, South America, and Australia. If those areas had been the home of a scientific, technically innovative, colonizing, and marauding culture, and if the peoples of the Eurasian continent had been inveterate stay-at-homes, then things would have been different.

Clockmakers originating in South Africa, Tierra del Fuego, or Australia might have faced in their preferred direction—south, toward the stormy seas and distant ice flows of Antarctica. And then they would have erected their sundials and drawn their clock faces based on the Sun rising at their left hands and casting a shadow to the west, moving to the north behind them and putting the shadow in front of their faces to the south, and finally setting at their right hands in the west and casting a shadow to the east. Their clock hands would have run in the direction we call “counterclockwise,” and the rest of the world would have followed suit. It all depends on your point of view, which is based on accidents of geography, demography, and historic migrations.

What else might have been different based on these historic accidents?

Certainly, our book texts and traffic signs reflect differing points of view. We in the European-based and -influenced world read texts from left to right, pretty much the same as the movement of our clock hands. But this was not universal. If we had kept the alphabets and scripts of the ancient Hebrews and Arabs, writing from right to left, and orienting their books from what we would consider the back cover to the front, then our literary world would be different and we would stack our library shelves in a different order. But we don’t, because we follow the Latin practice of the Roman culture that dominated the western, and eventually the eastern, end of the Mediterranean Sea and surrounding lands.

The earliest writing forms were different yet again. The Egyptians wrote in both rows and columns, depending on whichever was more convenient, and indicated the direction in which the symbols were to be read by the way that the animal signs—birds, snakes, and so on—faced at the top or side of the text. And anyway, hieroglyphs were for the priestly and aristocratic classes, intended to preserve the thoughts of important people for their future generations, and not for just anyone to read and understand. Early cuneiform writing from Mesopotamia was written from top to bottom and right to left, although they changed that direction from left to right at a later date. Chinese, Japanese, and other Asian scripts are generally flexible, written left to right when in horizontal rows, or top to bottom in columns and then mostly reading those columns from right to left—although sometimes also left to right.

Ancient Greek was the most practical of all, because texts were written and read from left to right for the first line, right to left for the second, back to left to right for the third, and so on. This was economical because they had no fraction of a second lag in brain time between the eyes finishing one row of letters on the right end and then tracking back to the left side of the page to start anew. This form of writing was called “boustrophedon,” or literally “as the ox plows.” Like most things Greek, it was eminently sensible—but it never caught on elsewhere.

And then, as to the shape of our books themselves, consider that what we think of as a “book” is really an invention of medieval monks with their manuscript codices,2 followed by Gutenberg and his printing press in Europe of the 15th century. Because Gutenberg was printing single broad sheets, folding them into pages, stacking them, and sewing the stacks together in a continuous, linear format, we have the modern book. Gutenberg probably inherited the idea of printing itself from Chinese books of pasted pages that were developed in the Song Dynasty around the 11th century.3

Before that, the Romans, Greeks, Hebrews, and just about everyone else wrote on scrolls. These were rolled up and packed into cubbies on library shelves, identified for the searcher by clay or wax tags attached to the tube ends. I have often thought that the order of the books we read in the Old and New Testament is rather arbitrary—except for Genesis, of course—and originally was based on whatever scroll you happened to pick up next. Someone must have written out a “cheat sheet” somewhere to direct you to some kind of chronological order after Genesis and throughout the New Testament. But things became easier when the pages were put in neatly linear order in a single sewn book.

A lot of the world we inhabit today—from clock faces, to the way we write, to which side of the road we drive on, to the shape of our keyboards—is pretty much a matter of geography, demography, and perspective. And the solutions we live with are not always the most convenient and sensible.

1. Short answer: The planets formed out of a cloud of dust and gas that started to spin in a particular direction—counterclockwise, when viewed from “above,” or from the Sun’s “north pole”—as it collapsed. But that gas cloud was already moving in another particular direction—clockwise, when viewed from “above” or “north” of the galactic plane. The opposite motions are more or less separate, arbitrary, and related to your point of view.

2. Codices (plural of codex) were handwritten single pages that were grouped together and bound between two wooden boards, as opposed to the rolled scrolls used in earlier times.

3. Printing was presumably invented by the Chinese about four hundred years earlier, where the entire page was carved from a single block of wood. Of course, this was just an advanced form of the rolling seals and stamps that had been in use for thousands of years. Carving a single page made sense when individual Chinese ideograms numbered about 50,000—too many to sort and select as single pieces of type. However, by the Song Dynasty the Chinese printers were doing just that. Gutenberg, with only twenty-six characters to choose from, plus capitals and punctuation marks, had an easier time with movable type.