Sunday, September 27, 2020

Monopoly Power

French marketplace

The Emperor Caligula was quoted by Suetonius as saying, “Would that the Roman people had but one neck!” Apparently so that he could hack through it more easily. Everyone wants to have control of their situation, and on the easiest possible terms.

In the business world, this tendency is represented by monopoly, where for the sake of simplicity, economy, efficiency, or some other perceived value there is only one producer or supplier of a particular category of goods or services, and by monopsony, where for the same set of reasons there is only one buyer. Think of the Defense Department and its need for complementary weapon systems, as opposed to individual purchases by each branch of the service, or by each military unit and base. Or the current drift toward single-payer medical coverage and its promise of cost reductions through the government’s negotiating power and volume purchases.

Monopolies have always enjoyed state support. The English crown, up until the 17th century, regularly granted royal favorites the monopoly trade in certain products, such as sweet wines in the Elizabethan period. And the British East India Company was granted exclusive trade rights in lands bordering the Indian Ocean. Americans have not generally favored monopolies until the widespread distribution of electricity in the late 19th and early 20th century, when it became inconvenient to have several power companies stringing wires up and down both sides of the street to reach their customers. It then became necessary to grant regulated monopolies to electricity and gas providers to systematize their distribution.

Generally though, big players do better in any market. If a company making anything, from cars to soft drinks, reaches the position of first, second, or third in the marketplace, it will want to crush its competition and take all the customers.1 And the government likes a marketplace dominated by big players: they are easier to deal with, regulate, and tax.2 Certainly, government regulation tends to work against a field of small players, who do not have the legal and regulatory affairs departments or the budgets to lobby government, respond to regulations, and engage in defensive lawsuits.

While our government has officially been “antitrust” since the days of the Robber Barons and the interlocking directorates of various companies controlling, for example, the markets in coal and steel, government has turned a blind eye to amalgamation and unification in the labor market. There different unions have banded together into effective monopolies on the labor supply for factory workers, service employees, and truck drivers. Again, big players do better in the market. They swing more weight. As individual union members join together in a giant, amalgamated union—they can speak with one voice. They can get more things done to their liking. They can have their way. And it’s actually a form of democracy—at least for the members of the union.

And where unions don’t exist, or have been withering for decades under our huge economic expansion, they soon may make a comeback as government increases its reach into the economy. For example, the current push for single-payer medical plans, or some version of Medicare for All, would make it easier for the nurses’ union to negotiate a favorable pay rate with a single government entity, rather than with a handful of large hospital corporations or thousands of local hospitals and clinics. And a government monopsony on health care would push the rest of the medical profession—doctors’ associations and collections of other health care specialists—into some form of consolidated negotiation or full unionization. It would also further the amalgamation of hospitals into larger corporations and combinations.

But while bigger may be better for the dominant players in the marketplace, the trend towards monopoly and monopsony isn’t necessarily good for the market itself.

First, when one product or system dominates, it tends to limit invention and technological progress. Success tends to make people conservative. Yes, monopoly players worry about some competitor coming along and beating them at their own game, but then their urge is to buy up, buy out, and shut down that competitor, or simply crush it by temporarily lowering prices on their own products. If AT&T (“Ma Bell”) had retained its monopoly on long-distance telephony and its ownership of the various local telephone companies (“Baby Bells”), its own manufacturing arm with Western Electric, and its research facilities with Bell Telephone Laboratories (“Bell Labs”), how soon do you think cellular phones, which are not dependent on wires at all and are instead a radio product, would have become available? The phone company would have crushed any radio product that needed to touch its phone system and landlines—except, possibly, for automotive radiophones, which would have been expensive and limited to very special users.

Second, in a monopoly situation, or under the conditions forced by a monopsony, employment choices are more limited. If you were a telecommunications technician or inventor in the Ma Bell era, you could either work for AT&T or find some other career. And if you disagreed with the company’s directives, choices, and planning, you could either speak out and find your career truncated, or you could keep your head down, rise in the organization, and hope to one day influence those decisions. Jumping ship to join a competitor or starting your own company with a better idea just wasn’t in the cards. The same goes for employees at NASA or your regulated local utility company.

Third, monopolies and monopsonies are almost always bad for the average person, the individual buyer, the customer, the person at the ultimate end of the supply chain. Where one organization has purchasing and pricing power over the market, the little guy accepts what he gets and pays what is asked. Not everyone wanted a Model T in “any color so long as it was black.” Not everyone wants a single choice of deodorant or sneaker. Not everyone wants the government deciding who will get a CT scan and when, because someone far up the food chain made a nationwide decision about how many CT scanners to buy for each county. People might appreciate efficiency, simplicity, economy, or some other overriding value in the abstract. But not everyone prefers white bread over pumpernickel, plain whisky over flavored vodkas, or the deodorant with a sailing ship on the label over any other brand. People like choices, making their own decisions, and deciding how and where to spend their money.

Fourth, and finally, monopolies and monopsonies almost never last. Sooner or later, the entrenched position becomes so cautiously conservative, so calcified, and so behind the times that a clever inventor can find a work-around: a new and disruptive product, a new marketplace, or a new champion. That’s happening all over the place these days, in the automotive world (hybrids, Tesla), in telecommunications (cell phones), in computers (laptops and tablets), in medicine (genetic analysis, personalized medicine), and in space exploration (SpaceX, Blue Origin). Big players become vulnerable unless they can also become nimble—not just crushing the competition but learning to dance with it.

Caligula’s desire for Rome to have just one neck, to make it easier for him to put his foot on and eventually to hack through, was the cry of every tyrant. But for anyone, even for a Roman emperor, life just isn’t that easy.

1. Unless, of course, the competition is good for the top players. Think of Coca-Cola and Pepsi-Cola, both of whom benefited by fostering their brand loyalty over the other competitor. Or the “Big Three” auto makers, who sold more cars by competing with the other guys on styling, horsepower, or some other popular enhancement, thus churning the annual sales of new cars.

2. If you doubt this, remember the senator who complained about the inefficiency of a market that offered Americans a variety of products: “You don't necessarily need a choice of 23 underarm spray deodorants or of 18 different pairs of sneakers.” It’s much easier to manage an economy with fewer choices and a monopoly player making all the decisions for the folks doing the buying.

Sunday, September 20, 2020

The Truth

Total honesty

I have always believed in the truth: that one should try to understand it, to know and speak it whenever possible, and to accept it, even if the implications and consequences work against one’s own prior assumptions, beliefs, advantages, and one’s personal situation. I would rather know and follow the truth than be happy and whole in the shadow of ignorance or a lie.

It was this basic adherence to the concept of truth that kept me from following my grandfather’s career path—although he was a great believer in truth, too—into the law, which everyone in the family thought would be my future, because I was so verbal as a child. But as I grew older, I realized that a lawyer deals mainly in argument, precedent, and the intricacies of the law as a giant logical puzzle weighing rights and advantages. I knew or suspected that a lawyer must sometimes decline to know or search for the truth—the facts of what actually happened, which he or she is required to bring into court, if known—while working toward an argument or an interpretation of the known facts that will best serve the client’s purpose. By putting some gain above the human obligation to know and speak the truth, I knew the law was something in which I feared to and dared not dabble.

So I studied English literature and became a devotee of storytelling. Fiction, a made-up tale about made-up people, is not necessarily a violation of the truth. It is not exactly telling lies. An author telling a story is like a blacksmith forging an iron blade. The smith hammers away the surface scale, and with it the impurities that cloud the pure metal underneath. And so the author hammers away the alternate interpretations and contradictions of a life situation in order to reveal a pure fact, or sequence of events, or understanding of the human condition that the author recognizes as true.

When I write about “the truth” here, I am not referring to biblical truth, or revealed truth, or a studied construct made of equal parts belief and hope. I am talking about a summation of observations, of experienced cause and effect, of facts that have been seen and where possible tested and annotated, of things we know to apply in the real world. It’s an elusive thing, this truth, but something that I believe can be observed by one person, formulated into a statement or story, communicated to another person, and received by that second person as something that is apparently if not obviously real and congruent with known facts.

It is therefore an article of faith with me as a fiction storyteller and a nonfiction communicator that language can have adequate if not exact meanings, in terms of the denotation and connotation of words. That one person can share an idea, a realization, a piece of truth with another person through verbal means and not be completely misunderstood. Some misunderstanding may take place. Sometimes one person does not have the same meaning—denotation or connotation—for a word that the original speaker does. Sometimes the recipient of the thought has different ideas or beliefs that get in the way of examining the story or statement and perceiving it in the same way that the original formulator meant or intended. Accidents of language and intention do happen. But, on the whole, between people of fair mind and unbiased perception, communication of the truth is possible.

It is also an article of faith with me that truth exists outside of our personal, subjective perceptions. That is, truth is an object waiting to be discovered, analyzed, and discussed. It is not merely a personal belief that is particular to each person and changes with his or her perceptions based on personal needs and desires. Two people can point to the results of a scientific experiment, or an art form or artifact that was carved, painted, written, or created by a third person, or to an event common to their experience, and reach agreement as to its nature, purpose, and quality.1

Of course, I am not a fool. I do not believe that every truth can be discovered and stated. I understand that some things are the product of chance or probability and so can fall out one way or another. I understand the quantum physicist’s dilemma when dealing with very small, intimate systems, that the act of observing a particle in flight—usually by bouncing a photon or some other particle off it—changes the direction of flight immediately. So the physicist can know where a particle was but not where it is now or where it’s going.

And I do understand that humans, their perceptions and interpretations, and the things they hold to be important are constantly changing: that we do not live in the same world of values and feelings that was inhabited by the ancient Greeks and Romans, the Medieval or Renaissance Europeans, the ancient or modern Chinese, or the Australian Aborigines. Humans are exciting and varied creatures, constantly evolving and reacting to the products of their own minds, and this is not a cause for concern. But I hold it as a postulate that, given good will and a common language, they learn from each other, share ideas, and can arrive at an objective truth about any particular situation or experience. However, I also understand that this level of understanding may require one or more human lifetimes and leave little room for other explorations and understandings. That’s why we have books and can read.

At the same time, I do not believe that human nature changes very much. What people believe and hold to be real may be influenced by great thinkers, prophets, and teachers. Otherwise, the Islamic world would not have taken such a turn away from the Judeo-Christian tradition that was in part its heritage. But people still have basic needs, basic perceptions of fairness and reciprocity, and a basic sense of both the limitations and the possibilities of the human condition. Until we become incorporeal creatures of energy or immortal cyborg constructs, issues of life and death, family and responsibility, need and want, will be for each of us what they were in the time of our hunter-gatherer ancestors.

And yet there are also things we cannot know about each other. I cannot know what is really going on inside your head. Even if I know you well enough to trust your nature and sense your honesty, even if you use words well enough to express your deepest feelings accurately, there are still secrets people keep to themselves and never tell even their nearest and dearest. There are still secrets people hide away from their own conscious mind and, like the movements of great fish in the deep waters, can only be discerned by the effects that these deep secrets have on their lives, their loves, and their mistakes and missed opportunities.

That is a lot of unknowing and things unknowable for someone who believes in the truth. But as I said, to be completely knowledgeable would take a library of great books and several lifetimes to read them. All any of us can do is try to start.

1. When I was studying English literature, back in the mid to late 1960s, we were taught what was then called the New Criticism. This was the belief that the work of a writer or poet—or a painter, sculptor, or musician—stood on its own and could safely be analyzed by any person with sense, feeling, and a knowledge of the language. This displaced the author’s own expertise with the object. The author’s claims about “what I meant” or “what I intended” might be interesting but did not define the work. Sometimes an author intends one thing but manages—through accidents of carelessness or vagaries of the subconscious—to achieve something else and sometimes something greater than intended.
    This is opposed to the literary criticism called “Deconstruction,” which has been taught more recently in English departments but is something I never studied. Deconstruction apparently teaches—at least as I understand it—that words, their usage, and their underlying reality are fluid rather than fixed. That they are so dependent on a particular time, place, and culture that trying to understand the author’s intended meaning, from the viewpoint of another time or place, is practically impossible. And therefore it is useless to discuss “great books” and their enduring value. That nothing is happening in any universally objective now, and everything is subject to current reinterpretation. This is, of course, anathema to me. It is a denial of any kind of perceivable truth.

Sunday, September 13, 2020

End of Days, or Not

Red-sky dystopia

This past week has been weird and depressing. A growing number of fires in California cast a pall of smoke into the atmosphere over the northern part of the state, like a high fog but with a gritty perspective in the middle distance and bits of ash floating silently down, so that your car’s hood and fenders are speckled white. There’s a cold, red-orange darkness at noon, like the fume out of Mordor, or like life on a planet under a red-dwarf star. You’ve seen the pictures on friends’ Facebook pages—not quite as apocalyptic as my stock photo here, but still disturbing. It’s like—and I think this is a quote from either J. R. R. Tolkien or J. K. Rowling—you can’t ever feel cheerful again.

On top of that, Monday was the Labor Day holiday. So, for those of us who are retired and only loosely connected to the working world’s rhythms, that day felt like a second Sunday. Then Tuesday was like a Monday, Wednesday like Tuesday, and what the hell is Thursday supposed to be? Glad as we are for a holiday, it throws off the pace of the week and makes everything feel subtly weird.

And then there are the overlying, or underlying, or background burdens of 2020. The pandemic drags on and on, so that we are isolated from family, friends, and coworkers, except through the synthetic closeness of a computer screen. We wear masks in public, so that we are all strangers to each other, even to the people that we know and would normally smile at. We avoid people on the sidewalk and in elevators, maintain a shopping cart’s distance at the grocery store, and feel guilty about touching a piece of fruit in the bin and then putting it back when we find a suspicious bruise. This illness, unlike any other, is not so much a matter of concern about personal safety as a national and social pall that has descended on everyday life.

Because of the closures, our robust, consumer-oriented economy has tanked, and we don’t know when it will come back. The stock market has revived from its swoon in the spring, apparently rising on shreds of pandemic optimism. But anyone who follows the market knows that these weekly swings of 500, 1,000, and 1,500 points on the Dow alone, with comparable lurches in the other indexes, just can’t be healthy. It’s like the entire investor class is cycling from mania to depression, too. Meanwhile, we all know people who have been laid off and are scrambling. We all have a favorite restaurant that is eking by with takeout service or a favorite shop that has closed, apparently for good. We all miss going downtown or to the mall for some “shopping therapy”—not that we need to buy anything, but we look forward to what the richness of this country and its commercial imagination might have to offer us. Buying dish soap, toilet paper, face masks, and other necessities off the Amazon.com website just isn’t as satisfying.

And then there’s the politics. The divisions in this country between Left and Right—emblematic of but, strangely, not identical with the two major parties—have grown so deep and bitter that friendships are ended and family relationships are strained. The political persuasion opposite to your long-held point of view has become the other, the enemy, and death to them! We are slouching, sliding, shoved inexorably into an election that has the two sides talking past each other, not debating any real points of policy but sending feverish messages to their own adherents. And whichever way the national polling falls out—with the complication of counting votes in the Electoral College from the “battleground states”—the result promises to bring more bitterness, more rioting, more political maneuvering, and perhaps even secession and civil war.1 There’s a deep feeling in the nation that this election will solve nothing.

The one ray of hope in all of this is that things change. This is not a variation on the biblical “This too shall pass.” Of course it will pass, but that does not mean things will get back to the pre-fire, pre-pandemic, pre-boom, pre-strife normal. This is not the “new normal,” either. There never was, never is, any kind of “normal.” There is only the current configuration, life as we have it, and what the circumstances will bring. But this is also not the “end of days.”

Every fire eventually burns out. The rains come, the ground soaks up their moisture, and the stubbornest embers are extinguished for another year. We may have other and worse fires—or possibly better drought conditions—next year, but this year’s firestorm will eventually be over. Yes, the ground is burned, homes are lost, and a number of lives and livelihoods are upended. But the ground is also cleared for new growth, and the way is clear for people to start over. As someone who sits on a forty-year pile of accumulated possessions and closets full of just “stuff,” I sometimes think a good fire is easier to handle than a clearing operation where I would have to weigh and consider every piece of bric-à-brac against future need or desire.2 Sometimes you just have to let events dictate what happens in your life.

Every plague eventually fades away. The virus or bacteria mutates into a harmless nuisance, our immune systems adapt to handle it, or medical science comes up with a vaccine, and a devastating disease disappears from our collective consciousness. Yes, we have death and disability in its wake. But death and disability come to us all, if not from Covid-19 then from the annual influenza, or a cancer, or accident, or other natural and unnatural causes. For those who survive, our lives and our attitudes become more resilient, more grounded, more able to take life’s hard blows. That which does not kill me makes me stronger—until it or something worse finally kills me. And this is the way of life on this planet. The essence of the human condition is that we have the self-knowledge, foresight, and insight to understand this, where for every other animal on Earth, life’s stresses are pure misery and death is the ultimate surprise.

Every economic downturn paves the way for growth. At least, that is the cycle in countries that enjoy free-market capitalism. “Creative destruction,” the watchword of economist Joseph Schumpeter, captures the vitality of markets that are able to respond to current conditions and meet the needs and demands of people who are making their own decisions. In my view, this is preferable to one person or group, or a committee of technical experts, trying to guide the economy and in the process preserving industries, companies, and financial arrangements that have outlived their usefulness but provide some kind of national, political, social, or emotional stability that this group values above letting the mass of people make their own decisions.

Every political crisis passes. Issues get resolved, the emotions die down again, and life goes on in uneasy balance. The new stability may not reflect the goals and values that you were prepared to fight for, actually fought for, or maybe even died for. But the resolution is usually a compromise that most people can live with … unless the end of the crisis is a terminal crash, a revolution, a civil war, and a crushing loss that results in a majority—or worse, a virtual minority—beating the other side’s head in and engendering animosities and unhealed wounds that fester for generations and destroy everyone’s equanimity. Sometimes the best we can hope for is an uneasy, unsatisfying compromise that will hold until the next round of inspirations and aspirations takes control of the public psyche.

There never was a normal, just the temporary equilibrium that kept most people happy, a few people bitter, and many people striving to make things better. There never is an “end of days,” because history has no direction and no ultimate or logical stopping place—at least, not until the human race dies out and is replaced by the Kingdom of Mollusks, if we’re lucky, or the Reign of the Terror Lizards, if we’re not.

1. But see my take on that possible conflict in That Civil War Meme from August 9, 2020.

2. I recently completed such an operation with a forty-square-foot storage locker that I was renting, and the exercise took three months and was exhausting. You stare at a book you once thought you would read, or a jacket you once wore as a favorite, and have to decide its ultimate fate. Sooner or later, you just have to let go and throw this stuff away.

Sunday, September 6, 2020

Counterclockwise

World turned upside down

The other day I was reading in Astronomy magazine one of my favorite features, “Ask Astro.” There readers pose questions about the universe and astronomy in general, and experts are called in to answer them. This one asked why the Sun orbits our galaxy in a clockwise direction, while the planets orbit the Sun in a counterclockwise direction.1 And that got me thinking about the arbitrary nature of directions and much else in our daily lives.

After all, the conceptions of “clockwise” and “counterclockwise” didn’t come into use until people started telling time with geared mechanisms instead of the angle of the sun, sand running through an hourglass, bells rung in a church tower, or candles burning down past inscribed markings. Clocks with gears have been invented and reinvented using different driving forces—water, pendulums, springs—in ancient Greece in the 3rd century, China and Arabia in the 10th and 11th centuries, respectively, and Europe in the 14th century. The fact that most round clock faces count time by moving the hands from left to right—clockwise—is based on the usage of early sundials. These instruments track the Sun rising in the east, and therefore casting the shadow from the sundial’s gnomon in the west. Then the Sun moves to the south at midday, casting it’s shadow to the north. And finally, the Sun sets in the west, casting the shadow in the east. All of this, of course, is predicated upon the person observing the sundial facing north as a preferred direction. This daily rotation from west, or left, to east, or right, was so familiar that early clockmakers copied this movement.

Of course, all of these cultures that used sundials and invented mechanical clocks were spawned north of the equator and only lately spread them to cultures and European colonies established south of the equator in southern Africa, South America, and Australia. If those areas had been the home of a scientific, technically innovative, colonizing, and marauding culture, and if the peoples of the Eurasian continent had been inveterate stay-at-homes, then things would have been different.

Clockmakers originating in South Africa, Tierra del Fuego, or Australia might have faced in their preferred direction—south, toward the stormy seas and distant ice flows of Antarctica. And then they would have erected their sundials and drawn their clock faces based on the Sun rising at their left hands and casting a shadow to the west, moving to the north behind them and putting the shadow in front of their faces to the south, and finally setting at their right hands in the west and casting a shadow to the east. Their clock hands would have run in the direction we call “counterclockwise,” and the rest of the world would have followed suit. It all depends on your point of view, which is based on accidents of geography, demography, and historic migrations.

What else might have been different based on these historic accidents?

Certainly, our book texts and traffic signs reflect differing points of view. We in the European-based and -influenced world read texts from left to right, pretty much the same as the movement of our clock hands. But this was not universal. If we had kept the alphabets and scripts of the ancient Hebrews and Arabs, writing from right to left, and orienting their books from what we would consider the back cover to the front, then our literary world would be different and we would stack our library shelves in a different order. But we don’t, because we follow the Latin practice of the Roman culture that dominated the western, and eventually the eastern, end of the Mediterranean Sea and surrounding lands.

The earliest writing forms were different yet again. The Egyptians wrote in both rows and columns, depending on whichever was more convenient, and indicated the direction in which the symbols were to be read by the way that the animal signs—birds, snakes, and so on—faced at the top or side of the text. And anyway, hieroglyphs were for the priestly and aristocratic classes, intended to preserve the thoughts of important people for their future generations, and not for just anyone to read and understand. Early cuneiform writing from Mesopotamia was written from top to bottom and right to left, although they changed that direction from left to right at a later date. Chinese, Japanese, and other Asian scripts are generally flexible, written left to right when in horizontal rows, or top to bottom in columns and then mostly reading those columns from right to left—although sometimes also left to right.

Ancient Greek was the most practical of all, because texts were written and read from left to right for the first line, right to left for the second, back to left to right for the third, and so on. This was economical because they had no fraction of a second lag in brain time between the eyes finishing one row of letters on the right end and then tracking back to the left side of the page to start anew. This form of writing was called “boustrophedon,” or literally “as the ox plows.” Like most things Greek, it was eminently sensible—but it never caught on elsewhere.

And then, as to the shape of our books themselves, consider that what we think of as a “book” is really an invention of medieval monks with their manuscript codices,2 followed by Gutenberg and his printing press in Europe of the 15th century. Because Gutenberg was printing single broad sheets, folding them into pages, stacking them, and sewing the stacks together in a continuous, linear format, we have the modern book. Gutenberg probably inherited the idea of printing itself from Chinese books of pasted pages that were developed in the Song Dynasty around the 11th century.3

Before that, the Romans, Greeks, Hebrews, and just about everyone else wrote on scrolls. These were rolled up and packed into cubbies on library shelves, identified for the searcher by clay or wax tags attached to the tube ends. I have often thought that the order of the books we read in the Old and New Testament is rather arbitrary—except for Genesis, of course—and originally was based on whatever scroll you happened to pick up next. Someone must have written out a “cheat sheet” somewhere to direct you to some kind of chronological order after Genesis and throughout the New Testament. But things became easier when the pages were put in neatly linear order in a single sewn book.

A lot of the world we inhabit today—from clock faces, to the way we write, to which side of the road we drive on, to the shape of our keyboards—is pretty much a matter of geography, demography, and perspective. And the solutions we live with are not always the most convenient and sensible.

1. Short answer: The planets formed out of a cloud of dust and gas that started to spin in a particular direction—counterclockwise, when viewed from “above,” or from the Sun’s “north pole”—as it collapsed. But that gas cloud was already moving in another particular direction—clockwise, when viewed from “above” or “north” of the galactic plane. The opposite motions are more or less separate, arbitrary, and related to your point of view.

2. Codices (plural of codex) were handwritten single pages that were grouped together and bound between two wooden boards, as opposed to the rolled scrolls used in earlier times.

3. Printing was presumably invented by the Chinese about four hundred years earlier, where the entire page was carved from a single block of wood. Of course, this was just an advanced form of the rolling seals and stamps that had been in use for thousands of years. Carving a single page made sense when individual Chinese ideograms numbered about 50,000—too many to sort and select as single pieces of type. However, by the Song Dynasty the Chinese printers were doing just that. Gutenberg, with only twenty-six characters to choose from, plus capitals and punctuation marks, had an easier time with movable type.