Sunday, December 27, 2020

BC and BCE

Christmas star

The Jewish calendar dates from the biblical date of the world’s—or rather, the universe’s—creation, according to the twelfth-century (AD, in the Western reckoning) Jewish philosopher Maimonides. That’s a pretty important event. For the religious observances of Jews everywhere, the year is now 5781 AM (for Anno Mundi).1

The Islamic calendar dates from the Hijra, the year that Mohammed and his followers left Mecca for Medina to establish their first true community, or ummah. For them, it is now 1420 AH (for Anno Hegirae).

The ancient Romans counted their years from the legendary founding of their city, approximately in the year we would call 753 BC. If we still counted the Roman way, it would now be 2773 AUC (for Ab Urbe Condita). Similarly, the Greeks of approximately the same period took their dating from the first Panhellenic Games held at Olympus in what we would call 776 BC, presumably the first time that all of Greece came together as a people. In that system, it would now be 2796—although the Greeks counted loosely in units of four, the span of years between each set of games, going back to the most recent games or forward to the next set. And the events themselves were suppressed in 394 AD, when the Roman Emperor Theodosius imposed the newly established Christian calendar on the Greeks.

The Chinese calendar was established in the fourteenth century (BC, in Western reckoning) and traced back to the Emperor Huangdi, who is supposed to have worked out the cycle of years in 2637 BC based on astronomical observations of the Sun and Moon. It doesn’t really matter what year it is in China today based on this calendar, other than for religious and festival purposes. The modern Chinese follow the Western calendar.

Western Civilization took the starting date of its calendar from the birth of Christ, a big event for those who had recently converted from their pagan rites to the new religion. And it is now 2020 pretty much around the world, based on the spread of Western trade and business. We used to call this year Anno Domini (from “Year of the Lord”), and any year before the start of the Christian calendar was then “Before Christ.”

Somewhere along the line, probably about fifty years ago, scholars started calling the current years CE (for “Common Era”), because we all use this Western reckoning anyway, and the years from before this start were BCE (for “Before Common Era”). Presumably, scholars of Jewish, Muslim, Chinese, or pagan faiths found it disdainful to identify the date with a Christian reference. So they kept the accounting but changed the words around to be less offensive.

And isn’t that just the ultimate in hypocrisy? Use the Christian calendar but don’t actually acknowledge it. Pfui!

If you want to abandon the Western, Christian tradition, let’s not take the easy path. Here are some alternate datings we might adopt in a totally nonsectarian fashion, basing our counting on something important to today’s crop of irreligious scientists.

We could, for example, celebrate the first use of a printing press with movable type, by Johannes Gutenberg in 1439, which started the revolution in information technology and popular literacy—at least here in the West. Of course, the first thing Gutenberg printed was the Bible, so that’s still kind of a religious holiday.

We might base the calendar on René Descartes and his publication in 1637 of the Discours de la Meéthode, the traditional founding of the Scientific Method. His line of thinking cleared out much of the superstition surrounding natural events and led to our technocratic view of the universe. Or we could recognize Isaac Newton, whose Principia Mathematica in 1687 laid the second brick in the foundations of the scientific revolution, and whose other studies led to our better understanding of gravity, light, and optics.

Or we might celebrate Alfred Einstein’s theories of Special Relativity (1905) and General Relativity (1916), which turned Newton’s gravity, the simultaneously occurring universe, and our understanding of space and time on its head.

Taking up any of these developments as the most important events of the modern world would make our age relatively short: just 581 years for the Information Age, or between 333 and 383 years for the Scientific Age, or only 104 to 115 years for the Relativistic Age.

If you prefer hard mechanical achievements to philosophical personalities, you could base the calendar on James Watt’s technical improvements in 1776 on Thomas Newcomen’s original steam engine of 1712, which kicked off the Transportation Age; on the Manhattan Project’s fission bomb of 1945, which started the Atomic Age; or the first working transistors developed by a team at Bell Labs in 1954, which kicked off the Computer Age—with the help of Alan Turing and the primitive computational devices of the previous decade. Any of those events would give us an even shorter history—most of them putting us still in the Age of Discovery.2

But for now, until the modern era settles down into a quiet, respectable, backward-looking middle age, we’ll just have to muddle along with “Before Christ” and “Anno Domini” and their secular equivalents. And that’s a good thing, because the computers could change their dating systems more easily and a lot faster than any of us humans could learn to handle a new calendar.

1. For now, we will ignore the fact that for some of these reckonings the calendar is based on the lunar cycle of 28 days, and so the year may have more than the 12 months, or the year less than the scientifically measured 365.25 days, to which we all are accustomed. And anyway, Christ wasn’t exactly born on December 25.

2. Think of the development of the first truly portable artificial intelligence and its control of existing computer and automation systems. Think of the time when we will develop the first fusion reactor that puts out more energy than it takes to ignite, which will revolutionize our energy production. This age has a long way to go yet.

Sunday, December 20, 2020

Apocalypse Never

Apocalypse meteor storm

All versions of popular politics are to some degree insane. This is because they try to reduce a complex, variegated equation into a simple, understandable formula. And then they try to produce the formula as some version of reality. Not smart.

I pity the Marxists and their brothers and sisters among the socialists and other collectivists. They believe that if they can reduce economics to a system of distribution governed by dispassionate technical experts in the government, then poverty, unmet needs, and inequality will disappear. Then history—the history of colonial exploitation, aristocratic dynasties, and economic revolutions—will come to an end. By invoking the power of a single-minded state and the obedience of all who dwell within it, they believe they can achieve paradise on Earth and that nothing will ever change after that.

I pity the libertarians and their brothers and sisters among the anarchists1 and rabidly rugged individualists. They believe that if they can end the stultifying maze of laws and regulations raised by an archaic and backward-looking society, then humanity will be freed to become truly dynamic and creative. Then human creativity will be invited to achieve … what exactly? Honest relations among men, between men and women, between parents and children? The opportunity for each man and woman to get—and take—what they want in a paradise where the ripe fruit is always low on the tree and everyone has a set of pruning shears? I’m not sure what the point of anarchy would be.2

The socialists and other collectivists don’t understand that the human condition is compounded of desire, restlessness, and imagination. Oh, and not a little greed and envy. The underlying principle of economics, all economics, is that wants and needs are infinite while resources are finite. From this principle the marketplace naturally arises, where people measure their wants and desires against what they will have to pay in terms of their time, energy, and the stored value of these gifts in currency.

This is true even in fantasy futures like Star Trek’s, where unlimited energy through matter-antimatter conversion and unlimited material goods through energy-to-matter replication can satisfy all human needs. Do you know what it costs to generate a few grams of antimatter? The stuff is more valuable than gold or diamonds. And do you know how much energy it takes to transmute streaming photons or electrons into protons and neutrons and them fuse them into the atoms and molecules of actual matter? Your whole society would go broke in a nanosecond trying to burn enough antimatter to make a replicated dish of ice cream. And none of that will bend the “fabric of spacetime” so that you can go galivanting through the galaxy with warp drive. But I digress …

Any political program that tells you wants and needs can be satisfied equitably by relinquishing human initiative to a dispassionate government—or to an incorruptible team of robots or artificial intelligences, because we know how inventive humans are and how easily machines can be hacked—is selling you utopia, the place that never was or will be, Heaven on Earth, a dream.

The libertarians and anarchists don’t realize that the human disposition is not kindly—not where matters of life and death and the survival of one’s children are concerned. The underlying principle of society, of all social organizations, is that the individual must give up some measure of freedom to achieve some measure of safety. From this disposition arises the tribe, the village, the nation-state, and the civilization that imposes laws and asks for obedience. And when an individual will not observe the laws and render obedience, it imposes sanctions and penalties.

This is true even in the many post-apocalyptic futures, played out in endless movies and television series, where society has fallen apart due to nuclear war, environmental catastrophe, or other unspecified collapse. Then the average human being—and even those among the main characters with superior intelligence, coping mechanisms, and fighting skills—are free and happy for about five seconds. But if they are lucky and the writers of these stories are smart, they quickly begin to form tribes and villages with laws and protections, recreating civilization all over again. The clean slate is not a happy state but a dangerous place that no one wants to occupy for long.

The political programs that work are those that recognize human and resource limitations, the need for compromise, and an understanding of the balance between personal freedom and cooperation. There are no end states, not on either side of the spectrum. No one will achieve any lasting condition of total control or total freedom. The Nazis and the Soviets tried the former, and their Nirvana lasted twelve years in one case and seventy in the other, and they caused untold suffering in one case and bitter stagnation with a heap of suffering in the other. No one has yet achieved much in the way of unlimited freedom, although the French Revolution tried and quickly—just about four years, with devastation already brewing—devolved into political infighting and the Reign of Terror, complete with daily executions. People simply are not designed, morally or psychologically, to be transformed into paradise all at once while still living.

And the reality is that any society after an excursion into totalitarian control or social anarchy always reverts to some kind of economic and political mean. The average citizen still has to get up in the morning, go to work, take care of the family, and pay taxes. The sun rises and sets. Life goes on as a struggle. And it will be this way until the sun burns out or humanity dies out and leaves the planet to the care of the earthworms and cockroaches.

This country, with its blend of free-market economy and shareholder capitalism balanced against a system of government taxation, regulation, and safety nets, enjoying a huge admixture of technological imagination and creativity, has come the closest to satisfying the most needs of the most people—ever, in the history of the world. Yes, there are pockets of want and misery, although our poor people are rich compared to most of the developing world. This is why other people are willing to walk across deserts and suffocate inside freight containers in order to get here. We can always tinker with the blend of freedom, regulation, and technology, nipping here and tucking there. But we throw away the whole thing and start again on a dream of total control or total freedom at our peril.

And that is the politics of realists, not dreamers.

1. And yes, anarchists often ride along with the Marxists and socialists in the early stages of the political struggle, believing that the fastest way to get to Nirvana is to begin by tearing down the structures that already exist.

2. Although I sometimes vote with the libertarians—being a small-government, more-market, greater-personal-freedom kind of guy—I really do not understand the extreme position or the hunger of the anarchists. (Remember, I’m not a purist or absolutist about anything.) In my view, the anarchists are not moving toward anything good but away from a political and economic situation that they find too complex, too boring, or too troublesome to be allowed to continue. It’s a form of nihilism.

Sunday, December 13, 2020

Life Without Fingernails

Robot head

I tend to keep my fingernails short—not into the quick but close to it. So every month or six weeks, after trimming them, I experience a week or so when I exist like one of those robots with grip-sensitive finger pads like we see on television and in advanced science articles but not much with which to cut or scrape. Articles written about robots deal with joint articulation, wrist movement, and grip pressure. But no one seems to imagine what a humanoid robot without fingernails would go through on a daily basis.1

Take that little spot of dried goo on a mirror, plastic countertop, or other and more scratchable surface. Abrasion with a fingertip or a rubberized grip pad won’t quite wear it away. The keratin in a fingernail—akin to the protein that makes up strands of your hair—has just the right combination of hardness, flexibility, but ultimate softness and “give” to wipe away that spot without marring the underlying finish.

Take Ziplock bags. When they really stick, so that just pulling on the outside or inside bag material merely tears along the zipper seam, only a fingernail gently inserted between the two lips can tease them apart without compromising the bag’s sealing properties.

Take the snap tabs on soft drink cans and the peelable edges of the stickers on fruit, the price stickers on items intended as gifts, and virtually anything else that adheres or clamps but is intended to come loose with a soft edge and a little pressure. Trying to get a purchase under that edge with a fingertip, using friction from the ridges of your fingerprint, will almost but not quite work. To really get a grip, you need a fingernail.

And, as I read somewhere, while fingernails originated in animal claws, fighting weapons for everything from house cats to bears, their main evolutionary purpose in humans has been to anchor the skin around the fingertip, enhancing our grip on everything we grasp.

Consider every time you use a fingernail in your daily life, every time it has saved you going into the utility drawer for a screwdriver or a spatula, and you can see where lack of nails would hamper the effectiveness of a humanoid household robot. To overcome this disadvantage, designers would have to give the machine an extensible blade of soft-but-not-too-soft plastic—and replace it every time it wore away in use.

About the only thing a human does that a robot doesn’t is scratch him- or herself, and for that fingernails are perfect, too.

But maybe we shouldn’t speak too widely about the usefulness of fingernails. When the apocalypse comes, we can send messages on the underside of peelable stickers and store life’s essentials in Ziplock bags—and then use them to defeat our robot overlords.

1. Of course, most robots in the real world to come will not have humanlike hands, with fingers, opposable thumbs, and gripping pads. They will not be the bipedal, mannequin-bodied, humanoid surrogates imagined in early science fiction like I, Robot. The mechanisms that we wish to automate will have intelligence embedded in their systems. And the artificially intelligent software that guides and makes decisions for them will be so easily reproducible and installable that no one would bother to pay for a multi-function humanoid robot.
    Would you design a robot with just two cameras for binary vision, two audio receptors for directional hearing, two hands to grip a steering wheel, and two feet to operate accelerator and brake pedals in order to drive a car? And what then—your robot chauffeur would get out, go into the kitchen, pick up a paring knife, and slice carrots? Of course not. You would automate a self-driving car with multiple cameras and other sensors, including radar, and hook its decision-making capacity directly into the engine and brake management systems. And you would have a separate kitchen ’bot to prepare and serve food. A thousand other chores would be handled at the point of function by specifically designed automated systems. That is the future we’re headed toward.

Sunday, December 6, 2020

The End of the Republic


So here we are. A narrowly contested election brought strong hints of vote fraud on many levels and by various means—but with no evidence that either the mainstream media or half of the electorate will allow. An enfeebled candidate for president and an unpopular candidate for vice president have won a huge number of votes in key precincts despite minimal campaigning and public appearances, against a president who aggressively campaigned with rallies attracting thousands. And this is after four years of “resistance,” claims of Russian interference in 2016, an attempted impeachment, and a sudden virus outbreak that has been surrounded by conflicting projections, recommendations, claims, and statistics. And now, failing to get behind the presumed winner and showing “unity” is an embarrassment. Something is not right.

And yet … if there was massive and coordinated vote fraud in the battleground states, I can see why most local judges and the media—even nominally conservative reporters and commentators—are reluctant to give it credit and expose it. The consequences for the nation and the democratic process would be just too catastrophic, with civil disruptions and the threat of war such as we haven’t seen in more than a century and a half. Reasonable people would want to draw a curtain on the situation in the same way that the Warren Commission, charged with probing the assassination of John F. Kennedy, sealed its evidence for 75 years: if their probe had exposed evidence of the Soviet Union or Cuba planning and pulling off the murder of a sitting President, the public outcry could have forced Congress into declaring World War III. Better to throw a blanket over the whole thing.

But if there was coordinated election fraud, then I would feel like an old Roman at the end of the Republic, where Caesar had just been declared Dictator for Life on the strength of his personal legions and then assassinated. And the result was not a return to the normal political process but a free-for-all.

The interesting thing about the Roman Imperium is that the term imperator did not originally mean “emperor”—that came later—but simply “field marshal,” reflecting the leader’s military backing.1 And yet, while one man held total control of the state because of a personal military force, all the forms of the republican government were obeyed. The cursus honorum was still in place, and Roman citizens of good background still filled the correct political and religious offices. Each of the Caesars was officially simply consul for the year, elected along with a nonentity whom he named to be his co-consul, but everyone knew where the real power lay. None of the ancient and sacred laws of the Twelve Tables was changed. They simply meant nothing important anymore.

The same thing could happen in the United States. Without the input of the people through a trustworthy voting process, all the forms of the republic could be maintained and still mean nothing. You wouldn’t have to change a word of the Constitution if you simply decided to accept a different meaning for the words.

None of the forms of government put in place by the main body of the U.S. Constitution have changed. We still formally adhere to the separation of powers of the legislature, executive, and judiciary branches and the checks and balances put in place to keep one or the other from taking full control. And yet over the years—and this has happened on numerous occasions—the power of the executive has expanded with administrative offices that are largely unelected and at the operative levels not even appointed, and yet they interpret the laws made by the legislature. And when Congress won’t give a powerful President the results he—so far “he”; we will see in the future—wants, the President whips out a pen and creates an executive order. The judiciary at the state and federal level then interprets the laws, the executive orders, and the Constitution itself according to their own political likes and dislikes. Meanwhile—at least over the last two decades—the legislature has been free to engage in partisan squabbling and gridlock, achieving little of note, while the country’s government keeps evolving and advancing in the direction of administrative and judicial law.

If you were to create a “heat map” of where the actual power and authority in this country exist and compare it to the structure envisioned in the Constitution, I believe you would find massive areas of non-overlap.

The Bill of Rights would be no more sacred. Protections for freedom of speech and religion in the First Amendment would still be guaranteed. But as the past year has shown, your freedom to assemble and worship can be curtailed in the event of a public calamity like the pandemic. Certain words and phrases have long been banned and punished as “hate speech.” And now we know that your freedom of speech may be freely infringed by a privately operated communications system, part of the “social media,” if the operators believe it contradicts the narrative imposed by the ruling majority.

Sure, the Second Amendment guarantees your right to bear arms—but we could interpret those words so that they apply only so long as you are a member of the army, the national guard, a police force, or other “well regulated militia.” There is nothing in there about hunting or defense of home and self.

Nor would it be “cruel and unusual punishment,” according to the Eighth Amendment, if we kept you in permanent solitary confinement or filled you with mind-altering drugs to treat your chronic “social psychosis” and “false consciousness,” brought on by your non-supportive political views. These are medical treatments after all, not punishments.

And so it goes. As Humpty Dumpty said to Alice: “When I use a word, it means just what I choose it to mean—neither more nor less.” And two plus two equals five.

As Benjamin Franklin also supposedly said, in response to an inquiry about the outcome of the Constitutional Convention in 1787: “A republic, if you can keep it.” And yes, like the old Romans of the first century BC, we can keep all the forms, all the laws, all the words written down in ink on parchment or cast in bronze, and still have something different.

Whether we can do anything about this—and whether anyone who benefits from the current situation cares—is another matter. When no one cares, then it doesn’t matter.

1. The early Romans had seven legendary kings, the last one ousted early in the sixth century BC. That started the Republic, with its popularly elected government offices up through the leadership position of the two co-consuls, who together served alternate months for only one year and then could not be re-elected until after another ten years. This was supposedly a guarantee against one man becoming too powerful.
    The experience of being ruled by kings was apparently so awful that the Romans were simply allergic to the title “king.” They probably would have joined the conspirators en masse to tear Caesar apart if he had taken that title for himself. A dictator—the word just means “speaker”—serving for a select period, even for life, was the closest they could allow themselves to come to monarchical authority in a period of crisis. However, when Caesar’s legal heir Octavian first avenged the assassination and then succeeded Caesar as leader of his armies and head of the state, they accepted him as princeps—“leader” or “first citizen,” from which we get the word “prince”—as well as the imperator. And by the time Octavian, who was subsequently styled “Augustus,” was an old man, the Romans accepted his rule as being effectively hereditary. They got themselves a monarch anyway, but by another name.

Sunday, November 29, 2020

The Wichita Lineman


Glen Campbell’s “Wichita Lineman”—a poignant song from my college days—has been running through my head. And after ten years working in the electric and gas utility business, back in the 1980s, I still can’t get the inconsistencies out of my mind.

I am a lineman for the county
And I drive the main road,
Searchin’ in the sun for another overload.

Where to begin? Well, first, “lineman” is not usually a county function. In most service territories, the line crews are employed by the public utility that owns, operates, and maintains the powerlines. So I thought we had difficulty in the first line. But, on investigation, I found that Wichita is served by the Sedgwick County Electric Cooperative, so in that case the lineman probably does work for the county.

But to continue … In most utilities, the linemen don’t drive around looking for trouble. That job—which is a promotion above the classification of lineman—goes to a “troubleman.” This person works alone, as in the song, and does patrol the distribution and transmission lines. And when the troubleman finds trouble, he—sometimes nowadays, but rarely, a she—calls out a line crew to fix it. Note that while some powerlines, especially distribution lines in neighborhoods, have rights of way along “the main road,” most of the higher-voltage transmission lines cut across country. So the troubleman spends more time on back roads and in the dirt than on easy main roads.

And then, what kind of trouble does this person look for? An “overload” implies a powerline that is carrying too much electricity. That is, the utility is operating it at too high a load for conditions. Electricity passing through a wire creates resistance and sheds this energy in the form of heat. On hot and windless days, the heat can start to melt—or at least soften—a one-inch cable braided from aluminum strands, so that it sags between towers or poles and threatens to touch the trees, brush, or even ground along the right of way. That would be an obvious problem. And maybe in Kansas, in 1968 when the song was written, utility operators wouldn’t know when they were overloading their lines and causing fires. These days, powerlines are monitored by remote sensing equipment and the operator varies the load to match conditions.

What most troublemen are looking for is the location of a known fault. Sometimes, in a high wind, two of the three phases on the line—carried in the three separate wires strung from pole to pole—come together, short out, and cause a fault. Then, in a modern powerline, a device called a “recloser” works like a fuse: it pops, interrupting the flow of current, then tries to close again in case the touch was momentarily. If the fault persists, it stays open. The troubleman finds the hanging recloser, gets out a long extension pole, and closes it. But more often the trouble is a line brought down by someone driving into a pole, or the wind causing a tree branch to fall into the lines and bring them down. Then the troubleman calls out a line crew to fix it.

I hear you singin’ in the wire;
I can hear you through the whine.
And the Wichita lineman is still on the line.

I understand that you can sometimes hear the wind in the wires and think it’s your loved one singing. If you hear a whine or a buzz, then you’re not dealing with a neighborhood distribution line. More likely it’s a high-voltage transmission line, and the sound is caused by moisture or dust carrying some of that current outside the wire and along its surface, like a giant Tesla coil. And yes, a lineman or troubleman might hear this too.

I know I need a small vacation
But it don’t look like rain.
And if it snows that stretch down south won’t ever stand the strain.
And I need you more than want you,
And I want you for all time.
And the Wichita lineman is still on the line.

All right, this one’s a definite no. Well, maybe in Kansas, where the rain comes straight down and the wind doesn’t blow. In California, however, rain means winter storms with lots of wind. That means lines coming down and lots of trouble to repair. You don’t want your troublemen and linemen going on vacation then. You want everyone on call and ready to roll at all hours of the day or night. And maybe, in Kansas on the prairie, you can get so much snow that it weighs down the lines or the towers and causes them to collapse. But if a troubleman knew of a weakened section vulnerable to such weather, he would have put in an order to repair or replace it. Maybe the utility company or cooperative has not yet executed that order, but prudent management suggests you do that while the sun is shining and not wait for the snow to take out part of your system.

And then, this lineman is thinking about his lady love and needing her and wanting her. Some senior member of the crew is going to slap him upside his hardhat and tell him to get his mind on the job and get back to work. That’s the reality.

But then, “Wichita Troubleman” have been a different type of song. And none of it would have rhymed.

Sunday, November 22, 2020

On Abortion

Human embryo

I try not to be too overtly political on these blogs, because I have friends and readers on both sides of the aisle. Also, I am generally in the middle on most issues, polling about three points right of center on a scale of one hundred either way. But this issue has been batting around in my head recently, and writing it down is a way to get it out.

First, let me say that I’m not generally against killing. I mean, we humans do it all the time. I eat meat and feel unashamed. I support war as the last resort of the beset and desperate. War means killing. War means horror. But war is what you do when negotiations break down and surrender is not an option.

Like most Americans, I think, I am not opposed to abortion when it’s done in the first trimester. At that time the embryo is still developing and doesn’t have much of a nervous system, so not a lot of sentience. While I believe life begins at conception, the early-stage fetus is still just the potential for human life. A lot of things can go wrong in a pregnancy and do. And a miscarriage in the first trimester is more a dashing of the parents’ hopes than the destruction of a human person. Still, I don’t like abortion as a birth-control option, because it’s invasive and it seems to teach the woman’s body how to miscarry. But if she is beyond the option of less invasive measures and still wishes not to be pregnant, that will be her choice.

The second and third trimester become, for me, more problematic. The fetus is developing a nervous system, sensation, and—if we can believe the memes that would have you play Bach to your belly, talk to it, and send happy thoughts in that direction—some self-awareness. Destroying a fetus in these circumstances becomes more like the murder of a human person than the destruction of a clump of cells. I have some feelings about that, and so do many Americans, I think.

Abortion at the moment of birth—what “partial-birth” abortion would seem to advocate—is, in my mind, really the killing of a viable human baby. I understand that the birth may be induced for this purpose, but that does not make it better. I also understand that the people who advocate for this are less concerned with the mental or physical health of the mother than they are with the legal standing of the abortion issue. They are absolutists and legal purists: if abortion in principle is not made an absolute right at all stages of a pregnancy, then it can be challenged and overturned at any stage, including the moment of conception.

I am not an absolutist or a purist about anything. So the appeal of this argument leaves me cold. I believe people should be responsible for their actions: if a woman decides she does not want a child, she should make up her mind in the first three or four months, not wait until the baby is almost born. Bending the law and common sense to accommodate her every whim is not good practice.

Also, abortion at the end of a pregnancy crosses a line that, I think, most moral people are unwilling to cross. If it’s acceptable to kill a baby at the moment of birth, then why not two weeks later? Does the child keep you up at night? Do you have regrets about becoming a parent? Smother it! Does your two-year-old daughter throw tantrums and grate on your nerves? Drown her! Did your sixteen-year-old son borrow the car and dent it? Shoot him! Did your sixty-year-old daughter tell you it’s time you were put in the old folks’ home? Stab or poison her!

Again, I’m not completely opposed to killing—at least not when it’s done with cause. But I do believe people should take responsibility for their actions. And their response to pressure, aggravation, and opposition should be proportional to the incident, where casual murder would be an extreme reaction. While I don’t have an absolute respect for life—again, not absolutist about anything—I do believe being careful and respecting the rights of other sentient beings, especially among your own species, is a moral good. It’s something to strive for, even if we cannot always attain it.

Now, many women are also saying, with reason, “My body, my choice.” This is to say that other people, men in particular, and society in general, have no right to an opinion in the matter of abortion, nor should they be allowed to make rules about it. And, in my view, this is true up to a point—that point being somewhere in the second trimester. Until then, the fetus might be considered just a “clump of cells,” not much different from a tumor, and certainly without a separate identity or viability, perhaps with the potential for humanity but not exactly a human being. But after that point—and we can debate where to draw the line—the fetus becomes a separate entity with sensation, some awareness, and more than just potential. At that point, the woman is hosting, sharing her body with, another living being. And whether you like the biological facts or not, that becomes a societal concern.

To say, “My body, my choice” about the entire process, up to and through the stage of actually giving birth, is like someone saying, “My dog, my property.” After all, they own the dog, bought it, fed it, took care of it in their fashion, and can now decide what to do with it. If a dog owner wants to beat it, starve it, leave it out in the cold chained to a tree, or even abandon it alongside the road, then society should have no say in the matter. The dog is a wholly owned possession that may be disposed of at the owner’s whim. Right?

That’s one view of the legal process about ownership and responsibility, but most of us would disagree. A world where such mind-your-own-business callousness is the societal norm would be a cold and unfeeling place, without pity or concern for the weak and defenseless.

That is not a world I want to inhabit.

Sunday, November 15, 2020

The Unexpected Candidates

Puppet master

Something very strange is going on. Or, put another way with more emphasis, what the hell is going on? Or, as we used to say back in I think it was the 1960s or early ’70s, “Who the hell for President.” Simply stated, the American electorate over the past decade and maybe more has been choosing, or perhaps being offered, the most surprising, least expected, and sometimes least qualified candidates for the highest office in the land.

The Presidency is the most prominent and most powerful popularly elected position in the country. It ranks above Speaker of the House, who is only elected by members of Congress; above Majority Leader in the Senate, who is only elected by members of the majority party in Congress; and Chief Justice of the Supreme Court, who is appointed by the President and confirmed in the Senate. Of all the key players in our national government, the President is the only one we all get together and choose, first in the primary elections or party caucuses in each state and then in the national election.

Yes, the Republican National Committee and the Democratic National Committee have great influence on how the candidates of each major party are chosen. The national committees solicit and direct funding for campaigns and write the rules for party organization and choosing delegates to their national conventions, where input from the primaries are reduced to votes for and against potential candidates. And sometimes the national committees, whose members and influence may not be publicly recognized—that “smoke-filled room” thing—put their fingers on the scale. In both parties, the votes at their national convention include both “pledged” delegates, representing results of the primary election in their state, “unpledged” delegates, who presumably can vote their conscience, or the desires of the party structure, or whatever.

Up until 2018, the Democrats had a large number of “superdelegates” in this position, representing members of Congress, governors, and past Presidents. They could vote however they themselves wanted or at the direction of the party. After 2018, the superdelegates were forbidden to vote on the first ballot of the convention, effectively letting the people decide that much, unless the outcome was beyond doubt. In 2015, the Republicans ruled that unpledged delegates had to vote in accordance with the popular vote in their state’s primary election.

And then, there is the matter of whether the state holds an open, semi-open, or closed primary election, reflecting when and how people not registered with a particular party can vote for the candidates of other parties. Only thirteen states and the District of Columbia have closed primaries, where the voter is only offered the choice of candidates within his or her registered party affiliation. Fifteen states have semi-closed primaries, where only independent voters may choose among candidates on any of the affiliated ballots, or may change their registration on election day. Fourteen states have open primaries, where the voter chooses the party ballot on election day. Others, including my own California, have some kind of blanket primary, where the voters choose from a roster of all candidates from all parties.

So how much actual choice any individual voter has in the selection of the final candidates put forward on the November ballot is open to question.

Still … what the hell has been happening? Sometimes, the party’s candidate has been around so long, raised so much money, or tried often enough that the national committee, the primary voters, and the delegates decide that, come what may, “it’s his [or her] time.” This is apparently what happened when the Republicans selected the cold-natured Bob Dole to run in 1996 and Democrats promoted the unlovable and sometimes questionable Hillary Clinton in 2016. In both cases, party loyalists had to grit their teeth and vote the platform. At least, both candidates had solid careers in the Senate, and Clinton had been Secretary of State, a high and influential office in any administration.

But in 2008, the Democrats nominated Barack Obama, a junior senator with limited government experience, with sealed transcripts and a ghost-written autobiography—but selected presumably because he was the only obvious Black candidate, and “it was time”—and the Republicans nominated John McCain, an established senator from Arizona but one who had voted against his party’s interest so often that he felt like an independent.

In 2012, the Republicans nominated Mitt Romney, a businessman, son of the former governor of Michigan, and chief executive of the organizing committee for the 2002 Winter Olympics. He was a nice enough guy, but still not ready for the presidency.1

In 2016, the Democrats finally decided it was Hillary Clinton’s “time,” narrowly excluding the senator from Vermont, Bernie Sanders, whose party affiliation is officially “independent” and unabashedly claims to admire socialism. And the Republicans passed over a dozen able candidates with political experience including governors and senators as well as a nationally prominent businesswoman with executive experience to choose Donald Trump, a real estate magnate and reality-television star with no background in electoral politics.

And then in 2020, we almost got Sanders as the Democratic nominee but he was passed over at the eleventh hour by Joe Biden, a long-time senator, vice president under Obama, previous candidate for president—but also a man of obviously frail and perhaps failing mental and physical health. If it was “his time,” that was sometime in the past. Biden was joined on the Democratic ticket by Kamala Harris, the junior senator from California and former state attorney general, who dropped out of the field of presidential candidates before her first primary. These are hardly charismatic personalities.

It used to be that candidates for the highest office in the land would have extensive political experience, usually as a governor running one of the larger states or as an influential and long-serving member of Congress, at least as a senator. But lately we have seen a parade of candidates chosen for some other reason. And not all of them have outstanding service in some other line of work, such as Dwight Eisenhower in the 1952 election after a leadership role in winning World War II.

It is almost as if the parties, or the people themselves, are devaluing the office, saying “Who the hell for President.” And this is at a time when Congress defers more and more of the details in the laws they pass to the judgment of unelected bureaucrats in the Executive branch and lets the legality of those laws be decided in cases before the Supreme Court. You would think that the person who appoints the senior executives in the administration, sets its day-to-day tone, can veto legislation, and nominates the federal judges and Supreme Court justices should be a person of proven capability, probity, and reasonable judgment.

Instead, we seem to get more than our fair share of nonentities and, sometimes, thinly disguised crooks and buffoons. Who chooses these people? What the hell is happening?

1. And it was only in the last year or so that he became the junior senator from Utah, gaining the political experience that he should have had eight years ago.

Sunday, November 1, 2020

Electricity's Dirty Secret

Power lines

For the decade of the 1980s I worked in the Corporate Communications department of the Pacific Gas & Electric Company, PG&E, one of the largest energy companies in the country, with a service territory covering most of Northern California. One of the biggest things I learned from this time—aside from the fact that your local utility is made up of good people who support their community—is that there are many ways to generate electricity and the key to choosing among them is economics rather than technology.

By a quirk of geography and history, PG&E had—and still has, for all I know—one of the most diversified generating systems in the country, although some of that generating capacity has since been spun off to private owners and suppliers. The company inherited a network of dams and flumes in the Sierra Nevada that provided powerful water jets for hydraulic gold mining in the 19th century, and these were easily converted to run turbines in hydroelectric powerhouses up and down the mountains. It had four large thermal power plants—steam boilers driving turbine generators—that drew on the company’s abundant natural gas supplies for fuel. PG&E also operates smaller units that burn the gas to directly drive turbines, similar to a grounded jet engine attached to a generator. It built a major nuclear power plant at Diablo Cove in San Luis Obispo County, and built almost two dozen generating units drawing on The Geysers geothermal steam field in Sonoma and Lake counties. It draws power from the world’s largest photovoltaic power plant, on the Carrizo Plain, also in San Luis Obispo County, and from the Shiloh wind-power farm in the Montezuma Hills along the Sacramento River in Solano County, among others. The company buys electricity from the Western System Power Pool (WSPP) and the California Independent System Operator (CaISO).

With all of this diversity, PG&E’s energy cost is relatively low, depending on factors like snowfall in the Sierras to feed those dams and the state of the aquifer feeding the steam fields. The company does not draw on enough renewable energy—yet—to be much affected by variations in the wind and sundown over the state.

If the state ever fulfills its promise to get rid of all fossil fuels and provide all power from renewables like wind and solar, the remaining nuclear and geothermal assets will not be able to make up the difference from those abandoned gas-fired power plants.1 There is talk of making up the difference from windless days and dark nights with some kind of energy storage: batteries, compressed air in underground chambers, or superconducting materials that let an electrical charge chase round and round in a donut-like torus. None of these technologies has been tried or proven at any scale needed to supply a utility grid. There is also talk of mandating solar powered roofing in all new housing and retrofitting existing roofs, with transformers to convert the electricity to household current and with batteries to supply energy on dark days and at night. Aside from the initial cost and payback time, generally measured in tens of years, these plans are intended—at least in the promoters’ dreams—to put the local utility entirely out of business.2

The dream of “free electricity” without fuel costs or emissions, using wind and solar power, runs into some basic engineering realities involving energy efficiency and capital cost.

In making these technologies work, the engineer has to move from conceptual design—linking up components, energy flows, and costs in back-of-the-envelope calculations and drawings—to detail design—putting the components in place at the right scale, establishing the true cost of each component, and accounting for variables like heat loss and line losses.3

Engineers constantly work with another variable set, too. For them, there is no such thing as perfection, no solution that is best under all conditions. Everything is a tradeoff in the engineer’s world. Instead of “good” and “bad,” the engineer thinks in terms of “better” and “worse.” You can make electricity with a gasoline generator—if the EPA and county authorities will approve it—or with a hand crank, or by rubbing a silk scarf on a glass rod. The question is always—and this is what I learned at PG&E—at what site, with what investments, and using what fuel supply at what cost? How attractive or interesting or politically correct the technology might be is not a factor.

Solar photovoltaics—generating an electric current by using the energy in sunlight to pass an electron through a semiconductor substrate—is about 20% to 22% efficient, even in cells and panels of the highest quality. This means that three-quarters of the solar energy that falls on them is lost to heat or reflection. And how that efficiency is affected by dust or a layer of snow and ice is still undetermined in large-scale applications, although probably not to good effect. Perhaps, in time, research into new materials can boost that efficiency up to maybe 30%, but much farther than that doesn’t seem to be in the cards.

Wind turbines have an efficiency of about 50% to 59%. This is comparable to the energy efficiency of a gas turbine or thermal power plant. But wind farms require the right conditions, a place with strong, steady, and predictable winds. Like a geothermal steam field, such locations are a resource that can’t be established by fiat or political rezoning. And wind turbines, like any machine dealing with strong forces, are subject to mechanical stresses on the blades and shafts. Although their energy resource is free, the capital investment to harvest it is expensive, not easy to maintain—that is, a heavy generator on a tall tower, sometimes sited on a hilltop, is harder to fix than a turbine generator under cover in a power plant—and subject to depreciation and eventual replacement.

Either of these fuel-free, renewable resources would require the participating utility to maintain a commensurate amount of “spinning reserve”—an alternate, dispatchable generating resource all fired up and ready to come on line instantly to meet the system load dropped when the wind dies or the sun goes behind a cloud. In most cases, this reserve power would have to come from fossil fuels, because the small amounts of electricity available from hydro and geothermal power, and the supply from an operating nuclear plant, would already be spoken for. And some form of “battery backup” on a systemwide basis is not currently technically or economically feasible.

And finally, fusion—the dream of limitless energy by harvesting hydrogen isotopes from sea water and compressing them with laser blasts or electromagnetic fields—is still ten years away. Always ten years away. Yes, we can make deuterium and tritium fuse with either compression technology; we just can’t make them give off more energy than we must put into the reaction. For now, it seems, the only way to fuse hydrogen into helium reliably is to compress it in a steep gravity field, like the inside of a star. Until we find some magical gravity-manipulation technology, utility-scale fusion is going to remain a dream.

All of these renewable technologies—except for fusion—have their place in a diversified system. None of them is ready, yet, to satisfy all of our energy needs. And a modern economy runs on ready availability of energy the way ancient economies ran on resources of clean water and food. Maybe in a few hundred or a thousand years, when we have run out of conveniently obtained fossil fuels, we will develop efficient and low-cost solar4 or fusion power. But for right now, we run on bulk carbon energy.

And no amount of wishing will make it otherwise.

1. Of all the fossil fuels, natural gas is the most efficient in terms of high energy output with low carbon dioxide emissions. This is because the methane molecule (CH4) burns completely, breaking all of its hydrogen bonds in turning methane into carbon dioxide and water. Other carbon sources like coal and oil either burn incompletely or tend to put soot particles and other contaminants into the atmosphere along with greater amounts of carbon dioxide.

2. Of course, manufacturing plants that need large amounts of electric power to run their operations—more than rooftop solar can supply, like steel mills, auto factories, shipyards, and other heavy industries—can either run their own generating stations or leave the state.

3. Building a solar- or wind-power farm—whose energy resource and efficiencies are generally be weaker than a thermal plant’s, and which will generally have to be sited some distance from the end user—must take into account energy lost to resistance and heat on a transmission line. This is usually accounted as 5% to 15%, depending on distance traveled.

4. Probably from orbit, as in my novel Sunflowers, where sunlight has an energy density of 1,300 watts per square meter instead of the 130 W/m2 that strikes the Earth’s surface.

Sunday, October 25, 2020

Are Empires Always Evil?

Roman arm

If you read science fiction, the Empire is always evil, the Emperor is always a villain, and his officers and minions—we’re looking at you, Darth—are always either toadies or supervillains. It was so in the Star Wars movies and the Dune books. Generically, if there is an empire involved in the story, it is bad place and meant to be fought against by the forces of light, reason, and goodness.

Perhaps this is a cultural spillover from the political view—generally held by Marxists and Soviet-inspired Leftists—that all the troubles of the modern world stem from “imperialism.” And by that they usually mean the empires built by white Europeans in Africa, the Middle East, Asia, and South America. The equation is: “Empire bad, local governance good”—even when local governance is at the tribal level without any political refinement. And that equation holds right up until the empire in question is one managed by Soviets or Chinese Communists, and then the benefits of central control by a foreign power structure are not to be questioned.

The cultural spillover also derives from the depiction of Rome and its ancient Mediterranean empire from the Judeo-Christian viewpoint. That is, from the troubles the Romans faced in the province of Judea, particularly when Rome tried to impose its statist, polytheistic religion on people who only believed in one, true god. This dispute ended with the Siege of Jerusalem in 70 A.D. and the Jewish Diaspora. That jaundiced view of the Roman Empire was also fed to us by the persecution of Christ under Pilate and of Christians in general under the empire—until Constantine legalized their religion three hundred years later.

But was Rome an evil empire? Was life there such a hardship?

First, let’s count the negatives. For starters, most people outside the City of Rome itself were added to the empire through conquest. You started off by trading with Rome at a distance, then getting a road built into your territory, then seeing an army march in along that road, and then you had to fight for the right of self-determination. Sometimes the army came first and the road came second—to make it easier for Rome to send reinforcements and hold you down. Almost nobody welcomed Rome at first. But let’s be fair: when the Romans marched in, what they were fighting was mostly the local king, the ancient families who held positions of power, and the armies they could recruit and command. Whether the war was short—as in a few campaigns by Caesar among the Transalpine Gauls—or long—as in all that unhappiness in Judea ending in the reduction of the capital and a bloodbath—was usually a matter of whether and how involved the average person, the peasant in the fields, became in the struggle. That, and the cohesive nature of the civilization that the Romans were attempting to absorb. Gallic and German tribesman were culturally similar but independence-minded and locally divided, and by the standards of the day they were primitive. Judea was an advanced civilization with a unified culture, strong central government, and firm beliefs.1

Next, the issue of slavery. Rome had it and didn’t apologize for that. But then, so did most of the lands and kingdoms they conquered. But, unlike the South in the United States, Roman slavery was not race-based. Just because you had a certain heritage and skin of a certain color did not make you a slave, subject to harassment and capture even after you were freed. Roman slaves entered captivity by losing a battle—all those wars of conquest—or resisting so strongly that the Romans made an example of your whole family or town by selling them into slavery. Or you could become a slave after being found guilty of a crime or through indebtedness—having pledged your person as collateral for a loan. Still, a Roman slave was property and could be abused, sexually exploited, tortured, and even summarily executed—although it generally didn’t profit an owner to damage or destroy his or her property. But also, Roman slaves could earn their freedom, and Rome eventually legislated slave protections such as being able to lodge complaints against their masters and to receive medical care in sickness and old age. And finally, in the ancient world, as in much of the world today, unless you held a piece of property or were trained and engaged in a skill or trade, you always had someone standing over you and making demands on your labor, your time, and ultimately your life. Still, it was better to be a citizen of Rome than anyone’s slave.2

And then, there was tribute. As a Roman province, you were put under the administration of a governor known as a propraetor or proconsul—usually an ex-consul or senior government official out to make his fortune after years of public service. The Roman administration was there mostly to collect tribute—so much to be paid each year in gold or trade goods—or to secure some necessity that the City of Rome needed, such as grain from Egypt, which was the ancient world’s breadbasket. Along with the governor and his administration came the tax collectors, who were not always honest and not always working directly for Rome. It was hard being someone from an old family, landed, wealthy, or otherwise locally important in a newly established Roman province. But, as noted above, life was hard all over—still is in many ways.

And now, some of the good things. First, you were generally cleaner and safer inside the Roman Empire than out of it. The Romans were creative and compulsive engineers, and wherever they went they took with them their construction skills and their preference for clean water and a relaxing bath. They built huge aqueducts not just to serve the City of Rome but throughout the empire to provide clean water and introduce the concept of regular bathing to the general population. And you tended to be safer because the Roman administration frowned upon casual banditry—an occupation reserved to the state—and introduced a proven code of laws suitable to civilized urban living.

Next, your worldview and access to trade expanded. The Romans transmitted knowledge and trade goods from one end of the Mediterranean basin to the other and extending into the hinterlands. If you were part of the empire, you were a citizen of the world. That meant, for a person with ambition, an increase in opportunity and income. And for a citizen, either in the city or the countryside, who might not have owned a piece of property or engaged in a lucrative trade, there was always the army. You signed up for 25 years of service with the legion. After that time, if you survived, you were generally awarded land and a living in the province where you had fought or maintained order—and by then you usually had a local wife and children. Being a Roman soldier was more dangerous than being, say, a farmer out in the hinterlands—except for that casual banditry—but it wasn’t a death sentence, either. The Roman legions fought with a disciplined cohesiveness and regular tactics that tended to minimize wounding and death and favored applying massive and concentrated force against their enemies. It was good to be on the winning side.

And finally, if you were a good ally and willing supporter of Rome, you eventually became a Roman citizen yourself. You had to bathe, speak and read Latin, and obey the law, of course. No hot-headed rebellion—which anyway would be quickly crushed, at least in the times that the Republic and then the Empire were a going concern. Eventually, you could move to Rome itself and become part of the elite. And the consensus seems to be that, in the ancient world, the best time to be alive was Rome in the second century—that is, between 100 and 200 A.D. Not only was the weather mild—the “Roman Warm Period”—but the Mediterranean world was generally at peace. It was a lull between the political chaos of the Hellenistic Age and the rising cold and invading barbarians of the encroaching Dark Age.

There is a reason people submit to the rule of empires and emperors. Whether the Islamic Caliphate, the Mongol Empire, the Ottoman Turks, or the British Empire, the food is usually better, the arts and sciences richer, the trade more expansive, the rule of law generally gentler and less oppressive than the dictates of a local king or brigand, and the average person has a sense of being part of something really grand. Also, under the Romans, you got a hot bath, and under the British, a flush toilet. Not bad for minding your own business and occasionally tugging the forelock.

1. And Egypt was just a mess, having been conquered by Alexander three centuries earlier and then mismanaged by the Ptolemies.

2. The taint of slavery did linger, however, even after a person was set free through the process of manumission. “Freedman” was a separate class in Rome from “citizen,” although freedmen who had previously been owned by Roman citizens could vote and their children became citizens. Still, in the Republic it was rumored that the general and statesman Gaius Marius, one of the “New Men” whose family originated in the allied Italian states and not in the City of Rome itself, had slaves in his ancestry. This was considered a blot on his character.

Sunday, October 18, 2020

Too Many Superheroes


It’s no secret that our movies, television, and to some extent also our popular fiction are inundated with superheroes.1 The main characters, or the essential focus of the story, is on people with some physical or mental enhancement: super strength, x-ray vision, ability to fly, increased lifespan, or genius-level perception. And I would include here people who are otherwise separated from the human race by exceptional circumstances: vampires, witches, fallen angels, and the victims of medical experimentation.

These movies, television shows—series, I guess you call them now, with extended story arcs—and books are aimed at the young adult, the middling young, and the young at heart. The trouble is that, in my view, they tend to arrest the normal human development from child to functioning adult.

Life’s problems, which all of us must deal with, cannot be solved by punching through walls, seeing through doors, outsmarting your enemies with a genius IQ, or becoming immortal. A functioning adult has to use the skills and knowledge developed through hard work, proper choices, and good use of time in order to gain confidence, capability, and self-esteem. These things cannot be granted by birth on another planet, a medical advance, or a fortuitous afterlife. There are no shortcuts to growing up.

One of my favorite science-fiction series is Frank Herbert’s Dune books, telling the fantastic far-future history of the accomplished Atreides family. The series actually climaxes in the fourth book, The God-Emperor of Dune. The main character there is Leto II, who is the ultimate superhero: emperor of the known universe, served and protected by fiercely loyal people, commanding a superb fighting force, as well as being virtually immortal, physically invulnerable, able to predict the future, and able to access the living memory of every one of his ancestors and so the entire history and example of all humanity. And yet, in Herbert’s brilliant style, he is brought down by two skilled but not super-powered human beings who resist being his slaves. The book is really the anti-superhero story.

To be an adult is to possess hard-won knowledge, to develop skills that cannot be acquired magically or through a pill or genetic manipulation, to have endured experiences that are both constructive and destructive and enable you to know and understand the difference, and to become adept at foreseeing and dealing with the consequences of your actions. All of this must be learned. It must be acquired by having hopes and dreams, working toward them, and sometimes—maybe often—seeing them dashed. It is acquired through working through your problems, paying attention to what happens and when, remembering those consequences, and formulating rules of living both for yourself and your children, if you have any. This is the process that every child, every young adult, and every post-adolescent goes through. If you are lucky to survive, you keep learning and updating your internal database through adulthood and into middle and old age. Perfecting who you are should never stop until you draw your last breath.

And that is the final lesson. To be an adult includes the sober knowledge and acceptance of the fact that you, personally, in your own self, will one day die.2 This is not a cause for grief, fear, rage, or despair. Humans die, animals and plants die, bacteria and funguses can be destroyed, cell lines come to an end. Even rocks and whole mountains wear away to dust and silt, then break down into their component atoms, and rejoin the cycle of life on this planet. In my view, this is the key understanding of the human condition. We are not immortal. We have no lasting power over death, only good fortune and small victories. We only have the strength of our bodies, the power of our intelligence, and the focus of our wills. That is all we human beings can command.

When you know that you will eventually die, then you know how to value your life, your time, and your effort here on Earth. To be willing to sacrifice your life for something you believe is greater than yourself, you have to know how to value your remaining time. This is a rational decision that our brains were designed to make—if they are not clouded by the veil of hope that we, in our own bodies, just might be immortal. That hope protects us when we are young and stupid and have little experience of death. It is a foolish thing to carry into adulthood and middle age, when we are supposed to know the truth and act accordingly.

Oh, and in addition to what we can command and accomplish as individuals, we can also work together, pooling our achievements and our knowledge over time. We can raise vast cathedrals, each person adding his own carved stone or piece of colored glass. We can build a body of scientific knowledge by researching and writing down our findings in a discipline that we share with others. We can join a company—in the oldest sense of that word, whether an economic enterprise, a body of troops, or a group of travelers—to attempt and achieve more than a single human can do. And if we cannot do any of these things directly, then we can support the efforts of others by mixing mortar for their cathedral, serving as an archivist of their scientific endeavors, or becoming the financier, accountant, or quartermaster to that company in whatever form it takes.

Any of these tasks shared with other humans requires a knowledge of self and your limitations, a willingness to hold your own dreams and desires in check and subvert them to the common will, and to take and give orders for the good of the common effort. And this is another aspect of becoming an adult: to put aside the me-me-me of childhood and adopt the us of a collaborative group.

Superheroes, in fiction and on the screen, leap over these everyday problems and concerns. If they experience disappointment and existential angst at all, it is usually focused inward, on their supposed powers and their failure when they meet a foe who exhibits a greater power. But it’s all a conception of, and played out in the mind of, the graphic artist, the writer, or the film director: the presumed power, the challenges, and the intended result. And, curiously enough, the superhero always manages to win in the end. That is the way of fiction.

Real life involves dashed expectations, failed attempts, physical and mental limits, rejection by loved ones, and sometimes rejection by society itself. It is what a person does with these situations, using only the strength and wits, skills and knowledge, that he or she has acquired through conscientious development, that marks a successful human being. And ultimately the extinction of body and mind comes for us all. If you’re not dealing soberly with these things—and superheroes don’t—then you remain a species of child.

Those developing-adult stories, dealing with growth and change, are really the ones worth telling.

1. In fact, about fifteen years ago, when I was still trying to find an agent for my science-fiction writing, one potential candidate asked, “Who is your superhero?” That was the literary mindset: the main character had to have extraordinary powers for any book that could hope to be optioned for a movie—and back then selling a million copies and making it to the big screen had become the sole purpose of publishing. Maybe it still it, for all I know. But Covid-19 and the closing of the theaters might change all that.

2. I believe I first read this in a Heinlein story—perhaps Stranger in a Strange Land, although I can’t find the reference—that the difference between a child and an adult is the personal acceptance of death. To that, one of the characters in the conversation replies, “Then I know some pretty tall children.”

Sunday, October 11, 2020

Modeling Nature

Mandelbrot fractal

A saying favored by military strategists—although coined by Polish-American scientist and philosopher Alfred Korzybski—holds that “the map is not the territory.”1 This is a reminder that maps are made by human beings, who always interpret what they see. Like the reports of spies and postcards from vacationing tourists, the observer tends to emphasize some things and neglect or ignore others. Human bias is always a consideration.

And with maps there is the special consideration of timing. While the work of a surveyor, depending on major geographic features like mountain peaks and other benchmarks that tend to stand for thousands of years, may be reliable within a human lifespan, mapmakers are taking a snapshot in time. From one year to the next, a road may become blocked, a bridge collapse, a river change course, or a forest burn—all changing the terrain and its application to a forced march or a battle. If you doubt this, try using a decades-old gas station map to plan your next trip.

This understanding should apply doubly these days to the current penchant for computer modeling in climatology, environmental biology, and political polling. Too often, models are accepted as new data and as an accurate representation—and more often a prediction, which is worse—of a real-world situation. Unless the modeler is presenting or verifying actual new data, the model is simply manipulating existing data sources, which may themselves be subject to interpretation and verification.

But that is not the whole problem. Any computer model, unless it becomes fiendishly complex, exists by selecting certain facts and trends over others and by making or highlighting certain assumptions while downplaying or discarding others. Model making, like drawing lines for topological contours, roads, and rivers on a map, is a matter of selection for the sake of simplicity. The only way to model the real world with complete accuracy would be to understand the situation and motion of every component, the direction and strength of every force, and the interaction and result of every encounter. The computer doesn’t exist that can do this on a worldwide scale for anything so complex and variable as weather systems; predator/prey relationships and species variation and mutation; or political preferences among a diverse population of voters and non-voters.

Computer modeling, these days—and especially in relation to climate change and its effects, or concerning political outcomes—is an effort of prediction. The goal is not so much to describe what is going on now but to foretell what will happen in the future, sometimes by a certain date in November, sometimes by the beginning of the next century. Predicting the future is an age-old dream of mankind, especially when you can be the one to know what will happen while those around you have to grope forward blindly in the dark. Think of oracles spoken only for the powerful or the practice of reading tea leaves and Tarot cards for a paying patron.

But complex systems, as history has shown, sometimes revolve around trivial and ephemeral incidents. A single volcanic eruption can change the weather over an entire hemisphere for one or several years. A surprise event in October can change or sour the views of swing voters and so affect the course of an election. The loss of a horseshoe nail can decide the fate of a king, a dynasty, and a country’s history. Small effects can have great consequences, and none of them can be predicted or modeled accurately.

When climate scientists first published the results of their models showing an average global temperature rise of about two degrees Celsius by the year 2100, the counterclaims were that they focused on carbon dioxide, a weak greenhouse gas; that the models required this gas to produce a “forcing,” or positive feedback loop, that would put more water vapor—a more potent greenhouse gas—into the atmosphere; and that the models did not consider negative feedback loops that would reduce the amount of carbon dioxide or water vapor over time. The climate scientists, as I remember, replied that their models were proprietary and could not be made public, for fear they would be copied or altered. But this defense also rendered them and their work free from inspection. Also, as I remember, no one has since attempted to measure the increase, if any, in global water vapor—not just measured in cloud cover, but also by the vapor loading or average humidity in the atmosphere as a whole—since the debate started. And you don’t hear much anymore about either the models themselves or the water vapor, just the supposed effects of the predicted warming that is supposed to be happening years ahead of its time.2

Add models that, for whatever reason, cannot be evaluated and verified to the general trend of results from scientific studies that cannot be reproduced according to the methodology and equipment cited in the published paper. Irreproducibility of results is a growing problem in the scientific world, according to the editorials I read in magazines like Science and Nature. If claims cannot be verified by people with the best will and good intentions, that does not make the originally published scientist either a liar or a villain. And there is always a bit of “noise”—static you can’t distinguish or interpret that interferes with the basic signal—in any system as vast and complex as the modern scientific enterprise taking place in academia, public and private laboratories, and industrial research facilities. Still, the issue of irreproducibility is troubling.

And, for me, it is even more troubling that reliance on computer models and projections are now accepted as basic research and scientific verification of a researcher’s hypothesis about what’s going on. At least with Tarot cards, we can examine the symbols and draw our own conclusions.

1. To which Korzybski added, “the word is not the thing”—a warning not to confuse models of reality with reality itself.

2. We also have a measured warming over the past decade or so, with peaks that supposedly exceed all previous records. But then, many of those records have since been adjusted—not only the current statement of past temperatures but also the raw data, rendering the actual record unrecoverable—to reflect changing conditions such as relocations of monitoring stations at airports and the urban “heat island” effects from asphalt parking lots and dark rooftops.
    As a personal anecdote, I remember a trip we made to Phoenix back in October 2012. I was standing in the parking lot of our hotel, next to the outlet for the building’s air-conditioning system. The recorded temperature in the city that day was something over 110 degrees, but the air coming out of that huge vent was a lot hotter, more like the blast from an oven. It occurred to me that a city like Phoenix attempts to lower the temperature of almost every living and commercial space under cover by twenty or thirty degrees, which means that most of the acreage in town is spewing the same extremely hot air into the atmosphere. And I wondered how much that added load must increase the ambient temperature in the city itself.

Sunday, October 4, 2020

Clever Words

Dissected man

Our politics is—and, I guess, has always been—susceptible to clever word combinations, puns, and rhymes that appear to tidily sum up a grievance, intended consequence, or course of action. For most of us, they are mere curiosities. But in my view they are treacherous if taken as a philosophy or a substitute for rational thought.

I’m sure there were chants and slogans that caught on during the American War of Independence, probably something to do with Indians and the tea shipments arriving in Boston Harbor. The slogan that comes readily to mind is from slightly later, the dispute with Canada in the mid-19th century about the Oregon border: “Fifty-four Forty or Fight,” relative to the latitude line that would define the hoped-for demarcation. I suppose it was just fortuitous that the map offered the preponderance of all those F’s and the opportunity for a stirring alliteration. If the border had been along the twentieth or thirtieth parallel, I guess the proponents would have had to come up with something else.

And then there is the modern-day all-purpose chant: “Hey-hey! Ho-ho! Fill in the Blank has got to go!” This one is particularly useful when a group of organizers want to stir up and direct a crowd. It’s got a rhythm that gets your arms and legs moving almost like a dance or a march step.1

To me, one of the worst substitutes for rational thought also comes from the 19th century, although a bit later. It is attributed to the journalist Finley Peter Dunne and his fictitious alter ego Mr. Dooley. In its shortened form it says: “The job of the newspaper is to comfort the afflicted and afflict the comfortable.” This formula, clever in its reversal—almost a palindrome—of verbs and objects, has been taken up by generations of progressives ever since. For some, it’s an exquisite summation of how they should heal social ills.

But this combination is, of course, nonsense. Clever, but still nonsense. It depends on a false equivalency: that the sufferings of the afflicted—the poor, the weak, the disabled, the denied and discriminated against—are directly attributable to the smug satisfactions of the people not so burdened. It presumes that those who have worked, saved, invested, and planned for the future of both themselves and their families—all of those middle-class virtues—have created conditions of poverty and injustice for those not so fortunate. And this is not so. Those who have taken up the virtues have simply removed themselves from the class of the destitute and the desperate, not caused their condition.

By all means, one should “comfort the afflicted.” Heal their hurts where it is possible. Work to change their current situation and their opportunities where you can.2 But at best, “afflicting the comfortable” serves only to remind them that an underclass exists in their society and that one should spend some portion of one’s day, one’s mind, and one’s charity—if not just their taxes—to alleviating the situation. “Afflicting the comfortable” is intended to be fighting words, suggesting that by reducing their comforts a society can somehow magically improve the lot of the afflicted. And that magical thinking is just pure Marxism: Been tried; didn’t work.

Another set of fighting words, intended to stir up the complacent and draw them into a social battle, are the various formulas intended to fight social apathy: “If you’re not part of the solution, you’re part of the problem,”3 and more recently “Silence is violence.” Again, the false equivalency that those who are not actively joining the fight—and on the side of, under the terms of, the sloganeers—are causing the wrong, are in fact wrong-doers themselves, that is the unspoken purpose of the chant.

These clever slogans are meant to give the great mass of people no choice. Join us or die—or worse, gain our everlasting contempt. They raise the issue in contention to the level of an existential crisis, a civilizational catastrophe, or a cause for civil war. However, for some of us, for many of us, perhaps for most of us in the middle of the political spectrum, who are spending our days doing all of that working, saving, investing, and planning for our own futures, in order not to be counted on the public rolls, the issue is not existential or catastrophic and does not merit a civil war. Yes, perhaps, the issue may demand our notice and concern. We might even add the deserving recipients to our list of charities or our list of considerations in the voting booth. But many of us, most of us, know that there’s nothing we can personally do about a lot of these social problems. We are not prepared to climb on the barricades, bare our breasts, and offer “our lives, our fortunes, and our sacred honor”4 to the project.

And no amount of clever words and scornful chants is likely to change that reality.

1. And in terms of serving multiple purposes, there is also: “No justice, no peace!” Simply pick your object of “justice,” and fill in your action for withholding “peace.”

2. But you have to be realistic about this approach. You can work to improve other people’s conditions sometimes, but that should not include a free ride or a lifetime’s residency on the dole. A taut safety net, not a soft and cushy safety hammock. Human beings are designed by a hundred thousand years of heredity to have personal goals and to seek satisfaction and self-worth through attaining them. No one—not children, not the mentally or physically disabled, nor the socially or economically disadvantaged—benefits from having their personal agency removed by a benevolent parent’s or government’s lifting and carrying them through all the vicissitudes of life.

3. Speaking of clever, I have always favored the chemist’s version: “If you’re not part of the solution, you’re part of the precipitate.” In other words, if you don’t join in this fight, you’ll be part of the fallout. Chuckle, smirk.

4. To quote from the last line of the Declaration of Independence, which for the signers did involve an existential crisis and, right quickly, a civil war.

Sunday, September 27, 2020

Monopoly Power

French marketplace

The Emperor Caligula was quoted by Suetonius as saying, “Would that the Roman people had but one neck!” Apparently so that he could hack through it more easily. Everyone wants to have control of their situation, and on the easiest possible terms.

In the business world, this tendency is represented by monopoly, where for the sake of simplicity, economy, efficiency, or some other perceived value there is only one producer or supplier of a particular category of goods or services, and by monopsony, where for the same set of reasons there is only one buyer. Think of the Defense Department and its need for complementary weapon systems, as opposed to individual purchases by each branch of the service, or by each military unit and base. Or the current drift toward single-payer medical coverage and its promise of cost reductions through the government’s negotiating power and volume purchases.

Monopolies have always enjoyed state support. The English crown, up until the 17th century, regularly granted royal favorites the monopoly trade in certain products, such as sweet wines in the Elizabethan period. And the British East India Company was granted exclusive trade rights in lands bordering the Indian Ocean. Americans have not generally favored monopolies until the widespread distribution of electricity in the late 19th and early 20th century, when it became inconvenient to have several power companies stringing wires up and down both sides of the street to reach their customers. It then became necessary to grant regulated monopolies to electricity and gas providers to systematize their distribution.

Generally though, big players do better in any market. If a company making anything, from cars to soft drinks, reaches the position of first, second, or third in the marketplace, it will want to crush its competition and take all the customers.1 And the government likes a marketplace dominated by big players: they are easier to deal with, regulate, and tax.2 Certainly, government regulation tends to work against a field of small players, who do not have the legal and regulatory affairs departments or the budgets to lobby government, respond to regulations, and engage in defensive lawsuits.

While our government has officially been “antitrust” since the days of the Robber Barons and the interlocking directorates of various companies controlling, for example, the markets in coal and steel, government has turned a blind eye to amalgamation and unification in the labor market. There different unions have banded together into effective monopolies on the labor supply for factory workers, service employees, and truck drivers. Again, big players do better in the market. They swing more weight. As individual union members join together in a giant, amalgamated union—they can speak with one voice. They can get more things done to their liking. They can have their way. And it’s actually a form of democracy—at least for the members of the union.

And where unions don’t exist, or have been withering for decades under our huge economic expansion, they soon may make a comeback as government increases its reach into the economy. For example, the current push for single-payer medical plans, or some version of Medicare for All, would make it easier for the nurses’ union to negotiate a favorable pay rate with a single government entity, rather than with a handful of large hospital corporations or thousands of local hospitals and clinics. And a government monopsony on health care would push the rest of the medical profession—doctors’ associations and collections of other health care specialists—into some form of consolidated negotiation or full unionization. It would also further the amalgamation of hospitals into larger corporations and combinations.

But while bigger may be better for the dominant players in the marketplace, the trend towards monopoly and monopsony isn’t necessarily good for the market itself.

First, when one product or system dominates, it tends to limit invention and technological progress. Success tends to make people conservative. Yes, monopoly players worry about some competitor coming along and beating them at their own game, but then their urge is to buy up, buy out, and shut down that competitor, or simply crush it by temporarily lowering prices on their own products. If AT&T (“Ma Bell”) had retained its monopoly on long-distance telephony and its ownership of the various local telephone companies (“Baby Bells”), its own manufacturing arm with Western Electric, and its research facilities with Bell Telephone Laboratories (“Bell Labs”), how soon do you think cellular phones, which are not dependent on wires at all and are instead a radio product, would have become available? The phone company would have crushed any radio product that needed to touch its phone system and landlines—except, possibly, for automotive radiophones, which would have been expensive and limited to very special users.

Second, in a monopoly situation, or under the conditions forced by a monopsony, employment choices are more limited. If you were a telecommunications technician or inventor in the Ma Bell era, you could either work for AT&T or find some other career. And if you disagreed with the company’s directives, choices, and planning, you could either speak out and find your career truncated, or you could keep your head down, rise in the organization, and hope to one day influence those decisions. Jumping ship to join a competitor or starting your own company with a better idea just wasn’t in the cards. The same goes for employees at NASA or your regulated local utility company.

Third, monopolies and monopsonies are almost always bad for the average person, the individual buyer, the customer, the person at the ultimate end of the supply chain. Where one organization has purchasing and pricing power over the market, the little guy accepts what he gets and pays what is asked. Not everyone wanted a Model T in “any color so long as it was black.” Not everyone wants a single choice of deodorant or sneaker. Not everyone wants the government deciding who will get a CT scan and when, because someone far up the food chain made a nationwide decision about how many CT scanners to buy for each county. People might appreciate efficiency, simplicity, economy, or some other overriding value in the abstract. But not everyone prefers white bread over pumpernickel, plain whisky over flavored vodkas, or the deodorant with a sailing ship on the label over any other brand. People like choices, making their own decisions, and deciding how and where to spend their money.

Fourth, and finally, monopolies and monopsonies almost never last. Sooner or later, the entrenched position becomes so cautiously conservative, so calcified, and so behind the times that a clever inventor can find a work-around: a new and disruptive product, a new marketplace, or a new champion. That’s happening all over the place these days, in the automotive world (hybrids, Tesla), in telecommunications (cell phones), in computers (laptops and tablets), in medicine (genetic analysis, personalized medicine), and in space exploration (SpaceX, Blue Origin). Big players become vulnerable unless they can also become nimble—not just crushing the competition but learning to dance with it.

Caligula’s desire for Rome to have just one neck, to make it easier for him to put his foot on and eventually to hack through, was the cry of every tyrant. But for anyone, even for a Roman emperor, life just isn’t that easy.

1. Unless, of course, the competition is good for the top players. Think of Coca-Cola and Pepsi-Cola, both of whom benefited by fostering their brand loyalty over the other competitor. Or the “Big Three” auto makers, who sold more cars by competing with the other guys on styling, horsepower, or some other popular enhancement, thus churning the annual sales of new cars.

2. If you doubt this, remember the senator who complained about the inefficiency of a market that offered Americans a variety of products: “You don't necessarily need a choice of 23 underarm spray deodorants or of 18 different pairs of sneakers.” It’s much easier to manage an economy with fewer choices and a monopoly player making all the decisions for the folks doing the buying.