Sunday, December 27, 2020

BC and BCE

Christmas star

The Jewish calendar dates from the biblical date of the world’s—or rather, the universe’s—creation, according to the twelfth-century (AD, in the Western reckoning) Jewish philosopher Maimonides. That’s a pretty important event. For the religious observances of Jews everywhere, the year is now 5781 AM (for Anno Mundi).1

The Islamic calendar dates from the Hijra, the year that Mohammed and his followers left Mecca for Medina to establish their first true community, or ummah. For them, it is now 1420 AH (for Anno Hegirae).

The ancient Romans counted their years from the legendary founding of their city, approximately in the year we would call 753 BC. If we still counted the Roman way, it would now be 2773 AUC (for Ab Urbe Condita). Similarly, the Greeks of approximately the same period took their dating from the first Panhellenic Games held at Olympus in what we would call 776 BC, presumably the first time that all of Greece came together as a people. In that system, it would now be 2796—although the Greeks counted loosely in units of four, the span of years between each set of games, going back to the most recent games or forward to the next set. And the events themselves were suppressed in 394 AD, when the Roman Emperor Theodosius imposed the newly established Christian calendar on the Greeks.

The Chinese calendar was established in the fourteenth century (BC, in Western reckoning) and traced back to the Emperor Huangdi, who is supposed to have worked out the cycle of years in 2637 BC based on astronomical observations of the Sun and Moon. It doesn’t really matter what year it is in China today based on this calendar, other than for religious and festival purposes. The modern Chinese follow the Western calendar.

Western Civilization took the starting date of its calendar from the birth of Christ, a big event for those who had recently converted from their pagan rites to the new religion. And it is now 2020 pretty much around the world, based on the spread of Western trade and business. We used to call this year Anno Domini (from “Year of the Lord”), and any year before the start of the Christian calendar was then “Before Christ.”

Somewhere along the line, probably about fifty years ago, scholars started calling the current years CE (for “Common Era”), because we all use this Western reckoning anyway, and the years from before this start were BCE (for “Before Common Era”). Presumably, scholars of Jewish, Muslim, Chinese, or pagan faiths found it disdainful to identify the date with a Christian reference. So they kept the accounting but changed the words around to be less offensive.

And isn’t that just the ultimate in hypocrisy? Use the Christian calendar but don’t actually acknowledge it. Pfui!

If you want to abandon the Western, Christian tradition, let’s not take the easy path. Here are some alternate datings we might adopt in a totally nonsectarian fashion, basing our counting on something important to today’s crop of irreligious scientists.

We could, for example, celebrate the first use of a printing press with movable type, by Johannes Gutenberg in 1439, which started the revolution in information technology and popular literacy—at least here in the West. Of course, the first thing Gutenberg printed was the Bible, so that’s still kind of a religious holiday.

We might base the calendar on René Descartes and his publication in 1637 of the Discours de la Meéthode, the traditional founding of the Scientific Method. His line of thinking cleared out much of the superstition surrounding natural events and led to our technocratic view of the universe. Or we could recognize Isaac Newton, whose Principia Mathematica in 1687 laid the second brick in the foundations of the scientific revolution, and whose other studies led to our better understanding of gravity, light, and optics.

Or we might celebrate Alfred Einstein’s theories of Special Relativity (1905) and General Relativity (1916), which turned Newton’s gravity, the simultaneously occurring universe, and our understanding of space and time on its head.

Taking up any of these developments as the most important events of the modern world would make our age relatively short: just 581 years for the Information Age, or between 333 and 383 years for the Scientific Age, or only 104 to 115 years for the Relativistic Age.

If you prefer hard mechanical achievements to philosophical personalities, you could base the calendar on James Watt’s technical improvements in 1776 on Thomas Newcomen’s original steam engine of 1712, which kicked off the Transportation Age; on the Manhattan Project’s fission bomb of 1945, which started the Atomic Age; or the first working transistors developed by a team at Bell Labs in 1954, which kicked off the Computer Age—with the help of Alan Turing and the primitive computational devices of the previous decade. Any of those events would give us an even shorter history—most of them putting us still in the Age of Discovery.2

But for now, until the modern era settles down into a quiet, respectable, backward-looking middle age, we’ll just have to muddle along with “Before Christ” and “Anno Domini” and their secular equivalents. And that’s a good thing, because the computers could change their dating systems more easily and a lot faster than any of us humans could learn to handle a new calendar.

1. For now, we will ignore the fact that for some of these reckonings the calendar is based on the lunar cycle of 28 days, and so the year may have more than the 12 months, or the year less than the scientifically measured 365.25 days, to which we all are accustomed. And anyway, Christ wasn’t exactly born on December 25.

2. Think of the development of the first truly portable artificial intelligence and its control of existing computer and automation systems. Think of the time when we will develop the first fusion reactor that puts out more energy than it takes to ignite, which will revolutionize our energy production. This age has a long way to go yet.

Sunday, December 20, 2020

Apocalypse Never

Apocalypse meteor storm

All versions of popular politics are to some degree insane. This is because they try to reduce a complex, variegated equation into a simple, understandable formula. And then they try to produce the formula as some version of reality. Not smart.

I pity the Marxists and their brothers and sisters among the socialists and other collectivists. They believe that if they can reduce economics to a system of distribution governed by dispassionate technical experts in the government, then poverty, unmet needs, and inequality will disappear. Then history—the history of colonial exploitation, aristocratic dynasties, and economic revolutions—will come to an end. By invoking the power of a single-minded state and the obedience of all who dwell within it, they believe they can achieve paradise on Earth and that nothing will ever change after that.

I pity the libertarians and their brothers and sisters among the anarchists1 and rabidly rugged individualists. They believe that if they can end the stultifying maze of laws and regulations raised by an archaic and backward-looking society, then humanity will be freed to become truly dynamic and creative. Then human creativity will be invited to achieve … what exactly? Honest relations among men, between men and women, between parents and children? The opportunity for each man and woman to get—and take—what they want in a paradise where the ripe fruit is always low on the tree and everyone has a set of pruning shears? I’m not sure what the point of anarchy would be.2

The socialists and other collectivists don’t understand that the human condition is compounded of desire, restlessness, and imagination. Oh, and not a little greed and envy. The underlying principle of economics, all economics, is that wants and needs are infinite while resources are finite. From this principle the marketplace naturally arises, where people measure their wants and desires against what they will have to pay in terms of their time, energy, and the stored value of these gifts in currency.

This is true even in fantasy futures like Star Trek’s, where unlimited energy through matter-antimatter conversion and unlimited material goods through energy-to-matter replication can satisfy all human needs. Do you know what it costs to generate a few grams of antimatter? The stuff is more valuable than gold or diamonds. And do you know how much energy it takes to transmute streaming photons or electrons into protons and neutrons and them fuse them into the atoms and molecules of actual matter? Your whole society would go broke in a nanosecond trying to burn enough antimatter to make a replicated dish of ice cream. And none of that will bend the “fabric of spacetime” so that you can go galivanting through the galaxy with warp drive. But I digress …

Any political program that tells you wants and needs can be satisfied equitably by relinquishing human initiative to a dispassionate government—or to an incorruptible team of robots or artificial intelligences, because we know how inventive humans are and how easily machines can be hacked—is selling you utopia, the place that never was or will be, Heaven on Earth, a dream.

The libertarians and anarchists don’t realize that the human disposition is not kindly—not where matters of life and death and the survival of one’s children are concerned. The underlying principle of society, of all social organizations, is that the individual must give up some measure of freedom to achieve some measure of safety. From this disposition arises the tribe, the village, the nation-state, and the civilization that imposes laws and asks for obedience. And when an individual will not observe the laws and render obedience, it imposes sanctions and penalties.

This is true even in the many post-apocalyptic futures, played out in endless movies and television series, where society has fallen apart due to nuclear war, environmental catastrophe, or other unspecified collapse. Then the average human being—and even those among the main characters with superior intelligence, coping mechanisms, and fighting skills—are free and happy for about five seconds. But if they are lucky and the writers of these stories are smart, they quickly begin to form tribes and villages with laws and protections, recreating civilization all over again. The clean slate is not a happy state but a dangerous place that no one wants to occupy for long.

The political programs that work are those that recognize human and resource limitations, the need for compromise, and an understanding of the balance between personal freedom and cooperation. There are no end states, not on either side of the spectrum. No one will achieve any lasting condition of total control or total freedom. The Nazis and the Soviets tried the former, and their Nirvana lasted twelve years in one case and seventy in the other, and they caused untold suffering in one case and bitter stagnation with a heap of suffering in the other. No one has yet achieved much in the way of unlimited freedom, although the French Revolution tried and quickly—just about four years, with devastation already brewing—devolved into political infighting and the Reign of Terror, complete with daily executions. People simply are not designed, morally or psychologically, to be transformed into paradise all at once while still living.

And the reality is that any society after an excursion into totalitarian control or social anarchy always reverts to some kind of economic and political mean. The average citizen still has to get up in the morning, go to work, take care of the family, and pay taxes. The sun rises and sets. Life goes on as a struggle. And it will be this way until the sun burns out or humanity dies out and leaves the planet to the care of the earthworms and cockroaches.

This country, with its blend of free-market economy and shareholder capitalism balanced against a system of government taxation, regulation, and safety nets, enjoying a huge admixture of technological imagination and creativity, has come the closest to satisfying the most needs of the most people—ever, in the history of the world. Yes, there are pockets of want and misery, although our poor people are rich compared to most of the developing world. This is why other people are willing to walk across deserts and suffocate inside freight containers in order to get here. We can always tinker with the blend of freedom, regulation, and technology, nipping here and tucking there. But we throw away the whole thing and start again on a dream of total control or total freedom at our peril.

And that is the politics of realists, not dreamers.

1. And yes, anarchists often ride along with the Marxists and socialists in the early stages of the political struggle, believing that the fastest way to get to Nirvana is to begin by tearing down the structures that already exist.

2. Although I sometimes vote with the libertarians—being a small-government, more-market, greater-personal-freedom kind of guy—I really do not understand the extreme position or the hunger of the anarchists. (Remember, I’m not a purist or absolutist about anything.) In my view, the anarchists are not moving toward anything good but away from a political and economic situation that they find too complex, too boring, or too troublesome to be allowed to continue. It’s a form of nihilism.

Sunday, December 13, 2020

Life Without Fingernails

Robot head

I tend to keep my fingernails short—not into the quick but close to it. So every month or six weeks, after trimming them, I experience a week or so when I exist like one of those robots with grip-sensitive finger pads like we see on television and in advanced science articles but not much with which to cut or scrape. Articles written about robots deal with joint articulation, wrist movement, and grip pressure. But no one seems to imagine what a humanoid robot without fingernails would go through on a daily basis.1

Take that little spot of dried goo on a mirror, plastic countertop, or other and more scratchable surface. Abrasion with a fingertip or a rubberized grip pad won’t quite wear it away. The keratin in a fingernail—akin to the protein that makes up strands of your hair—has just the right combination of hardness, flexibility, but ultimate softness and “give” to wipe away that spot without marring the underlying finish.

Take Ziplock bags. When they really stick, so that just pulling on the outside or inside bag material merely tears along the zipper seam, only a fingernail gently inserted between the two lips can tease them apart without compromising the bag’s sealing properties.

Take the snap tabs on soft drink cans and the peelable edges of the stickers on fruit, the price stickers on items intended as gifts, and virtually anything else that adheres or clamps but is intended to come loose with a soft edge and a little pressure. Trying to get a purchase under that edge with a fingertip, using friction from the ridges of your fingerprint, will almost but not quite work. To really get a grip, you need a fingernail.

And, as I read somewhere, while fingernails originated in animal claws, fighting weapons for everything from house cats to bears, their main evolutionary purpose in humans has been to anchor the skin around the fingertip, enhancing our grip on everything we grasp.

Consider every time you use a fingernail in your daily life, every time it has saved you going into the utility drawer for a screwdriver or a spatula, and you can see where lack of nails would hamper the effectiveness of a humanoid household robot. To overcome this disadvantage, designers would have to give the machine an extensible blade of soft-but-not-too-soft plastic—and replace it every time it wore away in use.

About the only thing a human does that a robot doesn’t is scratch him- or herself, and for that fingernails are perfect, too.

But maybe we shouldn’t speak too widely about the usefulness of fingernails. When the apocalypse comes, we can send messages on the underside of peelable stickers and store life’s essentials in Ziplock bags—and then use them to defeat our robot overlords.

1. Of course, most robots in the real world to come will not have humanlike hands, with fingers, opposable thumbs, and gripping pads. They will not be the bipedal, mannequin-bodied, humanoid surrogates imagined in early science fiction like I, Robot. The mechanisms that we wish to automate will have intelligence embedded in their systems. And the artificially intelligent software that guides and makes decisions for them will be so easily reproducible and installable that no one would bother to pay for a multi-function humanoid robot.
    Would you design a robot with just two cameras for binary vision, two audio receptors for directional hearing, two hands to grip a steering wheel, and two feet to operate accelerator and brake pedals in order to drive a car? And what then—your robot chauffeur would get out, go into the kitchen, pick up a paring knife, and slice carrots? Of course not. You would automate a self-driving car with multiple cameras and other sensors, including radar, and hook its decision-making capacity directly into the engine and brake management systems. And you would have a separate kitchen ’bot to prepare and serve food. A thousand other chores would be handled at the point of function by specifically designed automated systems. That is the future we’re headed toward.

Sunday, December 6, 2020

The End of the Republic

Powerlines

So here we are. A narrowly contested election brought strong hints of vote fraud on many levels and by various means—but with no evidence that either the mainstream media or half of the electorate will allow. An enfeebled candidate for president and an unpopular candidate for vice president have won a huge number of votes in key precincts despite minimal campaigning and public appearances, against a president who aggressively campaigned with rallies attracting thousands. And this is after four years of “resistance,” claims of Russian interference in 2016, an attempted impeachment, and a sudden virus outbreak that has been surrounded by conflicting projections, recommendations, claims, and statistics. And now, failing to get behind the presumed winner and showing “unity” is an embarrassment. Something is not right.

And yet … if there was massive and coordinated vote fraud in the battleground states, I can see why most local judges and the media—even nominally conservative reporters and commentators—are reluctant to give it credit and expose it. The consequences for the nation and the democratic process would be just too catastrophic, with civil disruptions and the threat of war such as we haven’t seen in more than a century and a half. Reasonable people would want to draw a curtain on the situation in the same way that the Warren Commission, charged with probing the assassination of John F. Kennedy, sealed its evidence for 75 years: if their probe had exposed evidence of the Soviet Union or Cuba planning and pulling off the murder of a sitting President, the public outcry could have forced Congress into declaring World War III. Better to throw a blanket over the whole thing.

But if there was coordinated election fraud, then I would feel like an old Roman at the end of the Republic, where Caesar had just been declared Dictator for Life on the strength of his personal legions and then assassinated. And the result was not a return to the normal political process but a free-for-all.

The interesting thing about the Roman Imperium is that the term imperator did not originally mean “emperor”—that came later—but simply “field marshal,” reflecting the leader’s military backing.1 And yet, while one man held total control of the state because of a personal military force, all the forms of the republican government were obeyed. The cursus honorum was still in place, and Roman citizens of good background still filled the correct political and religious offices. Each of the Caesars was officially simply consul for the year, elected along with a nonentity whom he named to be his co-consul, but everyone knew where the real power lay. None of the ancient and sacred laws of the Twelve Tables was changed. They simply meant nothing important anymore.

The same thing could happen in the United States. Without the input of the people through a trustworthy voting process, all the forms of the republic could be maintained and still mean nothing. You wouldn’t have to change a word of the Constitution if you simply decided to accept a different meaning for the words.

None of the forms of government put in place by the main body of the U.S. Constitution have changed. We still formally adhere to the separation of powers of the legislature, executive, and judiciary branches and the checks and balances put in place to keep one or the other from taking full control. And yet over the years—and this has happened on numerous occasions—the power of the executive has expanded with administrative offices that are largely unelected and at the operative levels not even appointed, and yet they interpret the laws made by the legislature. And when Congress won’t give a powerful President the results he—so far “he”; we will see in the future—wants, the President whips out a pen and creates an executive order. The judiciary at the state and federal level then interprets the laws, the executive orders, and the Constitution itself according to their own political likes and dislikes. Meanwhile—at least over the last two decades—the legislature has been free to engage in partisan squabbling and gridlock, achieving little of note, while the country’s government keeps evolving and advancing in the direction of administrative and judicial law.

If you were to create a “heat map” of where the actual power and authority in this country exist and compare it to the structure envisioned in the Constitution, I believe you would find massive areas of non-overlap.

The Bill of Rights would be no more sacred. Protections for freedom of speech and religion in the First Amendment would still be guaranteed. But as the past year has shown, your freedom to assemble and worship can be curtailed in the event of a public calamity like the pandemic. Certain words and phrases have long been banned and punished as “hate speech.” And now we know that your freedom of speech may be freely infringed by a privately operated communications system, part of the “social media,” if the operators believe it contradicts the narrative imposed by the ruling majority.

Sure, the Second Amendment guarantees your right to bear arms—but we could interpret those words so that they apply only so long as you are a member of the army, the national guard, a police force, or other “well regulated militia.” There is nothing in there about hunting or defense of home and self.

Nor would it be “cruel and unusual punishment,” according to the Eighth Amendment, if we kept you in permanent solitary confinement or filled you with mind-altering drugs to treat your chronic “social psychosis” and “false consciousness,” brought on by your non-supportive political views. These are medical treatments after all, not punishments.

And so it goes. As Humpty Dumpty said to Alice: “When I use a word, it means just what I choose it to mean—neither more nor less.” And two plus two equals five.

As Benjamin Franklin also supposedly said, in response to an inquiry about the outcome of the Constitutional Convention in 1787: “A republic, if you can keep it.” And yes, like the old Romans of the first century BC, we can keep all the forms, all the laws, all the words written down in ink on parchment or cast in bronze, and still have something different.

Whether we can do anything about this—and whether anyone who benefits from the current situation cares—is another matter. When no one cares, then it doesn’t matter.

1. The early Romans had seven legendary kings, the last one ousted early in the sixth century BC. That started the Republic, with its popularly elected government offices up through the leadership position of the two co-consuls, who together served alternate months for only one year and then could not be re-elected until after another ten years. This was supposedly a guarantee against one man becoming too powerful.
    The experience of being ruled by kings was apparently so awful that the Romans were simply allergic to the title “king.” They probably would have joined the conspirators en masse to tear Caesar apart if he had taken that title for himself. A dictator—the word just means “speaker”—serving for a select period, even for life, was the closest they could allow themselves to come to monarchical authority in a period of crisis. However, when Caesar’s legal heir Octavian first avenged the assassination and then succeeded Caesar as leader of his armies and head of the state, they accepted him as princeps—“leader” or “first citizen,” from which we get the word “prince”—as well as the imperator. And by the time Octavian, who was subsequently styled “Augustus,” was an old man, the Romans accepted his rule as being effectively hereditary. They got themselves a monarch anyway, but by another name.