Sunday, November 20, 2022

End of an Era

Tom on R1800

All things must come to an end, and it would seem that my history and my experiences as a motorcycle rider are among them.

I started riding almost fifty years ago, in 1973. I had been interested in motorcycles ever since some of the cooler boys in my high school class were batting around town on them—well, on 150 cc Hondas, barely better than a scooter, but the vibe was there. I myself had no occasion to ride all through high school and college, but after graduation and with some money of my own for the first time, I decided to try it. And my brother, who was older and had gone into the Air Force, rode motorcycles when he was stationed in Taiwan—again small machines, a Honda Dream and a Yamaha dirt bike—and anything he could do was something I wanted to try.

Because my brother was also enthusiastic for BMWs, I wanted to buy a BMW motorcycle first thing. But my father, who was wiser, suggested that $2,200—the price of an R75/5 at the time—was a lot to pay for something I might not like. So I bought a smaller machine, a Yamaha RD350, which really was considered a “hot bike” but with a frame and seat position too small for my six-foot-six body. But I learned to ride—which meant, in those days, that the dealer let me trundle it around the parking lot a couple of times before I headed off for home and evermore.

I dropped that bike twice. The first time was my first rainy day, first wet manhole cover, and first experience of the rear wheel—which I was precariously overbalancing—going out sideways underneath me. I ripped up a pair of pants, bruised my backside, and did a couple of hundred dollars in damage as the machine slid down the street on its side. The second time was when I stopped on a hill, about to make a sharp left turn across traffic, cocked the handlebar into full steering lock, and tipped the bike over, pivoting on that front axle. No injuries but more damages to repair.

After a year, and despite these mishaps—including a blown cylinder, because I did not realize that a tiny, two-stroke engine with aluminum pistons drives differently from a car—I was in love with motorcycling. I bought that R75/7 and rode it for three years, until I got married and needed to sell it for a down payment on our first home together.

I stayed away from motorcycles for about five years that time, but then the bug bit again. I bought a BMW R100, the same basic machine as my previous bike but with a 1,000 cc engine. I customized that BMW with saddle bags, an oil cooler, fork brace, and windshield, and rode it as a commute bike into San Francisco on sunny days. These were both opposed twin-cylinder, or “boxer,” engines. Four years later, when BMW brought out the first of its in-line four-cylinder motorcycles, designated with the letter K, I traded up. Over the years, I have bought twenty of the BMW machines, about half of them boxer Rs and half in-line Ks. And I loved each one as if it would be my one and only, forever, last motorcycle. But sometimes I have owned two bikes at once, just for the variety of having different machines with different riding styles.1

Along the way, I have also owned two Harley-Davidsons. They are a different style of riding, with all of the motorcycle’s weight comfortably underneath you. They are good motorcycles, but they lack the precision, the fit and finish, and the power of the BMWs.

Sometimes I have gone off motorcycling for a couple of years. Then I have kidded myself that I am shedding “tachyons,” after the episode of Star Trek where the ship had to be swept with a beam to eliminate these invidious particles. Tachyons are hypothetical subatomic particles, named from the Greek word for fast, that supposedly travel faster than light and are associated with time travel. But the lure of motorcycling always drew me back: the sensation of moving in three dimensions, the mastery of a large and powerful machine, the sense of freedom in the open air, and—yes—the element of danger requiring a high degree of precision and skill.

In the last three years, however, following a hiatus after the death of my wife, something has changed in my riding. In that time I have bought and sold six motorcycles. Some were a bit too big, or too small, or too cumbersome, or too exposed. And I was not vowing, as in the past, to love each one forever, although I was still buying the extended warranties as if each was to be my final choice for years to come.

What I was failing to realize is that I had changed. I was now in my seventies. Yes, many good men and women continue to ride in their seventies and even into their eighties. But the reality is that age takes its toll: you become more distractable; your reactions slow down; your tolerance for stress, cold, and wind buffeting deteriorates; and your strength declines. You can counter this with years of experience and increased care, but … there it is.

In the past month, after buying my last two motorcycles—a cruiser-style R1800 boxer and a sport-touring K1600 GT—just four months apart, and swearing that these were the best compromise of features, representing the summit of my riding experience, I had two upsets back to back.2 In the first, I was test-riding a larger R1800 Bagger, which was a hundred pounds heavier than my own bike but offered more wind protection and luggage capacity, similar to the K, and I dropped it in the parking lot. That resulted in a pulled muscle and about $2,500 in damages.

Then, a month later, I went out to ride my K1600 and rushed a stop sign, thinking that the vehicle approaching at right angles was going to stop.3 I braked in time to avoid hitting the car, but I locked up the bike, cocked the bars, and fell over. I was wearing protective clothing and a helmet, of course, but still suffered bruises, swelling, black-and-blue marks, and another pulled muscle. And the motorcycle itself suffered about $5,000 in mostly cosmetic damages. When you ride expensive bikes, you pay expensive repair bills.

But these two accidents—back-to-back and not years apart—confirmed what I had been suspecting for some time. I am not as sharp and mindful as I need to be to continue riding. I have always accepted that motorcycle riding must be done flawlessly. In a car, you can take a bad corner or come to a sudden stop, and the car doesn’t just fall over. And with all of a car’s safety features—seat belts, air bags, crumple zones, and such—you really have to mess up at speed to become injured. On a motorcycle, a minor brush in the parking lot causes bruises and broken bones, not to mention some thousands of dollars in damage.

So the time has come for me to hang it up for good. Because the automotive and motorcycling worlds are in something of a supply crunch, I’m able to sell the two motorcycles back to the dealer without too much loss. And I can wear my nice selection of leather and Cordura nylon jackets—without the armor padding, of course—as spiffy sportswear. But I no longer get to bomb along the freeways at high speed with the wind whistling around my helmet and the bike easily rolling side to side in turns, like Harry Potter’s broom in ecstatic flight. It’s been a good run, but now I’m done.

1. For the complete list of my machines, and some repetition of this story, see The Iron Stable.

2. Over my nearly fifty years, I have dropped motorcycles five or six times, usually when coming to a stop or at a full stop, when my foot slipped or I cocked the handlebars into a steering lock. My only upset at any speed was that first bike and first wet manhole cover. And I never have been seriously hurt. This works out to an accident about every ten years … which, I must say, is damned lucky.

3. As a rider I developed a number of rules or protocols to guide me in future occurrences of situations I have already experienced. Some are general, and some very specific. But the first three rules are: (1) Nobody’s looking out for you. (2) Even if they look right at you, they don’t see you. (3) Even if they see you, they don’t care. This accident massively violated the first rule.

Sunday, November 13, 2022

On Science

Nautilus shell

As a science-fiction writer, although no scientist myself, I am a fan of science. My grandfather was a civil engineer, and my father a mechanical engineer. I started reading science fiction as a child. For most of my corporate career I worked with and interviewed engineers and scientists. And I still subscribe to the magazines Science, Nature, Scientific American, and Astronomy. I don’t always read the original articles, which can be pretty dense, but I read most of the magazine’s summaries for lay readers.

When I worked at the biotech company, I routinely scanned scientific articles for studies that used our genetic-analysis products, so that I could write my own brief summaries to report back to our employees about how our equipment was being used. That gave me a general view, at least in the realm of the biological sciences, of how far and deep what I now call the “enterprise of science” extends. Principal investigators and teams of researchers in university departments, corporate facilities, and government labs around the world are all at work extending and enhancing our knowledge and sharing their findings through peer-reviewed publications. And I’m sure the same effort goes on in chemistry, physics, and the other, softer sciences.

So in no sense do I consider myself a “science denier.” But still … I am aware that not all of the work that passes for science is conducted at the same level, with the same rigor, or deserves the same respect for results. Science is, after all, a human endeavor, and not all human beings think and act the same, or value the same things. So scientific results are variable.

Biology is pretty sound, I think. The field has progressed over the centuries and in the more recent decades from being a work of pure description and simple cataloguing. Using techniques from chemistry and now genetics, and since the application of evolutionary theory, biologists are discovering not only what things are but how they function and why they are related. Their work is not so much theoretical as analytical and demonstrable. That is, provable beyond doubt. Biology’s stepchild, medicine, with its study of mechanisms, pathogens, and now disease tendencies transmitted through genetics, has advanced human life immeasurably in the past two hundred years or so.

Chemistry is in the same advanced state. Since the breakthrough in understanding atomic structure and the ordering of the Periodic Table, chemists have been able to analyze existing molecules and compounds, understand their structure and even predict their characteristics, propose changes to their nature, and even create novel molecules and materials. The field is creating unheard of substances like high-temperature superconductors and advanced battery substrates. Chemistry and its stepchild, electronics—and its stepchild, computer science—have created the modern world.

Physics has made remarkable progress in understanding the world around us, focusing on the mechanics of light and sound, energy and inertia, and other, well, physical—that is to say, tactile, observable, and measurable—characteristics of what we might call the “real world.” But in the 20th century, and continuing into the 21st, physicists’ deep dive into the realm of the unseen and only guessable, with quantum mechanics and probability theory, seems to have veered into a kind of mysticism. To my layman’s eye, certain physicists now seem to be playing imaginative games where anything might be real if only the numbers can be made to add up. So we have String Theory, where all subatomic particles ultimately resolve into tiny, vibrating loops of “string”—but only so long as the real world consists of eleven dimensions and not just the observable three dimensions of space plus one of time. And the ruminations of Erwin Schrödinger and his sometimes-alive-sometimes-dead cat have led to the Many Worlds Theory, where every imponderable probability splits the universe into finer and finer branchings of alternate realities. This has just become a game of having fun with numbers.1

The stepchild of physics, astronomy, is on firmer ground—well, no ground, really, because all that the astronomers can work with are various wavelengths of photons and electromagnetic effects from various subatomic particles. Or at least, it’s firm when they can observe and compare physical results. Astronomy has been remarkably perceptive in analyzing these effects and resolving conjectures based on them into perhaps provable truths. For example, we can be pretty sure the Earth and other planets revolve around the Sun, that the Moon revolves around the Earth, and now that other stars have planets similar to our own. But we are still in the conjecture stage about the nature of neutron stars and black holes,2 and cosmology and its theories about dark matter and dark energy are dependent upon our incomplete understanding of the nature of time, space, and gravity.3

Psychology, among the softer sciences, has made great advances in understanding the human mind and behavior since the purely conjectural theories of Freud, but it works best when it pairs with neurology and molecular biology to understand the brain that underlies the functions of mind. Still, psychologists are studying the elusive product of a trillion firing synapses, similar in all human beings according to their genetic nature but different in their chemical environment and learned responses. Psychology is a work of aggregation among many individual samples—and the art lies in picking the samples and applying a priori conjectures. Its stepchild, sociology, is in an even more tenuous state, because unlike psychologists, the sociologist works with aggregates from the start rather than with the personal reports of individual human beings.

And then there is environmental science, which shares a good deal with chemistry and physics, and has benefited greatly from advances in the understanding of geology, plate tectonics, and ocean thermal energy and salt transports, not to mention satellite imaging and widespread use of local ground-based radar. We can now understand most weather phenomena and predict patterns out to several weeks and trends out to possibly next year. But the Earth’s atmospheric systems are still highly complex and subject to many and varied influences. The current view that the entire planet is warming overall due to one human-caused variable, the industrial release of carbon dioxide, and predictions of worldwide temperature increases in fractions of a degree eighty years into the future—well, that is not observable science. Such predictions are science’s bastard stepchild, computer modeling.

Computers can model complex systems by aggregating variables and assigning weights to them. As I understand the current process—having followed the story of “global warming” since James Hanson presented his case at an Energy Daily conference in 1987 (before which “climate change” was called “global cooling”)—carbon dioxide is a weak greenhouse gas, unless it dominates an atmosphere like that of Venus. The climate models all assume that the tiny push from additional carbon dioxide will create enough heating to cause water vapor to enter the atmosphere, where aerosolized water is a much more potent greenhouse gas. And the models are all based on positive feedbacks to this cycle, ignoring negative feedbacks like carbon dioxide being a necessary plant food, and that plants absorb carbon dioxide out of the air.

The models supporting anthropogenic global warming also tend to ignore the influence of the Sun, which is known to be a variable star. Michael Mann and company produced a global temperature chart based on tree rings4 taken and analyzed starting several thousand years ago. Their chart showed global temperatures holding steady for more than two thousand years and only rising—in the famous “hockey stick”—during the last two centuries. Mann purported to erase the Roman warm period, the Dark Age cold period, the Medieval warm period, and the Little Ice Age of the late Renaissance. But each of these fluctuations coincides with the 800-year cycles—400 years of rising peaks, 400 years of declining peaks—in the familiar 11-year sunspot cycles. A spotted sun is a warmer sun, because the spots are actually rips in the solar atmosphere, which normally holds back part of the solar output; they also create flares and mass ejections that release excess energy. Each of the historic warm or cold periods corresponds to a peak or a drought—and sometimes a total disappearance, called a “Maunder minimum,” after the astronomer who first identified it—of sunspot activity. Since the middle 1700s, the Earth has been climbing out of the minimum responsible for the Little Ice Age. Peak sunspot activity probably occurred in the 1990s, and the peaks during the most recent 11-year cycles since then have been falling off.5

So yes, the Earth’s climate does change, no question about that. But it does so in complex cycles. We have also had ancient periods of “snowball Earth,” when the planet was in the deep freeze, and more recently four great “ice ages” occurring at 400,000-year periods. None of this had to do with a weak greenhouse gas like carbon dioxide.6 I do not deny climate change: the Earth’s climate changes continually and is certainly changing now. But I doubt that human industrial activity is the only or the most important cause for the current changes, or that drastically altering our economy will have much effect in the next hundred years. Human beings have been adapting to cold and warm periods, along with rising and falling sea levels, for our entire span on this planet—and we will adapt to whatever else we might find out among the stars. We will live through whatever changes Earth throws at us in the next hundred years.

As I said, science is a human enterprise, and human beings are fallible. They make mistakes, and they are also susceptible to peer pressure and “group think.” They sometimes engage their own curiosity to study phenomena and build on ideas that are popular in their scientific circles. And not all of them are above proposing studies and shaping their conjectures to follow the grant money offered by interested parties.

I am not—no, not ever—a “science denier.” But I am a realist, and I know you have to be careful about what you believe, take for granted, and choose to follow.

1. For more on this, see my blogs Fun With Numbers (I) and (II) from September 2010.

2. For example, Stephen Hawking theorized that black holes evaporate, and the small ones that may have theoretically been created in the Big Bang must have evaporated completely. He based these conjectures on the possibility that the sudden creation of particles and antiparticles in the vacuum of space, which is supposed to be happening everywhere and at any time, along with their immediate self-annihilation, must often happen at the very edge of a black hole’s event horizon. The occasional loss of one particle or its antiparticle over the event horizon and into the hole must then draw energy and information out of the hole. And that, over time, would deplete and erase the black hole entirely. In terms of provable, knowable reality, Hawking might as well have said that pixies ate them.

3. See also Three Things We Don’t Know About Physics (I) from December 2012 and (II) from January 2013.

4. However, I always understood tree ring size to correlate more closely with available water, rather than with temperature. And hotter climates are not always dry.

5. To look at recent sunspot cycles, see the Cycle 24 Prediction from the Wikipedia page on this latest cycle.

6. One of the articles I read while scanning for our genetic analysis equipment (and I don’t have the reference now) said that drilling into and through the glaciers in Greenland had uncovered arboreal duff—twigs and pieces of dead trees—whose genetic drift from current identifiable species was about 700,000 years. This finding suggests to me that the interglacial period between the last ice age, which ended about 12,000 years ago and peaked somewhere in the last 200,000 years, and the one that preceded it never completely melted the glaciers over Greenland. Otherwise, this ancient duff would have been washed away and replaced with younger stuff.

Sunday, November 6, 2022

On Sincerity

Joker’s smile

Back in the heady days of campus revolution, during the late 1960s—my college days—one of the cardinal sins in the mouths of the radically upset was “hypocrisy.” The idea was that the worst thing a person could do was profess one thought but actually believe something different. Or, more to the point, say one thing and act in congruence with another set of beliefs. The charge was supposed to expose people who were doing bad things, like quietly supporting the war or surreptitiously practicing racism, while claiming to represent peace and justice. “Liar! We see through you!”

The notion of hypocrisy has morphed and twisted but never actually gone away over the past fifty years. The modern form is “transparency.” At every level of public and personal life, a person or an organization, their motives, and their values are supposed to be open to public scrutiny. This is the origin of the “sunshine laws,” where “sunshine is the best disinfectant.” If only we could clear away the smoke-filled rooms and expose every consideration, every decision, and every deal to “inquiring minds” outside the council chamber, then bad things just couldn’t happen.

Although I am not a proponent of big government and secretive deals, I can understand that sometimes the door has to be closed. Court testimony is public, but jury deliberations—where conjectures are raised, and sometimes slanderous points are resolved—are private. Contract negotiations are privileged, although the final agreement in terms of public works or homeowner association improvements are public. Diplomatic maneuvers, including assurances and threats, must be secret, although the resulting treaties are public. How and why a public official might have changed his or her mind on a subject is not necessarily a matter for the public’s right to know. Maybe money passed under the table, and maybe a better argument was simply made. We can only judge people by their statements and their actions, not the content of their innermost thoughts and hearts.

During a student “colloquium” at Penn State, where the issues of the day were raised—among them, “hypocrisy”—one of my freshman philosophy professors, Stanley Rosen, stated flatly, “Sincerity is a trivial virtue.”

That has always stuck with me. Sincerity is a virtue, yes, but how important is it in our everyday lives? How much do I care whether you actually believe what you’re saying and doing? The answer is, not much. After all, Josef Stalin and Adolf Hitler were incredibly sincere in their beliefs, and they each killed millions of people—intentionally, knowingly, willfully—in order to remain constant to those beliefs. Today, do we care more that their beliefs were sincerely held, or that they caused great and inhuman damage?

If you are insincere, you may well be lying to yourself. You may also be lying to me. Lying to yourself is pitiable, while lying to me is actionable. But I would also be a fool if I judged everything people said to me as being perfectly true and sincere. Instead, I have to think about past statements, past performance, and what I know of—or can deduce from body language, facial expression, and other subtle measures—the person’s character. Nothing is certain. Everything is open to reasonable doubt.

What counts more than a person’s sincerity is their actions. Do they keep their promises, pay their debts, raise their children consistently, treat animals kindly and the wait staff with respect? Are they law abiding? Can they be trusted? These are the things that matter. These are the virtues upon which a civil society and stable economy are based. One person might think himself a great villain and yet remember to tip the waiter. Another might think herself a great benefactress and yet push past someone destitute sitting on the sidewalk.

We don’t judge people based on what they think of themselves, because that opinion is often secret and sometimes wrong. We judge them based on their words and the effect of those words, on their actions and the effect of those actions. I don’t want transparency into a man’s soul or to be convinced that he is actually as good a person as he thinks he is. I want to know that, if I lend him five dollars, will he pay me back as we agreed?

Sunday, October 9, 2022

Global Population Growth

Dissected man

When I was growing up, the thrust of many popular books in nonfiction, science fiction, and movies was the unrestricted growth of the global population.1 And we had the rising populations of China, India, Africa, and South America to prove it. Supposedly, human population would expand indefinitely, until we were packed shoulder-to-shoulder as in a crowded elevator.2

What we have seen since then—but perhaps not yet absorbed at a publicly conscious level—is that population growth in technologically advanced, highly educated, Western-oriented countries tends to level off and then decline. The shorthand version is that people who have a good life, plenty to eat and drink, interesting work and the free time to enjoy themselves, the benefits of modern medicine, and access to modern birth control along with the cultural mindset that supports its use, don’t automatically keep having one child per couple every eighteen months or so during a woman’s fertile lifetime. You need that kind of birth rate if you’re running an agrarian society, where most families tend to be physically isolated and experience a high rate of infant mortality. In the modern world, people just don’t need a lot of surviving children to help with the farm work. And modern women aspire to be something more than brood mares.

I can see this trend in my own family and the people around me, all highly educated with access to modern medicine. My mother and father had two sons, one gay and one in a childless marriage, net result: no replacement in the third generation. My mother had a brother and a sister; the sister was in a childless marriage; the brother had two daughters, who both married and between them bore a total of three children, net result: sub-growth replacement. My father had a sister, who married and had two children; they each married and bore two children; net result: generational replacement, but not enough for positive growth. Of the people I know, most are either childless or, if married, have one or two children. I know of only one person who is a member of a big family, six children, but his siblings have either two or one child or none; net result: not enough for positive growth.

The same story could be told across the educated middle class of America, as well as in Europe, parts of South America, and in Japan. India is still growing. Africa is still growing. And China … is a special case: their disastrous One Child policy, which allowed for abortion based on sexual selection, in a culture that vastly favors male children over females, doomed the succeeding generation to an overwhelming supply of young men without access to marriageable young women. China’s population is predicted to crash in a couple of generations.3

America and Europe are making up for the decline in their educated, advanced population by admitting aliens and “guest workers” from other countries. The United States draws them from Central and South America. The United Kingdom draws them from former British colonies like India, Pakistan, Kenya, and Jamaica. Western Europe draws them from Eastern Europe and the Middle East. For a generation or so, these population additions will bring their agrarian-based cultural instincts and their fertility rate with them. They will provide a generation of support and service workers that the educated middle class is no longer supplying. But by the third generation these “aliens” will become citizens, acculturated, educated, technologically dependent, and they will lower their birth rate accordingly.

And in the high-growth areas like India and Africa, technological advances—which are easily transported and tend to create wealth wherever they land—will create new generations of relatively comfortable, relatively educated people with access to good medicine, birth control, and the popular culture to use it.

Right now, the planetary population stands at about seven billion. Various projections take that up to perhaps nine billion within a generation or two. But if trends in technology and literacy continue as they are currently headed—always allowing for negative feedbacks, planetary disasters, and extraterrestrial intervention—that nine million may be the peak. China’s population alone stands at a billion three hundred million, but their projected collapse would take them back to about five hundred million. That’s still a lot of people, but it’s also a sobering decline in population that will have a devastating impact on an economy geared to serve more than twice that number.

Declining populations are aging populations, because the old people have all that good medicine to keep them alive, and technologically advanced societies provide social services and transfer payments to keep them comfortable. But those services and payments, let alone the medical expertise, all require a working population of younger people to support them. Right now, that would look like a recipe for disaster: querulous old folks wanting support and services, and rebellious young folks not willing to work eighteen hours a day to provide it—with the net result of economic and social collapse.

But I see another trend at work. Ever since the beginning of the Industrial Age, we have replaced transportation and production systems based on human and animal muscles with machinery based on fossil fuels and steam power, and then with internal combustion and electric motors. Add, in the last generation, computerized information systems and communications, and the rise of “smart” machines. And soon, in the near future, we will bring on individual machines and complex systems guided by artificial intelligence.4

It has been at least two generations since we needed to employ human ditch diggers with shovels—except, maybe, in archeology—when we had backhoes that can do the work of ten men. We no longer need mindless sets of hands to assemble our cars, trucks, refrigerators, and other durable goods piece-by-piece by hand. And soon we won’t need legions of lawyers, accountants, and technicians to maintain the flow of information and business transactions when self-programming systems can do it faster and more reliably. The future is not only mechanized and computerized, but it will also be intelligent.5

Technology is doing away with both the unskilled labor of ditch diggers and, eventually, the skilled labor of knowledge workers. That will leave the survivors relatively free to pursue their personal interests and hone their artistic and creative skills. Humans will always need and crave the personal element—innovation, experimentation, and surprise—in our lives, our entertainments, our cooking, and our sciences. We will pay people to be our entertainers and storytellers, our craftworkers and chefs, and our researchers and inventors. But we’ll leave the physically wearing and mentally numbing work to the machines, where it belongs.

And the political and economic structures envisioned to serve a massive and growing world population—more social control, more distributive scarcity, more government intervention; less freedom, fewer choices, less room—will give way to a more relaxed, more aspiring, more gracious lifestyle. People will count and be valued as individuals rather than as cogs in a machine or, worse, as pests on a dying planet. Or that is my hope.

The intersection between population declines and technological advancements suggests to me a coming Golden Age—at least for those of us who have the children who will survive to enjoy it. And always, of course, so long as a nuclear war, asteroid impact, or extraterrestrial invasion does not take us back to the Stone Age.

1. See, for example: Paul Ehrlich’s and David Brower’s non-fiction The Population Bomb and Harry Harrison’s science-fiction Make Room! Make Room!, which became the basis for the movie Soylent Green.

2. Of course, as I learned in a college course about predicting the future, any prediction that extends a current trendline indefinitely becomes immediately suspect. Every system includes both positive and negative feedbacks. And all trends go in cycles. Nothing is forever.

3. See, among others, this analysis from the World Economic Forum.

4. If your notion of “artificial intelligence” is a human-scale mind with an active personality and opinions—such as Skynet, “deciding our fate in a microsecond”—think again. The reality is going to be expert systems capable of analyzing data in greater volumes and at greater depths and speeds than any human mind, with the capability for pattern seeking and self-programming in response to discovered conditions. You will be able to talk to it, give it verbal instructions, and receive reports and information. But it will not ever be your friend—or your enemy.

5. And there goes your Marxist “labor theory of value.” The future will also be capital intensive, with the question being “Who owns the machines?” and “What is the rate of return on their investment?” rather than who gets paid to work and sweat and think and drudge.

Sunday, October 2, 2022

The Real Reactionaries

Robot juggling

Political lines seem to be drawn these days between Progressives, who are supposed to be looking forward to the future and what it will bring, and Conservatives, who are supposed to be looking backward, to past traditions and the security they promised, and away from the future the Progressives want—and so they have been called “Reactionaries.” But, like most modern memes and the definitions that support them, the real world is more fluid and sometimes becomes curiously inverted.

When I talk to Progressives—who used to be called “Liberals,” although that’s a position I hold—they seem to be afraid of the future. For all their claimed allegiance to science and progress, they appear to want to keep everything the same. Oh yes, they believe that the “arc of history” bends toward a desirable future: social justice, socialism, collective consciousness, and some nebulously defined utopia where history itself will just … end, while humanity glides forward in some sort of uneventful but endlessly gratifying state of somnolence. But unless that rainbow is somehow hurried along toward its pot of gold, bad things are gonna happen, and that’s scary.

For example, they talk about climate change (formerly known as “anthropogenic global warming”) and insist that we need to bend our economy out of shape, divesting ourselves of reliable energy sources in favor of renewables, and submit to increasing government control of our lives and our prospects in order to keep the world temperature from rising a couple of degrees. That level of warming will supposedly cause the oceans to rise and the planet’s vegetation to change measurably in the next eighty years. It will supposedly destroy coastal property values, change growing patterns, and disrupt people’s daily lives. But if we act now through stringent regulation, they insist, we can keep the current climate, land masses, farming systems, and economic activities to which we have become accustomed.1

That sounds like a Conservative yearning for the world to remain safe and familiar and not change at all. But change is inevitable. It has been going on for ten thousand years, not just in climate2 but in technology, culture, language, and political and economic affairs. No single step or series of steps will hold back the tide, or time, or the changes that both will bring. Continuous jockeying and adaptation have always been the keys to human survival and prosperity.

For another example, Progressives are afraid of scientific discovery and technology itself and the changes they will bring. Automation and artificial intelligence will disrupt the current patterns of work and economic exchange. Genetic modification of crops will supposedly open the door to human diseases or rampant changes in the environment. But pesticides, antibiotics, and artificial fertilizer are also said to disrupt human health and the environment, so the Progressive vision of agriculture is a return to “natural” and “organic” methods that lower yields and increase crop damage.

The Progressive yearns for a world that operates along current economic lines—the better, I suppose, to further their planning for a Marxist revolution that will institute state control under a change of power. But the world’s economy has been changing for more than a hundred years, as new energy resources and engines replace animal and human muscles, as improved farming techniques reduce the labor input to agriculture, as the Industrial Revolution at first absorbed new hands into assembly-line production methods, and now as the Computer and Communications Revolution is absorbing those displaced workers into a knowledge economy. And when machine intelligence does away with the routine knowledge jobs of accountants and librarians, other fields will open up.

For a third example, Progressives appear to be afraid of the people they presume to champion, of the average person in a differentiated society. Such people’s voices and understandings are expressed, debated, and examined through unfettered speech. Their beliefs and visions carry through to social action with free and open voting. And their needs and desires are expressed in a free market, where eager entrepreneurs working under a capitalist system respond quickly and comprehensively to fulfill them. Instead, Progressives want a world where experts steeped in academic study and trained in policy decisions will project what is best for the society and so arrange its systems.

And again, the Progressive mind is yearning for the safe, the known, the predictable, and the governable. The Progressives have a vision for the future that is based on the experiences of the past, and new ideas, new relationships, new methods upset the status quo. They want to be the disruptive force that brings about change—the vision that Karl Marx and his fellow travelers penned almost two hundred years ago—and they will not be stopped or sidetracked by the natural disruptions that free thinking and unfettered technology are bringing about. In this, the Progressive is just as protective of the familiar and the reliable as any 19th century robber baron trying desperately to hold onto his wealth and power.

Change is inevitable. You cannot predict the future. Humanity’s only hope is then to continue developing deeper scientific understanding and more subtle and robust technologies. Yes, there will be disruptions. Yes, a number of people will be put out of work, lose their security, be moved off their property, and be forced to scramble and adapt to a changing economic and environmental conditions. And yes, on an individual level this may seem unjust and even cruel. But life itself is about evolution and adaptation to changing conditions.

As Louis Pasteur—a contemporary of Karl Marx—once wrote, “Fortune favors the prepared mind.” That is, while you can’t predict the future, it is certainly possible to take prudent steps—learn everything you can, conserve your resources, stay light on your feet—to meet whatever eventualities the future throws at you. And that about sums up the viewpoint currently held by the rest of us “Conservatives.”

1. I live in the San Francisco Bay Area. Much of the shoreline around the Bay—including many towns like Foster City, where I worked for ten years—was originally, say a hundred and more years ago, tidal marsh and mud flat. By ringing an area with armor rock and filling it with garbage and topsoil, developers made industrial and residential property that today is worth billions. But even now some of that once-reclaimed land—especially some marginal farmland along San Pablo Bay—is being restructured into wetlands for environmental purposes. So … one hundred years ago your property was a mud bank mostly under water, and a hundred years from now it may once again be flooded. Cities rise; property decays and is abandoned; land is developed anew. This is the reality of attempting to own a piece of the ever-changing Earth.

2. Remember that human beings in their current form, Homo sapiens, certainly lived through the latter part of the last Ice Age and emerged into a world with changing shorelines and crop patterns, floods from melting glaciers—coming off an ice cap a mile deep over much of the northern hemisphere—and droughts from higher average temperatures. And they survived—not without loss, of course, not without suffering—while using technology no more advanced than stone spear points and bent sticks. I think we can do better with what we know today.

Sunday, September 25, 2022

Advanced Technological Civilizations

Spiral galaxy

If you have nothing better to do on a rainy Sunday afternoon, you might muse about where our technological civilization is ultimately going. And that thinking also configures the kinds of civilizations we are likely to meet out among the stars. Fortunately, you don’t have to speculate, because Soviet astronomer Nikolai Kardashev in 1964 proposed classifying advanced civilizations according to their access to and use of energy.

Energy is the basis of modern human civilization. Of course, we need safe ground, clean air and water, and available and nutritious food—and those are givens. But to live as we do in the First World today, we also need energy. And with sufficient energy, we can clear the ground, clean the air and water—and make fresh water from sea water—and grow more food or manufacture it from raw materials.

Previous human civilizations might not seem to have been as energy dependent as we are today, but their energy use is masked by a brutal fact. They did not depend on developed technologies that obtain and use energy from coal, oil, or natural gas, or build hydroelectric dams, nuclear reactors, solar panels, windmills, or any of the other resources we now take for granted. Instead, ancient civilizations depended on animal energy—mostly draught horses, oxen, and human slaves—to provide the work that we get from motors driven by those developed energy resources.

The ancient Egyptians and other “hydraulic empires” dipped water from the local river with counterbalances known as “shadufs,” where the differential driving the action was human muscle. The Romans developed the water wheel late in the game as a method of lifting water using the river’s own flow, and the torque from the axle could then also be used to turn grain mills and for other purposes. The ancients burned wood and charcoal for cooking and heating, and they burned lipids from waxes, animal fats, and vegetable oils to light their homes. They sometimes discovered surface deposits of coal and petroleum and used them for heating. They also burned animal and plant wastes, as well as peat deposits when found nearby. But the major source of motive and industrial power was still animal and human muscles.

On the Kardashev scale, a Type I civilization is able to access and use all of the energy and material resources of a planet. This would imply the capability for interplanetary if not interstellar travel, because it would make sense to preserve the livability of your home planet—making it a habitable safe haven, if not indeed a garden paradise—while you are tear apart a neighboring planet. Live on Earth and piecemeal the substance of Venus or Mars.

In this sense, our current First World civilization exists at about level 0.35. We extract coal, oil, natural gas, and uranium from easily accessed geologic strata. We harvest sunlight falling on a mere fraction of the planet’s surface. We have pretty well dammed and developed most of our major inland rivers, but we are only beginning to access the rivers of air that circulate in the atmosphere and have not yet touched the rivers of water found in the Earth’s oceans. We access geothermal energy when a nearby aquifer intersects with a surface hotspot, but we do not dig deeply enough to access the heat energy in the mantle and magma—and we prudently steer clear of most surface volcanos because we can’t yet predict and control their eruptions. We mine surface deposits of iron, nickel, copper, silver, gold, and other metals, but we do not have the technology to dig into the planet’s rich iron-nickel core.

We have not yet tried to access—nor are we capable of reaching—any of the nearby planets for their resources. We do, however, have our eyes on the 203-mile diameter asteroid named “Davida,” because it has metallic content—nickel, iron, and cobalt, plus resources of water, hydrogen, nitrogen, and ammonia—valued at $27 quintillion. Several other asteroids have similar metallic resources. If brought home from the Belt, they would either satisfy human needs for thousands of years—or instantly bankrupt every metal-mining operation on Earth and throw our global economy into such disruption that war and revolution would break out.

A Type II civilization makes use of the energy and material resources of an entire star system. Again, this would imply interstellar travel capability, because you don’t want to re-engineer or deconstruct the star that your home planet orbits while you are trying to live there. Your civilization migrates to a different star, or you go to work on a neighboring star. This level of energy use would also suggest that you have developed technologies to transmute atoms and molecules from one useful material to another—or to fabricate subatomic particles, atoms, and molecules out of pure energy.

We already have chemical technologies to transform near-neighbor molecules from one form to another. We make long-chain polymers—that is, plastics—from hydrocarbons, as well as processing one type of hydrocarbon to another during oil refining. We can change the nature of sugar and cellulose. And we make explosives out of various nitrogen compounds. But none of this capability yet reaches the submolecular or subatomic level.

The ability to make full use of a star’s resources would be most readily apparent when a civilization attempts to construct a Dyson sphere. Named for its inventor—or maybe “imaginator” or “conceptualizer” would be a better term—Freeman Dyson, this is a series of rings or a shell to enclose a star and create living space on its interior. The diameter would be the Earth’s orbit, so that the received sunlight would approximate that of a habitable planet. And the surface area would be many thousands or a million times the surface of the Earth. Since the mass of such a structure, even with the incredible tensile strength and thinness of some exotic material, would outweigh all the planets in a richly endowed system like ours, constructing such a sphere would require tearing apart several star systems. And you would have to perform some basic transmutation to create your wonder material for the shell system. These are technologies we can’t even begin to propose as yet.

Our astronomical instruments, such as the orbiting Kepler observatory, have discovered at least two stars whose brightness dips regularly to a shade of darkness that would not be the shadow of any credible planet. Absent another natural event we might observe or imagine—like the recent mass ejection in the star Betelgeuse, which dimmed its brightness for a while—these stars might be in the early stages of enclosure with a Dyson sphere. The recently launched James Watt Telescope, with its greater observing power in the near-infrared, would be a perfect eye to focus on these stars.

A Type III civilization makes use of the energy and material resources of an entire galaxy. Such engineers would use and dispose of stars in the way we build and fuel nuclear reactors. They might—if you want to let your imagination run wild—extract the singularity from their galaxy’s internal black hole and use it in their ping-pong games. Such people with such technologies would be as gods to us.

Kardashev never defined a Type IV civilization, but theorists claim such beings would make use of an entire universe. In that case we would already have met them, because they would have either colonized or destroyed our own galaxy, or our star system, or planet Earth in the process of exercising their powers. And a Type V civilization would have found a way to travel outside our universe and explore or make use of other universes—if such there be. But our small human imaginations cannot even consider, nor our theoretical physics encompass, such a situation … yet.

If there are civilizations at Type II and III levels, then they will be much more powerful than we are now or are likely to become in the near future. Meeting them would either be a wonderful opportunity or a civilization-ending catastrophe—likely the latter. If the meeting between European colonists armed with steel weapons, gunpowder, and other marks of superior technology and resources went badly for the native Americans, with their stone tools and not even the horse or the wheeled cart to carry them, imagine what it would mean to meet up with a civilization than can snuff out entire stars and galaxies.

If the James Webb Telescope ever sees a star or a galaxy suddenly go dark—but look twice and, please, check your data!—then it might be time for us to become afraid.

Sunday, September 18, 2022

Choosing Sides

Human embryo

I don’t want to be a Cassandra or a conspiracy theorist, but I can see clearly. The political situation in this country has become more divided, more fraught, more frantic, more weaponized than it has been in a long time. Maybe not since the 1930s or the 1850s. And once people’s divisions along lines of their core values and beliefs become deep and unbridgeable, we lose the possibility of reconciliation through political compromise, through voting, through any exercise of good will. The questions become existential: for one side to survive, the other must be destroyed.

That way lies insurrection,1 revolution, and civil war. It happened before in Russia in 1917, and dozens of times in smaller countries during the 20th century. It isn’t supposed to happen here, because we have the pressure-relief valve of a two-party system and the peaceful transition of power through the voting booth and assured elections every two and four years. Except that the last two general elections have been greeted with angry crowds on the Washington Mall, once in 2016 with angry Democrats proclaiming, “Not My President!” against Donald Trump’s election, and then again in 2020 with angry Republicans proclaiming, “Stop the Steal!” against Joe Biden’s election. Tens of thousands of people got up out of their armchairs, boarded busses, rode to the nation’s capital, and gathered shoulder-to-shoulder in bad weather to protest the election results. That’s enough of a crowd that anyone watching might think there was something wrong with the system.

Civil wars tend to have losers on both sides. But eventually one side, one set of values, one point of view comes out on top. And it doesn’t always represent the most reasoned, nuanced, equitable, and charitable version of that mindset going into the conflict. Instead, it tends to represent an absolute validation of the most exaggerated, most hard-line, almost caricatured, version of those values. If you think “Elections have consequences—we won, you lost—get over it” was harsh language in 2010, imagine the rejoinder to people complaining in the aftermath of a civil war. Such validation usually comes with a bullet to the back of the head.

If revolution2 and civil war come to this country over our current political divisions, the players about three steps into it won’t be the players who come out of it. The current cast—Biden and company in the current administration, Pelosi and Schumer in control of Congress on the Left; Trump, DeSantis, Abbott, and whoever else is vying for the Oval Office in 2024, and McConnell and McCarthy in the minority in Congress on the Right—won’t be the major players. They will be like Alexander Kerensky and the other liberal democrats who brought about the tsar’s abdication in the March 1917 revolution in Russia: igniters, not finishers.

Fighting the war, maneuvering forces, dodging and weaving, building and breaking coalitions, taking chances, winning and losing will all devolve to the truly hard men, to the Lenins, Troskys, and Stalins among the Bolsheviks in late 1917 and on into the civil war that followed. They will be the people not afraid of spilling blood and executing prisoners. They will be opportunists and warlords who can see that, once the peaceful, democratic structure of a republic is fractured, then the political and economic situation is wide open and available to whatever you can make of it and take from it. What comes out may not be anyone’s dream of utopia—or even a stable society.

We have not seen any of these potential “hard men” anywhere on the current political stage. They will only emerge once the system is broken beyond repair. In the same way, Lenin, Trotsky, and Stalin were virtual unknowns on the Russian political scene or in the Duma; they were a mere splinter of an exiled revolutionary group until the liberal democrats brought down the monarchy.

I do not want a revolution or a civil war in this country. That is not a solution. It would be like overturning the chessboard rather than playing out the game. But the game of politics, played under normal rules, always involves compromise. One side proposes a program—perhaps even involving extreme measures—then the other side counters, and they meet somewhere in the middle. Everyone gets something; no one is completely satisfied. That is the way a nation’s political thinking and moral feeling move forward, by compromise and the promise of living with the results of negotiation.

Revolutionaries are radicals. To them, compromise equals failure. They want the whole program, root and bough, brought forth in a single convulsive act. And if the opposition remains unconvinced, they will go to war and begin killing in order to achieve their goals. Radicals are by definition unreasonable people. They will achieve their purpose, by any means necessary, or die trying. They really embrace a death cult.

As someone who tries to live in the middle,3 I can usually make peace and live comfortably in the midst of either the Right’s or the Left’s mindset. I believe in obeying the law and minding my own business. But I see around me—not on a personal level, but in the various opposing parties and their captive media—a different feeling. Positions in this country have become solidified, uncompromising, existential, do or die, victory or death. We are becoming like the old European order or a Third World country, ready to ignite.

Under these circumstances, no one can expect a reasonable outcome. The resolution of a civil war will be a brutalized state. The winning side will feel no compulsion to consider the mindset and political feelings of the losers. The only options for the losing side will be total capitulation, exile, or death. We’ve seen this result before, and the damaging effects live on to the third generation.

I only hope that, when we come out the other side of whatever convulsion is coming, we have a country I can live in.

1. For real—not the velvet-rope walk-on at the Capitol building, led by a guy in face paint and a buffalo hat, that we saw on January 6. A real insurrection would bring automatic weapons and rocket-propelled grenades. Think of Trotsky taking the cruiser Aurora to attack the Winter Palace during the October Revolution of 1917. Real insurrections are earnest and tend to be bloody.

2. Radical members of the Left in this country have been talking revolution and political mayhem since I was in college. They sang songs about it (the Beatles’ “You Say You Want a Revolution” and John Lennon’s “Imagine”) and called for it publicly (Saul Alinsky’s Rules for Radicals, the Cloward-Piven Strategy, and outright action by the Weather Underground and Symbionese Liberation Army). Now that these elderly radicals and their progressive children have achieved control of urban centers on both coasts and at least two branches of the federal government—but apparently lost control of the third—they seem to be working toward the “orchestrated crisis” and economic and cultural collapse that will bring on a Marxist-style revolution. And how insane is that?

3. When I take those little online tests of political views, or try to find myself on one of those four-quadrant charts (representing, for example, the extremes of Left versus Right politics on the horizontal axis, and Authoritarian versus Libertarian principles on the vertical), I’m always about three points out of a hundred to right of center along the middle baseline. I am more fiscally conservative and socially liberal, and tend to see the need for both authority and liberty at work in society—or so I have been for the past forty years. But those distinctions are fast becoming obsolete. The trend now is all one way or nothing, up to one corner or down to the other. For example, I now hear it said that one cannot be socially liberal without embracing the rising taxation and resulting government spending needed to achieve progressive programs. The middle ground is sliding away under my feet.