Sunday, November 20, 2022

End of an Era

Tom on R1800

All things must come to an end, and it would seem that my history and my experiences as a motorcycle rider are among them.

I started riding almost fifty years ago, in 1973. I had been interested in motorcycles ever since some of the cooler boys in my high school class were batting around town on them—well, on 150 cc Hondas, barely better than a scooter, but the vibe was there. I myself had no occasion to ride all through high school and college, but after graduation and with some money of my own for the first time, I decided to try it. And my brother, who was older and had gone into the Air Force, rode motorcycles when he was stationed in Taiwan—again small machines, a Honda Dream and a Yamaha dirt bike—and anything he could do was something I wanted to try.

Because my brother was also enthusiastic for BMWs, I wanted to buy a BMW motorcycle first thing. But my father, who was wiser, suggested that $2,200—the price of an R75/5 at the time—was a lot to pay for something I might not like. So I bought a smaller machine, a Yamaha RD350, which really was considered a “hot bike” but with a frame and seat position too small for my six-foot-six body. But I learned to ride—which meant, in those days, that the dealer let me trundle it around the parking lot a couple of times before I headed off for home and evermore.

I dropped that bike twice. The first time was my first rainy day, first wet manhole cover, and first experience of the rear wheel—which I was precariously overbalancing—going out sideways underneath me. I ripped up a pair of pants, bruised my backside, and did a couple of hundred dollars in damage as the machine slid down the street on its side. The second time was when I stopped on a hill, about to make a sharp left turn across traffic, cocked the handlebar into full steering lock, and tipped the bike over, pivoting on that front axle. No injuries but more damages to repair.

After a year, and despite these mishaps—including a blown cylinder, because I did not realize that a tiny, two-stroke engine with aluminum pistons drives differently from a car—I was in love with motorcycling. I bought that R75/7 and rode it for three years, until I got married and needed to sell it for a down payment on our first home together.

I stayed away from motorcycles for about five years that time, but then the bug bit again. I bought a BMW R100, the same basic machine as my previous bike but with a 1,000 cc engine. I customized that BMW with saddle bags, an oil cooler, fork brace, and windshield, and rode it as a commute bike into San Francisco on sunny days. These were both opposed twin-cylinder, or “boxer,” engines. Four years later, when BMW brought out the first of its in-line four-cylinder motorcycles, designated with the letter K, I traded up. Over the years, I have bought twenty of the BMW machines, about half of them boxer Rs and half in-line Ks. And I loved each one as if it would be my one and only, forever, last motorcycle. But sometimes I have owned two bikes at once, just for the variety of having different machines with different riding styles.1

Along the way, I have also owned two Harley-Davidsons. They are a different style of riding, with all of the motorcycle’s weight comfortably underneath you. They are good motorcycles, but they lack the precision, the fit and finish, and the power of the BMWs.

Sometimes I have gone off motorcycling for a couple of years. Then I have kidded myself that I am shedding “tachyons,” after the episode of Star Trek where the ship had to be swept with a beam to eliminate these invidious particles. Tachyons are hypothetical subatomic particles, named from the Greek word for fast, that supposedly travel faster than light and are associated with time travel. But the lure of motorcycling always drew me back: the sensation of moving in three dimensions, the mastery of a large and powerful machine, the sense of freedom in the open air, and—yes—the element of danger requiring a high degree of precision and skill.

In the last three years, however, following a hiatus after the death of my wife, something has changed in my riding. In that time I have bought and sold six motorcycles. Some were a bit too big, or too small, or too cumbersome, or too exposed. And I was not vowing, as in the past, to love each one forever, although I was still buying the extended warranties as if each was to be my final choice for years to come.

What I was failing to realize is that I had changed. I was now in my seventies. Yes, many good men and women continue to ride in their seventies and even into their eighties. But the reality is that age takes its toll: you become more distractable; your reactions slow down; your tolerance for stress, cold, and wind buffeting deteriorates; and your strength declines. You can counter this with years of experience and increased care, but … there it is.

In the past month, after buying my last two motorcycles—a cruiser-style R1800 boxer and a sport-touring K1600 GT—just four months apart, and swearing that these were the best compromise of features, representing the summit of my riding experience, I had two upsets back to back.2 In the first, I was test-riding a larger R1800 Bagger, which was a hundred pounds heavier than my own bike but offered more wind protection and luggage capacity, similar to the K, and I dropped it in the parking lot. That resulted in a pulled muscle and about $2,500 in damages.

Then, a month later, I went out to ride my K1600 and rushed a stop sign, thinking that the vehicle approaching at right angles was going to stop.3 I braked in time to avoid hitting the car, but I locked up the bike, cocked the bars, and fell over. I was wearing protective clothing and a helmet, of course, but still suffered bruises, swelling, black-and-blue marks, and another pulled muscle. And the motorcycle itself suffered about $5,000 in mostly cosmetic damages. When you ride expensive bikes, you pay expensive repair bills.

But these two accidents—back-to-back and not years apart—confirmed what I had been suspecting for some time. I am not as sharp and mindful as I need to be to continue riding. I have always accepted that motorcycle riding must be done flawlessly. In a car, you can take a bad corner or come to a sudden stop, and the car doesn’t just fall over. And with all of a car’s safety features—seat belts, air bags, crumple zones, and such—you really have to mess up at speed to become injured. On a motorcycle, a minor brush in the parking lot causes bruises and broken bones, not to mention some thousands of dollars in damage.

So the time has come for me to hang it up for good. Because the automotive and motorcycling worlds are in something of a supply crunch, I’m able to sell the two motorcycles back to the dealer without too much loss. And I can wear my nice selection of leather and Cordura nylon jackets—without the armor padding, of course—as spiffy sportswear. But I no longer get to bomb along the freeways at high speed with the wind whistling around my helmet and the bike easily rolling side to side in turns, like Harry Potter’s broom in ecstatic flight. It’s been a good run, but now I’m done.

1. For the complete list of my machines, and some repetition of this story, see The Iron Stable.

2. Over my nearly fifty years, I have dropped motorcycles five or six times, usually when coming to a stop or at a full stop, when my foot slipped or I cocked the handlebars into a steering lock. My only upset at any speed was that first bike and first wet manhole cover. And I never have been seriously hurt. This works out to an accident about every ten years … which, I must say, is damned lucky.

3. As a rider I developed a number of rules or protocols to guide me in future occurrences of situations I have already experienced. Some are general, and some very specific. But the first three rules are: (1) Nobody’s looking out for you. (2) Even if they look right at you, they don’t see you. (3) Even if they see you, they don’t care. This accident massively violated the first rule.

Sunday, November 13, 2022

On Science

Nautilus shell

As a science-fiction writer, although no scientist myself, I am a fan of science. My grandfather was a civil engineer, and my father a mechanical engineer. I started reading science fiction as a child. For most of my corporate career I worked with and interviewed engineers and scientists. And I still subscribe to the magazines Science, Nature, Scientific American, and Astronomy. I don’t always read the original articles, which can be pretty dense, but I read most of the magazine’s summaries for lay readers.

When I worked at the biotech company, I routinely scanned scientific articles for studies that used our genetic-analysis products, so that I could write my own brief summaries to report back to our employees about how our equipment was being used. That gave me a general view, at least in the realm of the biological sciences, of how far and deep what I now call the “enterprise of science” extends. Principal investigators and teams of researchers in university departments, corporate facilities, and government labs around the world are all at work extending and enhancing our knowledge and sharing their findings through peer-reviewed publications. And I’m sure the same effort goes on in chemistry, physics, and the other, softer sciences.

So in no sense do I consider myself a “science denier.” But still … I am aware that not all of the work that passes for science is conducted at the same level, with the same rigor, or deserves the same respect for results. Science is, after all, a human endeavor, and not all human beings think and act the same, or value the same things. So scientific results are variable.

Biology is pretty sound, I think. The field has progressed over the centuries and in the more recent decades from being a work of pure description and simple cataloguing. Using techniques from chemistry and now genetics, and since the application of evolutionary theory, biologists are discovering not only what things are but how they function and why they are related. Their work is not so much theoretical as analytical and demonstrable. That is, provable beyond doubt. Biology’s stepchild, medicine, with its study of mechanisms, pathogens, and now disease tendencies transmitted through genetics, has advanced human life immeasurably in the past two hundred years or so.

Chemistry is in the same advanced state. Since the breakthrough in understanding atomic structure and the ordering of the Periodic Table, chemists have been able to analyze existing molecules and compounds, understand their structure and even predict their characteristics, propose changes to their nature, and even create novel molecules and materials. The field is creating unheard of substances like high-temperature superconductors and advanced battery substrates. Chemistry and its stepchild, electronics—and its stepchild, computer science—have created the modern world.

Physics has made remarkable progress in understanding the world around us, focusing on the mechanics of light and sound, energy and inertia, and other, well, physical—that is to say, tactile, observable, and measurable—characteristics of what we might call the “real world.” But in the 20th century, and continuing into the 21st, physicists’ deep dive into the realm of the unseen and only guessable, with quantum mechanics and probability theory, seems to have veered into a kind of mysticism. To my layman’s eye, certain physicists now seem to be playing imaginative games where anything might be real if only the numbers can be made to add up. So we have String Theory, where all subatomic particles ultimately resolve into tiny, vibrating loops of “string”—but only so long as the real world consists of eleven dimensions and not just the observable three dimensions of space plus one of time. And the ruminations of Erwin Schrödinger and his sometimes-alive-sometimes-dead cat have led to the Many Worlds Theory, where every imponderable probability splits the universe into finer and finer branchings of alternate realities. This has just become a game of having fun with numbers.1

The stepchild of physics, astronomy, is on firmer ground—well, no ground, really, because all that the astronomers can work with are various wavelengths of photons and electromagnetic effects from various subatomic particles. Or at least, it’s firm when they can observe and compare physical results. Astronomy has been remarkably perceptive in analyzing these effects and resolving conjectures based on them into perhaps provable truths. For example, we can be pretty sure the Earth and other planets revolve around the Sun, that the Moon revolves around the Earth, and now that other stars have planets similar to our own. But we are still in the conjecture stage about the nature of neutron stars and black holes,2 and cosmology and its theories about dark matter and dark energy are dependent upon our incomplete understanding of the nature of time, space, and gravity.3

Psychology, among the softer sciences, has made great advances in understanding the human mind and behavior since the purely conjectural theories of Freud, but it works best when it pairs with neurology and molecular biology to understand the brain that underlies the functions of mind. Still, psychologists are studying the elusive product of a trillion firing synapses, similar in all human beings according to their genetic nature but different in their chemical environment and learned responses. Psychology is a work of aggregation among many individual samples—and the art lies in picking the samples and applying a priori conjectures. Its stepchild, sociology, is in an even more tenuous state, because unlike psychologists, the sociologist works with aggregates from the start rather than with the personal reports of individual human beings.

And then there is environmental science, which shares a good deal with chemistry and physics, and has benefited greatly from advances in the understanding of geology, plate tectonics, and ocean thermal energy and salt transports, not to mention satellite imaging and widespread use of local ground-based radar. We can now understand most weather phenomena and predict patterns out to several weeks and trends out to possibly next year. But the Earth’s atmospheric systems are still highly complex and subject to many and varied influences. The current view that the entire planet is warming overall due to one human-caused variable, the industrial release of carbon dioxide, and predictions of worldwide temperature increases in fractions of a degree eighty years into the future—well, that is not observable science. Such predictions are science’s bastard stepchild, computer modeling.

Computers can model complex systems by aggregating variables and assigning weights to them. As I understand the current process—having followed the story of “global warming” since James Hanson presented his case at an Energy Daily conference in 1987 (before which “climate change” was called “global cooling”)—carbon dioxide is a weak greenhouse gas, unless it dominates an atmosphere like that of Venus. The climate models all assume that the tiny push from additional carbon dioxide will create enough heating to cause water vapor to enter the atmosphere, where aerosolized water is a much more potent greenhouse gas. And the models are all based on positive feedbacks to this cycle, ignoring negative feedbacks like carbon dioxide being a necessary plant food, and that plants absorb carbon dioxide out of the air.

The models supporting anthropogenic global warming also tend to ignore the influence of the Sun, which is known to be a variable star. Michael Mann and company produced a global temperature chart based on tree rings4 taken and analyzed starting several thousand years ago. Their chart showed global temperatures holding steady for more than two thousand years and only rising—in the famous “hockey stick”—during the last two centuries. Mann purported to erase the Roman warm period, the Dark Age cold period, the Medieval warm period, and the Little Ice Age of the late Renaissance. But each of these fluctuations coincides with the 800-year cycles—400 years of rising peaks, 400 years of declining peaks—in the familiar 11-year sunspot cycles. A spotted sun is a warmer sun, because the spots are actually rips in the solar atmosphere, which normally holds back part of the solar output; they also create flares and mass ejections that release excess energy. Each of the historic warm or cold periods corresponds to a peak or a drought—and sometimes a total disappearance, called a “Maunder minimum,” after the astronomer who first identified it—of sunspot activity. Since the middle 1700s, the Earth has been climbing out of the minimum responsible for the Little Ice Age. Peak sunspot activity probably occurred in the 1990s, and the peaks during the most recent 11-year cycles since then have been falling off.5

So yes, the Earth’s climate does change, no question about that. But it does so in complex cycles. We have also had ancient periods of “snowball Earth,” when the planet was in the deep freeze, and more recently four great “ice ages” occurring at 400,000-year periods. None of this had to do with a weak greenhouse gas like carbon dioxide.6 I do not deny climate change: the Earth’s climate changes continually and is certainly changing now. But I doubt that human industrial activity is the only or the most important cause for the current changes, or that drastically altering our economy will have much effect in the next hundred years. Human beings have been adapting to cold and warm periods, along with rising and falling sea levels, for our entire span on this planet—and we will adapt to whatever else we might find out among the stars. We will live through whatever changes Earth throws at us in the next hundred years.

As I said, science is a human enterprise, and human beings are fallible. They make mistakes, and they are also susceptible to peer pressure and “group think.” They sometimes engage their own curiosity to study phenomena and build on ideas that are popular in their scientific circles. And not all of them are above proposing studies and shaping their conjectures to follow the grant money offered by interested parties.

I am not—no, not ever—a “science denier.” But I am a realist, and I know you have to be careful about what you believe, take for granted, and choose to follow.

1. For more on this, see my blogs Fun With Numbers (I) and (II) from September 2010.

2. For example, Stephen Hawking theorized that black holes evaporate, and the small ones that may have theoretically been created in the Big Bang must have evaporated completely. He based these conjectures on the possibility that the sudden creation of particles and antiparticles in the vacuum of space, which is supposed to be happening everywhere and at any time, along with their immediate self-annihilation, must often happen at the very edge of a black hole’s event horizon. The occasional loss of one particle or its antiparticle over the event horizon and into the hole must then draw energy and information out of the hole. And that, over time, would deplete and erase the black hole entirely. In terms of provable, knowable reality, Hawking might as well have said that pixies ate them.

3. See also Three Things We Don’t Know About Physics (I) from December 2012 and (II) from January 2013.

4. However, I always understood tree ring size to correlate more closely with available water, rather than with temperature. And hotter climates are not always dry.

5. To look at recent sunspot cycles, see the Cycle 24 Prediction from the Wikipedia page on this latest cycle.

6. One of the articles I read while scanning for our genetic analysis equipment (and I don’t have the reference now) said that drilling into and through the glaciers in Greenland had uncovered arboreal duff—twigs and pieces of dead trees—whose genetic drift from current identifiable species was about 700,000 years. This finding suggests to me that the interglacial period between the last ice age, which ended about 12,000 years ago and peaked somewhere in the last 200,000 years, and the one that preceded it never completely melted the glaciers over Greenland. Otherwise, this ancient duff would have been washed away and replaced with younger stuff.

Sunday, November 6, 2022

On Sincerity

Joker’s smile

Back in the heady days of campus revolution, during the late 1960s—my college days—one of the cardinal sins in the mouths of the radically upset was “hypocrisy.” The idea was that the worst thing a person could do was profess one thought but actually believe something different. Or, more to the point, say one thing and act in congruence with another set of beliefs. The charge was supposed to expose people who were doing bad things, like quietly supporting the war or surreptitiously practicing racism, while claiming to represent peace and justice. “Liar! We see through you!”

The notion of hypocrisy has morphed and twisted but never actually gone away over the past fifty years. The modern form is “transparency.” At every level of public and personal life, a person or an organization, their motives, and their values are supposed to be open to public scrutiny. This is the origin of the “sunshine laws,” where “sunshine is the best disinfectant.” If only we could clear away the smoke-filled rooms and expose every consideration, every decision, and every deal to “inquiring minds” outside the council chamber, then bad things just couldn’t happen.

Although I am not a proponent of big government and secretive deals, I can understand that sometimes the door has to be closed. Court testimony is public, but jury deliberations—where conjectures are raised, and sometimes slanderous points are resolved—are private. Contract negotiations are privileged, although the final agreement in terms of public works or homeowner association improvements are public. Diplomatic maneuvers, including assurances and threats, must be secret, although the resulting treaties are public. How and why a public official might have changed his or her mind on a subject is not necessarily a matter for the public’s right to know. Maybe money passed under the table, and maybe a better argument was simply made. We can only judge people by their statements and their actions, not the content of their innermost thoughts and hearts.

During a student “colloquium” at Penn State, where the issues of the day were raised—among them, “hypocrisy”—one of my freshman philosophy professors, Stanley Rosen, stated flatly, “Sincerity is a trivial virtue.”

That has always stuck with me. Sincerity is a virtue, yes, but how important is it in our everyday lives? How much do I care whether you actually believe what you’re saying and doing? The answer is, not much. After all, Josef Stalin and Adolf Hitler were incredibly sincere in their beliefs, and they each killed millions of people—intentionally, knowingly, willfully—in order to remain constant to those beliefs. Today, do we care more that their beliefs were sincerely held, or that they caused great and inhuman damage?

If you are insincere, you may well be lying to yourself. You may also be lying to me. Lying to yourself is pitiable, while lying to me is actionable. But I would also be a fool if I judged everything people said to me as being perfectly true and sincere. Instead, I have to think about past statements, past performance, and what I know of—or can deduce from body language, facial expression, and other subtle measures—the person’s character. Nothing is certain. Everything is open to reasonable doubt.

What counts more than a person’s sincerity is their actions. Do they keep their promises, pay their debts, raise their children consistently, treat animals kindly and the wait staff with respect? Are they law abiding? Can they be trusted? These are the things that matter. These are the virtues upon which a civil society and stable economy are based. One person might think himself a great villain and yet remember to tip the waiter. Another might think herself a great benefactress and yet push past someone destitute sitting on the sidewalk.

We don’t judge people based on what they think of themselves, because that opinion is often secret and sometimes wrong. We judge them based on their words and the effect of those words, on their actions and the effect of those actions. I don’t want transparency into a man’s soul or to be convinced that he is actually as good a person as he thinks he is. I want to know that, if I lend him five dollars, will he pay me back as we agreed?