Sunday, November 29, 2015

Phlogiston, Aether, and Dark Matter

Having no formal training in much of anything except English literature and the martial arts, I tend to be an outsider in disciplines relating to science, religion, politics, and other art forms. That means I am not wedded to any particular doctrines or viewpoints. But I am interested enough, and read constantly—if haphazardly—enough that I tend to have opinions based on some knowledge if not deep, formal study. This means that, in the eyes of the conventionally trained, I am a dilettante and a meddling fool, while to the lay person and the general public, I am more of a voice lost in the wilderness. For myself, I am a contrarian who has not drunk the Kool-Aid of the formalists.1

As a science fiction writer, this can be awkward if not treacherous. I have to know the general principles of science, understand the nature of scientific inquiry, and be able to identify faulty thinking. But unlike any formally trained scientist, I must also be able to color outside the lines and engage my imagination in speculations beyond the limits of the known. My stories have to take us from what is already proven to what is possible without veering into what has already been shown to be dead wrong.2

A rich field for this sort of speculation lies in astronomy and its co-discipline of cosmology. We humans have been able to learn a remarkable amount about the universe, its origins, and its destiny just by looking outward from the surface of this planet with our eyes, our optical instruments, and now with technologies that can look beyond the spectrum of visible light into all the frequencies of photon radiation and measure a number of flying particles as well. We can analyze these energies, both on the Earth’s surface and from probes in intra-solar space. But the the more we look, the more we find questions and contradictions. It’s a place rife with speculation and theory.

One of the theories has to do with gravity. We understand it pretty well at the scales of human beings, planets, and stars—or at lease we think we do. But at the smallest scale, that of the elementary particles, gravity seems to have no effect. And at the largest scale, that of galaxies and galactic clusters, gravity seems far stronger than we can account for with the visible matter that makes up these objects. If we judge by the brightly shining stars we can see in the average galaxy, plus the unbright stuff we can infer from our own solar system—planets, moons, asteroids, comets, and dust, all of which are just a fraction of most any star’s mass—then our best calculations cannot account for the motion of the stars gravitationally bound in the galaxy.

With our current understanding of gravity and mass, the visible stars in a galaxy like our Milky Way or our sister Andromeda should move at varying speeds. Those in close to the center of galactic mass should trace their orbits rapidly around the dense galactic center, while those farther out should appear to move more slowly, taking more time to complete their longer orbits. This is the pattern in our solar system, where the inner planets orbit the Sun’s mass in a hurry of days or the Earth’s own year, while the outer planets saunter along in decades and even centuries. For visual effect, imagine chips of wood circling a maelstrom: those closer to the center move faster than those out on the edge.

But this is not what we see in most galaxies that have a pronounced spin. Instead of moving independently, the stars appear to move together in a flat rotation, as if they were painted in a fixed pattern on a spinning disk. It’s almost as if the mass of the galaxy increases with a star’s distance from the center. None of our calculations of the probable amounts of planets, dust, and cold icy bodies can account for this extra mass. No amount of mass attributed to a central black hole—the likes of which we now can predict lives at the heart of every galaxy—accounts for this extra mass, either.

Another observation suggests this extra mass as well. All massive bodies bend spacetime—that is, the effect of bodies existing and moving in space over time—and cause light rays to bend around them, either a little or a lot, depending on the amount of mass. Einstein’s prediction of this warping effect was confirmed almost immediately during the next full eclipse of the Sun: stars that should have been invisible because they lay behind the Sun’s disk were actually detected just outside its edge. Our Sun’s great mass and deep gravity well was bending the light rays from those more distant stars on their way to Earth. In the same way, galaxies that lie between observers on Earth and more distant galactic objects bend the light from those objects, so that they may appear offset from where we know they actually lie. In some cases, their shapes are distorted into a halo image surrounding the intervening body. This is called “gravitational lensing.” In all galactic cases, the amount of gravity required to achieve the effects we see is more than we can measure in just the stars of the lensing body.

To account for this flat rotation around a galactic core and its increased lensing power, many cosmologists propose the existence of “dark matter.” They envision this as some kind of invisible mass, perhaps a particle, perhaps not, that sheds no light and does not interact—except gravitationally—with the kind of matter that we know as atoms and their constituent particles.3

In some early theories, dark matter was proposed as an excessive number of normal interstellar bodies like brown dwarves, neutron stars, and black holes that just don’t emit much light. The common name for these objects supposedly scattered in a galaxy’s farther reaches was massive compact halo objects, or MACHOs. Sometimes they are called a robust association of massive baryonic objects, or RAMBOs. Although we haven’t seen evidence for enough of these dark bodies, when you add in a measure of heavy gases in a galaxy, you … still don’t come up with enough mass to satisfy observations.

The most common theory is that dark matter is composed of WIMPs, weakly interacting massive particles, which lie outside the Standard Model of quantum mechanics. They fall into the category of non-baryonic matter—that is, not like our protons, neutrons, electrons, neutrinos, photons, and other identified particles.4 These non-baryonic particles are also identified as “cold,” “warm,” or “hot,” based on how fast they are moving and their effect on the early universe. So, if you follow the WIMP particle theory of dark matter, rather than the MACHO brown dwarf theory, you are opting for massive particles that we cannot see or detect but that pass through every square inch of our bodies, the Earth, and everything else every second. They have no effect on us, except for their mass and inherent gravity—which would add to the gravity effects we can already account for on the local level pretty well from the masses of the Sun, the Earth, the Moon, and other nearby bodies. Uh-huh.

The more I think about this, the more I am reminded of past theories about phantom substances created to account for unexpected experimental results.

Early naturalists, before the development of commonly accepted principles of chemistry and molecular structure, observed fire consuming a log of wood and reducing it to a pile of fluffy ashes, which have not a tenth of the wood’s weight and content. To account for the missing mass, they proposed that burning released an invisible substance called “phlogiston,” which if recaptured from the air could be reconstituted with the ashes into new, hard wood. Now, of course, we understand that plants and other organic matter are rich in hydrocarbon molecules, which with proper heating in the presence of oxygen can be transformed into carbon dioxide gas and water vapor, leaving only the mineral residue from the wood as ash. A universal fire-like substance like phlogiston is not needed in this process.

Early physicists, before the development of quantum mechanics and special relativity, believed that light traveled in waves. Light couldn’t be a particle, because particles traveled in straight lines, like hurled rocks and fired bullets. Since ocean waves represent regular motion in water molecules, they proposed that light waves were regular motion in an invisible substance—not air, because light also traveled in the vacuum of space—that permeated the cosmos called “luminiferous aether.” It took an experiment by Albert Michelson and Edward Morley in 1887 at Case Western Reserve University in Cleveland to show that aether could not exist. If this substance was some kind of universal fluid, then the Earth must pass through it like a fish through water. In the experiment, Michelson and Morley split a beam of light at right angles and measured its travel in two different directions. Clearly, if light was a wave in aether, then the beam traveling in the direction of Earth’s passage, and so encountering a “headwind” in the aether, would be slower than the beam crossing the Earth’s passage and so encountering no headwind. They conducted their experiment on a rotating table, to allow for all possible angular effects. When light showed no preferred direction, they dismissed aether as a substance. This opened the possibility that light was a quantized packet of energy—equivalent to a lightweight particle, the photon—that travels at a constant velocity following an oscillating path whose frequency corresponds to its inherent energy.

Now, I may be indulging in faulty reasoning here. But if the phantom substances phlogiston and aether—materials created to explain our imperfect understanding of then-current observations—were later proven wrong, then might not the phantom WIMPs of dark matter also be an easy first approximation of a much more complex situation? In this I am reminded of the scene in one of my favorite movies, Aliens, where the slacker soldier Hudson is tracking the predator aliens with a motion detector. When the device shows them actually inside the room where the group has barricaded itself, Hudson bawls, “Hey, this thing ain’t reading right!” And the more dependable Corporal Hicks replies, “Maybe you aren’t reading it right.”

Maybe we aren’t reading the nature of gravity right. Other approaches have been proposed to account for the missing mass on a galactic scale. One is that the mass might be found in extra dimensions, but this is hardly better than massive, non-interacting particles. “Other dimensions” is becoming the broom closet into which physicists sweep any inconvenient observation that does not jibe with current theory. A possibly better approach is that the birth of the universe may have distorted and deformed the shape of quantum fields as we understand them. This is getting closer to an admission that the universe is strange and we haven’t worked out all the strangeness yet. A third approach is to admit that, as Einstein’s theories about time, space, and gravity enlarged upon those of Newton, perhaps at the galactic scale we are waiting for a new physicist to propose and prove further modifications to, and a refinement of, our understanding.5

I, for one, am more comfortable with that—with knowing our knowledge is incomplete and waiting to be furthered—than with accepting the existence of an invisible particle we can neither detect nor measure. I don’t begrudge the astronomic community spending money on search for this invisible particle, just on the off-chance it might exist. But I’m not about to become wedded to the idea as some kind of orthodoxy.

Our understanding of the universe is still in its infancy. As a sapient species, we have a lot of growing up to do. I can accept that.

1. My first publisher, Jim Baen, who was a contrarian himself, advocated this position. “Contrarians always win,” he once told me. My experience has been that, at least, they don’t get trampled by the crowd.

2. When I was first starting out as a writer, I submitted a short story—one of my very few, for I am by trade and inclination a novelist—about a harmless looking glass vacuum bottle, such as you might use for hot coffee or soup. The point was that the cap was stuck on tight and, when the thoughtless protagonist finally succeeded in opening it, the vacuum inside began sucking first the air, then all the matter out of first the room and then the universe. When the magazine returned this story, the kindly editor told me that science fiction about dead theories like luminiferous aether just don’t work. Being adaptable and easily trained, I have since been careful—or at least I hope I have—not to roam into the dead lands of bogus science.

3. Astronomers also speak of “dark energy,” which has to do with the fact that the universe seems to be expanding far more rapidly than you would expect from the impetus given by the Big Bang. Dark energy outstrips the tendency of the universe’s mass—even accounting for that dark matter—to pull everything back together. But let’s save dark energy for another time.

4. Ironically, the term “baryon” has its roots in the Greek word for “heavy.” So while normal baryonic matter is “heavy” stuff, this non-baryonic matter—which supposedly accounts for most of the mass in the universe—is the “non-heavy” stuff. Weird.

5. See my complementary blogs “Three Things We Don’t Know About Physics” from December 30, 2012 and January 6, 2013.

Sunday, November 22, 2015

A Spectrum of Self-Awareness

A recent article from Science Alert1 suggests that animals must have some level of self-awareness in order to project different choices into a future state and make decisions. For example, rats in a maze will pause and consider different possible pathways. Without some sense of self, the animal would not be able to distinguish a projected choice from the memory of an actual experience. This suggests that awareness begins with a sense of the difference between the knower and the known. One might also consider this sense the starting point, the far end of the spectrum, of self-awareness.

Finding that point is useful to writers and thinkers, like me, who ponder the human condition and the issue of creating a mechanical intelligence that has human-like capability.2 Certainly, artificial intelligence is more than simply a test of raw computing power or problem solving. Awareness and reflection—and the self-doubt and second-guessing to which they can lead—would only get in the way of straight-line computational speed and accuracy. But when we look for a mimic of the human mind, we look for that elusive quality of self-awareness.

This is more than the ability of a machine to create verbal responses that a human being cannot distinguish from those of another living intelligence—the Turing test. We already have programs that can do that, and they are not particularly intelligent and certainly not self-aware.3 For a program to become intelligent on the terms we mean by “artificial intelligence,” it would have to be able to distinguish among past experiences, current operations and future projections. It would have to be anchored in time, as human beings are.

We recently learned about a friend’s cat who had lost the use of its eye in a fight and would soon lose the eyeball as well. My immediate reaction was that most animals adapt to these injuries rather well because they are not burdened by the existential angst of considering their maimed and reduced state: “Oh, no! I’ll never see properly again! And, oh, I look so hideous!” If a cat that loses its depth perception because it has lost an eye even thinks of its handicap at all, it would be with a sense of befuddlement that arises only when the loss becomes immediately apparent: “Gee, it used to be easier to gauge the height of that cabinet. But this time I fell short. What gives?” This is akin to the puzzlement of an animal that’s used to going in and out of a “doggie door” that suddenly becomes blocked: “Gee, there used to be an opening here!”

Animals may be able—like the rats in the Science Alert article—to distinguish between present decisions and past experiences, but they don’t live in a conscious web of time. They don’t live in much of a past, other than to know that the cabinet used to be of a more accurate height or the doggie door used to swing open. Perhaps a dog, seeing its owner start to fill the tub, can recall the unpleasant experience of bath time, or seeing the owner put on his or her coat and reach for the leash, can anticipate the joy of a walk. But they don’t delve into and reflect on past circumstances and actions, and so they can have few regrets … although a dog can experience a limited sense of shame when the owner catches it in known bad behavior and will live with that sense of self-doubt or self-loathing for a few minutes—or until something distracts the dog, like a good shake. Similarly, the animal does not have much, if any, sense of the future. The anxiety the dog displays when its owner leaves the house without taking the leash and the dog is the generalized anxiety of separation from the adopted pack leader, not an internal estimation of hours or days left alone by the window or the possibility of absolute abandonment.

This opens a new thought, however. How could our dogs, cats, horses, and other pets love us—which I’m sure they do—without a primal sense of self-awareness? The affection they feel is more than just the instinctive following and responding to a known food source upon which the animal has come to depend. The relationship between pet and human includes trust, play routines, demonstrated affection, and emotional involvement—all of which require some sense of self. The animal can distinguish between its situation and that of another independent being. It forms a bond that reflects its own internal emotional state, an awareness of the other’s emotional state, and a sense of the difference between lover and loved one. This is analogous to understanding the difference between knower and the known.

The IBM computer program “Watson” could compete at the television game Jeopardy! because it could explore the nuances of language, word meaning, historical relationships, causality, and human concepts such as puns and word-play. It had command lines that would drive it forward through its comparative processes. And it had weighting factors that would help it decide when it was detecting a relationship based on logic, historical connection, or word association. It had incredible skill at answering the riddles on the TV show, and—if one thinks about the current company product offering called “Watson Analytics®,” these same techniques are now being used in mining complex data and answering human-inspired questions without specific programming and or resort to a structured query language.

But is the Watson machine aware? Does it know that it’s a Jeopardy! champion? And if someone were to tell the program of this fact, would it distinguish between its new status and any other fact in its capacious database? That is, does Watson know what Watson is? Can it know this in the same way that a dog knows it’s different from a chair or another dog or a new human being, and so can place itself in a mentally or emotionally projected relationship of trust or fear, pack dominance or dependence, fight or play, in relation to other animate beings? … Right now, I don’t think so, but at what point might Watson cross over to thinking about itself as a unique identity?

We humans live in a web of time. We have a past, present, and future—and invest a lot of our brain power and emotional stability in examining ourselves in all three temporal domains. We experience exquisite existential questions involving complex tenses represented by the words “might have,” “must have,” “could have,” and “should have” in dealing with the past; “may,” “must,” “can,” and “shall” in the present; and “might,” “must,” “could,” and “should” in the future. We can see ourselves in the past in relation to the deeper past when we employ the pluperfect tense (as in “I had been tried for murder”), and we can anticipate a possible but not certain condition with the future perfect (as in “I will have been tried for murder”). We swim in time as a fish swims in water, and all of this relates to a being we know, understand, and study as ourselves, our own condition, our cherished relationships and advantages, our perceived qualities and shortcomings, our known failings and our expected about-to-fails. We can also extend this awareness to situations and people outside ourselves and imaginatively outside our experience: other people’s strengths and weaknesses, what they did in the past, what they will think and do in the future, and how we did and will relate to them.

Can a computer think like this? Most programs and most processors exist in the now. Ask them to solve for x, and they will respond “x equals three.” Ask again two minutes later, and you get the same result, taking all the steps to arrive at the solution as before. Only if some human programmer has added the capability to assemble a database of previous problems and their solutions, plus a loop in real time that asks if the problem has been encountered in the past, will the program consider its own history. I don’t know if Watson has such a database and loop, but it might save a lot of time—particularly if the database preserved not just identical problems but parsed them into patterns of similar problems and their possibly similar solutions. “Oh, I’ve seen this one before! I know how to do it!”

The next step would be to program the computer to use its excess processing capacity for posing its own problems, possibly leveraging past experience, solving them in advance, and entering the patterns into another database to be consulted in potential future encounters.4 All of this could be done at about as much cost in processing power as operating an elaborate graphical user interface. But would the computer then have a sense of time approaching human-scale awareness? Probably not. We would still be dealing with a Turing-type simulation of awareness.

So, are we humans at the other end of the spectrum of self-awareness? The rat, the cat, and the dog are just beginning to perceive their own states as knower separated from the known. The more advanced species like dolphins can begin to identify themselves in a mirror; apes can recognize words as relational objects, appreciate social relationships and obligations, and communicate new ideas to other members of their troop; and elephants can draw pictures in two dimensions of objects in the three-dimensional world. So are we humans—the only creatures with which we can communicate in complete sentences for the sheer pleasure of this complex intellectual play—the end state of awareness?

I try to think of a more advanced intelligence—and that’s another part of a science fiction writer’s job. It wouldn’t have just better technology or a faster or more complex recall of data. It might become godlike in terms of how human beings imagine their gods: able to perceive past, present, and future as one continuous and reversible flow, because it stands outside of time; able to know all the answers to all the questions, because it invented both knowledge and questions; able to command space and wield infinite power, because it can apply all the possible mathematical formulas and manipulate their consequences in terms of mass and energy. But is this scope of capability actually better than human awareness? Wouldn’t standing outside of time imply an awareness caught in one permanent, eternal now? Wouldn’t absolute knowledge foreclose all possibility of wonder, desire, and choice? Doesn’t complete control of space, mass, and energy suggest an explosion of energy becoming mass reminiscent of the Big Bang? Gods are not superior to human beings because they are fixed in their potential. If they are at an end point, it is an eternal and unchanging stasis. And what fun is there in that?

No, if the rat in the maze is at the delicious beginning of knowing the difference between “I’ve seen this corner before” and “I wonder what’s around that corner,” then we humans are at the beginning—but nowhere near the end—of discerning, defining, deciding, and determining the shape of the maze for ourselves. And that’s a powerful place to be.

1. See Fiona MacDonald, “Humans aren’t the only animals that are self-aware, new study suggests,” Science Alert, June 18, 2015.

2. See, for example, the story line of ME: A Novel of Self-Discovery and certain motifs of the near future in Coming of Age.

3. See Intelligence or Consciousness? from February 8, 2015.

4. Something like this is part of the SIPRE approach to defensive driving in anticipation of the React step. See SIPRE as a Way of Life from March 13, 2011.

Sunday, November 15, 2015

Vegetable Vampires

A recent article in Science magazine1 describes the mechanism by which plants of the genus Striga, including the pretty, lavender-flowered witchweed, live parasitically off other plants.

The host plants, including crops like cotton, corn, and sorghum, release tiny amounts of chemicals called strigolactones into the soil as they grow. For the host, these chemicals are both a growth hormone stimulating the root system and also lures to attract and stimulate fungi, mycorhizzae, that provide nutrients from the soil like nitrogen and carbon. Any Striga seeds which lie dormant nearby also sense these chemicals, usually at lower concentrations than the host plant and fungi can detect. The strigolactones stimulate the witchweed seeds to germinate and then send tendrils to penetrate the host’s root system, sucking out nutrients before they reach the host itself and so blighting the crop. Each witchweed plant then produces up to 100,000 tiny seeds that disperse into the soil and wait, sometimes for decades, until another host is planted.

If that mechanism is not the perfect definition of botanical vampirism, I don’t know of one that would fit better.

Striga is a problem in Africa, where the plant apparently evolved in relation to sorghum, but it also affects crops in Europe and the United States, although to a lesser extent. The article in Science describes how researchers are studying the genes for strigolactone receptors in an effort to control infestations of witchweed and similar parasites.

As I read the article, it occurred to me that a 19th-century naturalist might have learned of such a relationship between host and parasite and then meditated on the grandeur and intricacy of God’s creation. My own reaction was similar but deflected. I meditate on the power of evolution represented by the DNA molecule and how persistent, random mutations can lead one plant to detect the growth hormones of another and turn them to its own uses. The difference, the deflection, is that I cannot think of a god or any intelligently motivated designer who would think up such a horror as witchweed, digger wasps—which lay their eggs in the larvae of other insects—and similar cases of parasitism and still consider them part of a benevolent world. A sensible god wouldn’t make witchweed, because it adds no glory or benefit or abundance to the world. A random force like evolution would make witchweed for the simple reason that it can.

This isn’t symbiosis or communalism or any of those beneficial relationships we learned about in high-school biology. There, for example, the bacteria that live in my gut consume undigested carbohydrates in the food I eat and benefit from the warm, moist environment I provide, while I benefit from their chemical processes that manufacture some of the vitamins I need, support my immune system, and protect me from more harmful bacteria. In contrast, the witchweed does nothing for the corn or sorghum plant but instead robs it of the nutrients its roots work to obtain from the soil and diverts them to the witchweed’s own use.

This is like the parasitism of the bark beetles, corn borers, and other pests which burrow into a plant and feed on its substance. We humans can understand this kind of attack, however, because it’s not too different from a logger cutting down a living tree for lumber or a farmer stripping the ears off a corn plant for food and then bailing the stalks for animal forage or burning them for fuel. In the wilds of nature, every animal feeds on either plants or another animal species, and the plants feed on the soil. This seems right and natural. The only time we humans get upset about this arrangement is when the bark beetle kills trees we plan to take for lumber ourselves or ones we cherish for their beauty and their shade, or when the corn borer robs us of a crop we plan to eat.

But I find the parasitism of the witchweed chilling. It’s not a member of the animal kingdom, like a hungry beetle or a caterpillar that will one day grow into a pretty little moth. This is one plant feeding on another, and not like a fungus growing on the underside of an already dead log. This is Dracula sucking the blood of a sleeping maiden.

In most cases of parasitism, we can eventually hope to achieve a state of balance. If the corn borer worm kills off all the corn in this field and the next, then the moth has no place to lay her eggs and the local population of the species Ostrinia nubilalis dies out. If the Ebola virus, which hijacks a cell’s genetic machinery to make copies of itself, is so aggressive that it bursts the cell membrane and then destroys so many cells that the human host dies, that particular strain of the virus dies out unless it can infect another host within hours or a few days, depending upon conditions. Most successful parasites either throttle back their aggression so that the host lives—think of the common cold, which manages to infect us again and again with new strains—or the parasite maintains a life cycle slightly less prodigious in terms of physical size or population than the host’s, so that new victims are always available.

Witchweed avoids this limitation by sowing seeds that can lie dormant for decades. If it kills off all of its hosts in the field, it can wait patiently until a new host arrives tens of generations later. This is the vampire that can sleep in its crypt for, comparatively, centuries.

I challenge any incipient gods out there to conceive of a set if tricks like this: mimicking the chemical receptors used by another plant, converting those chemical growth signals into both a trigger for germination and a tracking device to guide the parasite’s root structure sideways toward the host, and finally adapting its needs so that it can feed on nutrients manufactured by another species. Now couple all that with the ability to produce tiny seeds that can lie dormant for decades. It’s a lot to accomplish through random mutation and genetic drift.

Considering the complexity of the witchweed’s development, one might almost be tempted to think that an inventively inclined god must have had a hand in it. But as a convinced evolutionist, I know that for every Striga alive and functioning in the world today, millions of generations existed before it that had none, or one, or only some of these characteristics. The genus Striga had all of history since the first seed-bearing plants developed, about 390 million years ago, to learn its tricks. Some of the characteristics may have been immediately useful, as soon as the genetic mutation allowed them to occur. Others may have lain dormant inside the plant’s genome, like the seed waiting in the soil, until this mutated protein became useful in combination with those others to create one of the witchweed’s survival tricks.

Evolution is a long, slow process. It is not an act of creation but rather of accretion, of putting together genetic puzzle pieces, and adaptation, of leveraging the resulting proteins when accident finds a fit with the natural surroundings. If that fit works and contributes to the organism’s survival, then the genetic configuration and its protein products are preserved and become the stem for more random revision and development. The only rule concerns what can or cannot survive, given the current environment and its available opportunities.2

Working on evolutionary principles, the universe presents us with more possibilities than any sentient intelligence, no matter how powerful, can ever imagine. The universe is full of surprises. And that gives me hope, even if some of the surprises turn out to be nasty ones, like the vampiric witchweed.

1. See “How crop-killing witchweed senses its victims” by Elizabeth Pennisi, Science, October 9, 2015.

2. For more along these lines, see Evolution and Intelligent Design from February 24, 2015.

Sunday, November 8, 2015

Pounding the Slab

A couple of times a year, I take one of the motorcycles1 up to the Sierra to ride the foothills and high country with my son’s father-in-law. From the Bay Area, this means crossing the Central Valley, which for the most part means riding the freeway in the early morning or late afternoon. And on my pleasure outings and errand rides around the Bay Area, too, I spend a large fraction of my time on the highway. I call this “pounding the slab.”

Riding back roads that twist and turn through the hills is a test of skill and judgment. Your mind is occupied with holding a good line through each curve, applying the right amount of throttle and brake, paying attention to oncoming cars and straying wildlife, and simply enjoying the kinetic sweep, roll, and dive of a powerful machine negotiating random angles and trajectories.

But pounding the slab just takes a measure of watchfulness—of surrounding cars, cracks and potholes, and trash on the roadway. You can do it all with a six-second sweep of the path ahead and the mirrors on each side. And sometimes that sweep extends to fifteen and thirty seconds or more if the highway is relatively empty.2 For most of the time, my brain is on autopilot, exercising basic balance and control, employing subconscious SIPRE awareness,3 and … noodling.

Noodling may mean nothing at all. I’m enjoying the sunshine, the motion of the bike, the clouds in the sky, the feel of the wind. Sometimes, songs play in my head. I will pick up the throb of the motorcycle’s exhaust and the hum of the tires as low organ notes, usually with a subtly embedded rhythm, and compose wordless arias to match. It isn’t music you can reproduce, but it sings inside me.

I used to try to listen to music through speakers mounted inside my helmet. This isn’t illegal, the way ear buds are, because you can still—theoretically—hear traffic sounds, horns, and the bullhorn instructions of approaching police cars. But the problem is that I can’t hear the music. My favorites are usually classical symphonies,4 early rock,5 ballads,6 or movie soundtracks and show tunes—and all of them, like most sensible music, include both loud passages and soft. With wind noises coming off the helmet’s geometry, plus the bike’s exhaust and the rush of traffic, the soft parts completely disappear unless you crank the volume until the loud parts blast beyond recognition. Better to ride unfettered and enjoy the song inside my head.

More often these days, I take a problem that I need to think about and let it run through my mind while cruising at 70 mph. Usually, this is something to do with the next part of the novel I have in hand: working through a plot problem, or groping for the beginning image, effect, or line of dialogue that will kick off—I call it “the downbeat”—the next scene I am going to write.

You might think those Shakespearean coincidences, mistaken identities, cross purposes, and avowed intentions that you find in any interesting play or novel just came to the Bard while he held a quill in his hand, or that they flow naturally while I’m sitting at the keyboard doing what I call “production writing.” Sometimes they do, because the mind is a surprising mechanism when it gets going in the flow.7 But more often these plot points and twists arrive out of the dark, or from your blind side, while you’re driving, taking a shower, falling asleep or just waking up, making dinner, or doing something else completely unrelated to writing. If you’re lucky, you have paper and pen at hand to write them down. If not, I’ve learned to think the idea through in a comprehensive way, compress it as if I were making notes on paper, and assign it a key word that I will remember later. Then, with pen in hand or at the computer, I recall the word and the idea unfolds like a flower. It’s a neat trick for when you’re out on the slab doing 70 mph.

Sometimes, also, I can “prime the pump” by putting a question to my subconscious mind—such as “Well, how will the evil mastermind lure the fair damsels into his lair?”—right before I get in bed, step into the shower, or set out on my ride. Then, distracted by the business of falling asleep, lathering up, or navigating my route, my mind will sometimes churn the problem and toss back its own solution. Maybe other writers can sit down and brainstorm these issues with themselves, or open the Big Book of Plot Complications8 and find something suitable in there. But for me, it’s noodling the problem while conducting the business of life.

And for the rest, my time on the road is spent just pounding along over the pavement, avoiding cracks and potholes, and dodging trucks. Oh, and enjoying the clouds and the wind.

1. See The Iron Stable for the latest inventory.

2. And yes, traveling at my accustomed speed—which is usually a bit over the limit, so that I don’t get run down by a Volvo or a Prius on the California highways—I have sometimes been surprised by a member of the CHP motorcycle patrol, who usually sweeps by me going a good deal faster. I swear, everyone out there is doing 75 to 80 mph in a 65 mph zone.

3. SIPRE is a defensive driving technique that breaks the driver’s mindfulness down into five independent actions: Seeing, Interpreting, Predicting, Reacting—by which I mean choosing from a set of preplanned and practiced responses—and Executing. For more on this, see SIPRE as a Way of Life from March 13, 2011.

4. Beethoven, Brahms, Dvorák, Grieg, Rachmaninoff, Sibelius, Vaughan Williams—all the Romantics.

5. The Beatles, Fleetwood Mac, Journey, REO Speedwagon, Starship—back before popular music became just disorganized noise.

6. Judy Collins, Clannad, Neil Diamond, Enya, Loreena McKennitt—voices that can invoke a sense of time and mystery.

7. See Working with the Subconscious from September 30, 2012.

8. And if you know of a copy, please send it to me.

Sunday, November 1, 2015

If I Had to Live Abroad

If I had to live somewhere other than the United States—and I really have no plans to leave—then I would choose Italy as a place of refuge and exile. Of all the countries of my admittedly limited travel experience, I found Italy to be the most livable.

When we were traveling in the late 1980s to mid-’90s, my wife and I spent several weeks each on different occasions in England, France, the Netherlands (with a side trip into Germany), and Italy. And of the four, the friendliest, most spontaneous people—despite the language difference1—the easiest travel and accommodation arrangements, and the least day-to-day hassle were to be found in Italy. The art, architecture, history, and the food weren’t bad, either.

I know, the Italian government is in a perpetual state of chaos, the economy is in virtual collapse, and it takes forever—plus, it is rumored, a certain amount of discreet emolument—to obtain any kind of licensing or official action. But the Italians always seem to survive. And they do it with a cheerfulness, a flair, a sense of taste and style, that we earnest, plodding Americans can only admire.

Somewhere near the end2 of his massive history, The Rise and Fall of the Third Reich, William L. Shirer compares the German people’s attitudes toward Hitler and the Nazis with those of the Italians toward Mussolini and the Fascists. The Germans, Shirer suggests, belonged to a relatively young culture that only started to come together in the middle of the last millennium, and they believed implicitly in Hitler and his promises of national glory. But the Italians, having survived in a culture dating back two thousand years and more, which had gone from empire and world domination, through abandonment and barbarian invasion, the Renaissance and the Inquisition, territorial partition and foreign occupation, then revolution and reunion, were a relatively older and more sophisticated people. They tolerated Mussolini and his thugs so long as they provided jobs and made the trains run on time, but the majority of Italians never believed in him.

When your folk wisdom goes back two millennia and comprises the worldview of Roman legionaries, Gothic nomads, Byzantine clerics, Papal intriguers, Medici connivers, Carabinieri bravos, and Garibaldi patriots—not to mention the earlier influence of Greek colonizers and Phoenician traders—your personal attitudes become warm and rounded, like the stones in a fast-flowing river.

Although Italy is a constitutional republic with multiple parties, I’m pretty sure most of the people in office are some kind of socialist, and not a few are probably doctrinaire communists. But despite whatever private beliefs a man or woman in modern Italian politics might hold, I can’t see anyone seriously trying to move the country toward the sort of top-down, command-and-control economic system, with state control of all industry and commercial outlets, that the Soviets and the Cubans have tried and failed at, the Venezuelans are trying and failing at, and the Chinese once tried and have now abandoned in all but name only. For one thing, the Italians seem to be pretty bad at paying and collecting taxes. For another, they aren’t much better at respecting and obeying government edicts and regulations. The country seems to have two kinds of law: the one we honor, and the one we follow. Everything else is negotiable.

It’s a system that works, more or less. And it’s one that no clique, or party, or national movement will ever co-opt in order to lead the average Italian into some kind of far-off utopia.

Consider the food supply. In the U.S., our food production and distribution system is a wonder of biology, chemistry, market reach, and logistics. For example, when was the last time you or anyone in your family seriously considered the growing season? You want apples? Oranges? Avocados? Do you wait for the local harvest? No, you go to the grocery store. If you’re not buying apples that were fresh picked from the harvest in Washington State in the autumn, you’re getting apples that were cleverly stored for months in the right atmosphere at the right temperature—or were fresh picked from the harvest in Chile, where the seasons are reversed. Our growing, packing, and preserving techniques mean there’s not a flavor or a taste you for which you have to wait a minute or a month to enjoy.

The Italians can buy into this same bounty, of course. But when we visited the country, I also saw something uniquely Old World. From the train window as we traveled down the Po Valley, I could see small, obviously family-owned farms with one kind of crop growing in the fields, but also herb and vegetable gardens in the dooryard and fruit trees splayed out against every south-facing wall.

In Florence, we ate at a fine restaurant called Il Latini, where the dining room had hams, cheeses, and salamis hanging from the ceiling and shelves filled with jeroboams of wine. When I asked the waiter about these foods on display, he said they were from the farm which was owned by the family who ran the restaurant and that everything on the menu came from there. When your land has been overrun through the ages by the Vandals and Goths, the Austrians, the Spanish, the French, the Germans, and finally the Americans, you like to know—on a personal, familial, obligational basis—someone who has a wheat field and knows how to mill grain, a vineyard and knows how to press wine, and a pig sty and knows how to butcher and cure a hog. It’s just a matter of survival.

When I came to Berkeley in the 1970s, I became aware of “riot architecture.” That is, banks and stores along Shattuck and Telegraph avenues would have brick and cement fronts, and any windows would be narrow and high up. The free-floating protests of the 1960s had changed the way landlords and renters thought. But this style has nothing on the architecture we saw in Italy. In the oldest buildings, dating from the late Middle Ages, you don’t see any windows on the ground floor, and all those on upper stories have real, working shutters in solid wood.

In Rome, in the neighborhoods around the Piazza Navona, you see medieval remnants of the old Roman insulae: whole blocks turned into a single four- or five-story building. The outside, at street level, is all small shops—actually niches cut into the exterior wall—with no access to the building’s interior and protected at night with a roll-up door of steel slats.3 You enter the building itself through just one carriage-wide entry protected by a massive gate. The entire life of the building—which is usually cut up into several apartments, sometimes including a small hostel, a bed-and-breakfast, or an old convent occupying a number of rooms—is centered on the central courtyard and interior balconies. It’s a huge lifeboat where, at night or in times of social unrest, you can close that gate and shutters, roll down the shop doors, and survive for days or weeks at a time on your own well water and what’s in the larder.

In Florence, in the early ’90s, I got to talking with the old man who ran the cambio, the currency exchange, underneath the steps at the Uffizi Gallery. A proposal was in the air at the time for dividing Italy into separate countries north and south, and I asked how he felt about that. The man shrugged. “I’m a Tuscan,” he explained. Then he pointed up and down the narrow, cobblestone street. “In fact, this is my street. I would put a chain up at either end.”

You might chide the Italians for being backward, almost tribal, in this way. They have a traditional view of family, property, and personal obligations that is at odds with our modern, cosmopolitan, globalist perceptions and tendencies. And yet, for the most part, the Italians welcome visitors and will make accommodation for almost anyone who stops by—for in their long history they have seen almost everyone come through their country and either be assimilated or moved gently along.

I don’t know whether this is the worst form of primitivism or the most advanced form of sophistication. But I like it.4

1. My native language is English; I took six years of French in junior high and high school; and everyone in the Netherlands seems to speak English as a strong second language. Yet the Italians seemed to make a greater effort to make us feel at home, and they blossomed with what English they had if you just approached them with a shy smile and a cheerful Buon giorno!
       I remember an exchange I had with a security guard at the Sforza Palace in Milan. He had no English and I had no Italian, but I was intensely curious about the rows of square holes I could see on the inside face of the stone walls. Were they for ventilation? Openings to some kind of internal structure? With my pocket dictionary, much flipping back and forth, and a bit of pantomime on his part, we determined they were left over from original construction. Rather than erect a scaffolding, the stone masons just laid beams in the wall as the courses rose. When they were finished, they drew out the beams and left the holes.

2. I can’t find the exact quote now, so I’m paraphrasing here.

3. I saw these same roll-up door and window coverings on many individual houses in other cities. It may be nice to protect your home with an electronic alarm—but better to raise a shield that takes a battering ram or an acetylene torch to breach.

4. In fact, I borrowed this style of social organization—personal, familial, and obligational—as the eventual solution to some of our current problems in my novels of the near future, Coming of Age.