Sunday, April 29, 2018

The Immediacy of Life

Motorcycle on a curve

I’ve been riding a motorcycle—on and off but more on than off—for forty-four years. It is a whole body experience, much more than driving a car or even riding a bicycle. A motorcycle is like Harry Potter’s broom, moving with the speed of thought and requiring you to gauge every curve, every obstacle in the road, the nearness of every car, the gap between fenders if you are splitting lanes, and a dozen other things that happen practically all at once.

One of the things motorcycle riding teaches you is to perceive and move now.1 When you are approaching a curve at fifty miles an hour, and your balance—your very life—depends on making the correct adjustments to throttle speed, front and rear braking, handlebar position and force inputs, distribution of your body weight, choice of apex and your route through the curve to stay in your lane and avoid obstacles like gravel, potholes, and dead animals … you don’t have time to sit down and figure things out. In three seconds or less, you are going to be committed to making the maneuver. If you try to stop, jam on the brakes, or in any way reject the oncoming maneuver, you are going to be in a much worse place, with possibly catastrophic results, than if you miscalculate or misapply any of the above variables.

In such a position, you cannot wish that you were anywhere else, or that the choices before you were something other than what they are. You don’t have the luxury of time to decide. You don’t have the option of a do-over, or taking a mulligan, or apologizing to the gods of speed, balance, gravity, and tire adhesion. You are in the maneuver, and all you can do is face it, make your best approach, and ride it through.

This is one of the things that makes motorcycle riding so exhilarating, so much more a confirmation of life and skill than driving a car or riding a bicycle. You are in the moment, your life is on the line—figuratively and literally—and you have no choice but to succeed. If you stopped and thought about it, paused to dwell upon the consequences of a bad maneuver in terms of broken bones and abraded skin, if not concussion and death, you would probably not be there. But since you are there, you must put those thoughts and fears aside, swallow your heart, and ride it through. The moment is now, and the action is irretrievable.

There must be other experiences that match this: perhaps landing an airplane, hitting a fastball, throwing a long pass, or blocking a punch. These actions all have points in time that require you to commit to your skill and follow through. But even in these moments, you know that you have options. The airplane pilot can always apply power at the last second before the wheels touch down, fly out of a bad landing, and go around for another pass. The hitter with two strikes or less scan usually let the fastball go by and wait for another pitch which might be better. The quarterback can fake the pass and try a running play. The fighter can duck his head or even let the blow land, because a punch to the face is not likely to be lethal.

But life itself offers us many immediate and irredeemable moments. These are sudden choices, like rounding a curve at speed on a motorcycle, like facing a collapsing bulkhead in Lord Jim, where we have only a second or two to make a decision, with consequences that may affect and alter the rest of our lives, and then ride it through.

In the moment that you are offered a deal—like landing a new job that will require you to relocate, or a contract whose fulfillment will call for more effort or time or money than you were prepared to give, or a new house that is more space and upkeep than you planned for, or a mortgage that is more in monthly payments than your budget calls for—you have a sudden decision to make. Sometimes you can say, “Let me think about it,” and go away to discuss the situation with your business partner, your family, you wife. And you believe you can trust that, when the offering party says, “Okay, you think about it,” the parameters of the deal, or the deal itself, won’t change while you’re doing your thinking.

This is the “fish or cut bait” moment. This is the point of commitment. And an honorable person knows that if he or she commits to the deal, there can be no backing out. A person’s word—even if not backed up by oaths, is a bond, a signature is binding, and so is a handshake. The coward or the games player thinks of after-the-fact alternatives: that if things don’t work out, he or she can always get a lawyer, invoke an escape clause, annul the marriage, or just walk away from the obligations and the payments and let a court sort out the breakage. But the honorable person doesn’t have such thoughts. The commitment of a moment, made with a lifetime of careful planning and measuring, is binding for the term of the agreement—or until death do us part.

In this I am reminded of the assessment of managerial style by the majordomo Moneo Atreides in Frank Herbert’s God Emperor of Dune: “The difference between a good administrator and a bad one is about five heartbeats. Good administrators make immediate choices. … They usually can be made to work. A bad administrator, on the other hand, hesitates, diddles around, asks for committees, for research and reports. Eventually, he acts in ways which create serious problems. … A bad administrator is more concerned with reports than with decisions. He wants the hard record which he can display as an excuse for his errors. … Oh, they depend on verbal orders. They never lie about what they’ve done if their verbal orders cause problems, and they surround themselves with people able to act wisely on the basis of verbal orders. Often, the most important piece of information is that something has gone wrong. Bad administrators hide their mistakes until it’s too late to make corrections.”

That is, the good administrator can make a decision, ride through the curve—and have the honorable intention to accept and deal with the consequences. A bad administrator cannot make a decision and will either crash or absent himself—wish himself elsewhere—at the moment of impact.

Life itself offers all of us such moments. How we deal with them is a true test of character.

1. See SIPRE as a Way of Life from March 13, 2011.

Sunday, April 22, 2018

Life as a Mad Scramble

Word pile

Because I had already written one novel in high school—not a good one, a derivative space opera, but still it was a complete story in 472 double-spaced pages—I determined to go into English literature as my major at the university. I believed that, when I got out, I would become a novelist, a fiction writer, in order to make my living.

Most English majors around me at the university wanted to become teachers, either at the high school or college level. To teach in grade school or high school, they would need teaching credits from the College of Education, as it was then called at my university. To teach at the university level, they would need to take a master’s degree and then a doctorate to become a full professor with tenure rights. But this matter of learning English literature in order to teach English literature seemed self-perpetuating to me, like learning Japanese swordmaking in order to teach others to make Japanese swords. So long as society felt a need for students and the adults they would become to have a grounding in the literature of their civilization—or an in-depth knowledge of the artistry of swordmaking—the process might be self-sustaining. But let the faith in either literature or historic arms fade, and the teaching paradigm collapses.

But I was on a different path, learning to write novels. I actually proposed as my senior thesis to write a work of fiction instead of some scholarly dissertation, and my advisor agreed to consider it. But when the time came, I didn’t have an idea for a long story or novella. And when it came time to graduate, when in my dreams I would step away from the university and begin writing the novels that would make my reputation and earn my living, I discovered I didn’t have much of anything to say. That one novel back in high school, the story I had been dying to tell, appeared to have stood alone in my mind.

On a feverish Saturday morning in the winter term before my graduation, I actually went across the street to the building that housed the School of Journalism and found a professor in his office. I described my situation and asked about the prospects for an English major in journalism. He was kind but also, of course, amused. In his world, learning to become a journalist took years of training and practice—and this was back in the day when print ruled supreme and reporting with the written word counted for everything, while the mechanics of radio and television were taught as an interesting sideline, and the internet had not even been invented or imagined. He suggested that I should have started back in my sophomore year, switched majors, and dedicated myself to learning the reporter’s trade.1

In the end, I had the good luck of having impressed two of my professors. One recommended me to an open position at the university press, and the other immediately told me to bring my first editing assignment over to his house where, after dinner, his wife gave me a crash course in copy editing and manuscript markup.2 I held that university press job for all of six months, until the current depression hit home, the state university was squeezed for funding—in competition with the budget for plowing the state’s roads—and I was laid off two weeks before Christmas. But it was a start in the publishing world.

After being laid off, I came west to California, where my parents had established themselves the year before, and worked in my father’s business until I found a job editing trade books at a small press that specialized in railroad histories, western history, and Californiana. The job didn’t pay much, but it was interesting and served as a great introduction to my adopted state and a part of the country I knew only vaguely.3

From publishing, I went into technical editing at a local engineering and construction company, and that job turned into a position in their public relations department. And from there, I went to the local gas and electric utility, first as a technical writer and then in internal communications. After a stint of writing and publishing my first eight novels of science fiction—and getting caught up in the tidal wave that swept over the publishing world in the late 1980s and early ’90s (see The Future of Publishing: Welcome to Rome, 475 AD from September 9, 2012)—I went back to the working world at, first, a biotech company that made pharmaceuticals through recombinant DNA and, then, a manufacturer of genetic analysis instruments and reagents. In both cases, I started with technical writing and editing, then worked my way into internal communications, interviewing scientists and managers about new products and company issues.

At every step in this forty-year career, I had to scramble and reinvent myself. And I had to keep learning, asking questions, and absorbing new scientific and business knowledge along the way. But I never resorted to teaching professionally and I never once—in that old joke about English majors—had to ask, “Do you want fries with that?”

In my university days, I often hung out with engineering students. They would deride my English courses because, in their view, all I had to do was “bullshit” my way through a term paper, while they had to solve difficult equations and get the right number. Imagine their surprise, some years later, when I met the same kind of young engineers at work, and they were just discovering that “getting the right number” was the job of junior engineers. To advance to senior positions like project manager and vice president, they would have to write and speak well, entertain clients who were not all engineers or had grown beyond the engineering techniques they had learned so long ago, and understand a lot more about business and economics. That is, those engineering students would have benefited from a healthy dose of the humanities that any English major or liberal arts student was supposed to take along with his or her major. These talents last longer and go farther than the particular knowledge an engineer learns in school—although you have to keep up with the technical advances in any specialty.4

And when I got into the biotech world, I saw any number of postdocs who had devoted their academic lives to one submicroscopic scientific specialty. When they got into the working world, they were beginning to learn that, outside of academia or a government lab, they would need a lot more generalized understanding of both their area of science and the business principles that make a successful product function and sell. If they remained within their scientific “comfort zone,” they would limit themselves to just the one or two laboratories that did their own specialized line of inquiry.

Liberal arts education used to prepare generalists for work in management and government, where breadth of view, a knowledge of history and its patterns, and a familiarity with different viewpoints and ways of thinking—not to mention the common sense gained from this exposure—were valued more than any particular technical knowledge. Now they teach even social science and public policy as a specialty where students think they can safely ignore literature, art, and music. We are training a generation of termites for fitting into the narrow functions of the hive.

And in all of this, I am reminded of the quote from Robert A Heinlein in Time Enough for Love: “A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.”

1. I have since learned a bit about journalism, because writing for an employee newsletter or magazine—which I had to do as my main job in internal communications—uses the forms and principles of professional reporting: the 5W lead (i.e., Who, What, Where, When, and Why, or sometimes How) and the inverted pyramid style (i.e., tell the most important facts at the top of the story and then gradually descend into more minute details—because readers may not stay with the story to the end and want to know they are not missing anything important if they stop now, and because editors are constantly repositioning the stories in their newspaper and want the convenience of shortening the text from the bottom up without having to call for a rewrite). An internal communicator, even though he or she is presenting a viewpoint congenial to the company’s interests, must also present the story objectively or risk losing the employee-reader’s trust. And then, articles in a magazine will of course have a different structure—with a traditional beginning, middle, and end—because they are not necessarily jostling for page space and the reader’s attention, time, and interest.

2. Twice since then I have paid the favor forward, taking people who had just been hired into the editing profession, showing them about copy marking and typography, and presenting them with their very own copy of The Chicago Manual of Style, the Bible of the book publishing world.

3. Because this publisher had started out as a job printer, they were was one of only three houses in North America at the time that put all the elements of publishing under one roof. Right back of the office where I worked on manuscripts and read galley proofs, we had the linotype machines, which set the copy for all their books; the photography department, which made screened photo reductions and stripped the plates for the printing presses; the small four-color press for book jackets and the large sheet-fed Harris press for the interior pages; the bindery where the folios were folded, sewn, and bound; and the stock room and shipping department, where the books were stored, packaged, and sent out. Manuscripts and cartons of loose photos came in one door, and bound books went out the other. They even had a massive flat-bed letter press in the back, which printed directly from the cast type, one sheet at a time. Working there was a daily history in the civilization of the printed word: This is a Printing Office.

4. Except, perhaps, in the study of English literature itself. Since I left the university, the English Department there and everywhere has been taken over by critical analysis and the theory of deconstruction, which focuses more on the mechanics and limitations of language than the quality of literary expression. As I understand it—and I never studied the deconstructionist philosophy professionally—the approach would have you believe that no modern reader can fully understand, say, Shakespeare, because the language and the meaning of words and application of concepts has changed so much since the plays and sonnets were written. This would seem to undermine the idea that great literature presents us with themes and concepts that are universal and that resonate with human nature itself—and that’s a faith I tend to live by.

Sunday, April 15, 2018

Fake News

Total honesty

Back in the late 1960s I remember starting to hear people say that there is no such thing as absolute or objective truth. That truth could only be relative and conditional. That truth and the concept of knowing and believing something to be true applied differently in various spheres of knowledge and belief and among various peoples and societies.

Since this was also the time I was leaving the enclosed certainties of home, family, and high school and going into the wider world of the university, to this day I am not completely sure whether this variable approach to truth was a new concept in the world or just new to me in my suddenly wider reading among philosophies, religions, social theories, and the other grist processed in the mill of the humanities. The slippery nature of truth may long have been a tenet of some modern European and ancient Asian philosophies to which I was just opening my eyes.

But I’m willing to bet that, at the time, the disappearance of absolute or knowable truth was not generally accepted in the wider population. Certainly, our news reporters and commentators, the people who spoke to the public from a position of authority, acted as if the facts they were relating and analyzing descended from some knowable, supportable, and unchanging source. That they were speaking “the truth.”

But now, fifty years later, it seems that the acceptance of relative and conditional truth—as opposed to the absolute kind—has become common in the general population. Variable reality has become, in the modern usage, a “meme.”

Some of this change has been helped along by wider public knowledge of those ancient Asian philosophies. Zen Buddhism, for example, treats the world we can see, taste, smell, and touch as a sensory illusion that overlies a larger reality—or perhaps no reality at all. In the Hindu religion, from which Buddhism was originally a protest against the endless cycles of return and rebirth, the world around us is maya, an illusion, delusion, a magical trick of the demigods and not the underlying reality.

Some of the change has been adapted from the European existentialists, who taught that philosophic inquiry begins with the subjective, human viewpoint rather than from any universal, Platonic, or Aristotelian examination of an external reality. Later theorists, particularly in the areas of language and literature, brought in the concept of deconstruction, which attempted to separate the language and context of any communication from its meaning and the reader’s understanding. The assumptions of the religious, political, and emotional environment, the language, and the knowledge base of, for example, Shakespeare’s current readers and theatrical audiences are so interwoven and complex that modern readers cannot hope to thoroughly understand his work as it was intended and indeed as it is written.

And some of this rejection of absolute truth comes from a wider knowledge and acceptance of modern physics, particularly quantum mechanics. In the realm of the very small, where bits of matter sometimes act like bits of energy, exact knowledge becomes a chimera, a ghost of the old-style, mechanistic thinking of early physicists like Newton and Galileo. A quantum theorist knows that you can establish by observation either a particle’s position or its momentum and direction but not both, because to observe the one is to change the other. And, in dealing with such innumerable quantities of particles, the physicist relies on statistics and probability rather than trying to account for each photon, electron, or proton and neutron in an experiment. In the underlying philosophy of physics, reality is what you can actually detect and measure, not some presumed or imagined vision of what’s supposed to be happening.

All of these influences, which started to become mainstream in the intellectual, political, and moral upheaval of the late 1960s, helped erode a public belief in some kind of absolute, definable, true-for-all-people-and-times, knowledge of “the truth.”

In this country, the mainstreaming of this slippery version of reality was helped along by a growing agnosticism, acceptance of public atheism, and outright hostility to established religion. It gradually became fine for a person to be “spiritual” in the manner of the Zen Buddhists1 or Hindu mystics, but it was old-fashioned and absurd to still subscribe to the bourgeois, stifling, close-minded teachings of the Judeo-Christian religions.

And finally, the verbal tactics from the left side of the political aisle were brought to bear in this change. The National Socialists’ use of the “lie big enough, told often enough” to change public perception caught the attention of the entire twentieth century. Add Lenin’s earlier directive, “A lie told often enough becomes the truth,” which embedded the practicality of manipulating the truth in every political tactician’s mind. Challenging the apolitical teachings of the church and creating your own version of reality, especially a version based on statistics and false premises, forever changed the public trust in some externally verified, universal, and absolute truth.

So long as the politicians and their allies in the newsrooms of the big daily newspapers, broadcast television, and the professional commentariat could limit the scope of discussion, it was relatively safe to allow the public to understand that truth was relative, conditional, and negotiable. It served to undermine the old assurances, the established viewpoints, and the life lessons that Rudyard Kipling called “The Gods of the Copybook Headings.” Undermining popular assumptions and creating new myths and legends are what politicians and professional opinion leaders do.

And the process worked, after its fashion, for about forty years. But then in the past ten years or so that funny little science-and-technology-sharing computer circuit called “the internet” sprang into full flower. Not only could every person on the planet—except maybe those in censorship-heavy societies, like the People’s Republic of China—have his or her own Facebook page and Instagram messaging account without doing much more than press a few buttons, but everyone who wanted to put in the effort could publish his or her own books and pamphlets in electronic and paperback form, put up a web page that had all the visual authority of any “legitimate” news source, create podcasts that sounded exactly like radio and television interviews, and go into the news business with the same tools as the big city dailies and broadcast journalists. Add to that the quality of video imaging through your average smartphone being equal or superior to the best handheld equipment of a decade ago—and a lot more available and present on the scene—and you have a dozen eyes and a hundred voices ready to report “the news.”

Then the meme of truth being relative and conditional began to backfire. The established media, with their highly paid teams of reporters, editors, and commentators, their worldwide news bureaus, and their tested and accepted political narrative suddenly had to compete with god-knows-who from god-knows-where. The “news” and the “truth” started coming out of the woodwork and flying around the room—around the internet—like a storm of bats exiting their cavern at dusk.

The paid media can cry “Fake news!” at all the controverting crosstalk of the benighted amateurs, and the million voices of the internet can cry “Fake news!” right back at them.

In this, I am not advocating for a return to an oligopoly of opinion, held by the magnates who own and run the largest printing presses and most expensive broadcast newsrooms. That cat is out of the bag and long gone by now.2 I am not pining for the established voice of religion, or the oldest political parties, or my grandfather’s choice of books in the family library. In a way, the little-D democracy of letting every person have a voice is refreshing. The choice of which truth to believe is now the responsibility of the average citizen. The story is not handed down by the party in power or its media moguls, but flutters on the wind like a thousand little birds. Each of us must use his or her best intelligence and widest reading of history, science, and sociology to determine where an acceptable truth might lie. As Gurney Halleck said in the movie version of Dune,Now, guard yourself for true!

But maybe it was a bad idea, in the first place, and for political reasons, to sell the public on the belief that truth was not absolute or discernable but subject to preference, opinion, and varying conditions. In the internally wired society, you end up in the unfortunate position where my lie is just as good as your truth. But there we are.

1. Buddhism and its offshoot Zen are not actually mystical, mysterious, or even spiritual. Buddhism is a philosophy bounded by a practical approach to life and human existence. It is more of a psychological practice than a religion. But that’s just my “truth.”

2. Those national governments that think they can control their public’s opinions and perceptions by editing, filtering, and banning websites on the internet are playing a losing game. No matter how many censors you employ, the worldwide web will have a hundred times as many chattering sources who are well versed in employing euphemisms, creating metaphors, and hacking the code.

Sunday, April 8, 2018

One True Religion

Ancient of Days by William Blake

The popular image of Muslims these days is mostly that of raging fanatics who shout “Allahu Akbar!” while crashing trucks into crowds and blowing themselves and others up with suicide vests. I have no doubt that such people exist and that they are fanatical and dangerous. But the Muslims I have met in the West have been educated, reasonable, thoughtful, middle-class-seeming people.1

I’m thinking, in particular, of the husband of one of my supervisors at the biotech company. He was an Egyptian, formerly in diplomatic service, who was well-read and courteous, with a great sense of humor. He had no more interest in punishing unbelievers and throwing bombs than my grandmother. Of course, he was also from a well-to-do background. But in the matter of Islamic fanaticism, a person’s intelligence, breeding, and background are no guarantee of their gentility. Look at Osama bin Laden. Look at the young Saudis who could afford to take piloting lessons in order to fly jetliners into buildings.

In my experience, however, anyone who is well read and liberally educated tends to become more tolerant of other religions, not less. After studying the tenets of Judaism, Christianity, Islam, Buddhism, Hinduism, the ancient Greco-Roman pantheon, and the surviving forms of animism and pantheism, it becomes impossible for an enlightened person to insist that “my reading” of the scriptures, “my sacred book” itself, and the belief system “I was taught as a child” represent the one true religion and that all the rest are pernicious poppycock.

But I understand where and how many modern Muslims have acquired their fanaticism. About 1,300 years after Christ, Christianity went through its own antiheretical phase, where unbelievers and schismatics were bullied, tortured, and sometimes burned at the stake. Much of that rock-hard belief was inspired by external pressure from the previous expansions of Islam into Spain, France, and Eastern Europe, and some of it was a response to internal pressures from the early Reformationist impulses, which arose through political manipulations and frankly heretical tendencies.

Now, about that same 1,300 years after Muhammad, Islam is going through a similar phase, where apostates are executed and unbelievers are beheaded. Islam is responding to much the same set of pressures, too: an encroaching spirit of Western culture that started with the Crusades in the Middle Ages but has vastly accelerated with the blossoming of a worldwide political, commercial, and electronic cultural tide. And Islam has always had tension among its various sects—mostly between Sunni and Shi’a—which is now being pushed by political pressure from Iran against the rest of the Middle East.

I can only hope that most Muslims will come out on the other side of this turmoil and adopt the worldview of most of today’s western Christians: “I know the Bible says I must believe this and follow that—such as the injunction about ‘not suffering a witch to live’2—but … you know … I just believe in a loving God who wants the best for all of us.” In time, their Allah may likewise become a polite, country-club gentleman who makes no life-threatening demands and creates no unpleasant waves.

Among the things that a well-read, thoughtful person is likely to dismiss from the sacred scriptures are the fantastic stories, such as that the God of all creation walked in the Garden of Eden and hobnobbed with Adam and Eve, or that He spoke the only absolute truth for all the ages into the ear of Muhammad in his dreams, or that He had a human son with the Virgin Mary in order to sacrifice him to the world. Such a well-read—and doubting—person will tend to discount these stories as literal truth and instead accept them figuratively, as metaphors for God’s loving personal relationship with all men and women, both as individuals and as a group.

When you have read enough of other religions, the best you can hope for in terms of a relationship with the one true God is a kind of Vedantic liberalism, where the godhead—however defined—is simply the ultimate conscious reality, a metaphysical all-soul that encompasses all human thought and perception, and each of us is merely a part of this reality which has broken off, is now spending time in the changing physical cosmos which is comprised of body and matter, and will one day return with newfound insights to rejoin and add to the whole. That’s a bleak and not very comforting view of the situation, but it works as a kind of bedrock reality that would explain the religious impulse in human beings that we don’t find in, say, spiders or great white sharks.

What I am groping towards here is uncertainty. A well-read and thoughtful person cannot believe that every religion—in all its specific and fine-grained details—represents the ultimate truth of the whole world, let alone a universe of a hundred billion galaxies.3 Oh, the details may be true, whole, and perfect in the mind of each believer, no doubt! But is it true for the world, or for every world? Is that one story actually the way the cosmos—and whatever metaphysics lies behind all that spinning star stuff—actually work?

A thoughtful person, having been exposed to different ideas and points of view, will have doubts about the nature of singular truth. Such a person may still hold to the principles of the religion he or she was taught as a child, delight in and believe as “mostly true” the stories in his or her particular sacred book, and try to follow the precepts and guidelines of its particular morality. This is part of any person’s acculturation. But he or she will have a hard time insisting that his or her own personal religion is the one true religion, the only correct interpretation, and that people who believe otherwise are apostates, infidels, wicked, vicious, and worthy of sudden death.

Tribalism seems to be part of our basic human heritage. But to overcome it in our search for something better, truer, more real, and more universal would also seem to be part of human nature.

1. Our condo complex seems to be a landing spot for people emigrating from societies in distress and arriving in this country, because we are located in an attractive Bay Area community with a good school district. During the 1980s and ’90s, we had an influx of people from Iran and Lebanon, as now we have Russians and Chinese. It makes life interesting.

2. Exodus 22:18.

3. In this, I am reminded of the legends of Britain’s early war chieftain, King Arthur. For a legend arising from the fifth century, it has such well-defined and specific details: the passionate attraction between Uther Pendragon and Ygraine of Cornwall, resulting in the illegitimate and orphaned Arthur; the royal sword Excalibur embedded in the stone by the wizard Merlin; the betrayals of the half-fairy Morgana and her illegitimate son Mordred; the search for the Holy Grail by the Knights of the Round Table; and the illicit love affair between Queen Guinevere and Sir Lancelot, which sundered the perfect kingdom of Camelot. It’s a rich tale that seems too culturally modern to have come down from barbarian Britain in the early years after Rome’s legions had packed up and left. And of course, it’s a tale that has been adapted and extended in the telling, both in English and French legend, reaching a final form with Thomas Malory’s Le Morte d’Arthur in the fifteenth century, further refined by T. H. White’s The Once and Future King in the twentieth century, adapted again in Mary Stewart’s The Crystal Cave series, and put on Broadway as Camelot by Lerner and Loewe. This sort of quirky specificity comes from telling and retelling the culture’s favorite story over the ages. I’m sure the same sort of expansion and enrichment happened to Homer’s Iliad and helped shape most of the stories in the Judeo-Christian Bible.

Sunday, April 1, 2018

Freedom and Compliance

Minotaur vase

When I was growing up, one of the core values of this country was freedom. We celebrated it as a citizen’s natural right, embodied in the Bill of Rights, which was the first ten amendments to the U.S. Constitution. While the Constitution itself is mostly a procedural for how the government will operate, the Bill of Rights describes those operational necessities that remain with the citizen or those actions that the government is forbidden to perform.

It’s not hard to understand why my generation would value freedom. Our fathers had just fought a long and bitter war against cultures that seemed to celebrate tyranny over the individual. And our generation would be fighting—in spirit if not on an actual battlefield—a cold war against oppressive cultures that wanted to force their values on the rest of the world. Freedom of individual conscience was considered a uniquely American value, but it was one derived from the long history of democratic government in Western Civilization, going back to the Greeks and Romans.1

But it’s a wonder that in just two generations, the forty years or so since I was a young man, the core value has shifted from freedom, which is now considered dangerous, to compliance, which is considered to be safe.

“Compliance” in this sense means that the individual is supposed to surrender his or her personal choices about thought and action to the direction and approval of the group. School campuses have—and the students there loudly and actively enforce—speech codes intended to prevent causing others distress or offence. They operate a separate system of justice regulating interactions between the sexes, or rather, governing how males may touch, communicate with, or even look at the females around them. They promote gun-free zones, of course, but the surrender goes even further in that a student is not supposed to defend him- or herself in any physical or verbal attack but instead seek protection and arbitration from the school’s overriding authority.

And none of this—the limits on speech, action, and defense—is left to the perception, conscience, and direction of the individual. My parents raised me not to give intentional or unintentional offense to people, to think before I spoke, and to be mindful of the feelings of those around me. They taught me to be a gentleman in my dealings with women, to treat them with respect, and not to pressure or bully them with unwanted attention. They taught me to stand up for myself, to stand my ground in an attack, and meet force with necessary force. But none of those precepts would be adequate on today’s campus.

The issue is that compliance with codes of speech and action is not self-directed. It matters not at all what I as an individual might think would give offense to another person. Nor how much I tried to be a gentleman to the ladies around me. Nor that I had a general live-and-let-live policy, but that the freedom of others ended at the tip of my nose. Those would be my values, not those of the larger group.

These days, what is offensive or inappropriate is not left to common sense, good personal judgment, or individual perceptions of fairness. It is governed by a narrow segment of the local society: those with the most anxiety, anger, or insecurity. And where individuals in that delicate class may be too timid to speak up, the dictums are prescribed by a political viewpoint that has a built-in bias favoring anyone who can claim a past history of oppression or present him- or herself as the underdog in the argument. Because such persons by nature tend to represent minority positions in American society—people of any race or cultural heritage not derived from the Caucasian or European, or from an economic background not identified as middle class and above, or subscribing to a sexual orientation not heterosexual male or female2—this bias is dictated by and protective of a small segment of society and hostile to the larger portion of that society. The goal, whether intended or not, would seem to be making the larger part of our country feel outcast, isolated, and in terms of 1960s Transactional Analysis “not okay.” As such, this is an attack on the uninvolved majority by the offended minority and—in some cases, such as the more convoluted sexual orientations—by statistical outliers.3

The existence of these codes of speech and action—and ultimately of thought—renders the individual powerless. Freedom is no longer the state of being able to evaluate and choose for oneself from a limitless variety of possibilities about what to say, do, and think. Freedom is now reduced to the meager opportunity to harm, to offend, to act out, and to violate the directives of the group. That is, to not comply.

Of course, freedom is messy. A society that lets people think for themselves will have a large fraction—perhaps even the majority—thinking and choosing wrongly and adopting absurd, ineffective, and provably false beliefs. A society where people can say anything they want will have a large number of disagreements, hurt feelings, and even a few fistfights. A society where people can do whatever they choose will have a large number of accidents and occasionally ruined lives. It takes resourcefulness, determination, skill, and guts to grow up and function in such a society. The survivor will develop a hard intellectual and emotional shell, a measure of distrust of his or her fellow human beings, and not a little cynicism.

The alternative is the beehive or the ant colony, where individuals function cooperatively, sedately, serenely, and safely because most choices have been removed from their lives and from their minds. It was the sort of society that the National Socialists and the Soviets tried to create and operate. These were societies where the only choices and opinions that mattered were those of the Fürher, the Number One, the queen of the colony.

This is not a society worthy of fully functioning human beings.

1. Greeks, Romans, Western Europeans, French, and Britons—they all had their own versions of democracy. But most of them, at least at some stage in their histories, relied on the voice of the common man, the “strength of the people” (δημοκρατια or demokratia in Greek), to direct the government. In order for this voice to have meaning, the common man must have freedom of thought and action. Slaves can only support tyrants.

2. One of the great accomplishments of this political bias has been to conflate women as a group—who represent half of the population—with the minority position.

3. It matters not at all that, when parsed and defined too exhaustively, the sum of these minority positions generally constitutes a majority of the population. The proponents of this political bias tend to use the prefix “cis”—as in “cis-gendered”—in a pejorative way to mean “normal.” This is a political viewpoint that revels in the unconventional, the “other,” and the outré. It is designed, in the words of newspaperman Finley Peter Dunne, to “comfort the afflicted and afflict the comfortable.”