Sunday, July 26, 2015

Victimology

When did it become chic in our society to claim the status of a victim? I know this has happened, because various groups now routinely use words like “oppressed” and “deprived” to describe their general condition.

This instance of victimology is not applied to just one phase of a person’s life or to a particular period in his or her personal history. It has become fashionable to claim that you and your tribe were, are, and always have been victims of great and ignoble forces active in society, history, human psychology, sexual relationships, or some other sphere of interaction, where it has become impossible to name your individual enemies, call them out, and thrash them. Conversely, it becomes all too possible to point toward the nearest representative of some designated über-class and blame him for all your woes.

To my mind, this is a dead end. Going deeper into victimhood just doesn’t work. Good stories and happy outcomes don’t follow from reveling in the state of oppression, deprivation, loss of opportunity, loss of dignity, and impairment of selfhood that the victim mantle bestows. My first question to such a person would be, “Well, what are you going to do to get your own back?” That is, how are you going to repair the damage, reclaim your life, rebuild your fortune, regain your self-respect, punish your enemies, and defend yourself against future attacks and depredations? To me, these would seem to be the next steps in returning to the status of an independent, resourceful, courageous, and fully functioning human being.

To my ear, the status of a victim sounds too much like an excuse … and like whining. “I can’t be a complete person, I can’t be strong, can’t do my job, can’t improve my life, care for my family, learn and grow, because I have been abused, robbed of opportunity, looted of identity, denied good role models, hampered with social pressures, and oppressed by the economic system, by the banks, the patriarchy, the upper class, the one-percent, the middle class, or simply by the majority in society.”

When my generation was growing up, my parents and my various aunts and uncles were not people to suffer complaints and whining from me and my brother and our various cousins. I routinely heard the elders say, “Knock it off!” along with “Go do something! Get busy!” My father’s favorite aphorism was “Those who feel sorry for themselves, should!” And I don’t think we youngsters were ill-used. Their world had just survived a grueling and all-consuming world war, second of two in as many decades, separated by a worldwide financial collapse, and the end of that war had brought horrifying stories of brutality, torture, and genocide. Their generation considered strength, self-reliance, and preparedness in the face of such adversity a cardinal virtue. And no one has told me since then that the world has gotten any kinder or less dangerous.

If you attack me and my family, if you rob us or try to bend us to your will, you will not make us victims—combatants, perhaps, and certainly enemies, but not victims. The only person who can turn me into a victim is me, through my own attitudes about myself. And as I’ve said before, that’s a dead end.

Adopting the mantle of a victim is not a survival tactic. Instead, it is an admission of general weakness and lack of purpose. Moreover, it inspires not pity but, among those without compassion or charity, the urge to attack and finish the job. Why would anyone want to make him- or herself a target of unsympathetic scoundrels? If you know your condition or situation to be weak, the best course is not to advertise that fact but rather to conceal it—and then work hard on improving your condition and situation.

Perhaps those who openly adopt the mantle of victimhood are trying to take on a protective coloration against potential criticism. “I’m not an oppressor! I’m not a scoundrel! Look! I’ve been deprived and oppressed myself!” Of course, this argument leads too easily to comparing and contrasting your hurts and degradations with those of the people you are trying to convince. And that’s another dead end.

Adopting a pose of weakness, of oppressed status and a deprived condition, might seem like an expression of humility. But it’s the false kind. The truly humble person does not deny his or her strengths, skills, opportunities, and achievements. The magnanimous person accepts these things calmly, does not flaunt them before others, but uses them to achieve good ends. He or she also wishes to build a future, to grow and learn in order to achieve a better life, more opportunities, more instances of kindness, and an easier passage for his or her family, community, and society. The person who glories in perceived losses, hurts, and damages is actually exercising a kind of pride, focusing on him- or herself and what others may owe to satisfy his or her hurts. The truly strong person puts aside losses and hurts, attempts to take them in stride, and focuses on building for the future.

I’ve always been a fan of Frank Herbert’s Dune novels.1 One element that has stuck with me is a reference in relation to the Bene Gesserit, which is one of the Great Schools, whose adepts and mother superiors are the interstellar empire’s genetic preservationists and political manipulators. They operate under the axiom “Support strength.” On the face of it, this would seem to be backwards. After all, our human compassion tells us the poor and weak need our support, and the meek need our protection. Our human paranoia would suggest that the strong are more likely to be predators and users, and their prey is the poor and meek.

But I think that in the long view, in taking care of the entire human race and building for its future, the Bene Gesserit have it right. The goal is not to encourage the ruthless and predaceous, nor to increase the number of the weak and helpless. The greater good is to help those who are still learning, growing, acquiring skills and purpose, becoming self-reliant and able to cope, and teaching others by their example. Preserving, encouraging, and teaching poverty and weakness are a dead end for society and slow suicide for those who accept the status of victim.

Those who would live, prosper, and move forward into the future must be strong and resourceful. These are the people with the good stories and happy outcomes.

1. See my blog The Dune Ethos from October 30, 2011.

Sunday, July 19, 2015

The Risk-Transfer Society

I’ve written before about why I don’t believe this country is actually headed for a Marxist-style socialist government, with aggressive nationalization of various industries, such as the British undertook in the post-war period.1 Basically, it is much more difficult, time-consuming, and risky for government agencies to try running complex businesses like steelmaking, farming, and banking themselves. We in the U.S. find it far easier to impose complicated, interlocking sets of regulations on shareholder-owned and professionally managed businesses in order to achieve the policy goals we want.

You see this every day. The government would like to run any unit of the business sector—manufacturing, retail, financial, or service—as both a productive enterprise and as an instrument of social engineering, employment policy, and wealth distribution. But the government would soon run that factory, bank, or business into the ground—or into bankruptcy. Instead, bureaucrats can more easily impose regulations governing hiring and firing policies, payment and benefit policies, work and safety conditions, and environmental impacts without regard to their actual effects on production or the economics of the business. Sure, these regulations start out as a means of correcting obvious abuses and societal harm: banning child labor, eliminating overt discrimination against women and minorities, or keeping toxic wastes out of nearby rivers. But soon the scope of the perceived harm broadens and the scale of impact diminishes to the point that government regulation is used to favor and reward certain groups and activities while discouraging others.

But this is not just the action of a “laissez-majesté” government. Our entire society has become a game of risk transfer and rent seeking.

Take, for example, the differences between a property owner and a renter. Traditionally, the person who owned his or her home has enjoyed certain benefits. The cost of ownership in terms of mortgage payments are generally known within specific limits, so that the only unknowns are the costs of unexpected damage and loss—which can be insured against—and of routine maintenance—which can be forecast and budgeted—as well as changes in property valuation and assessed taxes. On the contrary, the renter of an apartment or house has usually been subject to unpredictable changes in cost and conditions, such as rising rents, unexpected evictions, and deteriorating maintenance, with only a short-term lease contract—typically of six months or a year duration—as protection.

But now in most big cities, and particularly in California, the government has stepped in on the side of the renter against the property owner. Rents are government-controlled to limit raises, despite increasing property taxes and maintenance costs due to inflation. State law severely restricts landlords from evicting tenants, and even the Ellis Act (Government Code Section 7060-7060.7), which allows landlords to take a rental property off the housing market without having to sell it, is under fire in San Francisco. These rules give the renter the kind of security that a homeowner has traditionally enjoyed while leaving the property owner on the hook with the risks of loss due to fire, earthquake, or other catastrophe, along with the responsibility and rising cost of paying for maintenance, upkeep, and taxes.

As another example, the Environmental Protection Agency is continually expanding the scope of the Clean Air Act and the Clean Water Act to encompass, respectively, more pollutants, more sources, and more resources. The government does not have to own the land to prescribe how the owner will manage its productive use, and it does not have to assume the costs and risks to productive use of meeting its own regulations. This is one thing when a property owner is banned from dumping raw sewage or toxic wastes into the river bordering his or her property. It’s quite another when federal law imposes development restrictions on a low-lying property that occasionally collects water after a rainstorm and so has been officially declared a precious wetland.

Or consider the grab bag of goodies that our court system and the practice of tort law has become. A plaintiff with any claim of injury can, usually with the support of a self-interested lawyer and a pending class-action suit, identify a “deep pocket” and profit from it. The take includes not only actual damages, if they can be proved, but damages for putative pain and suffering and sometimes punitive damages that are meant serve as a warning to future defendants. I’m not saying that a person does not have a right to seek recompense, have his or her damages due to malice or negligence made whole, and see wrongs corrected, but it’s become a commonplace in our society to envision six figures with every slip on an icy sidewalk or coffee spill by a harried waitress.

And then we have the case of Kelo v. City of New London, in which the Connecticut city forced a private property owner to sell under the Takings Clause of the Fifth Amendment in order to promote private development by a third party and so boost local economic activity. It is one thing to take a piece of property under public domain in order to build a facility designed for public use, like a road or sewer system, but quite another to take the property because you think someone else can make more profit from it and thereby increase the tax base.

In California we have the Coastal Commission, established in the 1970s, which regulates land development and public use along the seashore. Its jurisdiction extends inland from the mean high tide line to a variable distance: some hundreds of feet in urban areas, or up to five miles in rural areas. The commission can establish public rights of way across your private property for beach access. It reviews land-use development plans and issues permits. However, it does not have the power to issue fines or prosecute violations through the courts.

Now, I believe in competition and creative tension. I understand that every disagreement has two sides, and that we can only determine what is right and true when we bring both sides into the open to examine cases, expose facts, and air differences. So the legal struggle between opposing interests such as buyer and seller, labor and management, owner and renter, and other fields of conflict can establish what is fair and equitable in the public view. It would be a sad world if one side or the other automatically took precedence and won every case without contest. That would be a monopoly on the power of justice equivalent to the ancient divine right of kings and their prerogatives.

But still, I sense a movement in this country away from the rights of property ownership, self-expression, and personal interest and accountability. We are drifting toward automatic, reflexive adherence to community interest, public access, and communal property rights. This would all seem to be proper and fair, favoring the little guy and the public interest over the private rights of the landed and wealthy. But I also believe you can take this notion too far.

Since the opening of private land for public grazing in rural 19th-century England, economists have recognized a principle known as the tragedy of the commons. The reasoning is simple: when a resource like open land, productive assets, ground water, the fish in the sea, or anything else that people might value is held as a public good rather than as private property, it tends to deteriorate, be exploited, or become depleted. That’s because people are still rational economic beings who tend to maximize their satisfactions and minimize their burdens. So we see the English commons overgrazed; the seas fished with huge drag nets; public buildings, facilities, and infrastructure left dirty, worn, and badly maintained; and untended property defaced with the impulses of graffiti artists and vandals. People cherish and take care of the things they own personally; they are less careful of property for which they believe others are responsible. It’s a commonplace that some of the greatest environmental disasters have occurred in Communist countries like the Soviet Union and the People’s Republic, where all property is theoretically owned in common by the state or “the people”—which means no one in particular owns or cares for it.

Transferring private property to public use through government regulation would seem to be a way around this problem. You simply leave the upkeep and maintenance, the cherishing and the associated risks, to the nominal owner while letting others take what they want or need from the property’s active use. The owner is left holding the basket while everyone else gets to extract the goodies.

The process will continue for a while. But like almost everything else in the universe—except perhaps for the gravitational pulse of a black hole—the situation is unstable. If the owner perceives too little value from trying to run the business, develop the land, maintain title to the property, hold the company’s shares, or otherwise participate in the game, he or she will drop out and become one of the public users. People are not stupid, and they will not go through the motions of playing a losing hand forever.

A society that does not value fairness and equity, and that has no appreciation of the essentials of human nature, will not last long. One can fail on the side of protecting and promoting the little guy and the public interest just as surely as you can fail on the side of preserving the rights of the landed and wealthy. In the long view, one can see the Western World in the 19th century as favoring the top tiers of society and vested interests, while the late 20th century and the start of the 21st now favor the dispossessed and the landless.

These are interesting time we live in, and which way the government and economy of this country and the Western world as a whole will go is now to open to question. I can only leave you with the Buddha’s blessing, which is also a curse: Nothing lasts forever. Not even black holes.

1. See Why Own When You Can Rent? from October 13, 2013.

Sunday, July 12, 2015

Building Our World

Five years ago, in a meditation on the speed with which our technological base is expanding,1 I wrote the following:

“In the 20th century we started another transformation, from mechanics to informatics. Consider that for 100,000 years of human history a tool was a physical object like a hammer or an axe blade. But in the last 40 years techniques like crossfoot accounting, computerized spreadsheets, scheduling programs, risk analysis, and datamining have spawned a generation that uses nonmaterial tools on a daily basis. For 20,000 years, animal husbandry meant taming and working with whole animals or crossbreeding whole crops of corn. In the last 20 years the manufacture and use of enzymes, proteins, plasmids, and antigens has created a biological industry that employs the barest parts of, rather than whole, organisms.”

But that’s not quite accurate, is it?

Animal Husbandry

Humans have been using parts of animals from the very beginning. Hunter-gatherer cultures represent an advanced technology in themselves, and they leave almost nothing to waste. Consider the uses to which the Native Americans put the buffalo—an animal that they neither domesticated nor herded but hunted—deriving meat for their bellies and skins for their clothing and lodges. They gathered other animal parts, from porcupine quills to eagle feathers, for personal decoration. And the Eskimos use every part of the seal and walrus: blubber and meat for food; bones for needles, knives, and harpoon points; skins for clothing; sinews and nerves for cord and thread.

In a cave called Divje Babe in Slovenia, paleontologists have found a 43,000-year-old fragment of a bear femur with holes drilled for a flute. Debate continues whether this was a Homo neanderthalensis or H. sapiens artifact, but it seems indisputable that the holes are artificial and their spacing represents a diatonic musical scale.

Humans in ancient times used bees—hardly domesticated as pets, although they lived in wicker and straw hives of human construction that people placed conveniently close to their planted fields—to obtain honey. Similarly, ancient societies harvested shellfish for a particular shade of purple dye and scraped tree sap for incense and perfumes.

All of these were uses of animal parts in advance of any coordinated effort at domestication, care, and feeding such as humans have given to their dogs, horses, cows, and pigs from prehistoric times. And even domesticated animals have yielded more than their meat, hide, and muscle power. Horse hooves have been boiled for glue from prehistoric times. Sausage casings have been made from layers of the small intestines of sheep, goats, and pigs, while modern edible casings are made of collagen processed from cattle hides or cellulose processed from plants. And various plants have yielded more than their seeds, stalks, and roots for food, too. Consider the antiquity of weaving reeds and grasses into baskets and hats, or flax and cotton fibers into cloth. So people have been using animal and plant parts in industrial processes for hundreds if not thousands of years.

Still, I will stand by the intention of that paragraph above. In the last twenty years or so, especially since we learned to read and manipulate the genotypes of various organisms, our use of animal and plant proteins, enzymes, and antibodies has flourished. We routinely insert fragments of human or animal DNA into yeasts and various cell types of the Muridae, or rodent family, to create targeted proteins used in medicine and research. And laboratories all over the world are now working on genetic modifications that will let green algae produce and release lipids—various fatty molecules—that can be processed in place of oil drilled from the ground. The goal, of course, is to move directly from sunlight to a fuel precursor in one step that can be repeated endlessly with no net addition of carbon dioxide to the atmosphere.

And I don’t think I was wrong about the speed with which this industrial adaptation has accelerated. Our chemistry and biology give us a power over the carbon molecule and other artifacts from fossil life of which an 18th-century charcoal burner, a 19th-century coal-tar chemist, or a 20th-century petroleum engineer could barely conceive, let alone envy.

Information Processing

If knowledge is power, then humans have been wielding it since at least the invention of writing.

Consider the ancient Chinese classic I Ching, or Book of Changes. As a system of divination, it was no more accurate than reading Tarot cards or flipping coins, but that’s not the point. The I Ching’s arrangement of 64 hexagrams represent a system of thought, a proposed relationship among human characteristics, everyday circumstances, and the workings of chance. People have been using it to order their thinking and inspire their imaginations and insights for more than two millennia.

Consider Sun Tzu’s Art of War, probably the world’s first and best military handbook. The text is a distillation of principles that, yes, any experienced war leader could probably work out for himself. Like Machiavelli’s The Prince, the teachings are logical and easy to understand and build upward from one concept to the next. Like knowing the rules of chess and studying the most common gambits, reading either of these books in advance of going to war or entering politics gives the practitioner an edge over his or her untutored opponents.

Consider the principles of accounting. Crossfooting—the method of verifying a result by adding the totals across columns of numbers—was always possible and useful when working on a paper ledger in pencil or ink. But the system really took off with the invention of first the computer and then the spreadsheet, where the process became automatic and could be programmed to raise warning flags if the totals didn’t match.

Actually, though, the greatest advance in accounting was probably double-entry bookkeeping, which originated in Venice in the late 15th century. This system simply balances assets against liabilities to track the money and activity in any business. So taking out a loan, for example, puts cash in the asset side of the business ledger and simultaneously enters a debt to be repaid on the liability side.

Another accounting advance was reckoning the “time value of money”—that is, the rate of return that an amount of cash or other asset could earn if it were lent at the current interest rate or converted into some other form of investment over a specified period of time. The time value of money drives businesses to put their excess cash to work and lets the owner compare investment and asset strategies on a reasonable basis. Although this is easier to figure with a computer or calculator, the principle has existed as long as banking and alternative investments have been around, and the formulas for figuring present value and future value can be worked with pencil and paper. The whole realm of accounting is an informational tool that guides the decision-making of alert businessmen and –women.

So, yes, previous cultures have developed and used raw information as a tool. And this tendency only increased when the printing press made paper copies of organized information caches like books, pamphlets, and scholarly treatises widely available. Then it increased again when the computer automated the processing of information.

Still, I stand by the premise in the paragraph above. We are moving from an analysis of simple data patterns—like the movements of armies in the field or the flow of cash through a business—to appreciation of and predictions based on much more complex situations. For example, IBM is now selling a service called Watson Analytics. This is clearly based on the associational algorithms that IBM’s computer scientists built into the brain box called “Watson,” which turned it into a Jeopardy champion that could blend knowledge of historical and cultural references, facility with linguistic and logic puzzles, and human-scale memory recall into an instrument for recognizing and using patterns to obtain answers. IBM calls this “cognitive computing”—which means that the computer can analyze strange data and answer questions about them without a human programmer having to code the questions and the search terms for each step of the analysis.

The future is still coming at us faster and faster, but it all builds logically and progressively on the patterns of human invention that have come before.

1. See Coming at You from October 24, 2010.

Sunday, July 5, 2015

The Truth About Personal Honesty

We all believe in personal honesty. That is, we are honest with ourselves about what is real in our lives, about what we know, what we want, what our intentions are, and where we stand in the world. We may tell fibs and lies to the people around us—even to our nearest and dearest—but with ourselves, by ourselves, we are true.

Many spiritual paths such as Zen and self-help programs such as Alcoholics Anonymous require their followers to practice personal honesty. The Temple of Apollo at Delphi, seat of the ancient world’s most famous oracle, had “Know thyself” (gnōthi seauton) inscribed above the doorway. And that should be an obvious hint: when the most rigorous forms of self-knowledge and the voice of a god supposedly telling you the truths of the universe must urge you to be honest with yourself, perhaps personal honesty is not so commonly practiced as we assume.

You would think this was the easiest thing in the world, to know and understand yourself and your aims. After all, we live inside our own heads. Everything that passes through the human mind at some point must, we think, come before our conscious awareness to be recognized, evaluated, and either accepted or rejected. We can pick over our own thoughts and decide for ourselves what is true. So all of us should—at some point of reflection, confession, or intense personal struggle—be able to realize and state the truth about ourselves.

And yet, too often we don’t. Without the help of a Zen master, an AA sponsor, or the oracle of a god, we are capable of living in an almost complete state of denial, delusion, or personal amnesia—usually for years at a time. We can be cruel to people and call ourselves caring and helpful. We can indulge our senses and call it self-awareness. We can hobble our daily lives with nonproductive obligations and thoughtless debts, our bodies with excess pounds and neglected muscles, and our minds with useless habits and unexamined compulsions … and call it a state of freedom.

Why is this possible? I would contend such denials and delusions are a product of the human mind’s function as a story-making and myth-projecting machine. We all tend to make stuff up. We embellish the known facts. We redirect our minds from what we see to what we think we see. And we fill in the gaps when what we see is not a clear or complete picture.1

In part, this is a survival trait of the human brain and its sensory apparatus, and it operates on many levels. For example, our eyes evolved during generations in which we were both hunter and prey. We use close focus to examine the shapes of things that attract our attention and interest—for example the type composing the words you are reading right now—and then we study the images in that focus for their possible meaning and importance. But we also use the periphery of our visual awareness—by far the greater span—to detect movements, shadows, and hints of anomaly. These may either give us warning of threats, if we are operating in prey mode, or clues to concealed possibilities, if we are reacting as hunters and gatherers. The vulnerable naked human who does not sense and react to a sudden movement in the bushes can be surprised and killed. The hungry human who does not perceive the spots of a faun among the sun-dappled shadows of the underbrush, or react to the flash of color indicating ripe berries among the swaying leaves of the bramble, will eventually starve. In the same way, our ears react to the rustle of tree branches, the rhythmic pattern of footfalls, or the sound of our name in the babble of a crowd. Our skin senses the breeze as a touch, and interprets a touch as either potential caress or imminent attack.

Our brains were shaped to work from partial data, to flesh out vague perceptions, to make interpretations and decisions based on less than full knowledge. Sometimes, this processing goes haywire. A chemical imbalance, arising from illnesses like schizophrenia and bipolar disorder, can cause a person to misinterpret the waterfall of sensory impressions that pour in continually from the sense organs and are categorized, sorted, and interpreted in certain parts of the brain. Scrambled auditory signals present themselves as coherent voices, and their speech and suggestions are directed by the memories and fears active in the person’s current thinking. Scrambled visual inputs become hallucinations and visions.

Most of the time, though, this processing from partial inputs works to our advantage. A shadow at the edge of our vision, or a sudden movement in the air against our cheek, causes us to duck and thereby avoid a blow.

By extension, we use the brain’s facility in working from partial data and incomplete analysis to make leaps of intuition and understanding when dealing with puzzles and problems, and in trying to understand the world around us. We work from clues to fill in the complete pattern and arrive at actionable decisions. We are all so good at this that we must be constantly reminded to use standards of evidence, rules of logic, and tests of our assumptions when trying to establish the truth about historical, scientific, or legal facts. We imagine and assume so much in our daily lives that we must be restrained when representing “the truth” in more formal circumstances.

One proof that our minds are, at the most basic level, story-making machines can be found in our dreams. In the deepest stages of sleep, the brain begins sorting the day’s impressions, turning short-term experience into long-term memory, salvaging some of the day’s thoughts and ideas, and discarding the rest. During this period, we are not aware of the memory sorting process.2 Instead, the brain entertains our dozing mind with made-up experiences, fictitious personal histories, adventures, fantasies, harrowing escapes, and other real-seeming sequences that bear only a distant relationship to our everyday life. Our brains tell us stories while we sleep.

Even when we are awake, our memories are not perfect reflections of actual experience, taking a faithful image like a piece of carbon paper3 or a mirror. Instead, our memories are associational systems. We tend to join together similar things in our memories—which is why mnemonics and other memory tricks work so well. We also edit our memories slightly every time we recall and think about them.4 That’s why techniques to induce a “false memory”—perhaps an interpretation of some childhood trauma, or the fact of trauma itself—succeed so well, because a skilled practitioner can lead us to modify our memories to fit a preconceived notion.

With all of this processing and massaging of facts and memories going on inside our brains, is it even possible to be honest with ourselves? Can we separate delusion and imagination from truth? I believe so … I hope so. If not, then life is a dream, meaning is fragmentary, and nothing much is more important than anything else.

But the snares of memory and delusion mean that finding and keeping the truth about ourselves is going to be hard work. It requires effort, self-examination, and a willingness to be stern with the weaker side of our natures and fearless in the face of dangers and strong temptations. But as the Zen masters, the AA sponsors, and the Oracle of Apollo all suggest, our lives are worth that effort.

1. Such is the nature of conspiracy theories.

2. See, for example, Memory Consolidation and REM Sleep at Serendip Studio, Bryn Mawr College, from January 9, 2008.

3. A relic of the typewriting age, “carbon paper” was used to transfer the impression of a striking key from one sheet of typing paper onto a second or third sheet behind it. A typist sandwiched the carbon-dusted paper between the sheets, where it acted like a supplemental typewriter ribbon. Of course, these days we just press Control-C and Control-V to copy and move pieces of digital text from one place to another. I suppose that, in another twenty years, I will also have to explain for the next generation what a typewriter and paper were.

4. See, for example, When Memories are Remembered, They Can Be Rewritten at National Geographic from May 20, 2013, or Your Memory Isn’t What You think It Is at Psychology Today from July 16, 2013.