Sunday, August 30, 2020

Absent the Middle Class

Girl with magic box

I was born just after the Second World War, which means I grew up and became politically aware—or at least what I think of as “aware”—in the Eisenhower, Kennedy, and Johnson administrations. This was a time when the United States was the “last man standing” among the nations that participated in the war, and we came out better than any on either side. We had our infrastructure intact and had built up a huge capacity in raw materials like steel and aluminum as well as manufacturing due to the war effort. We were on top of the world.

That was also the time when the middle class in America was doing its best. Soldiers returning from the war were getting free education on the GI Bill. Homes were being built in newly defined and rapidly expanding suburbs. Business was booming and, even with the returning soldiers, jobs were plentiful. Most people—there were exceptions, of course, especially in the Jim Crow South—were prospering as never before. It was the good times.

The middle class is a relatively new thing in human history. It didn’t really develop until political and social structures had changed: urban life became commonplace, rather than the exception; and capitalism, the free market, and international trade became encoded with commonly accepted practices and rules, rather than just things that happened casually at the village level. The middle class was the place where people who were not nobles and landowners, yet too ambitious and too well educated to remain peasants, could find profitable employment and eventually riches by engaging in large-scale trade outside of selling butter and eggs on market day, or manufacturing outside of single-family cottage industry, or taking on the new roles in banking and legal transactions that supported these intermediate activities.

The middle class was for people with hopes and ideas, for those who sought independence from the old social classes, for those who wanted to do better than their fathers and grandfathers, for those who hungered to prove that they were as good as anyone and a damned sight better than most. It was the class of the feisty ones.

From Roman times and into the Middle Ages and then the Renaissance, the landed class, the nobles and the gentry, despised these strivers. Going into trade or handling money professionally was all about getting your hands dirty. And while anyone might admire a legally trained mind in the Roman Senate or a lawyer at court doing the king’s business, the sort of person who argued about the price of injury to a cow or the placement of a fence line was little better than a conniver and a con man in his lordship’s domain.

And of course the peasants, lately serfs, and still working the land that their father’s had farmed and sharing the proceeds with the lord of the manor, all viewed members of the middle class as social upstarts, the big men from town, whose fathers might have been the local blacksmith or miller, and whose grandfathers had been serfs like the rest of us. People who wore britches and waistcoats rather than the peasant’s smock were already getting too big for themselves.

So the middle class has been under suspicion and under fire for a long time. It wasn’t just idle animosity that made Karl Marx and the other socialists of the 19th and 20th centuries despise the middle class with its striving and materialistic values as “bourgeois”—which is just the French word for this class—or worse, “petit bourgeois,” as if they were too small to be significant. And why not? When the politics you’re selling involves state ownership of the means of production, and puts them all in the hands of appointed technocrats, or the revolutionary vanguard, or the modern equivalent of Plato’s philosopher kings, then the people who know how to handle their own or their neighbors’ business practices and money, who will start new enterprises simply because they think they can make a profit from them, and who will obey rules but not wait patiently on instruction from their betters—these people are the bureaucrats’ natural enemies. These are the people who will upset the serenely floating boat of socialistic doctrine and practice. And so these are the people who must be the first to go up against the wall.

And the peasants, the modern blue-collar workers, the ones who are content to do what they are told and lack the ambition or the education to go out and start their own businesses, even as house painters and contractors—they will be quite happy to work in a factory owned by the moneybags class with protections from their union, or work in the factory owned by the state with those same protections in place according to state law, and still have their union—if that’s even needed. The fate of those middlemen, professionals, and entrepreneurs is irrelevant to the new peasant class, at least at the surface of their minds.1

The middle class has always been in, well, the middle, between two classes that would just as happily see it disappear. And the middle class is disappearing these days. Not only is the upper class getting bigger—in terms of its power if not its numerical size—with wealth beyond the dream of kings and emperors of old. But the lower class is also getting bigger, with more people finding it harder to get the education and the good jobs that will enable them to enter the middle class as professionals, business owners, and independent traders. It is getting harder to own a house rather than rent, buy a new car instead of a used one or a lease, ensure your children a good education, take annual trips on your vacation—if you even get one while working two jobs—and plan for a comfortable retirement.

The middle class is being squeezed. Whether this is a planned process or just the natural course of modern economics,2 it’s happening. It has been going on in every decade of my life since I became politically aware. And I don’t know if it’s because the upper class and the Marxists do well when the majority of the people are more dependent on government and the largesse of big corporations than on their own initiative, or because we’ve lost something of the entrepreneurial spirit that fed bright and hopeful people into the middle class.

But something’s missing. And neither the top nor the bottom seems to notice or care.

1. If the peasant or the blue-collar worker thinks deeply, however, he will wonder where the technologies and inventions of the modern age—electricity, telephones and televisions, personal computers and smartphones, numerous medical advances, and easy credit and banking—all came from, if not from the entrepreneurial spirit of those who have their philosophical roots, if not their family background, in the middle class. But I digress …

2. As Robert A. Heinlein noted: “Throughout history, poverty is the normal condition of man. Advances which permit this norm to be exceeded—here and there, now and then—are the work of an extremely small minority, frequently despised, often condemned, and almost always opposed by all right-thinking people. Whenever this tiny minority is kept from creating, or (as sometimes happens) is driven out of a society, the people then slip back into abject poverty. This is known as ‘bad luck.’ ”

Sunday, August 23, 2020

Thingness and Nothingness

King Lear fool

The other morning I awoke to a realization: that while my mind, my conscious thought and memory, is a “spirit of the air” and therefore immortal, my body, my brain and the heart, hands, and legs that support it, is still a thing, and like all things it has a finite and ending existence in space and time. This is not, of course, a new thought in the history of human inquiry. But it is usually masked on a personal level by the day-to-say assumptions we all make, that our lives are virtually infinite and that death is far, far away.

As a professed atheist with a predilection for scientific empiricism, I am denied the consolation of belief in any kind of afterlife: transmigration of the soul into another body, translation of the mind into a spirit residing in a heaven or hell, or even transition into some kind of universal all-soul or cosmic consciousness. I have to believe that, like everyone in my parent’s generation and even some in mine, or a once-favored pet, a tree in the forest, or a dead fly on the windowsill, we are creatures of time and space, have existence and activation for a brief period, and then go out. The closest my beliefs come to religion is the Buddha’s description of Nirvana, the end of karma and its oppressive, continuing cycle of rebirths: you go out, like a candle flame. Where do you go? Don’t ask, because it is not defined.

In my view, this “going out” is nothing you have to work for or struggle to obtain, like the Buddha’s Nirvana. Instead, it just comes with the package of a person’s being a thing in time and space.

I have argued elsewhere against this “thingness” aspect of being human. My point was that transporter beaming as in The Fly and the Star Trek series cannot work simply because I—and people in general—are not things but processes. Every cell in our bodies, unlike the fixed materials in a brick or a bowl of potato salad, is in constant motion: absorbing nutrients, processing them, and casting off wastes, as well as transcribing DNA and translating RNA to make proteins, and using those proteins as building blocks and catalysts within the cells, along with chemical messages being sent through the bloodstream, and electrochemical messages sent across a network among 86 billion neurons in the human brain. None of this is static; everything is moving. Even with the fastest supercomputer, you can’t map the position of every atom in each cell, deconstruct it, transmit the information across space, and rebuild the body—and not expect to create a muddle of chemical reactions that got reversed, DNA and RNA replications that got interrupted, nerve impulses that got scrambled in transit, and a resulting accumulation of errors and poisons in each cell, the bloodstream, and the nervous system.

But still, much as I am a process, the machinery that is running that process has a fixed place in space and time. My software is firmly anchored to hardware, a squishy, chemically based machine.

The difference between a brick and a human body is that elusive, self-replicating, self-activating electrochemical thing called “life.” A brick is unchanging, except for the abrasions it suffers from use and weathering, and so it is virtually immortal. A living body is constantly changing, renewing cells and tissues as they wear out—at least up to a point—and adapting to its environment both physically as well as mentally and emotionally. But when the physical errors mount up—to say nothing of the mental and emotional errors—and the healing process fails, then the electrochemical spark dies, and the body becomes not unlike a brick—except that it deteriorates a lot faster. This is the mortality of all living things.1

Eventually, the brick will wear away—or you can smash it first—and then you have its component molecules of silica, alumina, lime, iron, and so on. But it will no longer be a brick. You can chemically unmake those molecules into their component atoms, then smash them down to protons, neutrons, and electrons, and finally break the neutrons down into protons, electrons, and antineutrinos.2 Protons have never been observed to decay, and their theoretical half-life is many times longer than the estimated age of the universe, so they may in fact be immortal—but proton decay still could happen. So, yes, you can unmake a brick or a human body down to the point of dissipated energy and subatomic fragments. And then neither of them will be a recognized thing anymore—nor, in the case of the human being, any kind of a process—and that’s as close to nothingness as you can get.

When my process dies away to mere thingness, and my thingness disintegrates to nothingness. Then nothing will be left. I may hope to be outlived by my children (if any), by my good works, by the love of the family and friends who knew me—all mortal and thing-based processes themselves—and by my books in the form of disintegrating paper and dissipating electrons. But in reality, any whispers of my existence will all have disappeared long before the last molecules of my body have blown away in a dust of subatomic particles.

That’s a grim thought for a weekday morning. I suppose you would call it a depressing thought. But in the long view—and here I’m talking about human history, culture, and many multiples of a human life—we don’t want things to last forever. Yes, I’d like a “lifetime warranty” on my mattress or my refrigerator, maybe not so much on my car, as I tend to enjoy the process and excitement of buying a newer model every couple of years. But we don’t want people, objects, or even stories and ideas to outlive their usefulness, to become meaningless—or worse, a fool’s punchline—for later and later generations. The imagined life of an Elrond or a Dracula, persisting through the ages of men, is indeed a tragic story.

Ozymandias was a warning to all who would yearn for immortality.

1. And yes, literature is filled with virtually immortal creatures such as elves and vampires. But I remind you that these are creations of the human mind and not actually found in nature. Even the Sequoia sempervirens, which can live for up to two millennia, eventually sickens, falls, and decays—if fire doesn’t get to it first.

2. Free neutrons outside an atomic nucleus tend to decay in about fifteen minutes.

Sunday, August 16, 2020

AI and Personality

Grinning dog
Starleth quadruped

So I’ve been following advances in artificial intelligence (AI) ever since I wrote my first of two novels about a self-aware computer virus. The current computer science isn’t up to actual self-awareness yet—more like at the stage of complex routines that can establish a goal, accumulate new information, hold it in long-term memory, and act on and repeatedly modify that information. This can simulate a living intelligence in terms of the Turing test, but it still might not be self-aware.

My little dog Sally is not self-aware, either. That is, unlike humans, dolphins, and perhaps elephants, she does not perceive of herself and have consciousness of herself as separate from the reality she inhabits. The test of this is the mirror: when human beings look in a mirror, they recognize themselves, associate the image with what they think of as “me,” and may touch their face or hair to improve that image. If you put a mirror in a dolphin’s pool, then strap a funny hat or put a mark on the dolphin’s head, the animal will go to the mirror to check out its “own” image. But if my Sally approaches a mirror—we have several floor-length mirrors around the apartment complex—she either pays no attention to the small dog coming toward her or is eager or shy about meeting that dog. For her, the image is “other,” not herself.

And yet, Sally has a definite personality. I was reminded of that this morning. We had gone out through the back of the complex for her early walk, where she had dutifully completed her business. There were other dogs out there, she knew, and she is shy about meeting any dog bigger than herself. When we got back inside, at the elevator lobby, I discovered that the up and down buttons on that floor were not working: we would have to go outside again to an external stairway and climb to the second-floor lobby, where I knew the buttons would work. And at the outside door, Sally balked. She would not go back out again. She didn’t have any more business to do; there were big dogs out there; this was not the morning routine; and because … why? It took a lot of coaxing, tugging on her leash—to the point of pulling her collar up around her ears, which she hates—and encouraging noises on my part, until she relented, wagged her tail, and came along willingly.

Clearly, something in her perception of the situation had changed. She either overcame her fear of whatever dogs were outside, or she decided that I—her buddy, walk coordinator, and the fellow holding the other end of the leash—knew best about the decision to go back outside, or she remembered the last time we went out that way because the elevator buttons didn’t work—although she could hardly understand the concepts of “button” and “work.” Whatever, she had a new thought and, after her initial stubbornness, came with me happily.

I’ve been watching the development of agile robots on four legs—dog surrogates—with extensible necks and a jaws-like pincer that can open doors and carry objects. I read recently that some of the Boston Dynamics robots have been assigned to patrol parks and warn any humans they see violating social distancing. That’s pretty sophisticated activity. And the four “paws” of the robot dog are better at moving cross-country than wheels would be. That got me to wondering what it would be like to have an artificially intelligent dog instead of my Sally. It would probably be a demon at defined tasks like patrolling the complex perimeter, herding sheep—if I had any—or fetching my slippers. It would bark less and obey more. And it would never, ever leave a puddle in the bay window. That, and it would never need to be taken for a walk on a leash several times a day in order to do that business, either.

But a robot dog would still be a machine: purpose built, goal oriented, and just a complex of embedded responses. It might be programmed to wag its tail and lick my hand. It might even be taught to use human speech and say endearing things. And it might—if you kind of unfocused your eyes and willingly suspended your disbelief—serve as a therapeutic presence when I was sad or depressed, and be programmed to detect these states and act accordingly. But it would not be “intelligent” in the way a dog is, with its own quirky personality, its own mind about why it will or won’t go through a door, its own reasons for barking its head off at a noise upstairs, its own silly, toothy grin and hind-leg dancing when I come home, and its own need to put its chin on my knee after we climb into bed. It wouldn’t shed—but it wouldn’t be warm, either.

I’m no expert on artificial intelligence, although I can fake it in science fiction. But right now, AI expert systems are perfectly acceptable as automated drivers, button sorters, pattern recognizers, data analysts, and yes, four-legged park rangers. We value them in these roles because they function reliably, give predictable answers and known responses, and they never balk at going through a door because … why? If they did have that endearing quirkiness—the tendency to give inexplicable, self-centered, unexpected responses to unusual situations—and occasionally left a puddle of oil on the floor, we would value them less.

However sophisticated their programming, robot dogs would not be our companions. And much as we liked them, we would not love them.

Sunday, August 9, 2020

That Civil War Meme

War devastation

Almost everyone who is paying attention will agree that the political situation in this country between the progressive Left and the conservative Right is becoming desperate. Families and friendships are being sundered over political differences. The differences represented are existential and encompass radically opposed views of what this country stands for and where it is or should be going. There is no middle ground upon which members from opposite sides of the question can build a workable compromise. The stakes have become all or nothing.

The last time this happened, between the views of the Northern abolitionists and the Southern slaveholders and states’ rights advocates in the 1950s, the only conceivable result was dissolution, secession from the Union, an attempt at a parallel slaveholding government in the South, and ultimately a war to bring the seceding states back into the Union. When the issues are existential and are believed to encompass the survival of one side or the other, when there is no middle ground or possible compromise, then breakup and/or civil war becomes the only answer—terrible as that may be.

Some would say that the “cold civil war” over political and cultural differences—which has been going on in this country for the last dozen or so years and perhaps started as far back as the 1960s—has already grown hot. In the past month, we’ve seen what are supposed to be “peaceful protests” in various cities (Minneapolis, Seattle, Portland) meld into violent riots and attacks on both city and federal properties both there and in other cities (Richmond, Austin, Oakland) in a spreading conflagration. Now the Department of Homeland Security, a recent addition to the federal government based on earlier terrorist activity, is supposedly fielding agents to protect federal buildings and round up the people attacking them. To me, this looks like insurrection. This looks like the earliest stages of an armed conflict.

The supposed “Second Civil War” that is being shouted in various novels, blogs, and memes right now—including some of mine—is not going to look like the first Civil War of 1861-65. Here is why.

First, the nature of war and the weapons used to fight it have changed drastically in the last 160 years. The Union and Confederate armies were composed of foot soldiers who marched in relatively tight formations and fired muzzle-loading muskets, supported by muzzle-loading cannon and men on horseback scouting ahead of the marching armies. The fastest means of communication was the telegraph wire, usually strung along railroad rights of way. But armies in the field away from the rail lines had to rely on a man riding a horse and carrying a handwritten message. The armies themselves could only meet on suitable ground, a defined battlefield, and usually tried to outflank an opponent to reach their own objective, or ambush an opponent to keep him from reaching his objective. This was all two-dimensional and—except in punitive expeditions like Sherman’s March to the Sea—paid little attention to strategic operations against civilian objectives.

As we’ve watched the progress of war from marching brigade lines to the immobilized trenches of World War I, through the mobile armies of World War II and Korea, to the Air Cavalry in Vietnam, and finally the village and urbanized insurrections of Afghanistan and Iraq—all with the background of a nuclear exchange in the offing—we know that a modern war on the continental United States will not be anything like the first Civil War. What would Lincoln and Grant not have done if they had helicopters and F-16s, let alone the threat of atomic weapons? A civil war today would probably not even be about taking and holding territorial objectives, especially if the war was not preceded by states seceding from the Union. It might be more like Vietnam, Afghanistan, and Iraq, all about winning “hearts and minds” and punishing insurrection. It might be neighbor against neighbor, with the frontlines drawn between cities and suburbs, neighborhood against neighborhood, like the Spanish Civil War of the 1930s.

Second, the looming civil war might well not be one of secession and recapture. Between the first Civil War and today, the nature of our governments at both the state and federal level has also changed. During the 1860s, the state governments were relatively strong, and the federal government was relatively weak. The federal government, at least at the start of the war, was small and funded mostly by customs duties, tariffs, excise taxes, and some direct taxes. There was no national income tax—but neither were there immense federal programs and outlays for Social Security, Medicare, and Medicaid; transportation projects associated with the Interstate Highway System, along with control and regulation of rail and air travel; educational standards and directives, backed up by grants and benefits; environmental projects and regulations; financial audits and controls, including the Federal Reserve and its management of the economy and the money supply; and the thousand other things we depend on the federal government to provide today.

Whether the current “Red States” in the central part of the nation secede from a Union dominated by the “Blue States” along the two coasts and the upper Midwest, or vice versa, one group is going to be left with all those federal programs, along with the Federal Reserve and responsibility for all those Treasury bonds and the federal debt. Maybe everyone in the part of the country that secedes will be comfortable with giving up their Social Security and Medicare contributions and future benefits, all that highway and education money, and everything else we’ve come to rely on the federal government to supply. Maybe forgoing their share of the looming federal debt would be compensation enough. But rewriting those funding and social service obligations under a newly conceived and authored Constitution and code of laws would be a gamble for most people. Some—especially those with much to lose under a new interpretation of the tax code—might think twice about giving up the devil they know for the one that has yet to be born.

And then there are the pesky details of what would become international transactions. For one side or the other, the companies and networks we all expect to function smoothly—the internet and its cloud computing resources, the U.S. Postal Service and delivery services like FedEx and UPS; distribution networks like Amazon and eBay; communications services AT&T and Verizon; farming, food processing, and distribution companies that keep the rest of us supplied with flour and bread, vegetables, chicken, beef, and Hostess Twinkies; the electric power pools and their system exchanges; control and security of interstate air travel and railroads; oil and gas transmission pipelines, to name a few—all will all be tossed into a cocked hat and distributed variously between two different countries. Some of these functions will continue smoothly under international agreements. Others will become fragmented, prizes to be pulled apart in the interest of benefiting one party while hurting the other.

Any way you look at it, our country—the whole United States—has become far more interconnected and centrally governed, less regional and local, less independent, than it was 160 years ago. A breakup into Red and Blue, if that is even the correct dividing line anymore, would be far more difficult to pull off, and even more difficult to operate in two halves—especially if the Blue halves were physically separated by a big Red chunk in the middle, with borders, tariffs, and travel restrictions going both ways—than the country that divided in 1861.

All of this is food for careful thought before we embrace the Civil War Meme and start picking sides.

Sunday, August 2, 2020

Beyond Socialism

Puppet master

The progressive far left of the Democratic Party, taking the line from Bernie Sanders and other members of the “Democratic Socialist” persuasion, has put forward a number of proposals designed to appeal to American voters: Medicare for All, Free College for All, Universal Basic Income, and similar direct subsidies from the federal government. To the ears of those of us on the center and right, this sounds like classic socialism. But is any of it really the kind of socialism that Karl Marx, Rosa Luxemburg, Vladimir Lenin, Leon Trotsky, or Mao Zedong would recognize?

Let’s take Medicare For All. Presumably, it would extend Medicare coverage to all Americans, not just those over 65. Presumably, it would continue the payroll withholding that we all pay as registered Social Security recipients, currently taxed at 2.9% of wage and salary income, split between employer and employee (or wholly paid by the self-employed) on the first $125,000 of income. Also presumably, it would still cover only 80% of the recipient’s medical and hospital costs, and still allow the recipient to buy supplemental private insurance under “Medicare Advantage” plans to cover the remaining 20%, usually with a modest co-pay for doctor and hospital visits and drug costs. So this is basically a government-run insurance program, managed by the Centers for Medicare and Medicaid Services. This is not philosophically different from private insurance, except that it is supported—at least in part—by payroll tax revenues and does not allow for a profit motive. As the largest single payer in the country’s medical system, this extension of Medicare would be able to dictate to doctors and hospitals the levels of service they might provide and the charges they could bill. These would be “negotiated” to the same extent that any dealings with a monopoly supplier or monopoly consumer—think of the U.S. Department of Defense when buying weapon systems—are a negotiation.

Contrast this with the healthcare offered under socialist systems like the British National Health Service (NHS). There the government health agency owns and operates the hospitals and other facilities, directly purchases medical equipment and supplies, directly employs doctors, nurses, and other staff, and provides service for free to British citizens—other than through the taxes they pay. Some doctors and services may be privately owned and operated, but they must be supported by patients paying their own way or having private health insurance. The Canadian healthcare system, called Canadian Medicare, is funded by taxes through the provinces and pays the bulk of costs associated with hospitals and doctors—which are then owned by private, nonprofit institutions. Unlike the British system, Canadians cannot use private insurance in most of the provinces to pay for government-covered basic services or obtain private, fee-for-service medical care anywhere within the country.

Or consider Free College for All. Presumably, the government—which now guarantees almost all student loans—would continue to offer them but at almost no interest or without payback terms, making the money virtually free as a taxpayer-funded service. I have not heard of any proposal with a plan for nationalizing the universities and community colleges, putting those institutions under government ownership and control, and directly hiring and paying the professors and other staff.1

Neither of these progressive proposals in their most advanced form would involve the government actually owning the means of production (hospitals and colleges), managing the infrastructure for provision of services (administration and billing), or employing the personnel who offer those services (doctors and professors)—which is the classic definition of socialism. Doctors would still have to rent, staff, and furnish their own offices. Hospitals would have to build and equip facilities and maintain a requisite hospital beds based on their estimates of the available market. Colleges would still have to acquire land, build classrooms and football stadiums, and determine their own curriculums based on the needs of their projected student population.

The “socialism” offered by the progressive Left in America is not about ownership of the means of production and infrastructure for services.2 It’s not even about financing these means on the producing side. Instead, it’s about providing the individual buyer with the money to pay for the goods and services he or she takes. Building and paying for the factories to make goods, the commercial associations to provide services, the stores and offices to distribute them, and the logistic systems to mediate between them are all left to the individual, privately held providers. This is even less socialist than our current system of providing universal, K-12 education: the local public school, which is owned and operated by the local school district, which in turn is an agency of the local government.

Socialism is messy. As every country that has tried it—nationalizing all means of production and provision of services—has discovered, it’s hard to get politically connected bureaucrats—usually rising party members—to care deeply about or know and understand intimately the facilities and people they’ve been sent to manage. Suddenly, the government is responsible for making things that actually work, getting them into the public’s hands in a timely fashion, and providing the services that people need to go on living and thriving. That’s all hard and takes dedication—and usually a stake in the game, represented by some kind of return on the managing individual’s time and effort, not just a gold star at party headquarters. Owning the business and making people happy are hard when done at the remove of government and party politics.

Even Elizabeth Warren, one of the Democrats’ most ardent progressives, doesn’t want to nationalize the U.S. economy as the Soviets tried to do (and failed) or the Chinese Communists tried (and eventually elided into a form of crony capitalism). Instead of abolishing U.S. corporations and their funding mechanism of venture and shareholder capitalism, Warren would put them under a charter system. The corporations would still own their productive facilities, make investments, and manage themselves, but they would operate under rules and obligations dictated by the federal government. This system would put the government’s social goals and recognition of other stakeholder needs—like communities, minorities, customers, unions, the environment, and whoever else comes to mind—alongside whatever the shareholders and owners of the company are trying to achieve. She wants to control the corporations and the U.S. economy without actually taking responsibility for making wise investments, creating and supplying useful products, offering good service, or running the business without running it into the ground. However, when you’re not responsible for the productive outcome, you can cheerfully make rules without regard for consequences.

That said, in relation to Medicare for All, Free College for All, and similar proposals, there is still the moral equivalent of socialism when the government is the sole buyer—the monopoly buyer—of medical, educational, and other services through the rules it lays down for what products and options the individual buyer will have access to through the government program and under what conditions they will be provided. The government is then in a position to dictate pricing and supply terms to the independent providers. Some providers, as under the British NHS system, would still exist outside the government-funded products and services, able to charge wealthy clients paying with their own money or with private insurance for non-government–covered services and procedures. However, there may not be enough wealthy people to go around to make offering these products and services a profitable strategy. Other providers, as under the Canadian system, would likely be limited to serving all citizens through the government-funded facilities without a private exception and rationing products and services accordingly. No, he who pays the piper not only calls the tune but, essentially, owns the piper.

Under those conditions, we can only hope the piper survives. And maybe, ultimately, that is the point of these proposals.

1. However, for example, the California State University (CSU) and the Regents of the University of California (UC) systems, both of which operate campuses up and down the state, are largely funded by public money from taxes, supplemented by tuition, fees, and other resources.

2. See, for example, Why Own When You Can Rent? from October 13, 2013.