Sunday, June 11, 2023

Clownface, Masquerades, and Assumed Identities

Joker’s smile

This is my brain following a random path toward a real thought. So bear with me. What is a clown?

I mean really, what do the full image, aspect, and persona of a clown represent? Are clowns a distinct race of mythical beings? Certainly, they are always a human person dressed up in a particular style. But are clowns representations of imaginary figures, like the black-robed, hooded figure of the Grim Reaper with his scythe? (And why is Death always a male?) Are they like the kachina spirits of the Native Americans? Clown makeup and costumes are varied—in the professional circus, a performer’s face painting is even registered and protected—but the genus or type is always recognized. If I mention a red bulb nose, orange string wig, and absurdly long shoes, you know that I’m speaking about some kind of clown.

The actual presentation of the modern circus and rodeo clowns goes back to the Italian Commedia dell’arte of the 1500s. The theater companies back then employed eight to ten stock characters, each with specific features of face and costume, so that the audience would know what to expect. Not all all of the characters were meant to be funny, but all had their place in familiar human situations that were always played for a laugh. And the servant characters—Pierrot, Harlequin, and Pulcinella—were generally the perpetrators of madness. That is, clowns.

Still, over time, and with the aid of the traveling circus in both Europe and North America, the clown itself has become a stock figure. Under the Big Top, clowns provide comic relief between the more daring and dangerous acts like the lion tamers and acrobats. On the rodeo circuit, clowns rush into the action to distract a loose horse or bull and protect the riders. Clowns are now physical actors with no actual speaking parts. They are visually funny while other comedians make jokes with their words and facial expressions.

And for some people, clowns are scary. Clowns are made up to be exotic and absurd. They pantomime humor but with a subtle edge of intent, sometimes of meanness—as all humor can be used meanly, to ridicule and to hurt. To some sensibilities, the exaggerated lines and shapes drawn on an otherwise human face, the essence of a mask to hide the underlying identity, are disturbing. Perhaps the best representation of this feeling of dread is the evil smile of Pennywise the Clown—not really a clown or a human being at all—in Stephen King’s novel It.

What makes us feel uneasy about clowns is also what makes us uneasy at a masquerade ball or a Halloween party. Or, let’s face it, with the whole concept of actors and acting, and why they have been disrespected as a profession—praised but not generally trusted—by polite society. We are accustomed in our daily lives to seeing a person’s face and believing we can tell what that person is thinking and feeling, who they are, and what they will do. We believe that the eyes are the “windows of the soul” and that we can read meaning there. We also believe we can trust smiles and laughs in the people we meet. Acting hides this. Masks hide this. Masks not only conceal identity but they also remove humanity. And a clown’s heavy and exaggerated makeup is more of a mask than the powder, eye shadow, and lipstick that many women put on in the morning.

And there, for many of us, is the difficulty when we encounter a transgender person, a transvestite or a drag queen,1 or even a markedly effeminate man or masculine woman. Our sense that we can tell a person’s true being just by looking at him or her is skewed.

We generally take a person’s sex—male or female, pick one—to be an essential part of their character. We consider it the base, ground-level, first-order characteristic of a person’s makeup. Check this box first: man or woman? And a man made up and dressed to look like a woman, or who acts like and believes he is a woman, befuddles this sense. Even if he has had hormone changes and surgeries so that in some physical dimensions he matches a female body type—or a woman who has undergone similar changes to emulate a male body—we still feel that something is amiss. The width of the hips, the ratio of body fat, some subtle missing part of the whole presentation cues us to the fact that what we are looking at is not what we were led to expect.

When we see a clown, we know that we are not looking at a separate species of being, but a human person who has put on grease paint, string wig, and floppy shoes. We accept the change as striving for a particular type of presentation. But when we see a man sculpted and painted like a woman, or a woman pared and groomed like a man, the presentation strikes deeper into our awareness.

It doesn’t just confuse and disturb us. It makes us feel that our sense of basic human nature has been betrayed. It makes us feel threatened.

1. And are not drag queens sometimes played for comic effect? Certainly, their heavy makeup and exaggerated characteristics are generally played for laughs.

Sunday, June 4, 2023

Not All That Intelligent

Dissected man

I have been writing fiction about artificial intelligence (AI) for most of my adult life.1 In all cases, my intelligences—whether a viral computer spy or a robot pilot from the 11th millennium—are what one science fiction author calls “a little man with a machine hat.” That is, they are multi-capable, self-aware programs able to function like a human being, carry on conversations, have thoughts and opinions, and occasionally tell jokes. The only difference is they aren’t made out of gooey carbon compounds. That is, they’re just another set of fictional characters.

With all the talk and all the hype about AI these days, it is useful to understand what the current crop of programs is and is not. They are not Skynet, “deciding our fate in a microsecond.” They are not functionally equivalent to human intelligence: that is, they are not thought processors capable of thinking through complicated, real-life situations, perceiving implications, and making distinctions and decisions. They do not have a lifetime of experience or what we humans would call “common sense.” They are ambitious children. And they are not all that intelligent. They are also designed—at least for now—for a single function and not the generalized array of capabilities we think of as comprising human-scale intelligence.

I recently heard an interview with Chairman and CEO Arvind Krishna of IBM. He said that programming the Watson computer that became a Jeopardy champion took six months. That was a lot of work for a machine to compete in the complicated but essentiallyt trivial task of becoming a game show contestant. IBM is now selling the Watson model as a way for corporations to analyze their vast amounts of data, like aircraft maintenance records or banking operations. Artificial intelligence in these applications will excel, because computers have superhuman scales of memory, analytical capability, and attention span. But Krishn cautioned that in programming an artificial intelligence for corporate use, the operators must be careful about the extent and quality of the database it is fed. In other words, the programmer’s maxim still holds true: “garbage in/garbage out.”

I can imagine that AI systems will take on large sets of data for corporate and eventually for personal use. They will manage budgets, inventories, supply chains, operating schedules, contract formation and execution, and other functions where the data allow for only a limited number of interpretations. They will be very good at finding patterns and anomalies. They will do things that human minds would find repetitive, complicated, boring, and tiresome. They will be useful adjuncts in making business decisions. But they will not replace human creativity, judgment, and intelligence. Anyone who trusts a computer more than an experienced human manager is taking a huge risk, because the AI is still a bright but ambitious child—at least until that particular program has twenty or thirty years of real-world experience under its belt.

Of recent concern to some creative and commercial writers is the emergence of the language processor ChatGPT, licensed by OpenAI, whose investors include Microsoft. Some people are saying that this program will replace functions like story, novel, and script writers, advertising copywriters, documentation and technical writers, and other “content creators.” Other people are saying, more pointedly, that such programs are automated plagiarism machines. People who have actually used the programs note that, while they can create plausible and readable material, they are not always to be trusted. They sometimes make stuff up when they can’t find a factual reference or a model to copy, being free to hallucinate in order to complete a sentence. They freely exercise the gift of gab. However, I expect that this tendency can be curbed with express commands to remain truthful to real-world information—if any such thing exists on an internet saturated with misinformation, disinformation, and free association.2

The reality is that ChatGPT and its cohort of language processors were created to pass the Turing Test. This was a proposal by early computer genius Alan Turing that if a machine could respond to a human interlocutor for a certain length of time in such a way that the human could not tell whether the responses were coming from another human being or a computer, then the machine would be ipso facto intelligent. That’s a conclusion I would challenge, because human intelligence represents a lot more than the ability to converse convincingly. Human brains were adapted to confront clues from the world of our senses, including sight, sound, balance (or sense of gravity and acceleration), tactile and temperature information, as well as the words spoken by other humans. Being able to integrate all this material, draw inferences, create internal patterns of thought and models of information, project consequences, and make decisions from them is a survival mechanism. We developed big brains because we could hear a rustle in the grass and imagine it was a snake or a tiger—not just to spin yarns around the campfire at night.

The Turing model of intelligence—language processing—has shaped the development of these chat programs. They analyze words and their meanings, grammar and syntax, and patterns of composition found in the universe of fiction, movie scripts, and other popular culture. They are language processors and emulators, not thought processors. And, as such, they can only copy. They cannot create anything really new, because they have no subconscious and no imaginative or projective element.

In the same way, AI develped for image processing, voice recognition, or music processing can only take a given input—a command prompt or a sample—and scan it against a database of known fields, whether photographs and graphic art, already interpreted human speech, or analyzed music samples. Again, these programs can only compare and copy. They cannot create anything new.

In every incidence to date, these AI programs are specialty machines. The language processors can only handle language, not images or music. The Watson engines must be programmed and trained in the particular kinds of data they will encounter. None of the artificial intelligences to date are multi-functional or cross-functional. They cannot work in more than one or two fields of recognizable data. They cannot encounter the world. They cannot hear a tiger in the grass. And they cannot tell a joke they haven’t heard before.

1. See, for example, my ME and The Children of Possibility series of novels.

2. The language processors also have to be prompted with commnands in order to create text. As someone who has written procedural documentation for pharmaceutical batches and genetic analysis consumables, I can tell you that it’s probably faster to observe the process steps and write them up yourself than try to describe them for an AI to put into language. And then you would have to proofread its text most carefully.