Salvo 02.09.2023 7 minutes

Loab, a Cautionary Tale

Hand on static

Are AI demons real?

Editors’ Note

Please be advised that some of the links in this article lead to graphic and disturbing imagery.

“Whose image is on this coin?”

When his rivals in Jerusalem presented him with a Roman denarius, Jesus pointed out that the face of Caesar was impressed onto the silver. So Jews could pay taxes to the Roman state: “give back to Caesar what belongs to him, and give God what belongs to God” (Mark 12:13-17; Matthew 22:15-22).

By implication, what belongs to God is you: “God said, ‘let us make mankind in our image’” (Genesis 1:26). The worth of the human soul is as far beyond that of a silver coin as the majesty of God is above that of Caesar. Rulers leave an imprint of themselves on the creations they treasure. On us–on the soul-shaped flesh of humanity, capable of loving and knowing truth–Love and Truth himself has left his mark.

In Greek, one word for such a royal seal is charactēr. By an easy metaphorical jump, this is how we arrive at our English word “character”: the kind of person you are is like an imprint pressed into you. It’s as if your soul were a ball of wax (another very old analogy) molded by the inward and outward forces of culture and belief. 

Now, what if we undertook to remake ourselves in our own likeness, impressed with our own seal? If we shaped a simulacrum of our humanity, not as we are but as we have come to see ourselves, would we end up with a machine run by algorithmic rules? A dead thing that looks and acts like us but cannot think or feel? And whose image would be pressed into this counterfeit coinage–this ragdoll puppet with strings of computer code?

The Lady Will Not Vanish

I thought about these questions recently when a friend told me about Loab, the AI demon woman. Loab–the story goes–appeared when the Swedish artist known on Twitter as “Supercomposite” was playing around with a certain AI image generator. She refuses to say which one, for fear of starting a viral trend. The last thing she wants is for Loab to spread.

Loab is made out of empty space. She is the opposite of something–no one is quite sure what. Supercomposite performed an operation in which she asked her program to generate something that looks as unlike something else as possible. In this case, she asked for the opposite of the actor Marlon Brando. The result was a futuristic castle in silhouette, printed with the kind of text that AI sometimes generates–like the randomly sequenced books in Borges’s Library of Babel, these programs often furnish series of letters that look like words, but are not.

When Supercomposite asked for the negative image of this negative image–the opposite of the opposite of Brando–she got Loab. The results showed “a devastated-looking older woman with defined triangles of rosacea(?) on her cheeks.” No one, including Supercomposite, knows quite how this happened. The software works by assimilating, tagging, and recombining existing material, creating a “latent space” or imaginary map in which images and their attributes are related to one another according to various metrics. A picture of a sparrow and a picture of a robin would be “close” to one another on the map. A picture of a sparrow and a picture of a cheeseburger would be relatively “far away.” Loab is what happens when you go as far away from Brando as possible, then as far away from that again.

But she is right next to what looks like hell. Recombining Loab’s image with other images–starting with a depiction of angels–prompts the AI to spit out gruesome nightmare visions of gore and torment. If one were prone to anthropomorphize, one might be tempted to say that Loab “reacts” to images of divine bliss by tearing in rage at human flesh.

Maybe the most unsettling claim Supercomposite has made is that Loab is “persistent.” The AI has an affinity for her: it very easily “latches on” to this particular image, reproducing her recognizably in scene after scene. And she is at least the kind of image that portals like OpenAI’s DALL-E (made by the people who brought you ChatGPT) often gravitate toward. DALL-E shapes are often recognizable but distorted somehow. Often they have a kind of twist that makes them look like the melting surrealist faces of Salvador Dali (whose name combines with that of the Pixar robot WALL-E to produce the moniker DALL-E).

Who is She?

Is Loab real? The categories we currently use to ask such questions seem inadequate. There is of course a serious possibility that Supercomposite made the whole thing up, or exaggerated it for clout, though she swears she didn’t. Let’s imagine for a moment that she’s telling the truth. Loab remains in some sense a fictional character. She certainly can’t see or hear, any more than a chatbot can–they make a good show of it, but these programs don’t have an inner life.

If Supercomposite is right, though, Loab is a name for a very real set of tendencies in the AI software. She is a pattern of preferences, a habit of contorting the human form and of rejecting beauty in the most apparently ferocious of ways. As the product of a negative process–the opposite of an opposite–she is in some sense pure negation. But she has form.

Try though I might, I can find no better set of ideas for talking about this sort of thing than those proposed by Saint Augustine in his Handbook or Enchiridion (4.13). Augustine famously says that evil has no being of its own but is only the corruption of being: “every real thing in nature is good. Nothing evil exists in itself, but only as an evil aspect of some actual entity.” Evil exists, but it exists because good exists: like a parasite, it eats away at creation, making tears and pockmarks in the fabric of things.

A hole in a wall is not the wall itself: the emptiness is its own thing. It is a purely inverted and destructive parody of charactēr, less a seal pressed into wax than a chunk bitten out of existence, a shape formed by deforming other shapes. The word for such an entity in the Christian tradition is “demon.”

If demons are real, then Loab could be real, at least in principle. Perhaps more importantly, if Loab is real, then demons are. The patterns and tendencies we build into our machines are not value-neutral, nor will they ever be. They have a character all their own, because they are an impression that we have made on the material world. 

If what we think of ourselves is that we are basically already primitive machines–meat hardware running neurochemical software–then the machines we make to represent ourselves will be congenitally perverse. They will be formed chiefly by the negation of our inner lives, and so of everything that is truest and best in us. 

In one of the Bible’s most chilling descriptions of idolatry, Psalm 115 suggests that worshipers of false gods attribute consciousness to human images which in fact have none: “their idols are silver and gold, made by human hands. They have mouths, but cannot speak, eyes, but cannot see. They have ears, but cannot hear, noses, but cannot smell. They have hands, but cannot feel, feet, but cannot walk, nor can they utter a sound with their throats.”

Such worshippers demean their own souls by imputing souls to their creations. The result is not to give consciousness to artifacts but to take consciousness away from humans: “those who make them will be like them, and so will all who trust in them.” 

This is not to say that all AI is inherently demonic. It is to say that manufactured goods of any kind are demonic when we trick ourselves into thinking we have made humanoid life simply by reproducing the outward form of our physical selves. To do so is to proclaim that our physical selves are the only thing we are. And the true horror is that if we insist on this lie long enough, we might eventually believe it.

The American Mind presents a range of perspectives. Views are writers’ own and do not necessarily represent those of The Claremont Institute.

Suggested reading

to the newsletter