The internet is now predictably awash with two—and only two—videos from this year’s Golden Globes awards ceremony. In the first, British comedian Ricky Gervais uses his opening monologue to announce what is obviously true, painstakingly unspoken, and therefore punishingly funny: that his audience consists of self-satisfied prima donnas who fancy themselves sages but in fact…
The internet is now predictably awash with two—and only two—videos from this year’s Golden Globes awards ceremony. In the first, British comedian Ricky Gervais uses his opening monologue to announce what is obviously true, painstakingly unspoken, and therefore punishingly funny: that his audience consists of self-satisfied prima donnas who fancy themselves sages but in fact “know nothing about the real world.”
In the second video, Michelle Williams (star of Dawson’s Creek) proves Gervais’s point with a tearfully heartfelt and quietly sinister acceptance speech. Williams declares that she got where she is by “employing a woman’s right to choose.” This of course is the favorite euphemism of a satanically corrupt and corrupting rhetoric, designed to obliterate all record of what Williams actually did: she allowed an infant in her womb to be murdered. Williams, like the clueless starlets who applauded her, is the victim of that rhetoric as much as she is its perpetuator. She has been gulled by it into trading her living child for lifeless gold, even as she now encourages others to do the same.
I and people like me cheered for Gervais and keened for Williams’s lost baby; others who oppose us cheered for Williams and scolded Gervais. Throughout, I couldn’t help feeling like the whole thing—the good and the bad of it—was playing out inside a snow globe, a little dioramic freeze-frame of a now unendurably scripted culture war.
To be sure, Gervais’s routine was original, admirable, and powerful—a delegitimization of Hollywood culture on its own terms, from within one of its major festivals of self-congratulation. But ultimately it’s not enough: the whole putrescent system needs dismantling. If our problem is that vapid celebrities are pontificating at us from the glitzy podiums of obsolescent awards shows, and our only solution is for another celebrity to pontificate at those celebrities from just such a podium, then we are, um, in trouble.
Long ago in the Athens of the mid-300s B.C., Plato had already coined a term for this predicament of ours. He called it “theatrocracy,” or “rule by spectator.” In Laws, Plato’s last dialogue, an imaginary Athenian recounts a potted history of Athens. “Under our old laws,” says the Athenian, “the general public exercised no control over anything but were, in a sense, voluntary servants of the laws.” A result of this was that, in the theaters where tragedians and solo musicians performed, people submitted themselves entirely to the judgment of trained authorities. These authorities considered carefully which new offerings were worthy and which displeasing.
Eventually, though, a new generation of pop stars took the stage, appealing to an adoring fanbase with unconventional new licks and riffs. Plato complains: “by writing and composing things like that, they planted lawlessness in the hearts of the public.” And as in theater, so in government: emboldened by the frenzy of the concert hall, the masses decided they knew what was best merely by gut feeling. They demanded of their leaders only what would satisfy their immediate desires, not what was in their best interests. And so whereas other civilizations—Persia, for example—hamstrung themselves “by forcing the populace into every kind of servitude,” Athens spiraled catastrophically into a senescence marked by “every kind of liberty” (699d-701d).
To me, Plato’s diagnosis of our situation seems exact, his implied solution intolerable. Certainly we are living in a theatrocracy—an aggressively stupid society whose celebrities train its populace to demand their own immediate gratification at the expense of anything with any real value. Even our delight in Gervais’s polemic—satisfying though it may be—is a symptom of our intense addiction to rule by spectacle.
But, as two thousand years and a revolution have taught us, submitting to good and wise men is only advisable if good and wise men are on hand and in power. At present they are neither, which is why Plato’s vision of a society ruled by experts sounds like a horror greater even than having to watch the Golden Globes. It is precisely the experts—the pundits and the so-called statesmen—that have failed us so singularly. They continue to do so, which is why we are at the mercy of celebrities in the first place.
The answer to this must surely be an overhaul in education, one that would retrain statesmen to lead and citizens to identify those statesmen. Only a society so trained is equipped for freedom; only statesmen so educated are worthy to uphold the laws; and only the laws may rightly command the allegiance of free citizens in the last analysis. Until then we will remain trapped in our theatrocratic golden snow globe.
People get naked for lots of reasons. When they make love, or go to the doctor’s, or wash themselves; when they’re skinny dipping or sleeping or posing for a painting. In the ancient world Greek athletes stripped naked to compete, and Celtic soldiers did so to wage war. But everywhere nudity occurs it is extraordinary:…
People get naked for lots of reasons. When they make love, or go to the doctor’s, or wash themselves; when they’re skinny dipping or sleeping or posing for a painting. In the ancient world Greek athletes stripped naked to compete, and Celtic soldiers did so to wage war. But everywhere nudity occurs it is extraordinary: even in civilizations where it is common, it is not the norm. It marks an occasion.
I do not deny that people exploit and abuse their nakedness as well as that of others. Strippers, streakers, pornographers: whatever our moral judgments may be about such people, we can see plainly that they participate in a cheapened kind of nakedness. They make use of the act so freely that it first shocks and then loses all meaning. But nakedness itself is so charged with significance that whether flippantly or reverently undertaken it commands, at least initially, a very striking power. So, too, mothers who breastfeed in public will often prefer to do so under some sort of cover—only hardline activists will disagree that a bare chest, even for good reason, entails an uncomfortable degree of vulnerability. All modern injunctions to “free the nipple” and “normalize” public nudity are hopeless for this same reason: per se, nudity is more than normal.
We cannot regard nakedness simply as nakedness because our flesh is not simply flesh. Ineluctably, our bodies and our behaviors mean more than the mere fact of themselves. As an act of expression, getting naked always means self-presentation and exposure. We say we “feel naked” when we have lost some item upon which we rely or revealed an intimate secret which we are careful to guard very closely. This is as true of a Celtic warrior as it is of a lover in the heat of passion. The warrior lays bare the raw fury of his strength in the sight of his enemies; the lover puts his most intense desires on display for his beloved; but both say of themselves: Here I am. Here is some profound and essential part of me that I have chosen this moment to unveil.
God came naked into the world. We are liable to miss this fact because it is an obvious element of a familiar story, and because the Gospels say only that Mary “wrapped Jesus in cloths” before she laid him in the manger (Luke 2:7). But of course before that, when he made his first appearance, God was naked.
He must surely have meant something by it. Jesus is how God revealed his character in human form. That is why Scripture calls Christ “the visible image of the invisible God” (Colossians 1:15): in his earthly body and life, the supreme being beyond all thought or sight encapsulated himself in the world of time and space so that we, who live there, could know him.
The beginnings of stories are important. It matters, therefore, that God began the story of himself without clothes on. He chose the language of flesh and blood because it is a language we can comprehend, so I must infer that he meant for us to understand by his nudity what we understand by all human nudity. That is, he meant to say at his birth: Here I am. Here is some profound and essential part of me that I have chosen this moment to unveil.
And like our nakedness, God’s nakedness is unusual: usually, he is clothed. Psalm 93 says that he “is clothed with majesty: The LORD has clothed and girded himself with strength.” In Psalm 104 God is “clothed with splendor and majesty” and covers himself “with light as with a cloak.” Often, assertions like this are accompanied by references to the hugeness and elegance of the natural world. Sunlight streaming through mist over clifftops, or the endlessly intricate patterning of stars on a clear night: that is the kind of garment God may be said to wear.
This is right and fitting, just as it is right and fitting for a king to wear a crown. God, who is mighty and beautiful, has chosen the mighty and beautiful things of the world to adorn himself. But kings get naked too. When they strip off their furs and their jewels, the people closest to them see their private depths. Imagine Livia with Augustus, or Catherine with Henry, or Solomon with Pharaoh’s daughter. Whatever was under the robes of those powerful men—delight, naïveté, animal lust, boyish fear—those women saw it.
Christmas was the day on which God let the rich tapestry of the clouds slip from off his shoulders and shucked the mountains from off his feet. It’s the day on which he peeled off the natural world like an undershirt and revealed what lies pulsing forever behind it all, what there is, essentially, underneath the accessories and accoutrements of grass and sea and sky.
It is something much softer and more heartfelt than we would probably have liked to believe, something small and weeping and tender: an infant child. When God pulls back his clothing to reveal the core of himself, we do not find him towering over us in furious strength. Instead we find him helplessly available, deeply and achingly in love. We find him reaching toward us with wide eyes.
There are two other moments in the Gospels when God gets naked in public, both of them moments of sorrow and passion. In mournful longing on the night before he died, Christ stripped off his clothes to wash his disciples’ feet. The next day he allowed his persecutors to strip him naked and beat him before they hung him to die on the cross. Each time he was laid bare, we saw in sharper relief who he is before and beneath everything else, beneath the planets that he spoke into being and the fabric of space-time that he stitched. Deep in the depths of it all there has always been—will always be—this naked and abiding God who is not ashamed, as we are, to make known the whole of himself. And the whole of himself is love.
There is a passage that’s been haunting me for years now, from the History of the Peloponnesian War by Thucydides. It’s about what happens when a society turns in on itself. Throughout the late 400s BC, Athens and Sparta—the two great city-states of Ancient Greece—were locked in a furious and ever-escalating struggle for dominance. Caught in…
There is a passage that’s been haunting me for years now, from the History of the Peloponnesian War by Thucydides. It’s about what happens when a society turns in on itself.
Throughout the late 400s BC, Athens and Sparta—the two great city-states of Ancient Greece—were locked in a furious and ever-escalating struggle for dominance. Caught in the wake, smaller communities around the Mediterranean had to choose where they would stand. Practically no one was neutral or immune: beginning with the island city of Corcyra, polities all over Greece went to war with themselves.
Usually it was the lower-class majority that wanted to side with Athens, and the well-heeled minority that wanted to side with Sparta. The consequent struggles attenuated or nullified all of the bonds that make a group of people into a state. Ancient and sacred laws of religious amnesty were violated. Fathers killed their sons. Besides which—and this is what’s been keeping me up at night—words lost their meaning. Here is what Thucydides writes (3.82, my translation):
They even changed the agreed-upon meanings of words to the opposite of the right ones. Rash impudence was called “courageous loyalty”; careful deliberation, “specious cowardice”…. Getting revenge was considered better than never being wronged in the first place…. The cause of all this was hunger for power motivated by greed and ambition.
Social upheaval, in other words, presents an opportunity for would-be autocrats to make a grab for power by weakening the foundations of legitimate rule. Those foundations are: piety, family, and language.
As Thucydides would probably be the first to say were he alive, these basic dynamics of human behavior have not changed in 2,500 years and are unlikely to change in the next 2,500. It is thus no accident that the major Socialist regimes of the 20th century have done their best to redefine important words. “Newspeak” was no mere invention of George Orwell in 1984.
In this winter’s Claremont Review of Books, for example, Charles Horner points out that words like “democracy” and “legality” were not abandoned altogether in Stalin’s Russia but repurposed. The USSR claimed to provide “true” “Socialist” democracy in contrast to the “sham” democracy on offer in Western states. The Communist Party of China, Horner continues, does much the same thing today with the word “Chinese”: “Chinese democracy” is the term used by the CPC to describe a system which of course is not democracy at all.
There is a reason why those who lust after power set about redefining, rather than simply eschewing, the words which properly belong to free men and women. It is because concepts like “democracy” and “liberty,” if rightly used, command a reverence that arises naturally from their true meanings. Freedom and popular sovereignty are self-evidently good and noble things, triumphs of the human spirit which Socialism seeks to crush. Since totalitarians cannot produce such triumphs, they must ape them, using the cover of falsely redefined words to lend themselves an air of undeserved gravity.
If this has not started to sound to you like the identitarian Left, you are not paying close enough attention. Last week, after the abject and supremely merited failure of Senator Kamala Harris to obtain the Democratic presidential nomination, commentator Leah Greenberg wrote that “the implicit racism and sexism of ‘electability’ is deeply damaging to democracy.” Harris, Greenberg implied, was unfairly denied the nomination by an electorate which cannot countenance a black woman in office.
Leave aside for a moment the obvious truth that Harris was in fact denied the nomination because she completely lacked both principle and charisma. Imagine that Democratic voters really had snubbed her because of her race or gender. That would be lamentable, but it would plainly not be “damaging to democracy.” It is in fact the very essence of democracy, the most proper possible definition of the word, that the public choose leaders of their own for whatever reasons they see fit.
What Greenberg really means is: “public opinion has produced an outcome of which I disapprove, therefore I declare that opinion not democracy but rather racism and sexism.” The only possible definition of the word “democracy” as used in this sentence is “society as I demand it be organized.” The only possible definition of “racism and sexism” is “forces which oppose me.” Those, indeed, would be pretty good working definitions of the terms as used today by the far Left writ large.
The co-option of words is, Thucydides knew, just as threatening to good government as the devaluation of religion and the dissolution of the family. The three trends go hand in hand, and they remain now what they have ever been: the unholy tools of aspiring petty tyrants.
My generation was raised on many lies. One of the lies that we Millennials were raised on is: you can be anything you want to be. You cannot. Everyone can be some things. In America, wonderfully, many people can be many things. But time, circumstances, the human body, all constrain what any one person can become or…
My generation was raised on many lies. One of the lies that we Millennials were raised on is: you can be anything you want to be. You cannot. Everyone can be some things. In America, wonderfully, many people can be many things. But time, circumstances, the human body, all constrain what any one person can become or accomplish.
This was not emphasized in our upbringing, or else it was outright denied—if not by our parents, then by our teachers and our cultural environs. It was an unspoken mantra behind our social studies lessons, our favorite movies, the advertisements aimed at us: you can be anything. An astronaut. President. Beautiful. If you want to know the wages of this particular lie, observe what people my age (I am 29) are currently doing to our bodies and our society in the name of radical transgenderism.
Another lie we were told was, God can and probably should be kept out of higher learning. I suspect this was one of the most important lies, because it was one of the ones most seldom articulated outright. Instead it was simply acted upon, made a premise of major structural changes about which we were too young to be consulted but by which we were deeply affected.
Recently I visited Calvin University, a Christian school in Grand Rapids. Mostly it looked normal to me: dorms, rec centers, classrooms, the standard college stuff. Except, of course, the whole thing was suffused with God. The banners outside the auditorium said things like “created to create” and “made in His image.”
What shocked me about this was that it shocked me. I realized that I had actually never seen slogans like that plastered openly on the walls of a university. It felt a little transgressive—scandalous, even.
I went to Yale as an undergraduate. I became a Christian shortly before my freshman year, but I would never have dreamed of going to a Christian school. Back then, I still assumed that a religiously “neutral” institution was the best place for a free and open exchange of ideas.
But as I toured Calvin, it dawned on me that my undergraduate education was not, in fact, neutral. Yale’s central library, Sterling Memorial, was purposefully built to look like a Gothic cathedral. Soon after it opened in 1930, one student called it “a bastardized version of the west portal of an abbey.” Behind the old circulation desk—visibly reminiscent of an altar—there towers a painting of Alma Mater, the university herself personified. She stands beneath the tree of knowledge and is gazed upon reverently by her children.
Sterling was part of a shift in the early 20th century, during which formerly religious schools like Yale re-imagined themselves as secular temples of learning. But the library’s own imagery makes clear that this shift did not remove religion so much as supplant it.
How could it be otherwise? Aristotle was right: the human animal is purposive down to his bones. The things he does, he does because he wants to achieve some good. His actions have a why, express or implied, or else they do not happen. Aquinas identified the Christian God as the final and ultimate good toward which all right human actions strive. This outlook, or something like it, was encoded into the medieval universities whose aesthetics helped inspire Yale’s architectural revival in the early 1900s.
By then it was becoming untenable to make religion of any kind into a backdrop for elite academic endeavor. But this did not obviate the need of a final goal. The iconography of Sterling Memorial bears witness to that fact: you cannot found an institution or even build a building without some purpose in mind. If you remove one purpose, another will slip in through the back door.
And, since purpose of the ultimate sort is worship by another name, the result of Yale’s transformation was not irreligion but new religion. The adoration of the university itself, and its supposedly secular sciences, replaced the adoration of God.
The university in 2019 cannot look in every respect like a medieval academy, nor should it. Perhaps not every school can look like Calvin, either. But Calvin, at least, is honest about whom it worships. You can read about Him right on the walls. By contrast, our secular universities are increasingly devoted to that greatest lie of all: here, we worship nothing.
This simulated nihilism—which is actually dogma by sleight-of-hand—is poisonous and cannot endure. It fosters in its students a dedication to self-destruction, most manifest in confused political agitation and desperately miserable sexual exploits. It’s the kind of thing that would drive anyone to drink, or worse, and does.
If the academy is to do anything other than crumble, therefore, its leading lights must own up to the fact that they worship. And then they must ask themselves that ancient question: whom do we serve?
If our present ideological convulsions may fittingly be called a culture war, then the accompanying clashes between corporate brands and consumers may fittingly be called trials by single combat. Perhaps this was inevitable: the purpose of a brand is to define the customer’s identity, and the American public is undergoing an identity crisis. It stands…
If our present ideological convulsions may fittingly be called a culture war, then the accompanying clashes between corporate brands and consumers may fittingly be called trials by single combat. Perhaps this was inevitable: the purpose of a brand is to define the customer’s identity, and the American public is undergoing an identity crisis. It stands to reason that we would assemble into opposing camps under the sigils of our favored champions, choosing as representatives the companies which most faithfully display our values.
So it was that, when Nike sponsored quarterback emeritus Colin Kaepernick and his disrespect for the symbols of American patriotism, patriots chose to buy their sneakers elsewhere. So too when Gillette released an advertisement opposing “toxic masculinity,” the response was plaudits from radical feminists and disgust from just about everyone else. We root for products to succeed not just on their own virtues, but on the virtues of their creators’ worldview and, by proxy, our own.
In this climate it has become de rigueur for companies seeking high-profile approval to endorse gay marriage and to over-represent homosexual relationships in their marketing campaigns. People who believe marriage is exclusively heterosexual are therefore left with few brands that they can comfortably choose as their avatar in the mêlée of ideals.
That is why the fast-food chain Chick-fil-A became an unlikely white knight of Christian pop culture. The company closes on Sundays, prints Bible verses on its Styrofoam cups, and prioritizes charity in its financial practices. Most notoriously, its leadership takes the traditional view of marriage and has said so publicly. As a result Chick-fil-A has, for over a decade, been protested, boycotted, and denounced as “hateful” by far-left LGBTQ activists.
Predictably, the charges against Chick-fil-A are outlandish in the extreme. At one point the company’s owners were accused of funding a proposed Ugandan law that would have made homosexual acts punishable by death. In fact what they did was help fund a pastor who went to Uganda and spoke against the law. Chick-fil-A did not back down for many years, and so it came to be cherished as a lone bastion of clarity and courage. But this week the restaurant announced that it would discontinue its donations to charities (the Salvation Army and the Fellowship of Christian Athletes) which have been deemed homophobic.
What is meant by this is simply that those charities do not support gay marriage. Since the Salvation Army provides aid to all homeless people regardless of their sexuality, and since homeless youth are disproportionately gay or transgender, the charity actually does a great deal to help LGBTQ people. But because its leadership is old-fashioned on the marriage question, it is considered an “anti-LGBTQ” organization by, among others, CNN and the Los Angeles Times.
There is only one conclusion that may be drawn from this. Agitators of the kind that have targeted Chick-fil-A are not merely, or even primarily, interested in ameliorating the lives of gay people. They are opposed per se to Christians holding their traditional beliefs about marriage. Chick-fil-A and the Salvation Army are under fire purely because they think, and say, that God wants men to marry women and not other men. For the extremists of the LGBTQ movement, it is impermissible that there should exist even one business whose owners believe that.
At the climax of the West’s oldest military epic, Homer’s Iliad, the mortal enemies Achilles and Hector meet to do battle. Hector, manifestly the weaker man, asks Achilles for an oath that the victor will afford his victim a proper burial. But Achilles, blinded with rage and thirsty for blood, sneers, “there can be no trustworthy oaths between men and lions, nor soft-hearted agreement between lambs and wolves” (Iliad 22.261-2).* The attack on Chick-fil-A makes plain that radical leftists are not in this for a compromise, or an equitable live-and-let-live solution. They are in an existential battle between lions and men. Destruction, not détente, is the goal.
That is why war—and not, say, business negotiation—really is the proper metaphor for what is going on here. It remains uncertain what Chick-fil-A’s long-term strategy will be. But its adversaries’ strategies are obvious: accept no concessions, take no prisoners. The Advocate, for example, swiftly insisted that “Chick-fil-A Still Isn’t LGBTQ-Friendly, Despite Pledge on Donations.” As long as “LGBTQ-friendly” means not “tolerant of gay people” but “actively in favor of gay marriage,” no conservative Christian organization will ever fit the bill. This is not a silly or a small thing. Christians are invested in Chick-fil-A’s decisions not because they are obsessed with fried chicken, but because they understand the stakes of the confrontation playing out in front of them. In the language of brand warfare, radical gay activists have made their message to the Church quite explicit: nothing but your annihilation will be enough.
A canon is a stick for measuring things with. That’s what the Greek word kanōn, cognate with the Hebrew kanna, means. When ancient Greek music theorists, for instance, wanted to mark out the locations on a string where the notes of the scale were played, they stretched the string along a plank of wood and made notches…
A canon is a stick for measuring things with. That’s what the Greek word kanōn, cognate with the Hebrew kanna, means. When ancient Greek music theorists, for instance, wanted to mark out the locations on a string where the notes of the scale were played, they stretched the string along a plank of wood and made notches at the points where the most important harmonies emerged. That instrument was called the kanōn.
The first literary corpus ever described as a “canon” was the Bible. Early Christians referred metaphorically to the scriptures they deemed legitimate as a kanōn: a central set of texts which records what is reliable and indispensable, just as a measuring rod records what is basic and accurate.
That is why a literary canon neither should nor could be merely a list of the most enjoyable books. The books which make it up are usually enjoyable, to be sure. But they are not canonized because they are enjoyable: they are enjoyable because they speak with authority about some centrally important truth. Plenty of books, moreover, are excellent but do not speak with such authority. It is no slight to their excellence to say that it is not of the canonical sort.
For this reason, too, it matters that canons often have national or cultural adjectives attached to them: the Western Canon. The French Canon. The American Canon. It would be futile to try and list all the best books ever, full stop. The impossible enormity and fatal subjectivity of such a task is becoming painfully apparent as digital technology makes it feasible, at least hypothetically, to compile endless but indiscriminate databases of every keystroke ever struck. In the face of such limitless data, the questions which haunt us grow ever more basic and vexing: how do we organize all this? How do we keep track of everything important?
We cannot, and the wisest among us have never tried. That is why the Bible is not an exhaustive list of everything that Christians may profit from reading. It is an essential collection of everything Christians must read to know who they are. The books of the Bible tell us—through poetry, allegory, philosophy, history—what it has meant and should mean to be God’s people. The American Canon can teach us what it means to be the American people.
Canon-building, like nation-building, is therefore an exercise in identity formation. And just like nationhood, canonicity has been plagued of late with anxieties concerning its exclusivity, its self-serving bias, its chauvinism. Why delineate canons at all, if to do so is only to perpetuate the unmerited influence of a well-heeled elite?
This uncomfortable question has an equally uncomfortable answer. That prejudiced, insular, imperfect elite—those dead white men (mostly)—made our society what it is. And their influence is not, in fact, unmerited. We have no other America than the one they built, the one which, over time, has come to enfranchise more kinds of people than any other country on earth.
Those who object to the idea of an American or a Western Canon, who suggest that we refrain from teaching white male authors in schools so we can even up the scales, are ipso facto asking to scrap the American project and start over with a blank slate. Given the history of utopian movements, I am not sanguine about the prospects of such an effort. Far better, in my view, to get out the measuring stick and take stock of who we are.
To do so is not to freeze time or foreclose reform. To the contrary: an appetite for innovation and development is essential to the American spirit. After all, few American novels better express that spirit—few are more unquestionably canonical—than Mark Twain’s Huckleberry Finn. In Twain’s pivotal scene, Huck confronts his certainty that he will be eternally damned if he abets the escape of Jim, a runaway slave. That is what his society—an American society in transition—has taught him. And yet, reflecting on Jim’s kindness and obvious humanity, Huck cannot do it. “All right then, I’ll go to hell,” he says. In the face of every authority, he sides with his friend Jim.
There, in one moment, is a perfect encapsulation of all that America is both in reality and in aspiration. Huck—fierce in his independence, dogged in his determination to do good, entangled in historical injustice yet revolutionary and visionary in his moral intuition—is us. America has no shortage of art that captures this ethos—from Moby Dick to Star Wars, from Louis Armstrong to Eric Whitacre. The richness and complexity of our heritage, the roadmap of our future, the outlines against which we measure ourselves: that is what a canon preserves. That is why we need ours.
In the late 500s BC, the military dictator Aristodemus took over the Greek colony of Cumae. He slaughtered his enemies en masse and undertook to ensure that no Cumaean man would ever be more than his slave. Here is how he did it, according to the essayist Dionysius of Halicarnassus. “To ensure that no noble or…
In the late 500s BC, the military dictator Aristodemus took over the Greek colony of Cumae. He slaughtered his enemies en masse and undertook to ensure that no Cumaean man would ever be more than his slave.
Here is how he did it, according to the essayist Dionysius of Halicarnassus. “To ensure that no noble or manly aspiration would arise in any of the citizens, he decided to feminize every young man by means of his upbringing in the city’s schools.” Aristodemus had the boys of Cumae wear long hair and embroidered gowns; he made them listen to soft music and keep out of the sun; he starved them of adult male guidance. This was so none of them would ever grow up strong enough to stand against him (Roman Antiquities 7.9).
What a paranoid and oppressive autocrat did to the sons of his subjugated people, American mental health professionals now propose we do to ourselves. In August of 2018, the American Psychological Association issued its first-ever “Guidelines for Psychological Practice with Boys and Men,” of which the first directive is that psychologists should “strive to recognize that masculinities are constructed based on social, cultural, and contextual norms.” In other words, treatment of boys and men should begin from the premise that manhood is culturally contingent and therefore alterable. The goal of therapeutic practice then becomes “to help boys and men over their lifetimes navigate restrictive definitions of masculinity and create their own concepts of what it means to be male.”
Graduate programs in psychology cannot gain accreditation or train students for licensing without the APA’s official imprimatur. It stands to reason that schools will feel strongly encouraged, at the very least, to conform their instruction with what the Association dictates. Not that institutions of higher learning typically need such encouragement: at Stony Brook University in New York, for example, the Center for the Study of Men and Masculinities is dedicated to deconstructing “traditional” manhood. Their website offers resources such as an article on “academic efforts to decode men.” It is to these resources that one is directed via hyperlink if one attempts to access any discussion thread about manhood which has been deemed toxic by the major chat website, Reddit.
Professional ideologues, then, are making their best efforts to train biological males out of their natural impulses toward strength, endurance, physical courage, and emotional self-control. Boys who find themselves lacking in these characteristics—as every young man does at some point in his development—typically experience a sense of inadequacy. Traditionally, caring adults have tried to alleviate that inadequacy by helping boys grow into themselves—by helping them attain the masculinity that is their birthright but not yet their achievement. The new diagnostic recommendation, however, is to treat all such feelings of self-reproach as needless impositions from an outmoded worldview in need of radical deconstruction.
What conservatives typically emphasize in response is that biological sex does matter, that men’s yearnings to be manly are indeed authentic and spontaneous. This is entirely true. But it misses something, something that Aristodemus knew: there is also a part of gender which is learned and taught. We experience certain natural ambitions, but then we build societies and traditions which honor and channel those ambitions. Most boys are born with an interest in fighting and competing, but no boy is born knowing how to play football or hold a gun. We school one another, generation to generation, in the ways of manhood.
Therefore if you train impressionable boys to disassociate themselves from their sex, they will indeed lose the sense of grounding and orientation that comes with proper instruction—they will indeed become “feminized” like the children of Cumae. That is why the efforts to degender our society are often focused on children. Public schools now teach gender theory using cartoon characters as diagrams. Little girls wearing male clothing were cheered on national television in October of this year at the Democrats’ “Equality Town Hall.” In the same month a seven-year-old boy was very nearly subjected by court order (subsequently amended) to hormonal alteration by a mother who encourages him to consider himself female. If it sounds alarmist to say that “gender theorists are coming for your children,” good. They are coming, and it is alarming.
The answer to this is not only to insist that “male” and “female” are real, natural categories: it is also to acknowledge that one natural component of those categories is aspiration. There is nothing harmful in exhorting a boy to “be a man.” If he is not yet—and no boy is—he will be told by activists and perhaps his teachers that he does not need to be. But the longings of his heart will tell him that he should, that he can. It is the business of gender theory to extinguish those longings. It should be our business to defend them at all costs.
When Kanye West’s new album, “Jesus Is King,” dropped last week, my Twitter timeline resembled a support group meeting for recovering abuse victims. Many of my followers are conservative Christians, and they emerged—wary, skittish, but achingly hopeful—to hear what Ye had to say about his return to Christ. On one level they were concerned that…
When Kanye West’s new album, “Jesus Is King,” dropped last week, my Twitter timeline resembled a support group meeting for recovering abuse victims. Many of my followers are conservative Christians, and they emerged—wary, skittish, but achingly hopeful—to hear what Ye had to say about his return to Christ.
On one level they were concerned that Kanye would make a mockery of the thing—he is famously arrogant to the point of blasphemy (“I Am A God” is the title of a song from earlier in his career), and his public persona is at best mercurial, at times unhinged. On the other hand, previous offerings—most notably 2004’s “Jesus Walks”—had expressed profound spiritual struggle and yearning: “I want to talk to God, but I’m afraid because we ain’t spoke in so long.” There was every chance that this was a very real, very public and high-profile moment of redemption. But it was equally possible that West might come out with something heretical or at least bizarre.
And on a deeper level, my traditionalist friends were suspicious because rap, hip-hop, and pop music are often flatly antagonistic toward them and their beliefs. Earlier this year Taylor Swift came out with the music video “You Need To Calm Down,” a candy-colored gay fantasia with a message for all who consider homosexuality and transgenderism anything other than normal, beautiful, and worthy of celebration. The message was: “You need to just stop / Like can you just not…?”
T-Swift encapsulates perfectly the venomous disdain, glossed and packaged as youthful positivity, with which most great luminaries of the pop scene treat old-fashioned values. Such disdain is expressed not only directly as in Swift’s work, but also indirectly in the form of aggressively explicit sexual posturing (watch, if you care to, the music video for Nicki Minaj’s “Anaconda” or Katy Perry’s “Bon Appétit”) and in the violent nihilism of artists like Eminem and Tupac Shakur. These giants of the industry present the degradation of self and other with a brazenness that implicitly demands either acquiescence or ostracization: get on board with this or be risible, irrelevant, and out-of-touch.
The attitude of pop music distills and concentrates the attitude of pop culture more generally, so doctrinal Christians have largely inferred that they cannot be cool. Or else that they can only be cool ironically, at a remove. The two available orthodox stances on this issue have been repudiation (“I don’t watch Game of Thrones and neither should you”) and strained tolerance in the form of ostentatious enjoyment (“yeah man, I watch Game of Thrones—how do you do, fellow kids?”). What is simply not available to traditionalists is ownership—they cannot, unless they abandon their convictions, treat mainstream music or television as something to be wholeheartedly endorsed without qualification.
This, I think, is why such people approached “Jesus Is King” with a kind of grim fascination, holding their breaths and cringing even as they compulsively streamed the album: they couldn’t look, and they couldn’t look away. Then slowly, with ever-deepening enthusiasm, it dawned on them that the man had actually been saved. The album is a largely unobjectionable and ostensibly sincere profession of faith.
Christians responded accordingly. “From what I’ve heard so far [it] seems pretty? Sound theologically?” one of my friends ventured cautiously on her first listen. Sohrab Ahmari of the New York Post tweeted: “Kanye’s ‘Closed on Sunday’ [an early track on the album] is one of the most reactionary things ever injected into the mainstream of American culture. I love it.” David French quote-tweeted West approvingly, and Andrew Walker at National Review wrote that Kanye is “just the figure…to bring a needed message that our society should reconsider what it deems praiseworthy.”
In one later track on the album (“Hands On”), Kanye worries that moralizing Christians will be the first to judge him for claiming Christ despite his checkered past. He could be forgiven for thinking so. In the event, though, the reaction from the Christian Right has mostly been gradual but passionate praise. Kanye has been compared to the prodigal son, to the repentant prostitute, to Saint Augustine—to all those heroes of faith whom God used for great things despite their great failings. That’s a bit much, but it tells you something: by and large, the Christian Right is remarkably ready to welcome anyone who can speak the Gospel into the culture from within the culture itself.
Compare this reaction with those of more putatively urbane cultural outlets, which have regarded Kanye’s conversion with slack-jawed incomprehension. Spencer Kornhaber wrote in the Atlantic that “Jesus Is King” consists largely of “pillow stitchings about devotion.” Noting Ye’s newfound commitment to sexual purity, Rawiya Kameir of Pitchfork said that “his interpretation of the gospel has been more dogmatic than faithful” (a meaningless distinction) and that “there is not enough depth here to distract from his [pro-Trump] politics, or to complicate them.”
There actually does not exist any room in mainline cultural commentary for the notion that a true encounter with the risen Lord might have occasioned a genuine transformation in West’s previously embattled and troubled soul. Faced with such a reality, reviewers have only snide belittlement to offer, and fallback accusations of inanity. But “Jesus Is King” is not inane: it is raw, muscular, and direct, a blunt rebellion against an ever more depraved society: “no more living for the culture, we nobody’s slave.”
“We nobody’s slave.” That is what critics cannot abide or understand, and what Christians are now welcoming with gratitude and recognition: the Gospel does not play by anybody’s rules. The wisdom of the cross is folly to those who are perishing, and “Jesus Is King” can only be folly to those who believe that its title is a lie. But this does not deter Christ, who enters into spaces that are furiously hostile toward him—spaces like pop music, or Kanye West’s heart—and claims them as his own.
On September 27th, 2018, the behemoth online discussion forum Reddit “quarantined” 23 of its “subreddits,” subsidiary discussion threads within the larger site. Internet communities “dedicated,” in Reddit’s words, “to shocking or highly offensive content” would be placed on a kind of probation. Quarantined subreddits are disallowed from generating ad revenue. They cannot be found in…
On September 27th, 2018, the behemoth online discussion forum Reddit “quarantined” 23 of its “subreddits,” subsidiary discussion threads within the larger site. Internet communities “dedicated,” in Reddit’s words, “to shocking or highly offensive content” would be placed on a kind of probation. Quarantined subreddits are disallowed from generating ad revenue. They cannot be found in searches, and it becomes impossible to see how many members subscribe to them. Quite a few of those quarantined in September were subsequently banned altogether.
All subreddits are titled according to the format “r/[nameoftopic],” and the September list included such appalling titles as “r/watchpeopledie” and “r/whitenationalism.” As a group they were ideologically quite diverse: “r/FULLCOMMUNISM” was also put on the watch list, as was “r/theredpill,” a thread infamous for hosting the various fractious splinter groups known collectively as the “Alt-Right.”
Reddit is not the only platform faced with charges of enabling online extremism, and not the only one to take measures meant to blunt them. The censorship of views deemed unpalatable by social media companies is now a matter of regular and tendentious attention, most recently because Facebook founder Mark Zuckerberg was called to testify before Congress on October 23rd of this year. The House raised the same question that the Senate did during Zuck’s appearance there in April of 2018. Namely: what right does Facebook have to make decisions about the sorts of content you ought and ought not to consume?
As a legal matter, the answer seems to be: Facebook (or Reddit, or Google, or Twitter) has no such right, if it wants to continue enjoying the extraordinary protection from libel law that is afforded to platforms (purveyors of content) as distinct from publishers (curators of content).
But from a purely philosophical standpoint, as a measure of the zeitgeist, Reddit’s practice of quarantining actually speaks to something deeper than the law. A quarantine is something you do when you do not want something—usually a disease—to spread. To “quarantine” ideas therefore implies what you are really worried about is not that those ideas are false, but that they will catch on. Reddit’s choice to conceal membership statistics for quarantined threads betokens the same anxiety: if people know that some view or set of views has a substantial following, perhaps they will start to wonder whether there might just be something to it.
Reddit is therefore not merely trying to weed out falsehoods or incitements to violence. Such projects are already dubious enough when undertaken by an overwhelmingly liberal Silicon Valley in tandem with leftist “fact-checkers,” who are perfectly willing to smear even conservative satire with risible charges of misinformation.
But quarantining isn’t actually framed in those terms. It’s predicated on a complete unbelief in the basic principles underlying the marketplace of ideas. From John Milton to John Stuart Mill and right on into the American republic, the governing consensus has been that free men and women get to decide for themselves what they find convincing. This was the reasoning behind 1977’s National Socialist Party of America v. Village of Skokie, wherein the Supreme Court famously allowed even Nazis to demonstrate in public.
We believe—or used to believe—that if the public is empowered to consider all the options then the best ideas will win and the truth will out. Reddit’s owners have shown that they simply do not think this is the case: they believe that good ideas, as defined by them, need a thumb on the scale to help them compete in the market.
Evidence would suggest that they are actually wrong about this—before being quarantined, all 23 of the threads targeted in September had (on the unlikely assumption that none of them shared any subscribers in common) a collective membership of 1,363,282. That’s 0.42% of the U.S. population.
But let’s imagine that a group which Reddit considered heinous did show signs of gaining real traction. Would that not suggest that the beliefs espoused by that group deserved a hearing, not a health warning? Shouldn’t people be allowed to get swayed, if they are swayed, even by arguments which their “betters” deplore?
Discussing the Zuckerberg hearings, 60 Minutes host John Dickerson said at Slate this week that Facebook should nix posts which present “characterizations and framings and elevations of certain issues that distort political reality…but that don’t have a specific fact that can be pinpointed…to remove them.” Emily Bazelon, in agreement with Dickerson, endorsed censorship of such material “even if it’s not completely false.” Jeff Zucker, president of CNN, went so far as to suggest that Facebook should “sit out” the 2020 election and muzzle political advertising entirely, “given what happened in 2016.”
There is reasonable discussion to be had about how to navigate and manage our new informational ecosystem. But at this point, that discussion is no longer being framed by the Left in the terms essential to free speech. Leftist dogma is instead threaded through with an extremist rhetoric of suppression and control. That control—tyrannous and self-satisfied as it is—should be resisted at every turn.
Let me tell you three stories, and then let me ask you some hard questions. The first story is this. In February of 2019, after 23 weeks of gestation in her mother’s womb, a baby girl was treated for the developmental abnormality known as spina bifida. Doctors at Cleveland Clinic in Ohio performed a delicate…
Let me tell you three stories, and then let me ask you some hard questions.
The first story is this. In February of 2019, after 23 weeks of gestation in her mother’s womb, a baby girl was treated for the developmental abnormality known as spina bifida. Doctors at Cleveland Clinic in Ohio performed a delicate new form of fetal surgery which drastically improved what would otherwise have been lifelong discomfort and impairment. A computer-generated depiction of the procedure went viral and was greeted with joyful wonderment.
Here is the second story, a horror story. Last Friday, also in Ohio, a federal appeals court blocked passage of a law which would have prohibited doctors from performing an abortion if the mother’s reason for terminating her pregnancy was a prenatal diagnosis of Down syndrome. It is therefore legal for a mother to extinguish her baby’s life if she feels it will be beset with adversity. To read the ruling is to find oneself staring into the abyss of a society which would rather execute infants than be discomfited by their pain.
The third story is the strangest. There is a kind of bird, the zebra finch, which learns its pattern of song from its father. At UT Southwestern Medical Center, scientists discovered that by interfering with these birds’ neurological pathways they can implant false memories, making finches sing songs they never learned but think they did. Though there are no immediate plans to apply the technique to humans, the New Scientist reported that some researchers “hope we will one day be able to alter memories associated with psychological trauma.” If so, then people haunted by their past may someday be offered treatment to make them forget that past ever happened.
Here come my questions about all this. It seems to me that when science restores to our bodies the wholeness for which our souls are destined, then human resourcefulness reaches its highest consummation. We are born broken into a broken world, but we intuit that we were meant to be well. This is why our hearts leap up to see new technology which can repair spina bifida, or give sight to the blind: viscerally, we know science is here setting something right which was deeply wrong.
It seems equally clear to me that the state-sanctioned murder of babies with Down syndrome is an abomination that should make our children’s children ashamed to own us as their ancestors. Whatever the answer is to the injustices of biology, it is certainly not to spare the comfort of the strong by slaughtering the weak.
But between these two extremes of right and wrong there is a chasm of uncertainty, and that is the chasm of the songbird. Every day the technology with which we can predict and alter the development of biological organisms becomes more sophisticated. We are verging on a world in which all aspects of our physical, hormonal, and neurological constitutions will be up for editing.
The first question is, what should we change, and on what grounds?
For, though we are not merely bodies, we are wholly embodied. Our dreams, delights, and heartaches are encoded into physical structures—which now can increasingly be toyed with. If a man was abused by his father; and if the memory of that abuse lives in some quadrant of his neural network which can be isolated and annihilated; and if by that annihilation the man is freed from a lifelong sense of self-loathing—should he be? Is the ease of his mind worth the erasure of his past? Which of our wounds make us who we are and so must be preserved, no matter how bad they hurt? Which are alien to our essence and so should be healed for the glory of God?
Horrifyingly, there is a strain of thought in the West which holds that suffering should be eliminated at all cost: delete memories, splice genes, kill babies, if only it alleviates the contortion and agony that comes from being as we are born. If this is to remain our governing philosophy then the use to which we put our ever-developing scientific powers will be fearsome indeed.
There must, must be another way. But what is it? Down syndrome, gender dysphoria, homosexuality, depression, anxiety: it is not out of the question that we will soon be able to identify and “correct” all these eccentricities, and others besides, before the patient is even breathing. Once we can, we will have to ask, should we? Which ones? Why, or why not? According to what moral rubric?
Behind all these unknowns lie two fundamental questions: what is it to be human? What is it to thrive? If and only if we think seriously about the answers now, then there is yet hope for the swiftly approaching future.
This morning the Economist ran one of its daily podcasts, a topic of which was Trump voters in mining country. These voters, said Political Editor James Astill, are continuing to support the president even though their jobs have not returned as he promised they would. Astill’s takeaway insight from this phenomenon was that “cultural identity” is…
This morning the Economist ran one of its daily podcasts, a topic of which was Trump voters in mining country. These voters, said Political Editor James Astill, are continuing to support the president even though their jobs have not returned as he promised they would. Astill’s takeaway insight from this phenomenon was that “cultural identity” is more important in American politics than economic identity or class.
That phrase has become the reflexiveexplanation among the commentariat for why the miners and the Rust Belt think and vote the way they do. And, hearing it, two things suddenly hit me. First, Astill’s remark slammed home what a condescending trope “cultural identity” is as an account of blue-collar American politics. Second, it hit me that I’d never realized this before—even though the phrase has long been a staple of my daily news diet.
The “cultural identity” trope is so condescending because it brushes aside the possibility that people in Kentucky might actually have beliefs and ideas which they value above all else. Among the possible explanations for a rural voter’s conservatism which occurred to Mr. Astill—cultural identity, economic identity, class—“intellectual principle” was nowhere to be found. And yet such voters do not oppose illegal immigration, support the right to bear arms, or care about religious freedom because they believe that dumping those views would somehow exclude them from being white, or Christian, or from Kentucky. They do not vote for President Trump because they think it will confer upon them some cultural status. They do it because they agree with him.
Imagine that: the voters in mining country stand by President Trump because he stands for political principles which they think pass logical muster, and which they therefore would like to see represented by their elected officials. They believe that those ideas are even more important than their mining jobs and than their “cultural identity.” Yet because many of them probably do not have advanced degrees or posts at the Economist, their “social superiors” have determined that their “unfathomable” behavior must be attributed to atavistic tribalism.
The grand irony here is that such tribalism is actually a hallmark of the identitarian Left and its warmed-over Marxism. It is Marx and Engels whose materialism teaches that all political action emerges deterministically from the actors’ social milieu, and it is identity politics which turns that philosophy into a worldview of pure groupthink. But, as always, the person doing the explaining is somehow exempted from the explanation: the writers at the New York Times and the Washington Post are clever enough to have ideas. For Trump voters, though, everything must boil down to “cultural identity.”
It is not news that our elites are supercilious about Middle America. What goes too often unremarked upon, however, is that their disdain is not always readily apparent even to those of us who are quite skeptical of our intellectual ruling classes. Skilled media commentators can fold their condescension into phrases like “cultural identity,” which are pronounced glibly and with an air of confidence so that they slip by listeners who might take issue with their implications were they expressly spelled out. This is a mode of avoiding argument: there is an entire ideology, breezily assumed but never made explicit, behind phrases like “cultural identity.” Such phrases are so embedded in the discourse that they threaten to colonize the thought of even intelligent people who might otherwise object very strenuously.
Manipulating the meaning of words is a highly insidious and effective method of pursuing a cultural program without having to defend it. It is a powerful resource of the radical Left, and should be identified wherever possible so it can be defused. Here as elsewhere, sunlight is the best disinfectant.
Last night for the first time in ages I thought about Super Mario Mac & Cheese shapes. Remember those? The Kraft kind. Super Mario Mac came in the usual cardboard box, only instead of the standard curved noodles it had starchy little effigies of Princess Peach, Yoshi, Toad, and the bouncy plumber himself. These you…
Last night for the first time in ages I thought about Super Mario Mac & Cheese shapes. Remember those? The Kraft kind. Super Mario Mac came in the usual cardboard box, only instead of the standard curved noodles it had starchy little effigies of Princess Peach, Yoshi, Toad, and the bouncy plumber himself. These you slathered with the requisite “cheese,” prepared from a bag of dehydrated powder plus four tablespoons melted butter and a quarter cup of milk.
They tasted better, is the thing. So help me, Super Mario tasted better than any other form of Kraft pasta. It wasn’t just that the pockets in the shapes could hold more cheese, because Nintendo Mac also tasted better than the other kinds of shapes you could get—Disney characters, say, or dinosaurs. Inexplicably, Super Mario cheese was richer, sharper, more flavorful.
This is what I remembered last night, what I could practically taste as I thought of it, what I tried with Proustian longing to describe as I frantically googled pictures of pasta on my phone to the bemusement of my partner. The subjectivity of the thing struck me. Apparently in the attic of my mind, if you rummage past the Greek verb conjugations and email account passwords, you will find, hanging around like a stuffed animal from childhood, the memory of what Super Mario Mac & Cheese tastes like to me.
It is undeniably the memory of an experience—not a memory of the mere fact that such Mac once existed. Stubbornly, defying all logic, my memory insists with the ferocity of a toddler that this kind of Mac is the best kind.
Upon reflection it occurred to me that this is important. It is important because as a society our focus is fixed almost exclusively upon the ever-growing capacity of machines to do our remembering for us. Etched into the coding of a thousand thousand servers, there hovers all around us a database of such scope and detail that the librarians at Alexandria would have blushed to think of it. Not only every book but every tweet ever written is steadily being digitized and so immortalized; every face and every movement, every word uttered, lives on. This inspires equal parts awe and paranoia, but most of all it obsesses us: we are constantly talking about it, reflecting upon it, wondering what the consequences will be of this new capacity for limitless retention.
What escapes us, often, is how different this kind of machine memory is from the human kind, the Super Mario Mac & Cheese kind. Already when the technology of writing was catching on in ancient Athens over the spoken word, Plato worried in the Phaedrus that human beings would outsource the work of memory to papyrus with disastrous effects. Because words remembered by tools and not by people, said Socrates in Plato’s dialogue, are like mute statues: they cannot move or breathe or answer questions. They do not live.
So too the data and information stored in our cavernous new online archives. Vast though it is, the Cloud has no way of, or interest in, performing the kind of selection and curation that my subconscious performed with the memory of Mac & Cheese. Machines are indiscriminate and factual; they remember that x was done by y at time z. Our human memories do something else: they select and magnify, distort and amplify, according to what an experience is like for us.
C.S. Lewis, in his novel Out of the Silent Planet, has one of his characters say this about the experience of meeting a dear friend for the first time:
When you and I met, the meeting was over very shortly, it was nothing. Now it is growing something as we remember it, what it will be when I remember it as I lie down to die, what it makes in me all my days till then—that is the real meeting. The other is only the beginning of it.
There is a kind of remembering that only we can do, the Mac-&-Cheese remembering which expresses to us some reality of our experience beyond the pure facticity of it. When you meet the love of your life for the first time, perhaps the fact of the experience will be unremarkable. It will be Wednesday, maybe. Maybe you will wear blue socks. None of that will matter. What will matter is when you look back on that day after ten years of marriage, and the memory glows with a kind of warm significance which records the reality of it: that it changed your world.
We know that this kind of reality is real: it is, in fact, the only kind that counts. It is the reality where love lives, and beauty, and desire. Millennials are famous for nostalgia—for making Buzzfeed lists and Facebook posts about that funny kind of pen they liked in middle school, or the way they used to feed their Neopets. I wonder whether this is why: because by remembering the little things that are precious to us we are clinging to the human type of memory. The type that elevates things which are of no significance to machines and infinite significance to us—pokey, subjective, storytelling beings that we are.
Spencer Klavan is assistant editor of the Claremont Review of Books and The American Mind. His book on ancient Greek music is forthcoming with Bloomsbury, and his devotional writing can be found at www.rejoice-evermore.com.