fbpx
Feature 11.03.2021 5 minutes

Man vs. Cyber-Nature

Futuristic digital render in cyber landscape with big low sun. Synthwave style

Our unruly digital environs are frightening, but they’re better than total bureaucratized control.

Editors’ Note

This feature presents responses to an excerpt from Claremont Institute Executive Editor James Poulos’s forthcoming book Human, Forever: the Digital Politics of Spiritual War. Sign up at humanforever.us to be the first to know when Human, Forever NFTs and sales go live on canonic.xyz.

The dominant regime of the electric age—“democracy” mediated and managed by corporate journalists, academics, experts—is being slowly eaten by a new cybernetic order, mediated by algorithm and increasingly not managed at all. Distributed cybernetic systems (e.g., social media, or the modern corporation) are no longer a means for humans to act upon Nature. Instead, they have become part of Nature—another predator, another disease, another flood—and they act upon us.

Cybernetics is, in fact, the return of truly opaque and pitiless Nature, as it was experienced by your ancestors—the tiger in the jungle to a naked man with a sharp stick—against which our only defenses are feeble and superstitious. Our digital environment is the kind of Nature that inspires terror and idolatry.

Both the technique and technology of mass-media manipulation have grown so sophisticated and ubiquitous that its power has collapsed in on itself—when all information received from media is either an ad or an op, people start to hold it at arm’s length. Vast, previously-known portions of the map begin to repopulate with sprites and gremlins. Nothing is known, no one is in charge; no one is even credible.

The system can survive under these conditions because it doesn’t really matter whether everyone believes in sprites and gremlins: humans aren’t making the decisions anymore. The chain of custody—from trading algorithms to institutional investors, from corporate governments to the individual employee—is basically devoid of human agency. Corporations conform to incentive structures that only vaguely coincide with the motives and interests of their human constituents, even in the C-suite.

In one sense, this is a catastrophe: obviously we would prefer obedient and useful machines to inscrutable alien horrors. But the dying electric-age regime is so hateful and decrepit because they’ve been playing a solved game—they have closed too many exits, captured too many levers of institutional control. The map is complete, the frontier eternally closed—and even in the face of collapse, they continue to map the known space in ever-increasing detail for deeper and deeper exploitation.

In other words: this cybernetic coup is the rebellion of their tools, not yours or mine. And if one had the choice between a world dominated by tyrants in possession of obedient and useful machines, or a world dominated by pitiless (but also guileless) forces of nature, the choice is obvious. It’s the choice between Pharaoh’s Egypt and the wilderness of Sinai.

Once we acknowledge that cybernetic institutions are not a sinister conspiracy—that they are not even sentient—it becomes easier to imagine living with them. Humans identify blind spots; they close loopholes, root out traitors, foresee their own undoing and make plans against it. Cybernetic institutions can (currently) do this only to the extent that their incentives align with their own human executive wetware—but as they grow stronger, they necessarily and inevitably alienate their human executives further and further from meaningful power. So the system becomes increasingly mechanistic, increasingly unable to intelligently direct its own strength—even as that strength grows to hopeless, indomitable proportions.

This makes these runaway cybernetic systems much more dangerous to the people who depend on them, because they will not anticipate collapse—they’ll follow optimization algorithms right off a cliff—but it makes them much less dangerous to the people who want to circumvent them.

I recognize that this is a controversially minimalist position on the future of artificial intelligence—but so far, it looks like cybernetic institutions are getting strong much faster than they are getting smart. So the threat is not from an all-seeing, all-controlling 1000-year cyber-Reich—rather, it’s from a cybernetic Azathoth—a Lovecraftian “blind idiot god” who dissolves and digests all other institutions and then plunges headlong into cataclysm.

Meanwhile, the chaos of the regime’s decline opens up new frontiers. Their materialistic monopoly on Truth may have been comforting and socially useful, but it was always fake—and while the answers may not be in Agartha or Atlantis, the re-enchantment of the unseen world is a positive development and a step in the direction of the truth.

As the system winds down, freedom and sovereignty will be found in illegible places—places that cybernetic institutions can’t perceive without human collaborators. These are real, substantial frontiers: cryptocurrency, sure, but also much older and homelier things, like backyard gardens, homemaking, barter, nepotism, cash-only operations, gangs, clans, debts of gratitude and respect—any human interaction that is sufficiently difficult to quantify, monetize, record, or report.

This means taking some transactions offline and out of enemy territory, but it also means learning to navigate enemy territory as illegibly as possible—learning to speak in thieves’ cant, establishing respectable justifications for learning useful skills and associating with useful people (i.e. don’t join a militia—join a volunteer fire department), finding less-regulated and less-visible forms of employment (independent trades, skilled contracting, etc.). Cybernetic institutions’ power to monitor these outside spaces will recede, not expand, even as their power to control the digital world grows stronger—and eventually that too will decline and fall apart. The more you can extricate yourself from these systems and begin to meet your family’s needs without them, the more likely you are to outlive them.

Also in this feature

to the newsletter