The New Hatred of Technology

People have never been better, here in the Year of Our Simulation 2024, at hating the very forces underlying that simulation—at hating, in other words, digital technology itself. And good for them. These everywhere-active tech critics don’t just rely, for their on-trend position-taking, on vague, nostalgist, technophobic feelings anymore. Now they have research papers to back them up. They have bestsellers by the likes of Harari and Haidt. They have—picture their smugness—statistics. The kids, I don’t know if you’ve heard, are killing themselves by the classroomful.

None of this bothers me. Well, teen suicide obviously does, it’s horrible, but it’s not hard to debunk arguments blaming technology. What is hard to debunk, and what does bother me, is the one exception, in my estimation, to this rule: the anti-tech argument offered by the modern-day philosopher.

By philosopher, I don’t mean some stats-spouting writer of glorified self-help. I mean a deepest-level, ridiculously learned overanalyzer, someone who breaks down problems into their relevant bits so that, when those bits are put back together, nothing looks quite the same. Descartes didn’t just blurt out “I think, therefore I am” off the top of his head. He had to go as far into his head as he humanly could, stripping away everything else, before he could arrive at his classic one-liner. (Plus God. People always seem to forget that Descartes, inventor of the so-called rational mind, couldn’t strip away God.)

For someone trying to marshal a case against technology, then, a Descartes-style line of attack might go something like this: When we go as far into the technology as we can, stripping everything else away and breaking the problem down into its constituent bits, where do we end up? Exactly there, of course: at the literal bits, the 1s and 0s of digital computation. And what do bits tell us about the world? I’m simplifying here, but pretty much: everything. Cat or dog. Harris or Trump. Black or white. Everyone thinks in binary terms these days. Because that’s what’s enforced and entrenched by the dominant machinery.

Or so goes, in brief, the snazziest argument against digital technology: “I binarize,” the computers teach us, “therefore I am.” Certain technoliterates have been venturing versions of this Theory of Everything for a while now; earlier this year, an English professor at Dartmouth, Aden Evens, published what is, as far as I can tell, its first properly philosophical codification, The Digital and Its Discontents. I’ve chatted a bit with Evens. Nice guy. Not a technophobe, he claims, but still: It’s clear he’s world-historically distressed by digital life, and he roots that distress in the fundaments of the technology.

I might’ve agreed, once. Now, as I say: I’m bothered. I’m unsatisfied. The more I think about the technophilosophy of Evens et al., the less I want to accept it. Two reasons for my dissatisfaction, I think. One: Since when do the base units of anything dictate the entirety of its higher-level expression? Genes, the base units of life, only account for some submajority percentage of how we develop and behave. Quantum-mechanical phenomena, the base units of physics, have no bearing on my physical actions. (Otherwise I’d be walking through walls—when I wasn’t, half the time, being dead.) So why must binary digits define, for all time, the limits of computation, and our experience of it? New behaviors always have a way, when complex systems interact, of mysteriously emerging. Nowhere in the individual bird can you find the flocking algorithm! Turing himself said you can’t look at computer code and know, completely, what’ll happen.

And two: Blaming technology’s discontents on the 1s and 0s treats the digital as an endpoint, as some sort of logical conclusion to the history of human thought—as if humanity, as Evens suggests, had finally achieved the dreams of an Enlightened rationality. There’s no reason to believe such a thing. Computing was, for most of its history, not digital. And, if predictions about an analog comeback are right, it won’t stay purely digital for much longer. I’m not here to say whether computer scientists should or shouldn’t be evolving chips analogically, only to say that, were it to happen, it’d be silly to claim that all the binarisms of modern existence, so thoroughly inculcated in us by our digitized machinery, would suddenly collapse into nuance and glorious analog complexity. We invent technology. Technology doesn’t invent us.

Unless, of course, we really are living in a simulation. Sometimes I suspect we are. Two more things to say about that. One: Our simulation is definitely running on analog chips. And two: We’re back to Descartes. He thought, therefore he was, but he couldn’t quite exist separate from that which can’t be stripped away: his simulator, his God. To hate technology is to hate creation itself.

Image may contain Label Text Symbol and Sign

Time Travel

Uh oh, in my 2022 piece about the simulation argument, I suggested that reality is boilable-down to bits: “In the beginning, after all, God created light and darkness,” I wrote. “Translation: The simulator created 1s and 0s.” Decide for yourself how literal I was being.

Image may contain Symbol

Ask Me One Thing

One reader wants to know how nice we should be to the artificial intelligences. Should we thank chatbots? Not kick robots? Say hello and goodbye to self-driving cars? It’s a modern-day version of Pascal’s famous wager: Believe in God just in case. Let me introduce WIRED’s Wager: Respect the computers just in case.

You can submit questions to [email protected]. Write ASK LEVY in the subject line.

The New Hatred of Technology 1

End Times Chronicle

Did you read this newsletter all the way through? According to The Atlantic, nobody reads anymore. How do they know? I don’t know. I didn’t read it.

Image may contain Label Text Symbol and Sign

Last but Not Least

The US has long been trying to “reshore” chipmaking. Here’s a look inside Intel’s colossal effort to build a fab in Ohio.

At this billion-dollar startup, robots are finally getting back to the business of folding laundry.

Roughly 40 percent of the posts on the blogging site Medium are AI slop, according to two independent analyses.

Feeling twitchy? Two hotshot security reporters just dropped the WIRED guide to protecting yourself from America’s digital surveillance machine.

Image may contain Logo Symbol Trademark Text and Label

Don’t miss future subscriber-only editions of this column. Subscribe to WIRED (50% off for Plaintext readers) today.