In a wide-ranging interview, the author of “The Age of Surveillance Capitalism” talks about why people should pay attention to how big tech companies are using their information.
It was a grenade lobbed into the tenuous peace between Apple and Facebook — a software update that explicitly asked iPhone users whether an app should be allowed to track their digital movements across the other apps and sites that they use.
Apple pitched the feature, App Tracking Transparency, as a triumph for privacy. But for Facebook, it was an attack striking a key source of revenue: the personal data of its users.
The dispute represents a further deterioration in the relations between the two companies and their chief executives, Mark Zuckerberg and Tim Cook.
In a recent episode of The Daily, Mike Isaac, a technology correspondent for The New York Times, asked a question at the heart of this conflict: “Do people care about privacy?” The answer, as he explained, will determine the conflict’s trajectory — and the limitations on Big Tech’s power in a largely unregulated, hypercompetitive fight for market dominance.
After the episode aired, we called someone who thinks deeply about both privacy and the economic forces behind this competition: Shoshana Zuboff, author of “The Age of Surveillance Capitalism.” In the following interview, we ask Ms. Zuboff about the significance of this update in regards to privacy protections long term, the prospect of platform regulation and her vision for a less-extractive digital future.
Our conversation has been condensed and lightly edited.
In your book, you describe personal data as the primary source of economic power for platform companies like Facebook that monetize the intimate details of our digital lives. How significant is App Tracking Transparency in limiting that power?
It’s significant. But what I think a lot of folks reading the headlines perhaps don’t understand is that what this App Tracking Transparency notification does on your iPhone is it limits applications from tracking us across to other applications and across devices.
What this does not do is prevent these very same applications, including Facebook, which is the big whale in this discussion, from collecting data within their own application. This is a massive surveillance empire worth hundreds and hundreds of billions of dollars. But we call it an app.
And App Tracking Transparency has no bearing on Facebook’s, or any other application’s, ability to continue tracking you, collecting every aspect of your behavior, your activities, your thoughts and feelings.
So, yes, it does take a big bite out of some of the things that they currently do, especially as they reach for this rich diversity of data, which is so important to them. But does it limit their ability to illegitimately convert our lives into data, which they then declare as their private property? No.
Do you think of this as a turning point?
The jury’s still out. I think that what this does is it puts Apple at a crossroads at a new level of intensity.
Why does it matter that these big tech companies can mine personal data for profit?
As we allow these companies to amass this huge scale of human-generated data, we’re changing the nature of our society.
Because, first of all, we’re allowing them to create these huge asymmetries of knowledge about people. Instead of this being a golden age of the democratization of knowledge, it’s turned into something very different from what any of us expected. The last 20 years have seen, especially the last decade, the wholesale destruction of privacy.
And operationally, what happens is they get to a point where they know so much about us that they can fashion targeting mechanisms. We’re not just talking about targeted ads. We’re talking about subliminal cues, psychological microtargeting, real time rewards and punishments, algorithmic recommendation tools and engineered social comparison dynamics.
We have seen the scourge of disinformation on social media, we now know from research, driving a huge number of unnecessary Covid deaths because of disinformation campaigns and also having a huge role in producing the seditious tragedy of Jan. 6. What is so important for folks to understand is that these are all connected points in one process. And the process is called how knowledge becomes power.
Apple now has over one billion active iPhone users. Are you concerned about the company’s growing control of our means of information access?
I’m deeply concerned about it. Apple is the wealthiest and most powerful corporation, certainly in modern history, perhaps in the history of capitalism. And Apple now exerts unilateral, essentially unaccountable control over critical communications infrastructures, by its complete control of the operating system for its smartphones and other devices.
I think it’s important for people to understand that Apple is not a government. Apple is a company. It’s a corporation. And in a corporation, C.E.O.s come and go, boards change their membership. Business cycles and business crises occur. And today, Apple can look privacy preserving. And a year from now, we could be having a conversation about how Apple has reneged on all of these privacy values because there’s an economic crisis, and with a new C.E.O. and a different board, Apple completely changes.
In what ways does Apple uphold or contradict its stated motto that “privacy is a fundamental human right?”
I had a data scientist who said to me the other day, “Look, the underlying norm of all software and apps designed now is data collection.” For all intents and purposes, all of them are designed to engage in surveillance.
Apple still makes the majority of its revenues through its sales of iPhones and other devices. Nevertheless, an increasing portion of its revenue comes from services, and a big chunk of services is selling apps. So even if it’s not a surveillance capitalist, it is a powerful enabler. A powerful accessory to this crime of surveillance capitalism.
And, of course, there are other ways in which Apple and Mr. Cook really violate the principles that he so eloquently states. Apple in China is obviously a huge example of that. Apple’s relationship with Google. So Apple is deeply compromised.
The question in my mind is now under the spotlight that Mr. Cook has chosen to shine on himself and this corporation: Are they going to move to truly fill the shoes of what it would take to be a privacy god, or are they going to continue to sort of fudge this along and play both sides?
But just to clarify, do you think Apple is actually interested in preserving privacy, or do you think the company sees the introduction of updates like App Tracking Transparency as an opportunity to gain a competitive advantage over Facebook?
I don’t think we should ever expect a corporation to do anything that is not self-serving. Corporations are, by definition, self-serving.
They’ve already made it clear that they’re looking at a way to expand their own advertising model, which is different from online targeted advertising. They’re putting down the elements here of an alternative advertising paradigm. This is an opportunity for that new paradigm to now converge with their stated values and not rely on massive scale collection of human generated data in secret.
What would you like to see Apple doing differently?
Here is a historic opportunity for Mr. Cook and Apple to say, “We are going to become the hub for an alternative ecosystem.” This alternative ecosystem is waiting for leadership.
Apple is the corporation that can provide that leadership, and it can immediately form alliances with other large, medium and even smaller companies to found an ecosystem where the whole investment profile changes. Investors are ready for this because investors are anticipating regulation that is going to take a bite out of surveillance profits.
And that means Apple has a golden opportunity to start with its own app store. Most purveyors of products feel that they have a responsibility to sell products that are safe. Apple could finally take responsibility for what it sells in its App Store and say we are only going to sell applications that observe privacy-preserving principles. It can help developers on alternative models of monetization. It can work with investors to develop alternative models of investment.
Apple could become an actual collaborator with lawmakers, making personnel available so that lawmakers and their staffs have a forensic understanding of the kind of enforcement operations new regulations are going to require.
How do you feel about the regulatory possibilities that are emerging at this moment?
I feel great about it. What the E.U. is doing is really taking us to the frontier of the regulatory effort, and I think of it as really something that we have to achieve in this decade or in the third decade of the digital century.
Over the years, you know, we’ve kind of had to flinch a little bit when we watched our elected officials interrogating the tech executives because they seemed so outmatched. Well, those tables really have turned. And in March, what we saw for the first time was congresspeople that really have grasped the economic model here and the unaccountable power that has accrued to these companies. And for the first time, we heard them saying we understand this information is a byproduct of your economics. And this is going to end and we’re going to end it.
Apple’s products are expensive. Is there a premium on privacy that only some can afford?
Android, of course, is by far the dominant smartphone in most countries. We see people who can’t afford privacy. And the idea of privacy as a luxury is a profoundly intolerable idea.
Can you talk about how the pandemic has empowered these tech companies in their data collection?
What’s happening in this remote education space now is truly frightening.
The huge irony here is that the outbreak of the pandemic was exactly the same time frame in which the New Mexico attorney general filed his class-action suit against Google Classroom, citing its illicit data extraction practices aimed at children.
Now you’ve got this whole sector called school safety technology. And then another sector called proctoring systems, which are these for-profit companies that attach to Google Classroom. When you dig into what these systems are doing, they’re being paid by school systems, school districts for these services and, the so-called safety systems, they’re tracking everything from notifications from social media, email files, chats, posts, messages, all documents, anything to do with the remote schooling activities. And then the proctoring systems, they’re doing facial recognition, gaze and eye movements to track attention. They’re producing what they call “suspicion scores.” They’re also taking microphones. They’re taking cameras. They’re insisting that cameras record your surroundings and broadcast that to the proctor.
And students and their families have no pushback because they’re saying that students can’t even access the data. They’re not even doing a performative statement where they’re agreeing to limit retention or third party sharing. They can just do whatever they want.
And during the pandemic, this data collection touches almost every facet of our lives, from remote work to school to socializing. Do you feel like that ubiquity is in some ways immobilizing?
I feel the ubiquity is a vehicle for spreading resistance. Big, professional polls [are] showing this decidedly is not the end of “techlash.” It was at least some confirmation for my hypothesis that the more we’re exposed to it and dependent on it, the more that nexus becomes a vehicle for precipitating resistance, repugnance and revulsion.