On February 4, Google proudly announced that its Nest Secure home anti-intrusion device would now support its voice-activated Google Assistant service. There was a catch, however: No one knew that the Nest Secure actually had a microphone inside it. Google claims the microphone “was never intended to be a secret” and “has never been turned on,” but it’s hard to shake the feeling that hidden mics are a natural step for Silicon Valley giants intent on collecting as much data as possible, no matter the cost to user privacy.
This episode is a perfect encapsulation of the digital threat outlined in a new book by tech critic and Harvard Business School professor Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. First published as an essay in a German publication in 2014, what Zuboff describes amounts to a new economic logic hatched in corporate America that aims to extract staggering value from users’ private lives. Cataloguing a dizzying array of sensors and invasive software, Zuboff sketches a vision of the economic future in which companies race to collect data in pursuit of Facebook- or Google-like profits.
Zuboff, who published her first book in 1989 on the future of technology and data in the workplace, warns in Surveillance Capitalism of a “seventh extinction” that threatens to eradicate “what has been held most precious in human nature.” Given the fragility of the global political and economic order, surveillance capitalism amounts to a “coup from above,” Zuboff argues, an assault on democracy by way of subverting the very idea of what it means to be an individual.
I recently spoke with Zuboff about her book, what constitutes surveillance capitalism, how it operates, and what sets it apart from other historical changes in the economy. The following interview has been edited and condensed for clarity and length.
I’m probably just gonna hold the recorder close to you because there’s a lot of ambient noise in here. The first thing I wanted to ask was for a definition of surveillance capitalism for the uninitiated. How would you describe it?
Do you want to come closer? Snuggle up, snuggle up. I would say that surveillance capitalism in many respects diverges from the history of market capitalism, but in one major respect it emulates the pattern of market capitalism, and that goes like this: Historians have long recognized that capitalism evolves by claiming things that live outside the market dynamic and bringing them into the market dynamic so that they can be turned into commodities for sale and purchase. So famously, for example, industrial capitalism claimed nature for the market dynamic. Nature lives in its own space and time, claiming nature for the market dynamic, reborn as real estate, as land that could be sold and purchased. Similarly, industrial capitalism claimed work for the market dynamic, so activities that people did in their fields, in their gardens, in their homes, in their cottages, now subordinated to the market, reborn as labor that could be sold and purchased.
Surveillance capitalism follows this pattern but with what I would call a kind of dark twist. And that is it claims private human experience as a source of free raw material, subordinated to the market dynamic and reborn as behavioral data. Those behavioral data are then combined with advanced computational capabilities in order to produce predictions of human behavior. So all of these behavioral data now streamed into our 21st-century dark Satanic mills, which are what we call machine intelligence, machine learning, artificial intelligence, in order to spew out predictions. These are a new kind of product.
There’s data and information collected about people that is used specifically to inform the particular products that are served back to them, but there’s this other data, what you’ve termed behavioral surplus, that’s information that doesn’t have an immediate use but is itself a kind of control and power that these organizations possess that gives them an advantage over other companies. Why is behavioral surplus something that’s so critical in surveillance capitalism?
The idea here is that what is being produced are predictions, predictions of future human behavior that are then sold to markets of business customers who have an interest in what people will do now, soon, and later. So that’s the sequence, the mechanisms of surveillance capitalism. When I say claiming private human experience and then translating it into behavioral data, I’m talking specifically about aspects of private human experience that aren’t what is required for product and service improvement. So, for example, in the world of search, where these mechanisms were first discovered and invented, it was clear that people were searching and people were browsing and you could use the data to improve the search engine, and create ancillary services like translation. But there was collateral data that was also produced in these processes that was not behavior that people understood they were sharing. It was an offshoot of their search experience, of their search activity, but not something they knew they were sharing. In the same way for example that you might do a post on Facebook to meet your friends and family for dinner, and what becomes interesting from the data point of view is whether you say, “I’ll meet you later” or whether you say, “I’ll meet you at 6:45.”
So the point is that there’s a meta level of these data that have tremendous predictive value that you don’t know you’re communicating when you are posting or when you’re searching or when you’re browsing or all of these things, and that’s data that is more than was needed for product and service improvement. These extra data initially, at the beginning of all of this, were lying around, unused, considered data exhaust, waste material. Eventually it was discovered that they had significant predictive value and that’s what was used to create the kind of prediction products of where people would click that became the basis for these new online advertising markets. So the idea here is that there is behavioral data that companies are collecting about us, that are being used to improve what they give us, but there is much more behavioral information that we are communicating that we don’t know. And this is the surplus that they then take for its predictive value, stream it through their production processes to create these prediction products, and all of this is happening without our permission, without our knowledge. It’s happening in a way that is designed to bypass our awareness, it’s happening in a way that is engineered to keep us ignorant.
This is what I call the shadow text. It’s what they can lift from these behavioral flows that gives them tremendous predictive power, and it’s different from what we knew that we were giving them. Which is why today under some of the new regulations, for example GDPR, it says, well, you can go to a company and you can ask them for the data they have on you. But when you go to a company and ask them for the data they have on you, what you’re really talking about is the data that you’ve already given them. But the important data they have on you is this metadata, this stuff that they’ve been able to pull from your data that you don’t even know about. Like if you use exclamation points or if you said 6:45 or later, if you use bullet points instead of just a general paragraph. And a million other things. So it’s this surplus that is more than what is needed for products and service improvement that became critical to the fabrication of these prediction products and laid the basis for this new work.
Just to pick an isolated strand of that data, how does whether or not I prefer to use periods or ellipses or semicolons become something that has predictive power that’s valuable?
So, there’s something called the five-factor personality model where you can pick behavioral cues from online material and you analyze it through this five factor personality model and come out with very specific personality assessments. Like, this is at the root of the Cambridge Analytica work. You can take behavioral data, you can run it through that model and make very fine-grain predictions. Like they can tell for example, you know, if you’re gay, or if you’re likely to vote alt-right, or if you’re likely to be a political malcontent. It’s correlated with all kinds of other behavioral predictions. So, for example they can take all the people who use ellipses, and they can cross correlate that with your outcomes on these personality profiles, and in doing that, they can see that, you know, people who tend to use ellipses are people who have uncertainty or don’t like to finish things. These are huge databases that correlate all these different tiny little signals. Many of them organized by the five factor personality model and then there are other models built on the five factor personality — like IBM’s got 12 factors, and somebody else has 16 factors. There are these immense correlations that are running all the time, so somebody who uses bullet points you know, they correlate that with greater tendency toward precision.
You could then theoretically arrange that data in such a way and get something that’s useful to whatever the possessor of that data’s purpose is or agenda is.
You build up real detail about these profiles, in real psychological detail, so that you can then begin to predict how a person with this detailed profile is likely to react to stimulus, right? So this person with this profile and these predilections and maybe this sense of confusion or whatever, and then if I bring a stimulus that says “this thing is happening and you should do this” — whether it’s “you should buy this” or “you should vote this way,” or “this thing is happening and you should be against this” — you’ve built a very robust prediction about how people with this profile are likely to react to that kind of stimulus. And that gets very fine-grained, very fine-tuned. This is happening with parallel processing with millions and millions of data points and ends up being able to predict, as we say with Cambridge Analytica, just you know pivoting these methodologies which are the sort of normal “day in the life” methodologies of any self-respecting surveillance capitalist, just pivoting those slightly from commercial outcomes to political outcomes, and you get really robust predictions about how people will react to certain kinds of material, triggers, stimuli, and so on.
Why is surveillance capitalism such a departure from just capitalism before? It’s something you address in the book to some extent, and how it departs from managerial capitalism and other traditions of capitalism but I’m kind of curious about what’s the value of having particular nomenclature to describe surveillance capitalists?
Yeah, that’s a good question. So you know, theoreticians of capitalism have talked over the years about capitalism as a sort of meta-economic process of capital flows — certain necessities of scale, economic growth, productivity, so forth. But it’s also been recognized that every era produces its own unique market forms that are participating in these larger principles of capitalism, but specific to a time and place. This is historical materialism, if you will. So the idea is, for example, Adam Smith contrasted mercantile capitalism, which was trading and ships between countries and so forth, to what became industrial capitalism. He was fascinated with the pin factory, because now the pin factory took these large-scale processes of capital accumulation and growth and productivity but now brought it into a different market form, which was specifically this form of high volume and standardized production.
So even early on in the theorizing of capitalism it was understood that capitalism takes on different market forms and different eras in the context of different technologies. We’ve had mercantile capitalism, and we’ve had factory capitalism, and mass-production capitalism, and managerial capitalism, financial capitalism. And typically what happens in these new concepts is that modifier, like “mass production,” or in my case “surveillance” capitalism, what that modifier is doing is pinpointing the pivot of value creation in this new market form. So, for example, mass-production capitalism: What’s creating the value in this new form of capitalism is the idea that you have a minute division of labor and standardization of parts. The separation of execution and administration, the high volume, low-unit cost which was the outcome, all depended upon having these particular characteristics of mass production that produced the value that ended up resulting in a new economic logic that yielded high volume and low-unit cost.
One of the things that big Silicon Valley companies arguably have in common, more than how they’re able to make use of consumer data, is that they’re able to construct software and hardware platforms to which entire economies have had to accommodate themselves. They create their own sort of protected gardens in which they can operate and in which they can discriminate. I’m curious about the ways in which the nature of the platform is actually what informs the economic power and ascendance in the structure of this new form of capitalism.
Yeah, well, the platform is a major technological innovation, but what I’m distinguishing here is an economic logic. I’m really making an effort to separate the technology from the economic logic And it is possible to imagine platform capitalism without the surveillance market form, but just as it is possible to imagine the digital technology without surveillance capitalism, it is impossible to imagine surveillance capitalism without the digital. So I’m making a distinction about a specific market form that has grown up that is not a necessity of a platform even though it has grown up on many platforms.
What’s an example of a platform without surveillance capitalism? Are there any current examples that come to mind for you?
This is a very dynamic situation, and there’s nothing pure. Let’s take Amazon as an example. Amazon has always been a very canny and even ruthless capitalist. And it has used the platform for achieving monopoly in many dimensions, and it has used huge amounts of behavioral data for really improving its services. And so it has used the platform technology, the platform system in a very powerful way. But it’s only more recently that Amazon appears to have moved into surveillance capitalism as a market form, as Amazon has moved into personalized services — the Alexa and all the stuff that goes with the Alexa. It’s taken itself out of the realm of just doing really canny, even ruthless platform capitalism, creating monopolies and so forth, to seeking these supply chains of behavioral surplus for personalizing, for creating predictions, for selling into these new prediction markets, in ways that are not a necessary consequence of the platform as a form.
This surveillance capitalism is an economic logic now that is travelling not only across the tech sector, but traveling across sectors in the regular economy where platforms are not the prominent constellation. So in the insurance industry, in the health-care industry, insurance companies are using telematics so they know how you’re driving in real time, and can reward and punish you with higher and lower premiums in real time for whether or not your driving costs them more or less money, or whether or not your eating costs them more or less money, or whether or not your exercise patterns cost them more or less money. Health-care providers who are using telematics for not only feeding into these prediction markets but also collecting all kinds of ancillary data to sell to third parties, and being part of these whole ecosystems now, are behavioral surplus suppliers. You download a diabetes app, it takes your phone, it takes your microphone, it takes your camera, it takes your contacts. Maybe it helps you manage your diabetes a little bit, but it’s also just a part of this whole supply-chain dynamic for behavioral surplus flows.
And this is an economic logic that is only now made possible because of the technological developments of the last 20 years.
There was a window where we had the internet, we had the World Wide Web, before we had surveillance capitalism, where we imagined the smart home as a closed loop. Devices in the home fed data to the occupant of the home, the occupant of the home decided how the data were going to be used, if they’re going to be shared, what they meant, all of that. Simple closed-loop system. That was going to be the smart home. That was also going to be telemedicine. Simple closed loops between the physician, a server and the hospital, and the patient at home.
Fast-forward 20 years and now those simple closed loops are things like the Nest thermostat, where scholarly analysis indicates that if you are a vigilant consumer and you install a Nest thermostat in your bedroom, you should review a minimum of 1,000 privacy contracts, because everything is flowing to third parties which is flowing to third parties, which is flowing to third parties. No one takes accountability for the third parties, and by the way, if you don’t agree to each policy at each stage, then you begin to lose the functionality of the product and the reason that you bought it in the first place.
In your first book, In the Age of the Smart Machine, you outline two distinct paths for how the future might unfold, and the ways technology might evolve. What do you see as having changed since that time — between 1989 and the 2000s — that led to this situation?
In that book I talked about the dilemmas of knowledge, authority, and power, and I talked about what I called at that time “the electronic text,” the textualization of the world — the idea that everything was being digitized, being translated into information, and that in situations where there were no sanctions or constraints, that that information was being used to survey and to control rather than to empower and to teach and to learn and to grow together. But in the 20th century, when I wrote that book, these issues of capital and digital were targeted on the economic domain. They were targeted on our roles as workers and, you know, employees.
That was a story about how, if we don’t pay attention to the opportunities in what I called an information workplace, where, you know, people need to get educated to be included in this kind of work, and it’s a more intellectual work, and it’s less hierarchical and everyone can make contributions — if we don’t pay attention, it’s just going to be top-down information, and it’s going to be used for control and surveillance, and that’s what I call the information panopticon, which was the end of that book.
And I talked about either going this way or that way, there’s neoliberalism and re-engineering and shareholder value maximization, and no one really giving a toss about educating the workforce and really taking advantage of these new technologies. It was all about offshoring and outsourcing and cost-cutting and down, down, down, down. So, of course it all went to the surveillance things, so that now workplaces have really become the laboratories for surveillance capitalism because that’s where we have habituated human beings to perpetual surveillance.
Amazon’s a good example of a pioneer in that, to your point earlier. They may have been late to it with consumers but they were absolutely pioneers of it in terms of their own workforce.
Oh yeah, and just about everybody is. I mean, I’ve been on a book tour. We’re just going in and out of buildings for interviews and you gotta — you’ve got the ID, and you’ve got the thing, and then you have to go through this thing and that thing and that thing and that thing, and all these buildings are surveillance now. If you’re actually an employee there, everything is surveilled. Everything is surveilled. And the depth of surveillance in the workplace now exceeds anything I even imagined in the end of the ‘80s when I published that book. So that’s all come true.
Whatever darkness I foresaw there has come through and much, much more. But now, the thing is, we’re in the 21st century, and so now these same capabilities, this textualization of the world, the idea that everything is now information — well, this [has] spilled over the walls of the factories and the offices and the workplaces. If those struggles were between capital and labor, the struggles today are between capital and society. Now this stuff is bearing down on all of us because we’re the users, you know. It’s not about our roles, our economic roles as workers, as employees — now our interface with these forces of capital is just as regular people in the course of our daily lives. We’re called users.
So our app user is like the new prole — I don’t want to be glib, but it seems like what you’re describing is a major reconfiguration —
It’s not like we’re the new proletariat, but we are the new, like, target of capital, and that’s why I started out by saying private human experience. It’s our human experience, our private experience that is the new target of capital for the raw material that it’s now manufacturing into products that it’s selling.
So the capitalists came for the trees, then they came for the labor, and now they’re coming for individual, private lives. Do you have any idea what could possibly come next?
So what I want you to understand about this — because I feel like I haven’t answered a question that you asked earlier, which is why I call it surveillance capitalism and not platform capitalism — in coming for our human experience … So, let’s recognize the structure of the markets that they’ve created, because they’re coming for our human experience, they’re translating it into behavioral data, they’re creating predictive knowledge out of that and they’re selling that predictive knowledge. But they’re selling it to others, not to us. So it’s knowledge about us but not for us. These are inherent asymmetries, right? The reciprocities now are between the surveillance capitalists and their business customers who figure out that their best way to make money now, where their margins are going to come from, is knowing what we’re going to do in the future, not selling us something that is really, really terrific and is going to make our lives better.
A counterpoint: These companies are really bad at predicting what their consumers will do in the future. One of the biggest places Amazon’s advertising dollars go is into retargeting — ads showing you what you’ve already looked at. And they believe that showing you what you’ve already looked at is a good way to get you to buy more of that thing. And that’s where they get the most value now, it’s not a predictive thing, it’s just literally regurgitating what you’ve already bought back at you. Are these companies on the precipice of developing far more sophisticated predictive powers that can make use of the data they’re collecting?
Oh, well, look, online targeted advertising was the place where this started, but this is not the place where it ends. It’s like saying that mass production was only relevant to make you Model T’s. This logic is spreading to all these different contexts, and all these different predictive markets, so — let’s go back to the Model T, for example. Three or four months ago, the CEO of Ford says there’s a global auto slump, it’s really hard to sell cars, but we’re getting downgraded in the markets. He says, we want price-to-earnings ratios like Google and Facebook. How do we do that? We’re going to become a data company. We got 100 million people driving around in Ford vehicles. And what we’re gonna do is, we’re gonna now figure out how to get all the data out of this driving experience.
So this is the telematics, this is the stuff that, you know, we can not only know how you’re driving and where you’re driving, we can know the gaze of your eyes — and that’s really important for insurers, to know if you’re driving safely. And we can know what you’re talking about in your car, and, like, Amazon and Google and so forth are already in a contest for the car dashboard, because that’s a way of hearing what you’re talking about and knowing where you’re going. So they talk about, you know, shopping from the driving wheel. So now the automobile itself becomes this little surveillance bubble. We can get all of this information, from your conversation to your shopping to where you’re going to what you’re doing to how you’re driving. And this has predictive value for all kinds of business customers.
It also seems to me, though, to be contingent on the capacity of these companies to make technological developments. How do we know that the surveillance capitalist is going to actually have power going forward? It’s scary, and they’re doing a lot of excessive data collection already that violates any conceivable norm, but it also seems like these companies are struggling to develop the power to get even the data that they claim to want in the future.
Well, like I said, this is dynamic. This is where we are now: that for a product company to say our margins and our revenue growth are not going to come from our product, it’s going to come from the data that we can scrape from the people that use our product, because those data are going to be valuable and lucrative in these secondary markets that want to know where we’re going to shop, where we’re going to do, what we’re going to do … This is now where capital is shifting to create these new capabilities, and it’s the telematics, and you read about what’s going on in insurance and stuff is very real. You’ve got all the consultancies lined up, advising the insurance companies. This is what is dynamic, this is what is taking root now and flourishing now. Began in the online world, it began in the tech sector, it’s migrating through all these different sectors.
What you’re seeing is a shift to this idea of, it’s not a product, it’s a smart product, it’s not a service, it’s a personalized service. This is the shift to surveillance capitalism, it’s a shift to this sort of parasitic form that instead of, “We’re putting all our effort into making the best product for you,” or the best service for you, and really solving the problems in your life and really making your health better, helping your financial situation and creating more employment opportunities, you know, it’s this …
I guess part of what I find curious is that some businesses are really good at this and others are really bad. You use these apps and they’re all terrible. They’re not great. Do you think that these laggards in surveillance capitalism pose any threat to the viability of that economic logic?
We’re not anywhere near the endgame, you know, and people say to me all the time, well, it’s too late and we can’t do anything about it, how do we ever undo this? And I say, no, no, no, that’s not where we are.
Mass production began as a very violent form. And there was no law to constrain it or impede it, and, there were unsafe working conditions and people were paid slave wages and children were working in factories. And it took decades and public contest and mustering democratic resources and eventually law and regulation that shaped and tamed it and made it something that was actually palatable for society in a relative equilibrium that we call market democracy. And I believe that surveillance capitalism has gone very far in some sectors, and is moving across other sectors. That’s why Ford to me is such an iconic example: This is what businesses are looking to, here’s where the margins are gonna come from, this is gonna be the new solution for how we make profit in the information age. The dominant form of capitalism is actually something where the main thing that we sell and buy are behavioral futures.
So this is why it’s so critical for us to understand this, because we’re at the beginning of this art, not the end of it. This has developed over the last 20 years with barely any impediment from law, with barely any impediment from regulation. It has developed while democracy has slept. And I give — I explore 16 reasons, in-depth, including some historical reasons why that is the case, why it has had basically a 20-year free run to develop.
But my argument is, this is why now, you know, the time has come where enough of this is substantiated out there that we can understand it, that we can see how it works. We can see its aims and its goals and its results. When I write about the Facebook contagion experiments and Pokémon Go, I write about these as sort of population-scale experiments in how to do surveillance capitalism in the real world, at scale. So Pokémon Go is a dry-run for the Google City. Right? Google City is a population-level experiment in how you herd and tune populations, modify people’s behavior, to steer them toward guaranteed commercial outcomes, so that the game is really based on Pokémon Go’s Niantic Labs which, of course, is a Google-incubated operation. It’s hosting its own behavioral futures markets, it has business establishments, McDonald’s, and so forth, paying it for footfall in exactly the same way that online advertisers pay for click-through rates.
More than Amazon marketplace customers pay to play on that platform —
That’s right. But now we’re in the real world, and we’re learning how to tune and herd people through the real world, through their real lives, in their real cities, to the places where now Niantic Labs gets paid because you showed up there, and you bought a pizza, or you bought a drink, or you bought a burger, or you stayed for how ever many hours, and you played in the Pokémon gym or whatever, in the toilet.
So this is the template for the Google City, the smart city, where the whole digital architecture becomes a global means of behavioral modification that we use to tune and herd populations toward guaranteed commercial outcomes in the city. And this is why citizens in Toronto now are contesting the idea of Google having the waterfront, because the whole idea is now these computational systems replace politics, and the whole idea is that they’re gonna gently, with a smile, tune and herd people toward the outcomes that serve Google’s commercial goals and the commercial goals of its business customers in these prediction markets.
It sounds almost like it’s a means of governance, as an algorithm.
It substitutes computation for politics, so it’s post-democracy. It substitutes populations for societies, statistics for citizens, and computation for politics, and so I read a lot about this experimental zone, the Facebook contagion experiments, where another experimental zone where when they wrote up those experiments in scholarly journals — very smart data researchers from Facebook combined with very smart academic scholars boasted that now we know that we can use the online world to create contagion that changes behavior in the real world. The first case it was voting, the second case it was emotional. And they bragged in their articles that we can do this in a way that bypasses the awareness of the user. Right? Always engineered for ignorance. Because you know, that’s the surveillance essence of this, you can’t do this by asking permission. You can only do this by taking it in a way that is secret, that is hidden, that is backstage.
I feel like I’d be remiss if I didn’t bring up China. To what extent do you think China offers a model of what the future may look like or how it’s distinct from it?
You’ve asked the question at the perfect time in our conversation, because we’re talking about the substitution of computation for politics, which is to say, for democracy. I’ve just been talking to you about how surveillance capitalism commandeers the digital infrastructure as a global means of behavioral modification, and I ask the question, what is the power to modify population behavior at scale? What is that power? And I answer the question. This is not totalitarian power. No one’s coming after us to murder us, to put us in concentration camps, to throw us in the gulag, to, um, to control us through terror.
No, they just want our money.
Really, they just want our data. They don’t care what we believe. They don’t care if we’re happy, they don’t care if we’re sad. They don’t care if we’re in pain, they don’t care if we’re in love. They only care that whatever we are and whatever we do, we do it in a way that interfaces with their supply chains. So that they’re getting their data flows. They are in this structure, fundamentally indifferent to the content of our behavior. They just want to have the data from our behavior.
I call this instrumentarian power, in contrast to totalitarian power. Instrumentarian power. Two reasons: One, it relies on the instrumentation of the digital milieu because it’s through that medium that we are being tuned and herded and shunted and coached and modified. No one’s coming up to like, you know, slap you or kill you or hurt you …
It sounds kind of totalitarian, in that people are just stripped of agency.
Well, that’s a piece of it. But “totalitarian” is a very specific thing. It’s a centralized totalitarian power that is specifically understood as functioning through the mechanisms of terror and murder. That’s what totalitarianism is. Instrumentarian power wants to control you, but it doesn’t care about hurting you. It wants to control you toward its guaranteed commercial outcomes. It’s making you, you as Noah, are simply instrumentalized toward the outcomes of its business customers, right? So you’re a means to others’ commercial ends. You’re instrumentalized and you’re surrounded by this milieu of instrumentation that is sort of hands-off. It’s your dishwasher, and your television set, and your car and the telematics, and your phone. It’s this whole digital surround that is now the instrumented medium that is producing the knowledge that creates the opportunity for the power to modify your behavior. So, um, all right, got to go back to the question that you just asked that I was answering …
China.
Oh, China, let’s go to China! China, China, China. All right. So what I’ve said is that this is a kind of power that has never existed before, just as totalitarianism, when it emerged in the 20th century, was a kind of power that scholars had never seen before. So we have this instrumentarian power, and it’s coming out of the private sphere, the commercial sphere. It’s extraordinary power, under the auspices of private capital. And one of the things that it wants to do, that it aims to do, is to substitute computation for politics.
That’s our pivot. Now look at China. What is China? China is an authoritarian state, not a democratic state. So, authoritarian state. Inside China, these internet companies which amassed also enormous, enormous instrumentarian capabilities, enormous asymmetries of behavioral knowledge, predictive power, analytical capability, able to reward and punish behavior in these very finely grained ways. Buy this, discount that, so and so forth. An authoritarian state sees these instrumentarian capabilities and says, “This is perfect. We want to take these, not terror, not murder, not the gulag. We wanna take these instrumentarian capabilities and now pivot them to the political and social outcomes that we as an authoritarian government seek. This is how we want our population to behave and this is how we’re gonna discipline our population. And these are the parameters we’re gonna give them. And this is how we’re gonna restore order in a society that has fallen into chaos because of the complete destruction of social trust in the wake of decades of the Maoist project.” So, what you see in China is the marriage of authoritarian state with instrumentarian power. And that is a very dark and dangerous endgame.
You describe it as an endgame, and I don’t want to conflate things, but having earlier described the moment we’re at as an initial stage of surveillance capitalism —
A dynamic stage.
Dynamic.
Not initial, necessarily, but dynamic.
And China represents one place where the dynamic could change and where —
And what’s the key variable? The key variable there is democracy. That’s why democracy is such an important concept in this whole conversation. We were talking about the smart city, the substitution of computation for politics and ultimately for democracy, and then you brought up China. I said, “You brought it up at the perfect time,” because this is so important. This is where these pieces connect. Because if we allow computation to substitute for politics, and we allow statistics to substitute for citizens, and we allow populations to substitute for societies, we are destroying democracy as we know it. And if we destroy democracy, all we are left with is this sort of computational governance, which is a new form of absolutism, Noah. It’s a new form of absolutism, computational governance. And that absolutism becomes its own opening to authoritarianism, because now we have eroded and weakened democracy, and hollowed out the resources of our democratic institutions in favor of this sort of vision of a totalistic, certain rationality that optimizes all population behavior toward perfect outcomes, be they commercial or functional or political.
So, this is what surveillance capitalism represents. And this where my book ends up, and the final chapters represent a profound threat to democracy that goes beyond the brief of just another capitalism that makes a lot of money. It increases mass consumption, and employs a lot of people, and gives consumers what they want. Now we’re talking about a form of capitalism that in order to fulfill itself, its own imperatives of scale, scope, and action, its own imperative of prediction, in order to fulfill its own imperatives, ultimately it replaces society and replaces politics with these computational principles, which right now are aimed toward commercial outcomes.
But those principles could very easily shift.
And that’s what Cambridge Analytica taught us. These were the people who worked out the five-factor personality model on Facebook profiles for years and years, and understood the huge predictive value from behavioral surplus drawn from Facebook profiles. Now, under the new regime of Robert Mercer, the billionaire who bought Cambridge Analytica and owned the Trump campaign, under these new auspices of plutocracy, Cambridge Analytica takes a day in the life of any self-respecting surveillance capitalist, just the ordinary mechanisms that are being used to do this work every day, and just pivots them a few degrees from commercial outcomes to political outcomes, and uses all the same methodologies to use the online milieu to affect real-world behavior. But this time it’s real-world political behavior, not real-world commercial behavior. So, that’s what Cambridge Analytica taught us.
The way this can be repurposed for political ends, not commercial ends.
That these capabilities of mass behavioral modification, instrumentarian power that can be used to tune and herd populations toward commercial outcomes in the real world, in the service of growth of surveillance revenues, can be repurposed for political ends. The Chinese are doing it in the service of an authoritarian state, and what we saw in America already is that anybody with enough money, any ambitious plutocrat, can buy the skills and the data to use these same methodologies to influence political outcomes.
And so, what is under siege here is democracy, from two directions. The structure created under the auspices of private capital that is based on unprecedented asymmetries of knowledge — knowledge that is about us but not for us, that gives rise to unprecedented asymmetries of power, a power that is able to shape our behavior but that operates outside of our awareness and is designed to operate, always keeping us ignorant. Engineered to keep us ignorant. We’re entering the 21st century in this institutional setting that introduces a whole new axis of social inequality. Not just economic inequality now, but these profound inequalities of knowledge and the power that accrues to behavioral knowledge that can actually influence our behavior, influence the behavior of our group, of our city, of our region, of our country, of our society. That’s at the institutional level and that is disfiguring 21st-century society, right from the beginning.
At the individual level, we are the so-called users whose behavior is being intervened upon, touched, coaxed, modified, contaged, influenced in all of these ways that are designed to be undetectable to us, to bypass our awareness. So, this is an intervention at the level of human agency, at the level of human autonomy, at the level of individual sovereignty, at the level where we expect to have an idea about our future that we act on now — I’m gonna meet Noah, so I get up early in the morning and I do a bunch of things that are gonna get me to the place where eventually I’m gonna meet Noah, because that’s what I choose to do. That’s my action. That’s what we call free will, that’s what we call agency, and that’s what we call moral autonomy. We can’t imagine a democratic society without human beings who can act this way. It was part and parcel of the conception of a self-regulating demos from the beginning.
Now we’re entering the 21st century with these threats to democratic society coming both from the structural level and from the individual level, the level of our intimate behavior. And when you put these two things together, that becomes something that’s, you know, bigger than what we normally associated with capitalism. As we said at the beginning of our conversation, Noah, that [was a] capitalism that bore down on the economic domain and it was in our workplaces and in our economic roles. Now, this thing, because of its drive for totality, for total information, for total certainty, for scale, for scope, for action, these imperatives that drive it … It’s into society, it’s into every aspect. When Eric Schmidt heard that the Toronto public officials had given Google the go-ahead to take the part of the waterfront to turn it into this smart city thing, I might be slightly paraphrasing, but Eric Schmidt’s words were, “Oh great, now it’s our turn.”
I remember this.
I quoted that because that seemed so pregnant to me, because “our turn” meant something different from democracy’s turn, like, democracy’s turn was up and now it’s our turn. The turn of private capital, the turn of private surveillance capital. What I want our readers to know is that in the arc of human history, the idea of democracy, the autonomous individual, individual sovereignty, moral judgment, these ideas are young ideas, they’re five minutes old. And humanity sacrificed, for many millennia, for the legitimacy, ultimately, of these ideas. And for us to say that, you know, democracy’s time is up and now it’s surveillance capitalism’s turn, it’s, to me, an abomination. That is an intolerable idea. What we should be doing right now is doubling down on democracy. Every generation has a responsibility for democracy. Every generation has a responsibility to keep it going, to keep that wheel turning, to keep it flourishing, to keep it from dying, from falling over, from breaking, from falling vulnerable to the forces that always want to supplant democracy with certainty and financial advantage and power and so forth.
Surveillance capitalism really, as we said before, it moves beyond the economic domain, to the societal domain, into our lives, threatening the very basis of a democratic social system. And that’s where, to me, we now recognize that the time has come for us to name it, to understand it, to recognize that it’s an economic logic that is not a necessary consequence of the digital or of the platform mechanism and that this is the time when we now, through understanding and through naming, we change our consciousness, we turn to our democratic institutions and reclaim democracy as a source of the law and the new regulatory institutions that are gonna intervene upon, interrupt, and outlaw these mechanisms, so that the age of surveillance capitalism turns out to be a short age. This is not legitimate. And the choices with which it has left us, as 21st-century citizens, are not legitimate choices. We should not be having to choose between having our experience scraped for other’s purposes and the basic requirements of effective social participation.
What do you think of the emerging political currents aligned against these kinds of companies pursuing this economic logic?
I mean, the great news is that we now have a political discussion going on. And we haven’t had that because there has been so much political capture. Google is the largest lobbyist, Google made itself invaluable in the election process, actually getting people elected.
It also struck me that these were people who were popular in the public imagination because they were paying people really well, and they were offering services that were free, and so on. There was a cultural cachet that’s now cracking. What do you make of the efforts to build political agendas meant to further destabilize it?
Politics is the realm where this is gonna get fixed. It’s not surprising that the loudest political discussions are beginning on the left. I’ve just come back from several weeks in Europe, and there are discussions in Brussels where it’s clear that we’ve gotta confront surveillance capitalism and that’s gonna go beyond the GDPR, it’s gonna go beyond privacy, and it’s gonna go beyond antitrust, because, you know, you take a big surveillance capitalist like Facebook or Google, you break it up, and then you’ve got four smaller surveillance capitalists. Antitrust, privacy, these are critically important, but they’re also 20th-century paradigms that aren’t gonna take us all the way into intervening upon and interrupting and outlawing these new operations. So the political discussion is key.
Ultimately, I see it as a political discussion that’s gotta be driven by a sea change in public opinion. Because everywhere I go, people are fed up with this, but they haven’t known how to name it. It’s a sense of unease, of anxiety, of loss of control, a sense of being manipulated, a sense of loss of freedom. A sense of powers that we don’t understand. And as they are able to name it, the sense of its intolerability becomes truly palpable. These shifts in public opinion are what ultimately are gonna force elected officials to pay attention. This is going to be a new era of law, a new era of regulation, and it’s got to be designed for this 21st-century situation, where we have unprecedented economic mechanisms that are going to require unprecedented legal remedies.
At the same time, I also believe that once we begin to double down on democracy and democracy sort of awakens, the sleeping giant of democracy awakens to actually confront this stuff, we’re creating space for new competitive solutions. Because we need different kinds of companies, and different kinds of capitalists, and different kinds of ecosystems and alliances that are actually capable of reclaiming the digital for the kind of values and functionality that we wanted from it in the first place. Every single piece of research, going all the way back to the early 2000s, shows that whenever you expose people to what’s really going on behind the scenes with surveillance capitalism, they don’t want anything to do [with] it. The only reason we keep engaging with it is because we feel like we have no choice. So, as soon as there is real choice for people that can also provide effective action, these new competitors have the opportunity to have literally every person on Earth as their customer. I mean, that’s really what it’s about right now.
Another way to look at surveillance capitalism, from the market point of view, is as a colossal market failure. Because it is not giving people what people want. It’s giving business customers what they want, to be able to manipulate people, but it’s not giving actual populations of people what we want. So, it has severed the traditional reciprocities between capitalism and its societies, where it counts on its society as a source of customers and a source of employees, and reoriented itself toward these prediction markets of business customers. But in the process, you know, our problems are not being solved. Our health-care problems are not being solved. Our climate problems are not being solved. Our employment problems are not being solved. How to build a better car, how to build a better building, how to make a better city that is democratic and still contains pollution and all those other things. These problems are not getting solved, surveillance capitalism is scraping off data for other purposes that are not solving our core issues.
The trade-off that we’re sort of baked into believing is that, “Well, the price I pay is that I’m giving up a bunch of shit I don’t know about because I can watch The Simpsons whenever I want with maximum convenience.” We’ve sort of implicitly accepted that there’s a trade-off already. We didn’t even have to be duped, we didn’t even have to be presented with a choice because it was made for us and we seemed okay with it. I mean, that’s the image that’s presented. I don’t quite buy that.
That’s the image that’s presented, but that’s because everything that’s inside that choice has been designed to keep us in ignorance.
We download a diabetes app. Just downloading it allows x percentage of the diabetes apps to take all the contacts on your phone, x percentage take the contacts and the mic, x percentage takes the contacts, the mic, and the camera. The stuff that they’re taking from you has nothing to do with the diabetes functionality for which you downloaded the app. Absolutely nothing. It’s simply siphoning off data to third parties for other revenue streams that are part of these surveillance capitalists’ ecosystems. This is a hidden world that is constructed around us in these elaborate supply chains. Everything that we’re touching that is internet-enabled is essentially a supply chain interface, but it is all designed to be hidden.
This is a conversation that hasn’t happened yet. And if there’s a reason why people are really digging this book I’ve written [for which] I’ve spent so long trying to discern these mechanisms, name them, and bring them forth so that people can have them to think about and talk about, it’s that all of a sudden, here’s this thing I’ve been feeling, this problem I’ve been feeling that I didn’t really know what it was and I didn’t have a name for it, and now I can name it. And when I can name it, I can think about it, and there’s a language. And once you have that, really, the landscape changes and the power dynamics change. And I don’t mean just my book. There are many other inputs to this process. But as we begin to name, this whole power dynamic is gonna change, because this stuff can no longer get away with being secret, it can no longer get away with being surveillance. And when the jig is up on that, then we’re cutting at the core of its economic umbilical cord.