In The Social Dilemma, the documentary that’s kickstarted a very important conversation on social media’s harmful effects on society, Tristan Harris not only pulled some neat coin tricks but gave us some chilling insights on how we’re being manipulated by Big Tech. Harris, a former Google executive and currently president of the Center for Humane Technology, tells Rohit Saran and Neelam Raaj that he hopes the documentary will be a much-needed pause button that will give us time to think before we mindlessly click
The documentary has been quite a hit across the world and in India too. The fact that big tech industry insiders “confessed” to the damage they’ve done, albeit unknowingly, has made people sit up and take notice. Has the response taken you by surprise?
I heard that the film was briefly the number one Netflix film in India, and that was so surprising and exciting to me because this was intended to be a kind of a global silent spring or ‘An Inconvenient Truth’ moment for the tech industry. An Inconvenient Truth (Oscar-winning documentary) was about this global existential threat of climate change. And, much like climate change, this documentary is based on another kind of extractive economy. Instead of extracting oil from the ground, it is extracting attention from very fragile social fabric and minds, and when you have an extractive economy, that’s built on commodifying, depleting and extracting from our minds and strip-mining cognition, you’re going to get a kind of social climate change and this kind of digital fallout. It’s great to hear that the film was received well in India…
Tristan Harris on social climate change
Just like climate change is the result of depleting natural resources that cannot be renewed and replace, social climate change is caused by extracting and depleting our attention and mind space.
I hope the film creates a new shared conversation, a shared reality, a shared understanding about a common thing that has happened to us, which is a breakdown of our shared fabric of reality. I don’t know the equivalent metaphor in India, but in the United States, I would say something like, we’ve been bombed by a business model. There’s been kind of a Pearl Harbor attack that has leveled our cultural infrastructure. And, and I think that, again, not because of these evil tech engineers who wanted any of this to happen, but through a slow accruing of decisions. It’s not that the people that you saw in the film have all the answers but it’s meant to catalyze a global conversation about what are the answers. Each country, whether it’s Myanmar or Brazil, is experiencing this very differently but the point is that we’re all caught in it. Even those who disconnect from Facebook are still living in a country where everyone else is still on Facebook or WhatsApp and the votes and the election will still be determined by these forces. So we’re all in this together in a sense.
The documentary is clear that we need laws and regulations for social media. How do we go about it?
In general, what we’ve seen is that when something like the GDPR (general data protection regulation) is passed in Europe, Facebook and other companies have to implement all of these changes to their platform. And then it becomes easier to turn that same set of data protections on for other countries, because now they can’t say, well, we can’t do that. And it really is about governments forcing them because they’ll keep saying, Oh, we can’t do that, that’ll cost too much, or we could never afford that many content moderators. If you force them to do it, they will find a way to do it.
Tristan Harris on regulating social media
“Social media companies’ profit is directly tied to the mass manipulation of human behaviour and beliefs…. If their business model is attention, then it’s not about truth, it’s just about what gets attention. And making up or exaggerating or distorting facts to serve political purposes is always going to be better for getting attention than delivering the truth.”
You’d be surprised what people figure out when you finally impose the constraint. Obviously, it’s better when those regulations start to align with each other because there is a shared conversation. And so I think that we need to get aligned on the issues. Since we’ve been talking about data protection, obviously, it’s important. And while we do need better privacy and data protections that goes into limiting the models — the behavioural, predictive voodoo doll avatar models that they can create of human beings — we’re going to also have to think more about changing their fundamental incentives so that they’re just not operating on an extractive logic.
One joke we used to say is, Facebook is like a priest in a confession booth, where they are listening to 2 billion people’s confessions, and all their private thoughts, but then they also have a supercomputer next to them. And so as you’re making confessions, they’re actually making new predictions about the next confession you’re going to make before you know you’re going to make it and then they sell access to the confession booth to some third party to manipulate you based on that.
Tristan Harris on FB ‘selling’ your confessions
Data protection and privacy is very important but no less important is to prevent the extractive model social media companies use to predict our future behaviour.
Any talk of regulation is met with cries that freedom of speech is being infringed on. How does one deal with that?
So if I were the tech companies, and I wanted the conversation on regulation not to happen, I would strategically move the entire conversation about regulation into the domain of freedom of speech, because I know that it would make it impossible for us to proceed because it’s such a contentious topic. But I think there is a clear distinction between freedom of speech and freedom to reach. We can each have a right to say something to other people but do we have a god-given right to broadcast to millions of people without accountability. A 15-year-old influencer can now reach as many people as a big newspaper but with none of the same responsibilities as a newspaper or broadcaster.
I mean, in general, the philosophical principle is, with great power comes great responsibility, and the greater the power, the greater the responsibility but what we’ve done In the way we’ve designed technology is that we decoupled power and responsibility. To use an example from the US, YouTube recommended a conspiracy theorist by the name of Alex Jones, who has this programme called Infowars, which is just making up things constantly. In fact, when there were the Sandy Hook shootings in the United States, he claimed that they were all crisis actors and parents who had literally lost their children’s lives were being harassed by his audience, because he was spreading these things. So as an example, this person, his videos were recommended on YouTube 15 billion times! So if you think about that, that’s more than the combined traffic of The Washington Post, The Times of India, Fox News, MSNBC, like all of these channels combined, because YouTube creates this aggregate mass broadcasting power, and there’s no responsibility about what you say.
Tristan Harris on freedom of speech & freedom to reach
“With great power comes great responsibility, and the greater the power, the greater the responsibility but what we’ve done In the way we’ve designed technology is that we decoupled power and responsibility.”
Not only are these platforms not accountable for the harms that they create, they don’t even have transparency. We don’t know how many users are kids who are using it in problematic or highly addicted ways, though I want to clarify here that addiction is one of the problems and it’s not the main one. But unlike, say, an alcohol, or tobacco company who puts a product on store shelves but doesn’t actually know which people are “addicted” or “underage, platforms like TikTok and YouTube do know how it is being used or that the user is underage. But what if the platforms were required to say, look, of all of our underage users, this is the kind of percentage that is using it in problematic ways.
Instead of just reporting to a board of directors, they could be reporting these harms to a board of the people because they have to be democratically governed in the long run, and they have to be serving the public interest. We’re never going to solve problems if they’re just maximising for profit, and profit is directly tied to the mass manipulation of human behaviour and beliefs. So we have to also change the fundamental governance structure. I know that sounds like it’ll never happen, or it sounds too big. But I think that this is the conversation that the film is forcing. if the business model is attention, then it’s not about truth, it’s just about what gets attention. And making up or exaggerating or distorting facts to serve political purposes is always going to be better for getting attention than delivering the truth. As Sandy (Parakilas) says in the film, ‘The truth is fairly boring’.
Haven’t the social media and search algorithms pushed fourth estate to follow the same social media model of clickbaits and page views?
That’s a very good point and one which is not well underlined in the film. These platforms have undermined the fourth estate of democracy. But first let’s go directly to another critique. Most people would say, well, hold on a second Tristan, weren’t television, radio and newspapers always competing for our attention? It’s not just these technology platforms. It’s not technology that’s really the problem here, right? But look at the platforms through which television, radio and especially newspapers are competing, it is through social media. Social media has organised the game in terms of clickbait, so even the best newspapers in the world are having to succumb to this kind of clickbaitification.
So one of the subtler effects we can add to the balance sheet of harms, is that it’s ruined journalism. We have shorter attention spans, more rapidly published stories with less fact-checking.
Tristan Harris on the curse called click standard
“The faster you publish, or exaggerate a set of facts, the more rewards you get, the more likes and more followers and more boosting and you feel even better. …you could be increasingly wrong, but you get all this positive feedback that you’re right.”
Twitter has decentralised yellow journalism. So each of us are the yellow journalists who are incentivized to say in all caps, BREAKING: this thing just happened, or Trump or Modi said this. The faster you publish, or exaggerate a set of facts, the more rewards you get, the more likes and more followers and more boosting and you feel even better. And you can be fed all of that feedback in your own little echo chamber where you could be increasingly wrong, but you get all this positive feedback that you’re right. And that is so subtle, but it has warped our entire media landscape, where we are really decoupled from reality.
Of course, this is not an easy problem to solve, but we have to move away just like we moved away from the gold standard, where our entire economy is linked to physical gold, we have to move away from the click standard. And you know, I’ve often joked that if I were to write a book, it would be called The Click: The Mistake that Turned the World Upside Down. It is basically about who can get a human nervous system to emit a signal that leads to your finger pushing down.
A panel led by Democrats has suggested breaking up Facebook and Google as one way of checking their monopoly power. What is your take?
I’m not an expert on those things but I do think that many people have rightly pointed out that if you just break up Facebook and Google into let’s say, 1,000 different entities that have the same economic mass manipulation incentives with the same advertising business model, you can still end up with still this great quagmire of mass distortion of culture. So, that’s where the counter critique often comes in. I think we need common standards and ethical building codes just like we have for buildings. If you use the urban planning metaphor, right now, it’s kind of the Wild West and whoever wants to build whatever form of environment does so and there are no rules so it kind of turns into Las Vegas, and everyone’s doing the race at the bottom of the brainstem. We need urban planning to build a more livable and human digital environment since more and more of our life is actually taking place in this virtual digital world, especially in Covid times.
What about WhatsApp? That doesn’t seem to have an advertising model and yet in India, fake news has led to a very real human cost before the pandemic and during it. But it didn’t find mention in the film.
This is one of the critiques of the film. WhatsApp doesn’t have ads or feeds but the problem is that there has been a decentralization of trust and authority and more people are getting information from those around them. The psychology is that if lots of other people are saying this is true — group think or social proof — then I start thinking it is true. You can create a kind of surround sound container and that’s also based on the hacking of human vulnerabilities except in this case, it’s not an advertiser but our organic practices. For instance on Covid, I think I am not an expert so I won’t be sharing information on the disease but in a world where everyone feels compelled to share information first, we decentralised the basis of who is sharing, who is speaking to us.
And even if you put limits on forwarding, there’s still a viral epidemiological rate for the spread of misinformation.
And each of us are the carriers of the virus. And then you have super spreaders. You have those certain people that just are constantly sharing links and then wherever they go, they’re shedding biases. And so instead of shedding the virus, they’re shedding biases into everyone else around them. And I think that’s a helpful way to think about the metaphors that we need here, because it is about epidemiology, but information epidemiology.
Tristan Harris on fake news super spreaders
By indulging in mindless forwards and likes, we behave like super spreader of fake news virus.
Imagine we never had WhatsApp, and people chatted on video calls, I don’t think we would have seen this mass breakdown of shared realities because people who are yelling crazy things would appear crazy to us and we would be able to use our own native trust interpretation.
And though it doesn’t have a business model, WhatsApp is still part of the infrastructure. So for example, just yesterday someone messaged me on WhatsApp, then suddenly all of their posts showed to me on Facebook or Instagram because they’re building a link between the two. And so eventually it all links together, Facebook, Instagram and WhatsApp, and they’re wiring them all together to make it harder to break them up.
Are you worried about the forthcoming US elections or you think there are enough safeguards in place?
Just to zoom out, Facebook manages something like 80 elections per year around the world. Think about how many people that they have on staff, and how many of those even speak the languages of the countries that they’re in. From the film and other news stories, we know about Myanmar and the Rohingya killings. That’s an example of digital colonialism — they rush into a country, they do a deal with a telco provider, so that when people get a phone, they get a Facebook account. In the case of Myanmar, the country was not even online at all so they had no internet literacy. So then you have people rushing into a new infrastructure, sharing information wildly without any of the knowledge about whether a website is authoritative or not. What happened in Myanmar is happening in dozens of other countries. Ethiopia is one of the countries that’s on people’s watch as a kind of Myanmar number two. There’s something like six different major tribal languages there. And Facebook doesn’t have moderators for all those languages, obviously. And you have people sharing whatever hysteria and misinformation that they want.
Tristan Harris on FB’s influence on elections
“I believe Facebook’s civic integrity team is actually funded by their anti-trust budget, meaning that their efforts to do integrity work on these different countries is backed by their need to protect themselves from government antitrust actions.”
I believe it’s the case that Facebook’s civic integrity team is actually funded by their anti-trust budget, meaning that their efforts to do integrity work on these different countries is backed by their need to protect themselves from government anti-trust actions.
I think of it like a stove where you have like 200 countries and you have all these different stovetop pots with boiling milk. Suddenly, a bunch of them bubble up and start overflowing. And that’s when you’ve got a major civic conflict situation going on. But then by the time you rush in there, you only have so many resources, so many cooks that can try to go in and fix these things. So ask yourself, is this the structure of a system that can protect any election? You know the largest countries who can speak the loudest and have the largest resources and the largest press are the ones that get the most attention. So does Facebook have a US election war room that they’re working eagerly to try to protect that election? Absolutely. Are they trying to change the ad policies? Yes. Are we, though, 10 years into a mass polarization, mass breakdown of truth, mass conflict, escalating environment on Facebook in the United States? Yes. And most of the damage has already been done, which is why I’m actually excited about The Social Dilemma as a kind of a pause button. Because if we all become aware that this is what’s happening, we can pause it and say, let’s go back and be more skeptical of what we believe in.
What about better and kinder social networks emerging?
For many years, people would say, why don’t you just start another social network, you know? And we would say, well, of course, that’s not going to work because you’re going to have the Network Effect with the existing monopolies sucking all the attention into Facebook and YouTube, et cetera.But now we’re seeing some activity. Between Marco Polo, Telepath and Clubhouse, there are new social networks who are trying to do things differently. And before this film, they would have never been able to attract a critical mass of people because people would say, well, there’s no problem. Why would I want to switch? I like my Facebook account or I like my WhatsApp account. And I think this is the first time that there may be hope that other emerging platforms could also succeed. And I’ve never expected that to happen. And it’s actually really heartening because I think we need new blood, new culture, new young people, new natives who are trying to figure out how we can make this work better and hopefully some will come from India.