Hi everyone, I'm Emily Chang and this is Bloomberg Studio. At one point out, as many of you know, I've been covering tech from Silicon Valley for over a decade, which means I've witnessed the epic rise of so many different technology companies, watching them shape our world, both for better and for worse. So when I watched Netflix's hit film The Social Dilemma, where yours truly even makes a
small cameo, it really hit home. The film has soared to the top of most watch lists in multiple countries, waking the wider world up to the alleged perils of technology. The indictment is unforgiving that social media is a drug, creating an epidemic of addiction and manipulation and generations more anxious, depressed, and unhealthy than ever before. He built these things, and
we have a responsibility to change it. The film profiles multiple Silicon Valley veterans, including Tristan Harris, former design ethicist at Google now co founder of the Center for Humane Technology, Tim Kendall, former head of monetization at Facebook, and for We're President of Pinterest now CEO of Moment. They joined me for a virtual conversation on this edition of Bloomberg Studio.
At one point out along with Sophia Noble, an associate professor at U c l A and author of a best selling book on bias and technology, to talk about what the movie got right, what the movie got wrong, and chart a path towards the social solution. Each of you has been a voice in the conversation around big text, unchecked power, and I'm curious when was the lightbulb moment for you? What was the moment that sparked your desire
to speak out? I think the moment for me was when I was at Google, and it was about two thousand twelve, having studied at this lab at Standford called the Persuasive Technology Lab UM, I saw that more and more of my friends decisions were about building engagement and growth and less and less about a genuine, sincere intent for how do we build technology that empowers people and does good in society? Uh? And that was really delightful
moment for me. Now, Tim, you were focused on how Facebook makes money for five years and then five more years at Pinterest. That's a decade. When did things shift for you? They really shifted When I had UM, I had my first child, UM, and I realized, I think, as I was trying to manage, you know, the job of pinterest, trying to be a dad, and then my phone. I realized that I just lost control over my phone
and and and realized that it was controlling me. Maybe what we were up to in terms of extracting attention, as as Tristan will say, UM might not might not have a very happy ending on a lot of different dimensions, including you know, our own mental well being, but also the impact that ultimately it could have on society. Tim, you recently testified before Congress and said that you believe social media is driving us to the brink of civil war,
which is quite dramatic. What do you mean by that, Well, I just think that the services are polarizing us really on almost every dimension. Um. They're polarizing us politically, They're polarizing us in terms of how we view health issues like COVID. They're polarizing us racially, I mean, and in a sense like I think it's fair to say that all of these issues around polarization existed to some degree before social media. But I think we can we can
all agree that that polarization has accelerated. In fact, I'd say the acceleration is accelerating, which is which is really scary. It seems to get worse every year and every month. We seem to be more and more divided. And I think that the violence and division that we're seeing, I believe these are precursor events to broader civil unrest that could ultimately lead to civil war. I mean, that's quite
that's a huge statement. I mean, Sophia, would you agree what we're talking about our histories of violence and oppression in our country, that social media exacerbates and weaponizes against the most vulnerable, against people who have historically been in the crosshairs of violence in this country. It's really about the society and the people, and then what the technology
is contributing to you or not. Now there's a tragic irony in that most of the people tasked with talking about the problem the social dilemma in the film are white men. Tristan and Tim are two of them. Um and white men are overrepresented at the companies that are building these products and creating UM part of the problem.
As a woman of color, what would you have added, Well, I would have backed up probably twenty year in their research, because again I think that women in particular and people of color have been doing the research in this area
for a long time. It's a little bit disheartening for some of us to see the makers of the problems in terms of people really actually working on product development all of a sudden see the light when it starts to affect their own communities or their own experiences, or maybe their own moral um barometer, and then UM kind of really narrow the conversation. I think there's an opportunity with this film to in fact open up um the landscape for fifty more films and books and graphic novels
and television shows, for many morning more voices. And I think that's really the opportunity with a film like this. It's, you know, it's kind of cracked open the can, but there's a lot lot more to pour out. How much of this is technology and how much of this is just human nature that technology and that these social platforms
simply amplify. There's a point in the film where Tristan talks about how, um, it's not so much the technologies per se, it's that the technologies really bring out the worst in people and amplify the worst tendencies in people.
Racism is is endemic, and it's been around UM for thousands of years, but you know, we've really started to see UM hate and violence around these issues just just explode by virtue of these tools that that bring out sort of the worst tendencies, or at least bring out tendencies that are there but are now being brought out into the into the light in a in a way
that's really corrosive. Now, we do have a statement from Facebook on the film Um Facebook saying we should have conversations about the impact of social media on our lives. But the social dilemma buries the substance in sensationalism. Rather than offer a nuanced look at technology, it gives a distorted view of how social media platforms work to create a convenient scapegoat for what are difficult and complex societal problems. Tristan, what's your reaction to that, Um, Yeah, we would expect
Facebook to respond in this way, um the UM. I think the claim that Facebook would make in that statement is We're holding up a mirror to your society. If you don't like what you see in the mirror, we're really sorry to tell you that was already there. But the mirror they're holding up is a fun house mirror that warps society because it amplifies certain characteristics and makes some way more visible than others, specifically conflict, outrage, moral righteousness, etcetera.
Because those things get more clicks. And it's this race to the bottom of the brainstem, this race to the lizard brain, whether it is racism, inequality, poverty, or climate change. If we're all walking around is outraged, narcissistic, and addicted people with low mental health and feeling overwhelmed, that's not a society that can heal itself. It ample fies whatever gets the most attention, which is just like driving down
the one oh one. And if we sort by what gets the most attention, if I look at a car crash, Facebook and Google say, oh we you must love car crashes, and they give you more and more and more car crashes. And that's kind of why our society looks like it's this. It's sort of gone crazy because that's not become just a virtual reality. We started making choices on that reality. And even if you don't use these platforms, you still live in a country that is living in that virtual reality.
And voting based on those decisions, and then making and bringing guns to street corners based on those decisions. So I think we have to really look at the entire
consequences here. This is my conversation with Tim Kendall, former head of monetization at Facebook and former president of Pinterest now CEO of Moment, Sophia Noble, an associate professor at u c l A and author of a book on bias and technology, and Tristan Harris, former design ethicist at Google now co founder of the Center for Humane Technology. Up next, can we expect the makers of social media to make changes on their own? What will it take?
Are the creators of these platforms really bad guys? I'm Emily Chang. This is Bloomberg Studio one point out. What I want people to know is that everything they're doing online is being watched, is being tracked. Every single action you take is carefully monitored and recorded. A lot of people think Google is just a search box and Facebook is just a place to see what my friends are doing. What they don't realize is there's entire teams of engineers
whose job is to use your psychology against you. That was the co inventor of the Facebook like button was the president of interest, Google, Twitter, Instagram. They were meaningful changes happening around the world because of these platforms. I think we were naive about the flip side of that coin. A number of guests in the film are asked, is there one bad guy? And they all say no, no, no, there's not just one bad guy. And at this point,
let's take Mark Zuckerberg for example. He can see the consequences of what he's built, but he is still building it. Does that mean he's a bad guy. I think it means he's negligent. His understanding is absolutely not keeping up with the impact that his technology is having on the world. And I think the best illustration of this is the election happens in two thousand and sixteen, Facebook, a number of ways, through bad actors and and algorithmic bias, tip
the election. A month later, he's asked what do you think about that? He thinks it's preposterous, And two years later he acknowledges the impact, acknowledges that Facebook likely did tip the election, and then apologizes in front of Congress. But that's two years later. Tristan, what do you think about Mark about the Google founders who have recently left the company. Are they bad guys? I think I agree with him that there's a lot of negligence. Each company
is different and the personalities that are involved are different. Um. I think that even if they were as enlightened as they possibly could be, they when they speak, they sound like hostages trapped in a hostage video, where nothing that a hostage is saying makes any sense until you see the person off stage holding a gun to their head, which is their business model. I would say Jack Dorsey appears to be trying to do the best that he can, but in a very very complex and um uh, you know,
unfortunate circumstance. And so that's why we actually need a global cultural movement to address these problems, because the state has to bind the market. This film is exciting to me because it's the first time that we have a hundred and ninety countries and thirty languages, I think, all becoming awake to these problems at a at a massive scale. Now, Sophia, here in Silicon Valley, we often glorify the founders of these companies. As a Silicon Valley outside er, how do
you think we can hold them accountable. I think that we've got to have a big time structural intervention. It's going to require regulation, it's going to require repatriation of the trillions of dollars they've made off of making these problems and putting them back into the systems that can actually take care of our society. We can't be so dependent upon UM this sector or its leaders. Tim, I'm sure you're still friends with a lot of people who work at these companies. How have but how how have
your former colleagues reacted. I don't think fundamentally anyone is a bad person who works there. Um. I do believe they're negligent. I think they fundamentally believe that that their platforms are doing net good in the world, and so they look for the data that reinforces that. Now you have had your critics, um. Brooke Hammer, Lying, a tech communications Vetterman said of a number of the folks interviewed in the film, most of them made a great deal
of money from the companies they're now criticizing. I mean they made plane money, guys. She means enough money to buy a plane. And while when you have a G five at your disposal, it's really easy to sit there and criticize. I asked them, why not give a bulk of your wealth away to fight this, to undo the damage you yourselves help create. Tim, I'm sure you made some money in that decade. How do you reconcile that? I think it's I think it's really fair criticism. I
hadn't read Brooks exact statement. Um, I've I've put ten million dollars plus into this issue, and we've given away millions of dollars to nonprofit organizations that are helping around these issues. And our intentions to keep giving away a lot of money and a big chunk of the money that we made from these platforms. So look, I think it's really fair criticism. Let's say Mark Zuckerberg wakes up tomorrow and says, I'm done, shuts down Facebook, no more Facebook.
What happens is the world better or worse off? I think it would be perfectly fine for Facebook to shut down. I'm on the real Facebook oversight board. You know. I think we're really interested in seeing some important interventions happened quickly before democracy continues to collapse as we pummeled towards the next presidential election in the United States. Um, but you know, it's I think it's more complex than just saying Facebook should shut down, although I think there are
many many people who would agree it probably should. But if we stay on the current path, where there's no oversight, no regulation, no law, no responsibility, we can be guaranteed that this is just the tip of the iceberg. So what should the laws be? What should the government do? If these companies aren't going to change on their own, then should they be broken up? Should there be new laws?
I look at the amount of regulatory reform that we need here as being on parallel with the financial crisis, in the sense that in the financial crisis, where you had thinner and thinner slices of sort of junk bonds being sold as if they were you know, high quality bonds,
leading to a full scale house of cards collapse. I think we've been selling fake, thinner and thinner slices of junk attention assets, fake attention sold for fake clicks to fake users, with fake reporting to advertisers, and we have an entire sort of democratic collapse sitting on top of the fact that this has been a predatory economy that has um based on high you know, high risk activity, recommending random things to users you don't know and you
don't speak the language to leading to a genocide. So I think if you think about the scope or reform, we're not just saying let's pass this law or that law. We're saying, how do you get something more like Dodd
Frank comprehensive humane technology reform? How do you have um, you know, comprehensive reform that enables competition, has more transparency, accountability for harms, um, tax justice as Sophia is talking about repatriating some of the money back into institutions that have fallen apart um and and have that all add up to a system of technology that reports to the people as reports as opposed to reporting to the board of directors and have them work in the service of people.
You're listening to my conversation with Sophia Noble and Associate professor at u c l A and author of a best selling book on bias and technology. Tim Kendal, former head of monetization at Facebook and former president of Pinterest now CEO of moment and Tristan Harris, former design ethosist at Google now co founder of the Center for Humane Technology. Coming up, we chart the path towards the social solution
is opting out of social media enough. I'm Emily Chang, this is Bloomberg Studio at one point out stay with Us. True to its name, the movie talks a lot about the social dilemma, but not a lot about the social solutions. As the credits start to roll, Tristan, You're you're allowed to offer one suggestion about turning off notifications, but really, in the short term, like while we wait for these big changes that could take years to happen, what do we do? Do we delete? Do we get rid of
our phones? Like? How do we navigate the day to day? Even if you decide not to use these platforms and you delete your accounts, people who are not on these platforms can still experience life or death consequences because other
people who live in their in their country do. The Muslim minority group through a Hinga and Myanmar who were murdered um didn't have to be on Facebook to be to be the to experience the consequences and persecuted because of fake news that was spread about them on Facebook. The mob lynchings in India that we're spreading fake news about this particular minority group that led to lynchings. They
didn't have to be on Facebook to experience that. And when you think about teenagers who say, hey, I would like to opt out of being on Instagram, and I don't want to do that anymore. I don't want to be on TikTok anymore, they still go to a school where all their kids and who's all the kids are are ranking each other by who's popular based on how
how successful they are on TikTok. So the point is that this has so entangled itself into our social fabric and into our social status pipelines and into the development of children that even if you opt out, that's not enough. One thing that can change at a fast enough clock rate is culture, meaning a common like cultural doctrine for a humane way to live in an inhumane technology infrastructure. And I think that's kind of something we have to
co create together. Is a non violent communication protocol or a humane protocol for living on in humane platforms. There's many, many more recommendations on humane tech dot com. I think there's a lot of ways that we need to intervene. I will tell you though, that you know, putting it back on the public and saying you know, it's your fault you didn't um take the phone out of your kid's hand, or you you know, you parent are wrong because you, you know, had too much screen time going
with your kid. I think is really the most powerfully disingenuous kind of framing there is. It's like, you know, if you if you don't have any healthy food in your community and you don't have any grocery stores, and all there is is fast food, and then you say, shame on you for feeding your kid McDonald's or corporate
industry food. It's just not fair. You know. It only took us small handful of people to really convince Congress and state legislatures in local municipalities that smoking was terrible. You know, my students can't even remember. They weren't alive at the time, but some of us on this call, maybe we're you know, I'm sure that my mom probably smoked a pack of cigarettes after she gave birth to me.
It's probable that the doctors and the nurses were smoking in the hospital in the seventies when I was born, And so we have to ask the question, like, how did we go from that to what we have now, which is a total paradigm shift, and so I agree we need cultural shift, but that was just strategic public policy that said, even if you're not a smoker, there's still tertiary negative effects that happen from second hand smoke or from these systems that right neglect UM the broader
public concerns. That's the kind of model we need to think about. And I think look at as we're looking at the larger tech sector, and I think there's a lot of promise and opportunity to make change. You've got the House and the Senate scrutinizing big tech right now. Should these companies be broken up? Yeah, I think I think there's a there's a you know, a monopoly conversation that needs to happen. I also think that there is an interesting page to be taken out of sort of
um auto manufacturers. And what I mean by that is that auto manufacturers were addicted to the fossil fuel industry, and they were incentivized to move to electric because it was better for the world and it was better for people.
And so I do wonder if there is a possibility and a path that could be co created between these tech companies and governments all over the world where incentives could be put in place to segue off of an attention based business model by e fossil fuels in hopes of finding what is the electric and solar and green model and how do we work together to get there? If technology is advancing exponentially only going to get better at manipulating us, can we actually INTERVENE I'm sorry, I
don't think the companies will change from within. I think that when you have a product that creates so much consumer and public harm, it's actually unsafe UM products. Until those are regulated and contained as unsafe, they will continue. Why wouldn't they? I mean again, we're talking about billionaires adding billions to their coffers right now, so there's no incentive for them to stop the model. And they're also quite frankly so insulated, um, you know, with their wealth
from ever even being affected by the harms. So we sat they don't even let their own children engage with the technologies that they make. So we know that the makers are not going to be the source of the intervention. It's going to require an informed public, electorates, policymakers, regulators around the world. UM, it's going to take those kinds of actions and of course the culture shift that we
need um in order to see the changes that we want. Tim, if you could say something to Mark Zuckerberg right now, one thing he could do, what would it be, It's change your business model. Start having the conversation at Facebook about how the business model is corrosive and what could be the path to segue off of an attention based business model to something else that is aligned with the
public good, like a subscription model. Possibly a subscription model with governments all over the world subsidizing in the same way that they subsidize false so fuel shifting to you know, more green technologies. I think that is I think that may be our only path out of this. Where are we in five to ten years do you believe will be better off or worse off? I don't know if we're going to make it another year. Frankly, if we
don't change these things immediately, I am deeply concerned. And I think that this thing has eroded faster than any of us have ever anticipated. I think, as as Tim says in the film, the changes that we've seen, in the developments we've seen in the last few years, I didn't think we'd get to a world where fringe conspiracy theories would would make up a third and a a half of major political parties were now ten years into this mass hypnosis, this mass warping effect on the collective psyche.
So even if we kind of vote in this election, we have to realize that the basis for that vote, the basis for how we're all thinking, believing, and and seeing reality has been warped for for many decades now because of these forces in the business model. I think
we're better off. You know, I'm ultimately optimistic, Uh, you know, give some of the things that we all talked about, I believe in in five to ten years, um, you know, much like what happened with tobacco, much like what happened with seatbelts, which, by the way, we're incredibly controversial, and automakers didn't want to put them in, and now we've seatbelts in every car, it's not controversial. I hopefully we'll get on the other side of this in the same
way we did with those historic issues. I think of abolitionists in my own community. I think of of of women struggling for the right to vote. I think of so many different movements that have changed paradigms and again, it doesn't take millions of people to do that, but it does take enough people of conscience, of determination, of of love, and of care to organize and stay persistent and relentless. And I think we've been there and we're going to continue to be there in this in this
domain too. All right, Well with that, we will leave it there. Tristan Harris, Sophia Noble, Tim Kendall, thank you so much for joining us. Yah Bloomberg Studio One point oh was produced and edited by Kevin Hines. Our executive producer is Candy Chang. This is actually her last episode. She's moving on to her next adventure, and I just want to say we love you, Candy. Thank you, Candy. We'll miss you. We are so grateful for the love and hard work you poured into this show, and we'll
try to do you proud. Our managing editor is Daniel Culbertson, with production assistance from mallorye Abelhausen. I'm Emily Chang, your host and executive producer. This is Bloomberg