Misinformation, health, and social media - podcast episode cover

Misinformation, health, and social media

Apr 25, 202222 minSeason 3Ep. 8
--:--
--:--
Listen in podcast apps:

Episode description

Imran Ahmed, CEO of the Center For Countering Digital Hate, joins the podcast to discuss the proliferation of misinformation across social media, how medical misinformation has become a profitable industry, why it's been allowed, and what can be done.

Read the CCDH's reports, The Disinformation Dozen and Pandemic Profiteers.

You can continue the conversation with Jessamy and Gavin on Twitter by following them at @JessamyBagenal and @GavinCleaver.

Send us your feedback!

Read all of our content at https://www.thelancet.com/?dgcid=buzzsprout_tlv_podcast_generic_lancet

Check out all the podcasts from The Lancet Group:
https://www.thelancet.com/multimedia/podcasts?dgcid=buzzsprout_tlv_podcast_generic_lancet

Continue this conversation on social!
Follow us today at...
https://twitter.com/thelancet
https://instagram.com/thelancetgroup
https://facebook.com/thelancetmedicaljournal
https://linkedIn.com/company/the-lancet
https://youtube.com/thelancettv

Transcript

This transcript was automatically generated using speech recognition technology and may differ from the original audio. In citing or otherwise referring to the contents of this podcast, please ensure that you are quoting the recorded audio rather than this transcript.

Gavin: Hello, welcome to The Lancet Voice. I'm Gavin Cleaver, it's April 2022, and we're very pleased to have you on board. This week, I'm chatting with Imran Ahmed, the CEO of the Centre for Countering Digital Hate, about their recent work looking into how medical misinformation propagates across social media, why it's so widespread, and what can be done about it.

Remember, you can subscribe to The Lancet Voice and our series of specialty focused podcasts in conversation with wherever you usually get your podcasts. For now, take care. Let's hear from our guest Imran Ahmed.

I thought perhaps we could kick off by talking about the disinformation dozen, which when you publish that report, it, it seems to get fantastic traction, in politics in the UK, in the US. And that report, of course, found that 12 people are responsible for 65 percent of vaccine misinformation posts.

Tell us a little bit about, how that research came about. how valuable you think it's been and what some of 

Imran: the kind of follow on from it's been like. So you have to remember that this report came out in March 2021 and it was situated in a moment when there was still a debate over whether or not vaccine hesitancy was a broad social manifestation and that something had just come out of nowhere or if it was an organized disciplined anti vaccine industry.

Now we, we'd published a report in June 2020 on the anti vaccine industry warning that if they got their claws into the COVID crisis, that it presented the perfect environment for them because there was high epistemic anxiety, people were unsure where to get information from, what was true, and they were yearning for certainty.

And there was this industry of people who have built sustainable business models, economically very successful business models, for spreading misinformation and using it. it to generate revenues off the back, which then helps them to build more content and more misinformation, et cetera, et cetera. So we wanted to focus minds on the fact that a small number of bad actors who are very organized, disciplined with 501c3, so charities that they own limited companies could build a vast amount of misinformation.

we found was that 65 percent of the shares of misinformation. on social media platforms had links which linked back to material created by the disinformation dozen. And I think what that helped to do was to create the sense that there is a cogent counterforce to public health. And that in itself is a, is an amazing thing to think that there are people out there who are militating against the success of the public health profession in the middle of a pandemic.

So it focused attention on that. And it made them think as well about who are these people, what are their motivations, what are their strategies. And in that respect, it advanced the debate. But it also, we saw the utility in providing motivation inoculation against the misinformation they were providing.

So one of the ways to inoculate people against misinformation is to educate them about the motivations of people spreading it. And to show that these were economically motivated spivs. was really valuable in that moment. 

Gavin: That kind of good faith, bad faith argument is really interesting to me.

Because, it seems to me that a lot of people interacting with the disinformation dozen and posts that come from them could be said to be acting in good faith themselves because they're taking on this kind of information, as you said. But I think You follow up a report where you showed actually how profitable it is for people to spread misinformation really gets at the heart of the bad faith of the original spreading of these rumors.

Imran: That report came about because I wanted to produce a deck of cards, a sort of, with the 12 key anti vaxxers and how, and I said, I'd like to see how much profit they make on that. So it's almost like Top Trumps where you could say, my guy makes more profit than yours does. And Colette Camden, who was making a documentary for channel four, which eventually became the anti vax conspiracy documentary, which broadcast last year, said, can you turn it into a full report?

And so we did. And we went into forensic detail on the, we have investigative journalists working for CCDH. I myself have a finance background. background we dug deep to work out how do they make money from spreading misinformation? Because that was a question I'd be asked all the time was, what's in, why do you think they're doing this?

And I'd say, I'd always say it's very curious how much money they make from something that they claim to believe they're doing for the public good. But here we could quantify that and, these are shocking amount amounts of money. Bullshit is highly profitable. 

Gavin: And it really seems to be ingrained in the U.

S. at the moment, and it's having what seems to be quite a severe effect on public health. 

Imran: Look, I think that there is a nexus of two things happening here. First of all, you've got this, the sort of the veneration of wealth and of Inverted commas, entrepreneurialism. And the second thing is that you have a culture of impunity that has been bred over two decades in the U.

S. around the Internet. That there is a sense that the Internet is free of the moral boundaries. of normal, of the normal world and that in part is driven by legislation. One of the things I bang on about is something called section 230 of the Communications Decency Act 1996, which is a bit of law passed in the early internet, which said that to help the early internet grow, if something is on your website, but you didn't put it up there, you are not legally liable for that content, but you have also full protection under the law.

If you want to remove that content, so it allows you to moderate a for example, a bulletin board, or if you're a news site, the con, the comment section, but the problem is that when social media was introduced, what that created was business models where people essentially profit from millions and billions of other people writing content for them, and they are not liable for it.

any of it legally and that legal impunity has built a culture of impunity over a generation of tech executives coming forwards where they believe they can make money from anything and it's none of their business as to what it is that they are enabling 

Gavin: so What changes would removing that piece of legislation make, how do you think it would change the kind of face of the internet?

Imran: I advocate for all companies to have to respect a duty of care to both their users and society. And that's the way the law works generally, that if you breach your duty of care and if a reasonable person would say that you are that you have not followed your duty of care, then you can be liable for harm caused to people as a result of the use of your product.

The only industry in the world that has that sort of immunity is the tech industry in the U. S. specifically. And, various countries are now thinking how do we start to create this duty of care within the industry? The U. K. has the online harms plan. which is now the online safety bill, which literally produces a duty of care.

And it says if you fail to enforce it, we're going to have a regulator who's going to analyze whether or not you're doing so and will punish you for it, will potentially fine you for it. In the U. S., the way that the duty of care is enforced is typically actually through litigation. Actually, people have said that if you remove Section 230, what it would do is create a crisis where internet platforms wouldn't be able to survive, that's just not true, because the court system still has a corpus of law and of cases where it can say this is the point at which you become responsible legally for the harms caused on your platform.

This is the, you can't be sued, for example, if something terrible happens and it's beyond your control. But if you haven't put into place the reasonable processes and controls to ensure that it hasn't happened, And you've known that there is a problem that, that is happening, then you may be liable. And I think that would be a good way to change, to at least change the cost calculus for these companies so that they feel that they Operate within a system that has accountability.

Gavin: Yeah, to move the needle slightly. Yeah, that makes perfect sense. I think, actually, on the subject of kind of removing posts and reporting posts, we've all tried to report a particular post, to Facebook or Twitter, and then usually you get the response, this doesn't violate our community guidelines.

Why does the system with social media seem so unresponsive when it comes to this content? 

Imran: It's very simple. Until now, the social media companies have played a really, a cunning game. And they are subject to the same pressures as any other company. They, they realize that they're creating harm.

They realize that, that, There is moral responsibility, but they know there's no legal responsibility and in the end that they're also accountable to their shareholders. Their shareholders want them to maximize profits. And if they remove content, they literally reduce their revenues, because every bit of content helps them to generate revenues by attracting eyeballs and therefore allowing them to place ads next to it.

To actually remove the content imposes a cost. For them, the To do the right thing both reduces their revenues and increases their costs. To do the wrong thing, which is to sit back and allow this all to happen, is insanely profitable, as we've seen by the enormous market values of the major social media platforms and of the individuals who run them.

Mark Zuckerberg is worth a hundred billion dollars and he's younger than I am. Which is, a phenomenal business achievement, but that model is unsustainable. Because the truth is now, we are starting to see that the decision, post January the 6th and post the pandemic, I think the debate has changed fundamentally.

They are now starting to destroy long term shareholder value. Their brands are being destroyed. Facebook used to be a place that if, if someone went to work there, you'd say, That's so cool. Now you'd say, Oh, how are you going to live with yourself? And that is a fundamental shift in these companies.

So I think that we are seeing a tipping point now, where people are starting to become aware. And I can tell you that in my job, where I'm on the bleeding edge of legislation and the social consequences that we can create for these platforms. In the last six months I've done the British Parliament, I've done Congress, I've done a number of other countries talking about specific legislation that they're seeking to bring forwards.

I, I think the age of impunity is over. 

Gavin: Yeah, and I think those attempts at self regulation by the social media companies, because we've all seen as well those little badges that are on COVID 19 related posts where they're inviting you to click through to get fact checked. information. They're just pathetic, aren't they?

Who is clicking those to go through and get the information? 

Imran: The, the prima facie, they seem weak. And actually the data these companies have show that it's weak as well. Ad credits are weak. So an internal study at Facebook that, that came out in the Haugen, in Francis Haugen, the whistleblower from Facebook, who brought forward thousands of internal documents.

One of those documents said that they'd analyze the free ad credits they'd given to international health bodies and found that those ads were net negative rather than positive because the problem is that they got flooded with comments by highly interconnected, highly active trolls which, which then undercut the point of the post in the first instance.

So the most liked comments on there would be people attacking it. and spreading misinformation. So the truth is that platforms are so toxic now that even when they try and give, add credits to the good guys, the bad guys still are able to game those systems to work for them better. 

Gavin: What kind of different challenges to platforms that are closed like Telegram and WhatsApp presents?

I ask because there's Quite a strange campaign going on in the UK at the moment of people turning up at vaccine centres and demanding that they shut down using, common law excuses. And from what I understand, these are all organised on Telegram. 

Imran: They can be organised on Facebook groups as well, but that, each platform has a different utility.

Twitter is where, Twitter is not a, very, it's not as widely used as Facebook, for example, or Instagram, but it is used very heavily by elites, journalistic elites, political elites. And so it's a very good way of shaping elite discourse. And also the way that Twitter works, the sort of the mathematics of that platform are that 50 people can generate 5, 000 notifications on someone's phone.

In a couple of minutes, you each tweet, you each quote tweet each other, and then you like the tweets, that's 50 times 50 times two notifications. And if you got 5, 000 notifications on your phone, you'd say, it's blown up. 5, 000 notifications, your brain says, a hundred people must feel that for one to have said it.

And so you think there's you know, half a million people, but it's actually not multiplied by 100, it's divided by 100 to 50. The mathematics of those platforms and how poorly they're understood by the targets is why we've seen this rapid shifting, the mainstreaming of aberrant narratives and, fringe narratives into the mainstream, often amplified.

by the misreading of the mathematics by political and journalistic elites. That's what Twitter's used for. Instagram is, for targeting younger people, it's for highly visual memes. Facebook is where you have the drip of misinformation into groups, into which you can recolor the lens through which people see the world.

through frequency bias of showing them aberrant content on a regular basis. Telegram can be used because it's only used basically by people that agree with you and, by the fringes. And so it's a great way to operationalize and to further radicalize. Facebook is where you bring in new adherents.

Telegram is where you radicalize them. If you want to organize something really bad, you do it on a peer to peer, so an end to end encrypted peer to peer system, like WhatsApp, which allows you to basically be outside of the view of even law enforcement. So that's where you organize a terrorist attack.

It's where the domestic terrorist attacks on, were organized on signal, on tele on, on WhatsApp, on these closed platforms that have no visibility. And we don't even know who it is that's interconnected there. You 

Gavin: mentioned your background in finance. What drove you to take this 

Imran: particular role on? My career's been varied.

I started off at med school in the UK at UCL in 96, and then I went, actually couldn't stand it. I quit. And I went into finance. I worked for Merrill Lynch. I worked for various management consultancies. And then 9 11 was the reshaping of my awareness as a human being. I'm one of that, and that particular generation that's grown up in absolute peace.

And that's the first time I realized there are bad things in the world. Like really, there's evil in the world. And I went back to university, I went to Cambridge and studied politics. I left and I worked in parliament for a number of years. And what really shifted it for me was Both the rapid influx of anti Semitism into UK political discourse on the left and into the Labour Party, and then seeing the instrumentalization of misinformation and hate in the referendum, which led directly to the murder of my colleague, Joe Cox MP, because I was the special advisor to Alan Johnson, who was leading the Labour Remain campaign.

And in that moment, I just thought, No, I can't do this anymore. I can't pretend that we are the locus of political discourse, that it happens in, in parliament or in the media. This is happening in Facebook groups, in spaces that we aren't even aware of. And I was aware that these spaces existed.

I couldn't persuade politicians to be interested. And so I set up CCDH to work out how to disrupt those spaces from being used. without the help of Parliament. And I said at the time, it will take five years before legislatures start to work out that something is seriously wrong. And as it happened, six years later, I was right.

It took five years, but now we're in the phase where legislatures are starting to respond. We've evolved as an organization. We were two people initially. We're now we're now 20 across the U. S. and the U. K., and we work with, we have a reputation that allows everyone from the Department of Homeland Security's domestic terrorism people to the White House and the Surgeon General, and the National Security Council, through to legislatures all around the world.

take us seriously because we've been thinking about this for a long time now and worrying about the way in which the drip of misinformation and the reshaping of elite discourse, it allows radical movements to emerge very fast and achieve great scale and do enormous harm because it is undercutting everything we care about, democracy, the common fights against common problems like pandemics, climate change, etc.

Gavin: How big do you think then that gap is between the kind of let's say institutions and legislature and government and the police as well understanding how these digital 

Imran: spaces work. You know every now and then I'll find someone that really gets it. I remember speaking to a really hot senior guy in the intelligence community once in the U.

S. And I went through my situation analysis and what solutions I think we need to put into place. And he said, it's like you've got a spy camera in my head. And so there are people that are thinking this in the background, but we help them to articulate it and to bring to life what happens.

You're right. Part of my job is education, but part of my job was education with the public health. industry. We've been working with the American Public Health Association, with UK bodies I've been advising GCMS during the pandemic on cross departmental bodies and, it has been about educating people as to the nature of the harm because I think until about a year and a half ago, I would get asked the question all the time, isn't this just online though?

This is just online. People don't really believe this in the real world. But after January the 6th, And the pandemic and the anti vax problem. I haven't been asked that question one more time. Now the question is Oh, God, what do we do about it? You were right. And I think that, we've taken people along the journey, and think that we are able to continue to influence that debate.

In May, we've got legislators and civil society from around the world coming together in Washington, D. C. at a conference we're organizing to talk about sharing learnings, sharing situation analysis. sharing evidence points from around the world. I think there's still an education job to be done, but my charity's there to do that.

That's what we're set up to do. 

Gavin: Is there any advice that you could give for, say individuals or even medical professionals who might not be working in these bodies, of course, that we're talking about governments and institutions, but are still concerned about the, proliferation of anti vax information they see online?

Imran: From the beginning, we've been really aware that this is a battle of epistemic authority and weight. It's about who has the right, who has the credentials to speak to the public and give them authoritative information. Anti vaxxers, one of their key themes is you can't trust doctors.

They only have three themes. Covid's not dangerous, vaccines are dangerous, you can't trust doctors or public health professionals. And I think that the real power that you have is the power of the relationships that you have built. The authority, the social influence, the social proof that you have for the words that you say in your communities.

So it's to ignore. The anti vaxxers to block them like both from your from your minds but also literally on social media platforms block them if they're harassing you and do not let them stop doing what it is that you do which is providing Convincing authoritative information to the people that care about, that have, that trust you, that have relationships with you.

Because that's how we win this. This is an epistemic battle about who has the right to convey information. And the best thing that we can do is exemplify the professionalism and authority that that you hold every single day.

Gavin: That's it for this episode of The Lancet Voice. If you want to carry on the conversation, you can find Jessamy and I on Twitter, on our handles at Gavin Cleaver and at Jessamy Baganal. You can subscribe to the Lancet Voice if you're not already, wherever you usually get your podcasts. And if you're a specialist in a particular field, why not check out our In Conversation With series of podcasts, tied to each of the Lancet specialty journals, where we look in depth at one new article per month.

Thanks so much for listening, and we'll see you again next time.

Transcript source: Provided by creator in RSS feed: download file