S2: Bonus Ep 1 - More on "What Went Wrong?" with the New York Times - podcast episode cover

S2: Bonus Ep 1 - More on "What Went Wrong?" with the New York Times

Jul 20, 202318 minSeason 2Ep. 2
--:--
--:--
Listen in podcast apps:

Episode description

New York Times investigative reporters Michael Keller and Gabriel Dance share more about what tech companies and the government are doing and aren’t doing to confront the vast trading of child sexual abuse material across the internet. 

If you would like to reach out to the Betrayal Team, email us at [email protected].  

To report a case of child sexual exploitation, call The National Center for Missing and Exploited Children's CyberTipline at 1-800-THE-LOST 

If you or someone you know is worried about their sexual thoughts and feelings towards children reach out to stopitnow.org 

In the UK reach out to stopitnow.org.uk 

Read the article by Michael Keller and Gabriel Dance, The Internet Is Overrun With Images Of Child Sexual Abuse. What Went Wrong?  

See omnystudio.com/listener for privacy information.

Transcript

Speaker 1

Topics featured in this episode may be disturbing to some listeners. Please take care while listening.

Speaker 2

This is a crime that thrives in the shadows, and people needed to hear what was actually going on.

Speaker 3

One of the biggest problems reporting on this is nobody wants to hear about the problem because of how awful it is.

Speaker 1

I'm Andrea Gunning and this is a betrayal bonus episode. In episode four, you heard from New York Times reporters Michael Keller and Gabriel Dance as they spoke to their twenty nineteen investigative piece on child sexual abuse material. It's called the Internet is overrun with images of child sexual abuse? What went wrong? If you have a chance to read it, look it up because it's superb investigative reporting. There's a link in our show notes to the article. We wanted

to dive a little deeper. How are crimes being reported, what role our technology company's playing, and how is the government responding. Here's Michael Keller.

Speaker 2

This is a crime that thrives in the shadows, and people needed to hear what was actually going on.

Speaker 1

Reporter Gabriel Dance.

Speaker 3

One of the biggest problems reporting on this is nobody wants to hear about the problem because of how awful it.

Speaker 1

Is, and to be honest, we were nervous. We know from season one of Betrayal that our audience is genuinely interested in letting the light in on dark stories. One of Michael and Gabriel's most important revelations was that our legislators don't really want to hear about it. State lawmakers, judges, and members of Congress have avoided attending meetings and hearings when it was on the agenda. They just aren't showing up.

Speaker 2

One of the big things was the failures of the federal govern to live up to its own promises that it made around two thousand and eight to develop a strong national response. The government had not really followed through on its grand plans. The high role position at DOJ was never fully created. The strategy reports that we're supposed to come out on a regular basis, there's only been

two of them over the last decade. You know, these reports have risen, but federal funding to these special task forces has largely remained flat.

Speaker 3

I mean, there's so much of these offenses going on, there's so many reports, there's not enough police in the United States seemingly to solve this problem.

Speaker 1

I CAACK or the Internet Crimes Against Children's Task Force is working on the front lines every day. There's at least one eye CACK in every state hearing what they go through through daily. It's truly harrowing.

Speaker 3

What I will say is speaking with members of these ICAC task forces, I was always in such admiration and awe of their work dealing with this kind of content and this kind of horrible crime, and really the survivors and how hurt some of them are for sometimes the rest of their lives. We spoke with an Ikaq guy in Kansas who had served in the Iraq War, and he said he would almost rather go back and serve another tour than continue in his position dealing with these types of crimes.

Speaker 2

He had said that he worked in Ikaq and then to take a break, he did a tour in Iraq and then came back and felt like, all right, now I can go back and keep doing this work.

Speaker 1

I'm in awe of the law enforcement officers. They choose this work because they want to save children, but it really is a kin to war in an emotional sense.

Speaker 3

Some people like viscerally cannot deal with this issue because it is truly one of the most awful crimes that we commit against one another and the descriptions. Michael and I probably read hundreds and hundreds of search warrants and legal documents that would describe videos and photos and the acts in them.

Speaker 1

One strategy ICAQ uses to write reports is to turn the video off when documenting the audio and turn the sound off when documenting the video because it's too much to handle. At the same time, these KAC Task Force members can only do so much with what they are given. They triage the cases, often prioritizing the youngest victims, but they can only investigate about a third of all the

tips because the caseload so overwhelming. Of course, predators are the biggest problem and bear the most responsibility, but we need to acknowledge there's another culpable participant when it comes to the explosion of seesam material technology companies. Before the Internet, the US Postal Service was the leading reporter of CESAM and was stopping the dissemination of material via the mail. However, with millions of images plaguing the Internet, is it time

we started holding technology companies responsible? For their lack of action.

Speaker 3

What I do think they're certainly responsible for is allowing this problem to get very serious before they started to take responsibility for their role in it. As early as two thousand, tech companies knew this was a very serious problem,

and we're doing nothing to solve it. So I would say that tech companies are certainly responsible for allowing the problem to spiral out of control in the early part of this century, and I'm encouraged that from what we've seen, several of them have begun to take the problem much more seriously.

Speaker 1

The technology exists to root out criminal behavior, why aren't tech companies deploying it?

Speaker 2

Microsoft, along with professor Hani Farred, came up with the technology called photo DNA. This takes a database of image fingerprints and whenever a photograph gets uploaded to an Internet platform, that company can scan it to see if it's in the database of verified illegal imagery. And so that's the main tool that tech companies use, which is great because

it's largely automated and easy to use. It's been around for a long time, so a company like Facebook or others that are doing automated scanning, they can generate a large number of reports just through this software. They also generally have a team of human moderators that review it, and that serves an important role of verifying what was found and also escalating it if there's evidence of actual hands on abuse.

Speaker 3

Of a child.

Speaker 2

If you talk with most technology policy people, one perspective that you hear a lot. Technology companies don't have that much pressure to get rid of harmful content on their platform because they don't face any legal liability for it. You know, technology companies, of course would say, we have every reason to get rid of this harmful content. We

don't want to be a place for exploitation. What the legislative solutions that have been proposed so far try to go after Section two thirty of the Communication's Decency Act, which shields technology companies from any liability for content that users post. There have been a few proposals to try and change that, both from Democrats and Republicans. It's been

one of the few areas of bipartisan support. Those proposals have not gone through, but over the last few years you do see people trying to find ways to increase the incentives for tech companies to clamp down on this more.

Speaker 1

Let's take Facebook's parent company Meta as an example. Meta is the leading reporter of child sexual abuse material to the National Center for Missing and Exploited Children. Almost all of the illegal content gets transmitted through their Messenger app. That isn't necessarily because it has the most CSAM because they're using photo DNA and finding offenders. Currently, Messenger does

not encrypt their messages. However, Meta has announced that this year it will make end to end and encryption the default. Meta executives have admitted that encryption will decrease its ability to report SEESAM, saying if it's content we cannot see, then it's content we cannot report. The Virtual Global Task Force, a consortium of fifteen law enforcement agencies, is practically begging

Meta not to do it. Meta CEO Mark Sockerberg stated encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion.

Speaker 3

The more communications are encrypted, the less capable tech companies are of using these automated scanning tools to find and report sea SAM that's a much broader conversation that should be had, and oftentimes it gets shorthanded to everything should

be encrypted or nothing should be encrypted. The encryption conversation is often complicated by this particular issue and held out by both sides, both by law enforcement and by tech companies and people who believe that all communications should be encrypted as a wedge issue. I think there can be more nuanced to that conversation, particularly when you come to platforms and social media networks where adults can engage with children.

Just by definition, children are at such a disadvantage. Something that's important to note as well is that many of these social networks also give predators is an opportunity to engage with children in a way that was never before possible. You have documented cases of grown men going on Facebook pretending to be children and then sexually extorting other children into sending images of themselves, after which they continue to force them to produce more and more imagery.

Speaker 1

Gabe is referring to what is commonly known as sextortion, tricking a young person into sending an image and then essentially blackmailing the child into scending more with threats of exposure or harm. The encryption debate won't be solved anytime soon, but it's clear that protecting children from abuse is not enough of a reason to compel for profit tech companies

to consider changing their approach. Social media websites and messaging platforms are ground zero for the production and sharing of ce sam material through the dark web and encrypted groups. Appalling communities have developed. Take the site welcome to Video. This darknet site, hosted in South Korea, a mass more than two hundred and fifty thousand child exploitation videos in only two years. Welcome to Video created a community of users who bought and traded appalling content. Videos were sold

for bitcoin. According to an April twenty twenty two article in Wired magazine, the site's upload page instructed do not upload adult porn, the last two words highlighted and read for emphasis. The page also warned that uploaded videos would be checked for uniqueness, meaning only new material would be accepted.

Speaker 2

In a lot of online groups, these images are like a currency. In order to gain access to people's collections, it's required that you produce knew, never before seen images. So you also have that dynamic where people that want to get images are pushed into abusing children. And documenting that abuse and sharing it online.

Speaker 1

Welcome to Video was brought down by a joint effort between the FBI and the South Korean government. It was the result of dogged detective work and internet sleuthing, and while it was hosted in South Korea, many of its users were United States citizens. There are so many people who don't realize just how big this problem is and how close to home it actually hits. So with all of this information we have what can we do to make the public more aware of this problem.

Speaker 3

What I came away with as the clearest call to action from our reporting is spreading awareness and educing parents and encouraging them to educate their children. This is not necessarily a problem that tech companies and solve, and certainly don't seem determined to solve.

Speaker 2

We spoke with a few online child safety experts who had a few pieces of advice. One brought up the idea that the industry is not about the business of promoting safety, and she said that she would love to see whenever she buys a cell phone a pamphlet that comes along with it that says how to keep your children safe with this device. The key thing is to not keep abuse secret. The less we talk about this,

the more the offenders have an advantage. They thrive on the feelings of guilt and blame that a child may have if they were tricked into sending a nude photograph. That shame is really what gives them more.

Speaker 1

If you or someone you know has been a victim of sextortion, you can get help email the National Center for Missing and Exploited Children or call one eight hundred The Lost Many Thanks to Michael Keller and Gabriel Dance from The New York Times, see our show notes for a link to their article, The Internet is overrun with images of child sexual abuse? What went wrong? Since we spoke with Michael and Gabriel, Meta has been caught up

in controversy again. A recent investigation by The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst found that Instagram was helping to link predators and people selling child sexual abuse material. Its algorithm connected accounts offering to sell illicit sex material with people seeking it.

According to The Wall Street Journal, Instagram allowed users to search for terms that its own algorithms no may be associated with illegal material, and it's not like they were hiding in. Instagram enabled people to search hashtags like hashtag pewdo horror and hashtag preteen sex, then connected them to

accounts advertising CSAM for sale. If that wasn't troubling enough, a pop up screen for the users warned these results may contain images of child sexual abuse, and then offered users' options. One of them was see results anyway. Meta has set up an internal task force to address the problem. If you would like to reach out to the Betrayal team, email us at Betrayal Pod at gmail dot com. That's

Betrayal Pod at gmail dot com. To report a case of child sexual exploitation, called the National Center for Missing and Exploited Children's cyber tipline at one eight hundred the Lost. If you or someone you know is worried about their sexual thoughts and feelings towards children, reach out to Stop it Now dot org. In the United Kingdom go to stop it Now dot org dot UK. These organizations can help.

We're grateful for your support, and one way to show support is by subscribing to our show on Apple Podcasts and Don't forget to rate and review Betrayal five star reviews, Go a Long Way, A big thank you to all of our listeners. The Trail is a production of Glass Podcasts, a division of Glass Entertainment Group and partnership with iHeart Podcasts.

The show was executive produced by Nancy Glass and Jennifer Fason, hosted and produced by me Andrea Gunning, written and produced by Kerry Hartman, also produced by Ben Fetterman, Associate producer Kristin Melcurrie. Our iHeart team is Ali Perry and Jessica Krincheck. Special thanks to our talent Ashley Litton and production assistant Tessa Shields. Audio editing and mixing by Matt Alvecchio. A

Trail's theme composed by Oliver Bains. Music library provided by my Music and For more podcast from iHeart, visit the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

Transcript source: Provided by creator in RSS feed: download file