For Police, Social Media Is Now Part of the Job - podcast episode cover

For Police, Social Media Is Now Part of the Job

Jun 25, 201919 min
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Episode description

When police officer David Gomez was first stationed at a school in rural Idaho, he thought he’d spend his time breaking up fights in bathrooms and scanning the hallways for weed. Instead, he found that almost every problem was either happening on social media or started there. This week on Decrypted, reporter Shelly Banjo explores how age-old dangers like drugs, child predators and school shooters have shifted onto new platforms, and how one school has tried to adapt.

See omnystudio.com/listener for privacy information.

Transcript

Speaker 1

In a small high school in a rural part of Idaho. I'm sitting in a cramped office with a police officer called David Gomez. We're looking at his phone, scrolling through a feed of short videos on the popular app TikTok. This is something David does often. It's part of his job as a school resource officer and one of the ways he keeps tabs on the kids here. By monitoring what students are posting, David can keep an eye out for drug dealers and predators, anyone who could pose a risk.

Right now, we're watching a video of a girl in jeans and a sweaters singing along to a song, and to me, it looks pretty normal. So this is a good example of our show parents, completely innocent video, right, But guys will copy this and they'll say, hey, we'll look at this. She should wear this kind of bras, she should do this. She's got seventy five thousands, that's right,

seventy five thousand likes. When David started doing this work about seven years ago, he thought it would involve breaking up fights and keeping guns and drugs out of school, but he quickly learned that kids social lives had moved onto platforms like TikTok, Snapchat, Instagram, and YouTube, and that came with a whole new set of dangers. So these are people she doesn't know at all. What a cliche, but it too we honest him made Land said Harry write.

They're telling her she's interrexit. They're telling her. What a cliche, but it's very honestly. This is a story about trying to keep up with a generation that spends more and more of its waking hours in a virtual reality and learning how to protect them from some of its very real world repercussions. I'm Piaga Kari and I'm Shelley Banjo and you're listening to Decrypton. Officer David Gomez is forty five and he's a tall, laid back guy. School officials

say he's a good cop and he's incredibly friendly. Hi, it's so nice to meet you. Thank you. He got assigned as a school resource officer because he was good at dealing with kids. Over the years, he's worked at a number of schools. I went there to meet him in his small rural community an hour outside of Boise, Idaho. Before becoming a police officer, David Gomez spent more than a decade working in tech at a contractor for chipmaker Micron.

About ten years ago, I saw an ad in the paper for law enforcement opening, and I took the job there. My first three years there were on patrol, just street cop like everybody else. And then I became the school resource officer at a middle school. That's where my education

and social media began, he says. At that time, his main experience with social media was an outdated my Space page, but he had to quickly get up to speed on a bunch of hot apps once he realized just how much time the kids at school spent on their phones. Every problem had something to do with social media, whether it was pulling, drug dealing, doping, suicidal kids, everything was on social media. Early on in his new job, he

decided to run a little experiment. He created a Facebook account with a fake name and photo, pretending to be a fellow student at the middle school he worked at. And when I added some of the popular kids, they all accepted me that same Within the hour, they were accepting me, and once I had all the popular kids as friends because I had friends in common with them, everybody else started adding me. I didn't even have to add people. After that, that first Facebook account opened a

whole new world for David. Now he could see who the kids in his school we're talking to and what they were saying to each other. Pretty soon he identified

a threat. But when we met a predator who had advertised himself as a twenty six year old marine sergeant um he came to meet as it was actually a homeless guy, and so to see what kind of a problem he had, we made a bunch of fake accounts and we realized that the problem was huge, of sexual predators right here low All that summer, David helped start

what the police force called a predator Team. The idea was to understand how adults were targeting kids and identify this kind of behavior online before it got dangerous for students at the school. It became clear that part of that job would be explaining to parents and students what some of the risks were on these platforms, and that's how he found himself asking parents to tell their kids not to send naked pictures of themselves over the internet.

David says people would be shocked if they knew how often and how easily could send naked photos of themselves He says that because they see an unending stream of sexualized photos and music flow through platforms like Snapchat and TikTok, it doesn't even feel weird for young kids anymore when

friends or even strangers ask for them. At the high school level, and any big high school, by eleventh grade, I would say the girls have sent out their naked picture by eighth grade, meaning in the middle school we're at about and weekly were to can report of eight year olds, nine year olds, ten year old sending out naked pictures just because they've been groomed to think that

it's okay, it's not that big a deal. While I was in Idaho, I sat down with several students at the schools where David worked, and all of the girls said they had been asked for nude photos. It was like asking if they had heard of Kim Kardashian, like, of course they had. Here's one high school junior, Alexa Porter. Have you ever been asked for a nude photo? Uh? Yeah? Have you ever sent any? No? And do you feel any pressure or anxiety around the kinds of things that

you post or how frequently you post? Yeah? I do so I kind of like monitor my posting range to who I'm catering to, which sounds really bad, but it's yeah, yeah, that's what I do. But not every student is as careful. Some boys will make a fake Snapchat account and they will contact girls on Snapchat and they'll say, hey, I have a naked picture of you, send me a new one to me personally or else. They call it blasting.

I'll last year naked picture all over the place, meaning I'll put on Instagram, I'll send into your family, I'll send it to everybody. Now, so many girls have sent out their naked picture that the chance that they have a naked pictures possible. Concerns about what kids are exposed to on the Internet have been around since the early days of a old chat rooms, but today of US teens use smartphones and say they're online almost constantly, according

to the Pew Research Center. Meanwhile, the social media industry is booming. TikTok has been downloaded more than a billion times globally. Instagram's parent company Facebook is now valued at half a trillion dollars. YouTube and snap that's the company which owns Snapchat, continues to pull in a growing share of ad dollars each year. And some ways these platforms are great because they helped bring people closer together, but

in other ways they seem pretty harmful. Besides the potential for abuse, there's also just the hours of screen time and amped up social pressure. Studies have linked social media with anxiety, depression, and even suicide in teens. These problems exist across social media, but three platforms kept coming up during our interviews with David and other law enforcement officials as well as parents, kids, and regulators Want a Snap, where parents say that disappearing messages make it harder to

keep tabs on their kids. Another is YouTube, which has come under fire for its recommendation system. It can surface unsavory content and also make it easy for pedophiles to watch videos of young children. And the third one is newcomer TikTok, which David says can be risky precisely because

it seems so innocent. So TikTok snuck up on people and when kids would show their parents the fun video making things that you could lip sick and you can make funny whatever, for parents thought because it's a kid's app, it must be safe for kids, and that was what kind of bypass some of the regular questions that parents should ask of applications, like hey, does it put your GPS on? There? Can you private message? Who else is watching these videos? Snap, TikTok, and YouTube all said they

take children's safety very seriously. They emphasize that their platforms aren't designed for kids younger than thirteen, and when they find underage users, they close those accounts. YouTube said in the first quarter it took down eight hundred thousand videos over violations of its child safety policy. Snap told us that it's model of private messaging serves to protect kids because it's harder for harmful content to go viral. But social media has completely changed the nature of David's job

in a lot of ways. For example, drug sales work completely differently now. When I visited him in Idaho, he pulled out his phone and opened his fake Snapchat account to demonstrate clicking through stories from local drug dealers showcasing vape cartridges and marijuana for sale. David said some of these accounts are from past students. Others are people who accepted his friend requests even though they don't know him.

So this is supposed in the last twenty four hours this guy, so they're open for business, which means they'll deliver um. So here these are going to be vape cartridges um, and they're going to be marijuana cartridge is because nobody pays thirty dollars for one cartridge. They're usually like five dollars, so thirty dollar one is going to have a full lot. Snap said accounts used to sell drugs are an intentional abuse of their terms of service,

and that the company removes these accounts when they're reported. Meanwhile, David says, who uses these Snapchat accounts to rest drug dealers? Once those people are close to campus with cocaine or whatever it is, he can bust them. But there's something else, something even more serious, that's always on his mind. What if someone wants to hurt the kids in his school. Often social media provides the first clues about a school shooter or another kind of attack, and it's these cases

where law enforcement will take extraordinary measures. After spending some time with Officer David Gomez, I realized that drug busts and tracking down predators is stuff he does all the time, but there are other less frequent incidents that he's always on the alert for, including school shooter threats. David said, that Not long ago, a parent reached out to him after seeing messages on Snapchat that suggested someone was planning

an attack. David used his Snapchat account to look at the messages, and then the next day he saw another message. This one included some specifics. Hey, I'm going to be at the school tomorrow at this location, and don't be there if you want to live. This is the kind of lead. He has to investigate quickly to determine if there is real so he reached out to Snapchat itself. He said, hey, look, I've got a threat to the school.

I need help. So the tech companies go through and say, hey, look, here's the IP address, here's the email address, or here's a phone number. So then I look and they tell me when the account was created too, so I can see if it just got created for this purpose or if it's a mature account. In some cases, this would be enough information to figure out which kid is making

the threat. This time, though, he needed more information, so David called the cable company, gave it the IP address of the web browser he'd received from snap and a cable company matched it to a physical home address. And so once I find the address, we're going there. How long does it take the tech companies to get that information to you? I probably I can have the address of where the person is in twenty minutes, and as long as you have the right authority and you give

them the right paperwork afterwards or before. Sometimes I just have to mail them on a letterhead that says I will follow up with whatever for work is needed. David found the kid making the threats, Luckily this time it was a hoax. In other situations where he's found evidence an attack was being planned, he's had to make an arrest, but in this case, he turned matters over to the school,

who suspended the student for a week. Hearing David talk about this made me curious about exactly how this relationship between the police and the tech companies works. So a threat to the school, suicidal person, kidnapping, you know, some kind of a domestic violence were imminent danger means there's likeliness to be great bodily harm or death. So any situation where there's a possibility of imminent danger which is great bodily harm or death, they will help us very quickly.

Snap says it received nearly seven thousand criminal legal requests for data from July to December of last year, including search warrants and why a tap orders. It comply with seven percent of them. A spokesman confirmed that in cases where police officers quote swear there is an imminent threat to life or safety, Snap can disclose certain user information right away, officers are required to follow up with the correct legal process, like a subpoena, after the threat has subsided.

And the same goes for Facebook, which said it complied with se of the one hundred and ten thousand data requests from law enforcement during that same period, and across the board. The law enforcement agents and regulators we talked to said the tech companies are mostly responsive and accessible, but some of these tactics also highlights an uncomfortable trade off between safety and privacy. It raises the question of how much surveillance is too much, especially when it comes

to children. Take TikTok, for example. I learned in the course of my reporting that TikTok now uses facial recognition to predict the ages of users and to identify when someone might be under age. That's a protection measure, but that also means it's scanning kids aces that in itself

can seem a little creepy for some people. But some experts think that developing new technology will have to be part of the solution, like Julie man Grant, who worked at Microsoft and Twitter for two decades and now heads up a commission on Internet Safety for the Australian Government. For a long time, we have been seen incremental innovations

and changes. I think we're starting to see the deployment of more technology tools like artificial intelligence and machine learning, which have a great ability to help detect and remove abusive and harmful material before it's posted. But I think we need more of that, and we need real leadership to change the culture around how technology and services are developed. Safety has to be and children's rights and um ethics need to be at the core of technology development rather

than just an afterthought. But a lot of children's advocates think tech companies need to do more to keep young kids off the sites in the first place. Earlier this year, TikTok had to pay a record fine to the US government for illegally collecting data of users under age thirteen, even for all minors using these platforms. Duly emphasized that once tech companies have sensitive information about their youngest users. They have to make sure the data is used appropriately

and is kept completely confidential. But you also need to make sure that if you're collecting this kind of really sensitive personal information, that you're not creating security and privacy issues by creating honey pots um of information um sensitive information that can be exploited in other ways. A TikTok spokesperson told us that promoting a safe and positive environment is a top priority and that this is a complex industry wide challenge. TikTok said it deploys a combination of

technology and moderation to address misuse of its platform. Back in Idaho, office of David Gomez has started posting tutorials on his own Facebook page age trying to clue parents into what's going on with their kids social accounts. He's got a following of around ten thousand people, including parents and educators across the country. He says most parents, even the ones who are really engaged, will never know everything

their kids are getting up to online. There's one thing they can control, though, when they give their child a phone. So I always tell parents the minimum age of thirteen plus the understanding that the kids are going to look at pornography, not child porn, butch is regular pornography. But what we're seeing is about third grade now, so eight seven and eight, many, many kids are already getting their

own smartphones. We're taking bullying reports from third graders already because they're taking pictures, putting them online, posting them on Instagram accounts. Um. Some of my seventh and eighth grade girls have Tinder accounts. Other tips include abiding by app age limits, not allowing kids to keep their phones in their rooms at night, disabling messages and pop posting for young children, and making sure kids only accept friend requests from people they know. In real life, it can feel

like a never ending task. As one parent I talked to J. C. Holman explained, I mean, it is truly a lot of work to try to stay on top of it. And I don't think it's possible to stay ahead of everything your kids are doing unless you are extremely tech savvy and devote a significant amount of your life to it. And it continues to get harder now that tech companies use artificial intelligence to create highly individualized social feeds, features like disappearing messages and vaults that hide

certain posts. It means that what a parent sees on the app and whether kids see are often totally different. Ultimately, there's really no true way to know exactly what they're doing. It feels what you feel a little helpless, um because they all have phones and the only way to do it is to not allow them to have one, and in this day and age, it would be like cutting off the air they breathe, I mean in their minds.

So it's it's really a mixed feeling because it gives me access to her, but on the flip side, it gave everybody else that's same axis. That's it for this week's episode of Decrypted. Thanks for listening. We always like to know what you think of the show. You can write to us at Decrypted at Bloomberg dot net or I'm on Twitter at s Banjo and I'm at Pia gad Cary. And please help us spread the word about our show by leaving us a racing or review in

your favorite podcast apt. This episode was produced by piagad Cary and Lindsay Cradwell. Our story editor was Ann Vandermay. Thank you also to Aki Ito, Emily Busso, and Brad Stone. Francesco Levi is head of Bloomberg podcasts. This episode marks the end of our spring season, but we'll be back in September with a full slate of news stories. Have a great summer.

Transcript source: Provided by creator in RSS feed: download file
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast