TikTok Part 4: Are The Kids All Right? - podcast episode cover

TikTok Part 4: Are The Kids All Right?

May 06, 202131 minSeason 2Ep. 4
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Episode description

While TikTok is known primarily as a breeding ground for funny memes, dance routines and lip-synching videos, there’s a dark side to TikTok that starts with the young children who populate the app. In this episode, reporter Shelly Banjo looks into the consequences of TikTok’s failure to safeguard its adolescent user base. Warning: This episode contains difficult and disturbing reporting, including stories about the sexual exploitation of minors.

See omnystudio.com/listener for privacy information.

Transcript

Speaker 1

The thing that's funny about TikTok is that it even makes it twenty two year old feel old. Here's Ameliana Via. When I first downloaded the app, I posted a video of me like dancing in the club, like set some music. Um, and then I get a comment like five minutes after I posted. In the comment it's like yes, like teach me your skills or something. And I go to the page and it's like a little girl like she looks like eleven years old. Um. Yeah, And I was like, okay,

so this is really like a kid's app. Like people aren't lying when they say that, and looking at an eleven year old dancing in her bedroom on TikTok felt wrong. I mean I didn't feel creepy. It felt creepy like looking at their stuff because I feel like I'm not the audience for it, and it's a little weird that it's so public. Amiliano isn't the only young person to tell me that he felt protective of kids on the app. Yeah. I would not recommend like a young child a me

app unless they have like parents supervision. That's Katie Pheeney. We heard from her in a previous episode. She's eighteen years old and spends about four hours a day on TikTok. But her message to the kids play outside more. I feel like when I was ten years old or nine years old, I was playing outside, and I feel like now like these kids are cursing and they're doing super like sexy dances, and I just I can't watch it. I'm like, oh no, please, don't do that. Go right

a bike. And there's some corners of the app that a number of teens say are worth worrying about. Here's nineteen year old Mila Dilatore. There was a lot of weird stuff about like eating disorders. There was there was a bunch of really stupid, like disgusting trends on TikTok of like really underage kids. It's just like, it's such a scary thing, I think for kids, like fourteen year olds to be on TikTok because there's such disgusting stuff on there. So it's not just a bunch of old

people that are concerned about TikTok. Even the kids are asking are the kids all right? Because for all the funny memes, dance routines and silly lip sinking videos, there's a dark side to TikTok, and it starts with a young children that have been on the app from its very start. You're listening to Foundering. I'm your host, Shelley Banjo a word of morning. This episode contains some very sad and disturbing information, including stories about the sexual exploitation

of miners. And I did want to mention that I pressed TikTok at least five times to interview someone in charge of child safety. The company declined to make anyone available, but they did give us a number of comments, which will include throughout this episode, starting with the statement TikTok is deeply committed to the safety of miners. So up until now, we've been telling you the story of TikTok's explosive growth and how it took Silicon Valley and Hollywood

by surprise. TikTok was able to achieve this success by taking a number of shortcuts. One of them was letting kids, even kids under thirteen, roam free on their app. That led to a firestorm among parents, police departments, and lawmakers that hurt TikTok's reputation just as it was hitting it big in the US. We'll tell you more after a quick Break. TikTok's problems with young children actually began years before TikTok even existed. It goes all the way back

to when TikTok was still called Musically. When the Musically app first started getting popular, investors called it the world's youngest social network because it's audience included kids in elementary school. At the time, Disney songs were popular in the apps music library, and unlike Instagram and YouTube, Musically didn't prompt users to disclose their ages. Here's Josh Constein speaking with Musically's founder, Alex jew at a conference in Josh was

a journalist at tech Crunch. So I'm a little bit terrified because some of the kids that I see on this app seem extremely young. I mean it's not just under thirteen or not just thirteen year olds, which are actually allowed on most services like Facebook and Instagram, but kids that look like they're eleven nine, seven years old. And these are kids with millions of followers, and some

of them are acting quite provocatively. Most social media companies draw the line at thirteen years old to avoid running a foul of a federal law called Kappa that's the children's online Privacy Protection Act KAPPA limits how much information apps can collect from kids under thirteen. It's one of the only federal laws in the US that protects kids online. It's meant to keep them away from sexual predators and others who might have suppoitt them. Does musically abide by

the Child Online Privacy Protection Room? Absolutely? How do you guys abide by that? Because you don't. You don't ask users for their age, right, So that's actually uh compliant with the regulation, that's not that's feigning ignorance though. Right Alex is on stage and meaning he intentionally didn't ask kids to enter their ages before signing up, and he defended that decision, right. But we we could do something similar to snapchat right collecting the age the year of births.

But those kids there were anyway say they over certain, right, So that doesn't really change the conversation for us. The most important thing is we have to make sure the environment is a safety We put a lot of resource and technology in place to make sure everything is safety. And going back to to your previous point, a lot of users, especially top users, the under certain but they are being monitored by parents. So we do have wit chat communication we sell parents own daily basis, So Alex

had many parents download wha Chat. It's a Chinese messaging app, kind of similar to What's App or Facebook Messenger. That way, Alex and his team in Shanghai could keep in touch with the parents by essentially texting back and forth. But as the app hit more than a hundred million registered users in the US, people started asking how they could possibly get permission from all of these parents. But I

still don't understand though. If it's if these kids are under thirteen and you're providing their you know, contact information, photographs, real names, user names, all things that are prohibited by the Online Protection Act for Children. How I don't understand how that's not a violation. But the parents give the permission. It is being momented by the parents. That's a point. But there can't be that all these kids, I mean, so many parents don't even know what this app is.

But if you go, how do all there's You're telling me that every kid that's under thirteen on your platform, how their parents permission? I can't guarantee, as far as I know, is being wanted by the parents. I understand that you guys have have faigned that you don't know the ages of users before, or that their parents are always involved. Honestly, looking at this content, I think that this is something that you guys should really be looking deeper into. I was surprised by how hard Josh Constein

is questioning Alex here. Tech conferences usually showcase a founder's success, not this. It's also just strange to hear Alex caught so completely on his back foot from the get go. Alex seemed to know just how he wanted his app to look and feel. He had a vision for how to handle creators and how to work with the music industry. So why was he so careless about the young users on his platform. The way he handled this problem seems sloppy,

and this would come back to bite him. I remember the founder of Musically had actually been interviewed at the conference with an industry conference where he was asked in detail about their cop of compliance and he said, oh, yeah, we have a lot of kids under thirteen. But then everybody said no, but we don't have a problem their parents know, and he was really given a hard time and it was really I mean, it just was pretty blatant.

So most companies don't do things like that. That's Angela Campbell, an attorney who has spent decades working on child privacy issues. She was among a group of lawyers who failed to complain against TikTok to the Federal Trade Commission. So in twenty nineteen, about a year after Byte Dance bought Musically and folded it into TikTok, it came out that TikTok had become the subject of a federal investigation. I mean, there were just so many little kids on the app

back then. It was unbelievable. Um such a clear cop of violation. They had lots of kids songs they had, you know, like baby Shark was one of the things you could lip sync too. So even though they said they were only intended for over thirty, it was clear that that wasn't the case. They were directed their children, and also that they knew that they had kids on there. The cop of violations were apparently so clear that the

federal government stepped in. In February nineteen, the FTC alleged the app was collecting personal information from kids without their parents consent. The U. S Government slapped TikTok's Chinese parent company, Bye Dance, with a five point seven million dollar fine. At the time, it was the largest ever child privacy settlement in US history. TikTok's unchecked growth appeared to be

at risk. They agreed to make a number of changes, including deleting the profiles of kids under thirteen, and they put out a statement today the FTC announced a settlement agreement related to Musically. It's our priority to create a safe and welcoming experience for all of our users. I

remember thinking that the statement was so backward looking. It focused on Musically, which at that point had been defunct for almost a year, and it seemed to lay the blame squarely on Musically's founder, Alex jew The company was framing the FTC fine as a problem with its previous self, an ugly issue TikTok left behind when it stopped being Musically, but things didn't work out that way. We'll be right back. Don Hawkins runs a group called the National Center on

Sexual Exploitation. She says a few things surprise her anymore when it comes to kids on social media. Until one of our family members got on TikTok. He's eight years old. His second grade teacher um suggested that all of the students get on TikTok as a way to kind of bond and stay close. They could they could make funny videos together, et cetera. UM obviously see a lot wrong with that. Kids are not suppose of you on these

platforms until they're thirteen or older. UM and eight year old especially, there's a lot out of problematic content on there um for their little brains to handle. Don says his eight year old parents agree to let their son on TikTok because they thought TikTok was safe and because his teacher thought it would be a fun thing to try. I found out he hit was on it and took a look at his account, and I found that thirty two men had written him private messages, and some of

the messages contained links to disturbing images. But Dawn says they were too subtle for an eight year old to understand how and why they were wrong. So often it was just like, hey, you're so cute, how are you? Will you send me these videos? Some of them sent images of their um, of their penises and asked them for pictures of his. A couple of them did say it's a challenge. Others just said, hey, you should do this, and wear this and and he just did it. Yeah,

he just did it. And then things got even more disturbing, and some of the men asked him, I think that they were men, asked him to create some videos in his underwear and gave him exact videos to copy and dance moves to copies, which he did and um. And we then found that some of them had recorded his videos and then reposted them. And this was perhaps the

worst part. The videos were being shared widely, and so his videos had, you know, like under twenty views and um, but the videos that they reposted of him had tens of thousands and um. And you know, he's in his

underwear dancing in a very sexually like suggestive way. Um. Not realizing what it means, John broke the news to her young relative about what happened with his videos, like his whole body language changed, and he said, he's so embarrassed, and he didn't know that it could be sent out there. He posted it on his own channel, How how is it possible that it could be on someone else's. He didn't give permission. And the rest of that day he kept coming to me asking how could they copy it?

Why would they do that to him that was just for him and his friends. His little eight year old self is so traumatized that it's it's out there, and what does that mean? Are his friends at school are going to see it? And and so forth. I spoke to don about nine months after the videos went up. Her family was still trying to get them removed. He's working with some police now. I think the police are asking for those videos to come down. And did he

delete TikTok or is he still on it? You know what, These kids love it and it is so fun, right, so once he's experienced it, he it's really hard to take it away. So um it isn't fully deleted, but he's only allowed to be on it when a parent or an adult is sitting with him. It's pretty telling that after something like this happened, this eight year old is still on TikTok. But that just speaks to how

addictive app is. TikTok started asking users for their date of birth in twenty nineteen, and the spokesperson told me they'll delete the accounts of kids that they find are lying about their age. But Bloomberg has reported that the FTC and the Department of Justice are looking into allegations that TikTok hasn't followed through with these promises. When the story came out, TikTok gave us a statement, we take the issue of safety seriously for all our users, and

we continue to further strengthen our safeguards. TikTok has become so ubiquitous now among teens that what happens on TikTok leaks into life offline. I spoke to a school police officer about this. I see it on TikTok, I see it in real life, and that's all across the world that that's happening. All kinds of challenges that go on that are awful. The duets are awful, right, some of

the duets. You know, somebody has half a screen, the other person has half a screen, and the duets just makes our some awful uh sexualization of young kids, and and they're recording it for TikTok. That's Officer David Gomez. He's a school cop in Idaho. When he started nearly a decade ago, he thought he'd be spending most of his time breaking up fights, stopping school shootings, or keeping

drugs out of bathrooms. But on the job, he noticed that kids social lives had moved beyond the hallways and parking lots, onto their phones and onto platforms like TikTok. I'll ask kids at my own school, Hey, how many people do you think who are juniors have sent out their nude photos? And you know the answer I get more and more is coomans. I think everybody has sent out their nude photo by eleventh grade, right, and by eighth grade it's about fifty. And that is a question

I absolutely get a lot. Hey, is everybody's doing it? Why is it a big deal that I do it? While it seems that sending a sexy photo is kind of a write of passage for teenagers these days, it's a big deal to Officer Gomez because that naked photo could be used as blackmail, as a tool for predators to extore young children. He told me about a recent

case he handled with one of his students. One of them again that started on TikTok, was a twenty four year old in Pennsylvania who was collecting girls to send him weekly naked pictures of themselves. One of the girls was a thirteen year old student at his school. Once he got their underwear picture and say, hey, look, if unless you want all your friends, your school, your grandparents send me. Now, you're gonna send me a video every week by this time, or else I'm gonna blast this

to everybody. Officer Gomez says she sent the man in Pennsylvania photos of herself because she felt she had no choice and was too scared to ask for help. Gomez has seen this a lot. They feel shamed, they feel embarrassed, you know, they feel out of control, they feel brainwashed. All those things are the same things that adult predators are using on young girls who are too much easier to control than an adult woman would be. Officer Gomez says he was able to get Pennsylvania State troopers to

go to the guy's house and warn him to stop. Luckily, in this case, the guy stopped contacting the thirteen year old, but Gomez warns that her naked photos are all out there and other predators could buy them one day and try to extort her again. And do you think TikTok in particular has become a bigger problem in recent heres? Absolutely right, it has become the biggest problem above Snapchat. Now this was news to me when I first met Officer Gomez it was and I was working on a

magazine story about TikTok. Back then. He told me that Snapchat was his main concern because of its disappearing messages. But in the past two years, from Officer Gomez's perspective, TikTok has overtaken Snapchat as the most dangerous app for kids. Oh really, why why above Snapchat? Because parents think that TikTok has some redeeming values of them making videos, and I would agree that it does have some redeeming values yea videos, lip syncing, singing, dancing around, oxide activity, Okay,

I see that stuff. But parents are just not understanding how many predators are on TikTok. Parents underestimate TikTok because they don't see what their children see. Alex, do you you explain this back in people will have very personalized experience, see very very different contents basically on the age, based on the preference, So the parents experience will be completely different.

With experience. Personalization means that a parent can't exactly monitor their child's feet and it means that they're not clued in into some of the big issues on TikTok. One of those problems is bullying. And it happens often, all the time, all the time, and it's definitely with like people who aren't conventionally pretty or even like just looks slightly different than like the average beauty standard. That's Liz, an eighteen year old from Florida. She didn't want us

to use her last name. Lizays TikTok is worse than other social media apps. She uses them, always coming onto my life teams just like calling me names like oh, you're so ugly, you're so fat, you cow, and I'm like, okay, cool, thanks, thank you, thank you for the input. Bullying happens on our social media sites, but it can be more damaging when we're talking about the really young users on TikTok.

And because TikTok is still so new compared to other companies, there are fewer people working on many of these safety issues. I spoke to a girl in the UK named Talia about this. She's thirteen and had a horrible experience on TikTok. I lost all my friends that I thought or baby enough for a long time and the best time because I felt alone a lot, so I felt like kind of trapped. That makes sense. It all began when Talia

started getting a flurry of messages from her classmates. So I was doing online work and suddenly my phone got message saying did you do this to this girl? And I went, no, of course not, and then I got um six messages from the same kept going keept going, and I realized that I was her mom because the things I was getting said to me. And then Talier just came up to me and she said something's happened on TikTok and I was like, what do you mean. She she starts to explain it, and I'll just give

me your phone. That's Talia's mom, Shelley deal. She says, someone online was pretending to be Talia. They set up an account with Talia's face and Talia's name with the sole purpose of trolling a popular girl at their middle school. It's what many tiktoker's call a hate account. So this person pretending to be Talia was calling this other girl really mean things, calling her a pig, fat, and ugly, and all of Talia's friends were confused why was Talia

doing this? So then all of their classmates started to hit back at the real Talia, asking how she could have done such a thing. I had no proof they weren't me. I can't really do much because I thought I was really good friends to people, so people that messaging me, they just seemed so angry at me for something that I didn't do. Um. They were just saying that she's jealous, that she's a bit, she's a slag, just awful things. Talia's mom tried reaching out to TikTok.

When she couldn't reach anyone, she reported the issues to the school, who sent her to the police, but the police said that TikTok refused to turn over any information. I'd already reported it to TikTok and said someone is pretending to be my daughter and didn't hear anything back. Um. But I was just shocked. And then the more the TikTok's were coming in, I just felt sick. I just

felt physically sick. This was happening to her and we just didn't know who it was and why, and we couldn't stop it because I just thought I could report it to TikTok, they'd be taken down and that would be in for it. But they were It was a long time before they took it down. She said. The police officers told her this was a particularly bad case of online abuse, but they didn't have the resources to go after TikTok, even though cyber bullying is illegal in

the UK. On TikTok, you can't speak to anyone. They don't reply. There's no response. I've given them chrome reference number. UM. I said, it's serious. The police are dealing with it. This is a brilliant You don't even get a response, they just ignore you. A few months after I spoke to Talia and her mom, TikTok finally did something about the incident because Talia's story appeared in the Daily Mail, and then TikTok banned the hate account for violating its guidelines.

A TikTok spokesperson said, bullying and harassment are strictly against our community guidelines, and content of that sort is removed. But the hate accounts stayed up for eight months and this incident changed Talia's life to the point where her mom decided Talia should move schools. Obviously, Yeah, we've had times where she's just been sobbing and really down. You know, she's lost even her best friends that she's had since

primary school. All turns against her. UM and it was actually her that said I could never go back to that school. I was surprised to find out that Talia stayed on TikTok. Her mom allows it because it's helped Tally a big friends at her new school tell you says that what happens still haunts her, but she's moving on. Yeah. I want to find out who it is. But if not, I want to leave it behind and on Monday friends

and just forget about what I'm trying. Don Hawkins, the child's safety advocate we heard from earlier, the one who had the eight year old relative. She thinks that problems like this shouldn't just be up to parents and the police. She wants TikTok to take a heavier hand and protecting its young users. Look, these tech companies know pretty much everything there is to know about us. They know us

better than we know ourselves. They can detect age lying, They probably already have all of that information on us already. Don speculating here, But many tech experts I've spoken to agree with her that tech companies can do more to protect kids, but they don't because it would bite into add revenue. TikTok has made some strides and improving the safety of its app. The company barred kids under sixteen from sending messages and right after we contacted TikTok for

comment about this episode. In January, they stopped users from being able to duet or download videos posted by kids under sixteen, and they stopped strangers from being able to comment on those videos. They also introduced a big change so that if you're under sixteen, your account is set to private by default. That means kids have to individually approve who gets to see their videos or deliberately changed

their account to public. TikTok also released a statement. They wrote these changes are aimed at driving higher default standards for user privacy and safety, but Dawn says that it's clear that child safety still isn't TikTok's number one priority. She believes these problems emerged because TikTok became so successful so quickly. I do give TikTok the benefit of the doubt. I think probably it was a problem of scaling and growing.

I mean, their company explode it in growth across the world, um, and they just didn't they didn't prioritize child safety as they were growing. At this point, child's safety was turning into a disaster for TikTok and for Bye Dance. The CEO, John Gi Ming, just as he got his wish of taking TikTok worldwide and surpassing one billion registered users, suddenly all these problems started to emerge, child exploitation, cyber bullying. He gets slapped with the heftiest FTC fine of all

time for violating child privacy laws. The UK and India launched their own investigations. You can't exactly go out and celebrate when everyone knows this is going on in your platform. I spoke to a number of people who worked with the Ming. They told me that he has an obsession with the technology side of his business, but he never really wants to wait into pr problems or government relations. He's a practical person and who often doesn't understand the

emotional side of his business. He's often at a loss when it comes to dealing with people. That includes both his colleagues and the users on his apps. People said that Yuman believed he could get past the issues presented by the children on TikTok by programming his way around them for Bike Dance, based on the success of the products they already had in China. They were pretty confident that they will be able to expand beyond this very

young demographic and make the app mainstream. That's where I mom, she's a former tech banker in China and Silicon Valley. She's spent years following Byte Dance, TikTok's parent company. She says that by Dance may have felt confident about their ability to move past the FTC investigation because you Mean had been in hot water before. He had big run ins with regulators in China, where laws about child internet

use are much stricter than in the US. So even before the US government was looking at TikTok for having underage users, by Dance had actually gone through a whole series of, you know, investigations with the Chinese government. Um. That's because the platform was accused of having inappropriate content.

UM one of its actually first apps called They had Dwines, and it was a app that feature jokes and memes, and it was permanently banned um from the app stores by the Chinese government back in April two tho eighteen April, so a year before the FTC fine. Apparently Mean felt

shaken by the Chinese government's actions. I think in some ways, you could call it existential, yeah, because like at that point, that was the harshest punishment that had been doled out, leading to Johnny Mean actually having to come out with a very public apology. Eming wrote a public letter that he posted on one of his social media accounts at four in the morning, apologizing to Chinese regulators, his users, and his employees. I asked my colleague to read a

version that was translated by the China Media Project. I've been filled with remorse and guilt, entirely unable to sleep. Our product took the wrong path, and I'm personally responsible for the punishments we've received. We placed excessive emphasis on the role of technology, and we have not acknowledged that technology must be led by the socialist core value system

broadcasting positive energy. At the time, some media analysts speculate this letter was Human's way of saying that he would fall in line with the Chinese government's demands, and he did. You Mean created several child safety features for his Chinese users. Kids were blocked from the Chinese version of TikTok from ten pm until six am, and he used facial recognition

to predict their ages. But these changes were made in China, and you mean realized that he still needed to make some big changes at TikTok globally, and that would start with a shake up and leadership. You mean wanted to put someone else in charge of TikTok, someone who could hand all the complexities of government investigations, someone who could talk with officials around the world, someone who could speak

English well. So you mean made a surprising move. He decided to hand over the reins of TikTok back to Alex jew At that point, Alex was still on a sabbatical of sorts, roaming around and listening to jazz music. Even calls Alex up and asks him to come back to work. He wanted Alex to deal with their preteen users, repaired TikTok's reputation after the FTC fine, and fly to the US to set things straight with government officials. And it's not just that Eming asks Alex back because he

suddenly gained a conscience and wanted to protect kids. Multiple people have told me that Evening actually blamed Alex for creating the problem in the first place. After all, it was Alex who appeared to ignore the issues of having so many young users on the app for years, so you mean wanted Alex to clean up the mess. But soon after Alex came back to TikTok, he find out that TikTok's young users were just the start of his problems. He would find TikTok in the crosshairs of u S

lawmakers and the president. Politicians wanted to know where all this American data was going and what exactly was by Dance's relationship with the Chinese government. That's next time on Foundering. Foundering is hosted by me Shelley Banjo. Sean Wyn is our executive producer. Raymondo is our audio engineer. Molly Nugent is our associate producer. Additional reporting by Isabel Lee. Special thanks to Jeping Huang, Edwin Chan, Josh Brustin, and Doors

Schloton's class at Portola High School in Irvine, California. Her students gave us so many great insights on what it's like to be a teenager on TikTok. Our story editors are Mark Billion and Vander May and Alistair Barr. Francesca Levi is the head of Bloomberg Podcasts. Shared a subscribe and if you like our show, leave a review. Most importantly, tell your friends see you next time.

Transcript source: Provided by creator in RSS feed: download file
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast