Young Kids Love TikTok–and That’s a Serious Problem - podcast episode cover

Young Kids Love TikTok–and That’s a Serious Problem

Nov 30, 202239 min
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Episode description

A heads up: This episode is on a difficult subject and some of it isn’t easy to listen to. You might want to listen with headphones if children are nearby.

In just a few short years, TikTok has become as ubiquitous as Instagram, Facebook and Twitter. The brief videos and fun dance challenges that TikTok’s one billion users post on its app often become viral sensations. 

But there’s a darker side to the platform, and one TikTok is having a hard time fixing. Young children who aren’t supposed to have full access to the app are finding ways around the company’s safeguards and logging on. When they do, they're exposed to some content that's not suitable for kids, including viral videos that challenge users to do dangerous things– sometimes, with tragic results.

Bloomberg senior reporter Olivia Carville joins this episode to discuss her investigation of how kids are using TikTok–and what the company is doing about it.  

You can read Olivia’s full investigation here: https://bloom.bg/3isBcmo 

Listen to The Big Take podcast every weekday and subscribe to our daily newsletter: https://bloom.bg/3F3EJAK 

Have questions or comments for Wes and the team? Reach us at [email protected].

See omnystudio.com/listener for privacy information.

Transcript

Speaker 1

From Bloomberg News and I Heart Radio. It's the big Take. I'm West Casova today. Why TikTok and young kids are a bad mix? Just a note about today's episode. It's on a difficult topic and parts of it are not easy to listen to. If there are children nearby, you might want to use headphones. Bloomberg Senior reporter Olivia Carville spent months investigating the deaths of children who participated in dangerous challenges that can be seen on TikTok, the enormously

popular video sharing social network. Her reporting shows how the company has tried and failed to stop young kids, some of them under ten years old, from joining the platform, even though they're not supposed to be allowed on it. I asked Olivia to tell us what she found in her reporting for a Bloomberg Business Week, starting with why TikTok has such appeal for kids? What kids want to be when they grow up as an influencer more so than an astronaut or a teacher or a journalist. They

want to be a YouTube or TikTok influencer. They want to be social media famous. They are communicating with a community of like minded people who are creating cool content, and they're looking at people that they aspire to be. We're talking more than two thirds of all teenagers in America use TikTok. Millions of kids every day are on this app more than Facebook, more than Snapchat, more than Instagram. TikTok is the coolest place to be on the Internet

right now. One thing it's made TikTok so popular are the viral video challenges users post for others to copy, like a dance or a song. Overnight, millions of TikTokers are posting videos of themselves doing their own version of it. So the challenges on social media apps like TikTok really started in two thousand and twenty, when the pandemic shut down schools and kids were at home just on their phones. Initially, these challenges were pretty harmless. They were kind of cute.

I saw pits being dressed up in different outfits and four generations of one family dancing in the same way. But as is usually typical when children or teenagers are involved, they started to get edgier or more dangerous. The crate challenge where kids would climb stepped milk crates and some of them would fall and injure themselves. We're talking milk

crates that were like eight crates high. Or they do challenges lie the cinnamon challenge, where they'd eat a spoonful of cinnamon, which can actually cause minor lung damage if they inhale it into their lungs. They were doing things like dropping pennies into a partially plugged in phone charger waiting for it to spark, and in some cases this

resulted in catching light and property destruction charges. We'd see even more dangerous challenges throughout with the skull breaker challenge, where two people would encourage a third to jump in the ear and then they'd kick their legs out from under them and they had land on the ground, sometimes on their head, and this would result in concussions. We also saw the fire accelerant challenge, where kids would actually

light themselves on fire accidentally. We've seen children who are seven, eight, nine years old, who are too young to be on TikTok participating in dangerous and tragically, in some cases, deadly challenges. You heard Olivia say just now that children this young are too young to be on TikTok. That's because TikTok, like other social media platforms, has an age gate, kids under the age of thirteen aren't allowed to have full

access to the apps content. What's interesting about this age gate is actually that the number thirteen is largely arbitrary. There's no scientific research or pediatricians or child psychologists that say kids over the age of thirteen are mature enough to handle this kind of content. The reason why platforms say you have to be over the age of thirteen to get access to them is actually because they don't want to violate already existing child privacy laws in the US.

Children under thirteen are supposed to be directed to a walled off version of the platform where the videos are scrubbed of content that TikTok says isn't appropriate for kids. Art Kids are smart, they're digitally savvy, and they don't want to be on the child version of TikTok. They want to be on the cool version of TikTok. They want to be on the adult version of TikTok. So they're lying about their age to gain access to the platform. Millions of children today are on TikTok who are under

the age of thirteen when they shouldn't be. If you're under the age of thirteen, and you try to sign up for TikTok and you just say that you're older. Is that it does TikTok do anything to try to verify your age. The platform does many things to try and weed out these underage accounts, but it's difficult. Some of the things they do is use text based analytics to scrape through, you know, bios and people's profile pages. So say you download an account and you write in

your bio, I'm eleven years old. The artificial intelligence is good enough to detect that and remove you. They also get community users to report in suspected under age accounts. If you're nine years old and on TikTok and posting regularly as the account holder, other people who use TikTok can report you to the platform and you'll get taken down. They also train their moderators to look out for suspected child accounts, and every three months the platform removes twenty

million suspected child accounts from TikTok. This is a huge number, but TikTok has one billion users today worldwide. Millions of children are on the platform and lying about their age to access it. So all these measures that they're adopting to try and find those suspected under age accounts largely aren't working. Olivia's story is centered on an especially gruesome video challenge that's been making the rounds on TikTok, the

Blackout Challenge. What it is is children choking themselves until they lose consciousness, blacking out and then waking up and getting that adrenaline rush or that high of regaining consciousness. They're filming this and uploading it to social media as if it's a fun, cool thing to do. The challenge requests that you use a household object to try and choke yourself, So in some cases I've seen children using belts, using nick ties, using dog leashes, using shoelaces to choke

themselves until they pass out. If this sounds familiar, it might be because choking games like the Blackhout Challenge didn't start on TikTok. Well. The Blackout Challenge actually dates back to, you know, as far as the nineteen thirties. It was around before we even had an iPhone. But social media has amplified its spread. Not only is it easier to reach a huge range of kids, it's reaching children who are younger and younger kids who are too young to

understand the risks. One of those children was a little girl named Arianni Arroyo. Arianni Arroyo was a nine year old girl who lived down in Milwaukee, Wisconsin. Artie was fashion. Addie was glitter and glad, and she just had her own little style and everything. That's Arianni's mother, Olivia, spoke to both of her parents. My name is Christo a Royal Roman, my name is Eddie Berto, a Royal Roman,

and my daughter is Arianni Jaylene Arroyo. She loved dancing, and dancing was the first trend that she really started engaging in TikTok. She would do all the dances she record herself and um, it just grew from there. When COVID nineteen shut down her school and she was at home all the time, she became obsessed with TikTok. She

was using it every day. She was trying all of these challenges with her younger brother, doing things like holding water in your mouth until you start giggling, or one challenge they tried was only eating red food for twenty four hours. They loved it. They thought it was really fun day. It was addicting. The TikTok is like something that lures you in, makes a kid believe that they're going to be famous. One evening in early two thousand and twenty one, Arianni was playing with her little brother

before bedtime. It was a Wednesday, it was a normal day. Her father was downstairs in the basement and his workshop. From what we understand, she used a dog leash. This was a middle dog leash. Her parents had recently purchased it because the kids really wanted to get a puppy, and they bought the dog leash to show their children

that they were willing to buy them a dog. And they were playing with this dog leash, and Arianny she read the middle dog leash around her neck and she hooked the buckle onto the hinge of a wardrobe door, and she slipped off the toy chests that she was standing on, and she was left hanging about two ft from the ground. She couldn't scream out. She couldn't cry out. Her father was out of earshot anyway, and her little brother had to run downstairs and tell his dad that

Sissy's tangled. Her father came upstairs to see what he was talking about, and that's when he found Ariane lifeless, hanging by this dog leash from the hinge of the wardrobe door. He tried CPR, She never regained consciousness, and she she died a week later. As you can imagine, Arianne's parents are heartbroken. They told me that their happiness had died with their daughter. That this feeling just weighs

heavy on your chest. That it's um hard to move on and hard to continue living when you know you lose a child in that way. One of the things her mother said to me is that she she knew that she had to be protective of the outside, but what she didn't realize is that she had to be protective of the inside. That this danger actually came through her child's phone, a social media app. I never thought in my mind that that could ever result to heat

losing a daughter. As we've done everything to protect our children, and you think they're in the safety of your home. We're home, My kids were home, everybody was home. A week after Arianny died, her parents asked her younger brother what they were doing the night of the accident. He said they were playing a game like they'd seen on TikTok. My son told us, like we played hang the Cheen, Like what's he in the chee. He was only four years old, so we didn't know the name of the game.

And it wasn't until after her funeral that they were asking her friends, you know, who were her age, who had known what she was doing around that time, and then her friends told me, oh, that's black Out challenging, And then I'm like, what's black out Challenge? I didn't know. We didn't know anything like that existed. After doing some research, they discovered another child had died from allegedly participating in

this challenge, a ten year old girl in Italy. Our story continues after the break, I asked Olivia what more could TikTok have done to stop a child like Arianni Arroyo from viewing videos like the Blackout Challenge. TikTok doesn't want this kind of content on its platform. It causes all kinds of issues like this exact one that we're discussing.

So TikTok has done a bunch of things to try and reduce Blackout Challenge video is one of those things is hiring people to sit and watch hours and hours of TikTok videos and decide if they're okay or not. The people who do this are called content moderators. Content moderators for social media apps are essentially watching all of the videos uploaded to platforms like TikTok or YouTube and trying to find any kind of content that may violate

community guidelines that may be problematic for the platforms. They're training their human content moderators to take down any content that promotes or glorifies dangerous behavior, like the Blackout Challenge. They have also band hashtags connected to the Blackout Challenge in the app. When you type in blackout challenge on TikTok, now you're automatically sent through to a page that says

your safety matters. You know you shouldn't be looking for dangerous content like this, and unfortunately though children have found ways to get around that. They have renamed the Blackout Challenge. They have replaced the oh of blackout and put a zero in it, or used a one instead of the L, misspelling code words, secret phrases to keep the content spreading, keep the challenge on the app because they like participating in it and they want to evade or avoid or

be smarter than TikTok. It's hard for them to train moderators on what is dangerous because the word dangerous is subjective. Maybe what you think is dangerous is different to what I think is dangerous. So should you remove content of children climbing stacked milk crates? Some may say no, let the kids do what they want to do. Don't be overly paternal and take this content down. They're just having fun.

But what happens if a child dies? Is that now the time to remove milk grate challenge content from TikTok. But trying to decide from the outset how dangerous is a challenge is one the difficulties that TikTok's trust and safety team faces when it comes to the blackout challenge. All along, TikTok has not wanted this content on its platform.

The difficulty comes down to removing it. How do you remove this content when you have more than a billion users from multiple like almost every country around the world, uploading videos on a daily basis and doing what they can to get around the restrictions that you're placing on the app to try and keep kids safe. You mentioned trust in safety, and that's another term that you hear in connection with social media platforms quite a bit. What is a trust in safety team? Trust and safety are

effectively the cleaners of the Internet. These guys are the fixes. When things go wrong, trust and safety comes in to clean up the mess. So for the past year, I've been talking to people who work within the trust and safety industry trying to figure out how hard is the job, What are some of the complexities or challenges that they face. A lot of people who enter the trust and safety industry actually come from social justice or law enforcement backgrounds.

What the ultimate goal is by joining these big tech platforms is to try and protect users. But once they're inside the company, and once they're actually have a seat at the table and work there, they realize that their job is twofold, protect the users and protect the company's reputation.

One of the most challenging things for the dozens of trust and safety workers that I spoke with is this feeling that you're effectively cleaning up for the company rather than doing everything you can to protect users on the front end, and on TikTok in particular, which is a platform that has seen explosive growth and really high turnover. Within the trust and safety team, that's because the job is so damn hard, you know, trying to moderate the

platform and keep users safe. When some of the kids are as young as six or seven years old, what do you do when they're lying about their age? What do you do when the appearance allow them on the platform and allow them to lie about their age? How do you handle that? As not only TikTok the company, but you, the trust and safety worker who was told to detect child accounts on the platform, how are you

going to go about doing that? This question of age verification is now front and center as pressure increases on social media companies to do something about underage children other platforms. We've seen a sea change in Silicon Valley in the last year to eighteen months where platforms are aware that the age verification issue has kind of become the elephant

in the room. This is a massive issue not only for TikTok, but for every social media platform is trying to detect underage kids and keep those who use the

platform in the correct and age appropriate experience. We're seeing regulatory shifts and legislation being written calling out TICH companies for putting their own profits over the best interests of children, and these new laws that are being written actually forced TICH companies to ensure that they know the age of their users with reasonable level of certainty, and how exactly are they supposed to do this? Well, it turns out

there are companies that specialize in it. This third party companies ones called Hive and they're based in San Francisco. There's another company called Yoti that's based in the UK and they specialize an age estimation. And we've seen different platforms like Twitter, Be Real, Pixar, read It, Instagram partner with these third parties to try and ensure that their age verification efforts are as good as they possibly can be. Well, I really wanted to know, like how effective are these

third parties? Does it actually work or you know they're just making it up? Like how do you know that a company that says it's going to be able to guess a user's age can actually do it? And if so, why isn't every platform in the world using them? The third party company Olivia mentioned just now Hive. It's building artificial intelligence tools to verify the age of users. How do we build AI models that are very focused on

solving real world business problems? Problem of constant moderation, age detection, object detection, hate speech, bulling detection. These type of aspects that we saw that there was a lot of generally human labor involved in and wasn't very scalable, especially when you consider the amount of digital content being produced on

the Internet. That's Kevin who, the CEO and founder of Hive, define what it is you're trying to detect and then train the model on that data by actually having millions or hundreds of millions of examples of what that means. Imagine training a model is kind of like teaching a small child to concepts. Right if you were to take a different problem like recognizing you know, animals and a farm, cats and dogs. Right, when you teach the small child,

you showed them examples in picture book. Here's what a cat looks like, Here's what a dog looks like, Here's what a pig looks like. And generally the child will learn from that from those examples. In our world, how we view this problem is you take a person, it's on camera, it's in a video, and we do run a model that looks at a few things. It looks at their face, but it look also looks at the rest of their body, and it makes the prediction of

age based on those features. So it's using the input of the person's facial features and their body and in general or anything else that we can kind of say is it's a kind of related to the physical makeup of that person that produces a results on age, kind of saying, when you look at someone, you're not totally sure exactly why you know that person seems like they're nine years old, but your instinct is that they are because of enough pattern recognition. That's when we trained this

model to do as well. We're actually predicting the age without knowing identity whatsoever. One of the things that I found really surprising during the reporting of this story is as I was looking into Ariani A Royo's k I actually saw a screenshot of her TikTok account. She was only nine years old, so she wasn't meant to even be on the platform. But when you look at her account, you can see she's a kid. You can see in her bio photo that it's her nine year old face.

She wasn't trying to hide herself or trying to lie to the platform. Sure, she entered in the wrong date of birth to access the app, but so did all her friends. Olivia decided to put HIDES technology to the test, so she sent them a short video clip ARIANI had filmed of herself for social media. And the way our models work on videos, as we look at frames and we samples, we usually do a one frame per seconds.

In this case, I believe the video was something like eleven seconds, So we produced eleven seconds of results, and then we simply take the average of that to see what our model would predict, and we've predicted that her age was around ten years old. It took HIVES software three seconds to determine that Ariane was ten years old, and she died three months shy of her tenth birthday. Kevin Guo says that level of accuracy is fairly standard

for age prediction software. There's really no reason, in our opinion, the humans should be looking at any of this. Our models are going to handle this at necent accuracy, really at near Yeah. One of the stats we like to give is our you know, we'll catch content of bad content while making a mistake unless than one percent of it. The way that we've thought about it with all of our partners and platforms we work with today is we've never yet encountered a problem where their scale was too

much for us to handle. And this includes some of the largest platforms in the world. So I'm very confident that if a platform we're actually aligned and trying to solve this problem, the hardware and the costs can't keep up. Accordingly, this is intended to be a real time product. In this case, you could imagine a world where for users that are underage, given the sense of the nature of that, you would still want to escalate hume review to like take final action, but at least the flag that this

content is something worth investigating. We believe that should be status quo for every platform out there. This technology exists. The technology exists, but is TikTok using it. I'm aware that TikTok actually met with both Hive and Yote throughout two thousand and twenty one to try and understand how their technology works and if there could be a potential partnership there. TikTok ultimately decided not to deploy this technology

and not to partner with those companies. So I went to TikTok's communications team and asked them specifically, why did the company decide not to partner with these third party age estimation software providers. I knew that TikTok had met with representatives from both Hive and Yote throughout two thousand and twenty one, and that the company had decided not to deploy this technology, but I didn't know why they

declined to comment on that particular question. I did talk to a number of people who worked inside the trust and safety space at TikTok to try and figure it out, and I was told that the company was afraid of the public relations backlash or the reputational risk of collecting biometric data of children when there were so many suspicions and fears around TikTok and its parent company, Bite Dance, which is based in Beijing, spying on children or taking data from places like the US and the UK and

giving it to the Chinese government. So the company was afraid that there'd be an additional pr backlash around suspicions of China spying on child users around the world. It's important to take a moment here to spell out the difference between the age prediction software that Hive makes and facial recognition software, because it sounds like they do the same thing. Facial recognition software tries to figure out the identity of a person. That's why it raises all kinds

of privacy concerns. Age prediction software only tries to determine how old and an anonymous person is. It doesn't collect any data or try to find out who that person is. It turns out they are related, but they're by no means at all the same. And of course in the database that we even trained on, there's no identities. We don't know their names. Those pictures could have been taking who knows when, many years ago, you know. So it's nothing to do with real time identity tracking. Every new

image or video that comes to us. It's like we're starting over from scratch and we're just making a guess as to how old we think that person is. So content moderation isn't effective. Age prediction technology isn't always being used, and that means kids younger than thirteen are coming into

contact with all kinds of risky challenges, including the blackout challenge. Now, most adults might see it is common sense not to copy a video where someone is choking or hitting his head on pavement or lighting things on fire, but the brain of a child processes those videos differently and referred to it as a three part recipe. UM in the digital world. That's Dr Adam Pletter. He's a child psychologist

whose work focuses on how kids use social media. If we talk about children or adolescents, what we're talking about is the prefrontal cortex, the front part of your brain. That's the control center. It controls emotions, it helps all those executive functions that people often talk about prioritizing, sequencing, and so that part of the brain, the front part of your brain, develops all through childhood into adolescents. So that's the braking system of the brain, the emotional part

of the brain. UM more of the middle part, more primitive structures in the brain, are you know, really overactive in childhood, which again if you think about you know, if you go out to a playground or you know, see a bunch of children playing, it's a lot more active.

And then the third part of the recipe is the now, the digital world and the amazing emotional often content on demand in often in their pockets, you know, two three clicks away at their fingertips literally, So if you put all those three together, a weak breaking system, an overactive emotional part of your brain that wants, you know, that's encouraging you to go out and seek out information, and then the digital world that is so focused and targeted

on pulling for our emotions, it really creates, you know, a real struggle. To say the least. We'll hear more from Dr Pletter a bit later. The struggle he's talking about is one that's playing out across the globe. In August of this year, two children in the UK died allegedly participating in the Blackout challenge. Ariani's family, along with two others, have launched a lawsuit against TikTok, claiming wrongful death. The case is led by the Social Media Victims Law

Center based out of Seattle. I went out to Seattle and meet with Matthew Bergman. He founded the Social Media Victim's Law CINA and is actually the lead attorney on the Ariani a Royo case. It's clear that this is an epidemic, and it's equally clear that this is not a coincidence, and it's not an accident, but rather a clear result from the design decisions that TikTok made when they set up their algorithms. And that's why we're bringing

these case. Is I asked Olivia what tech talk says about the lawsuit and about the problem of children and its platform. TikTok says that it cannot comment on pending litigation, which is what most companies say when they've head lawsuits filed against them. It does say that the blackout challenge predates the platform, that it's been around for decades before

TikTok was even created, and that's all true. It also says that the blackout challenge has never been a trend on the platform, that they've never found evidence of the blackout challenge actually trending will going viral on TikTok. The company says that it does everything it can to remove content linked to the blackout challenge, that it's against its community guidelines to post harmful content that glorifies or promotes

dangerous behavior like the blackout challenge. They're not seeing how there are algorithms lead to these challenges and not seeing how the addictive nature of their product lead to these outcomes, and this is why we need them to be held account of all in a court of law. The lawyers from the Social Media Victims Law Center want to hold these companies liable when things go wrong, and that was one of the challenges of reporting this is how do you prove that the child died because they saw a

video on TikTok. To prove which social media platform sent that content to that child is incredibly difficult. You need to do a forensic analysis of their cell phone. You need to find when that specific piece of content was sent through to the child, where did they watch it, when did they watch it, and how do you know that when they died. What they were trying to do was emulate that challenge. Even when the evidence may seem cut and dried, the law may not be on the

side of parents. There was a case of a ten year old girl in Pennsylvania who died allegedly participating in the Blackout challenge. She accidentally hanged herself with the purse strap in her mother's ward robe. In that case, her lawyers say that police did a forensic analysis of her cell phone and that there is evidence that TikTok's algorithm sent her a video showing her how to hang herself

with the purse strap of a handbag. TikTok filed a motion to dismiss this case, arguing that it couldn't be held liable for her death because under the Communications Decency Act, platforms cannot be held liable for the content that is posted on their sites by third parties. It is an immunity. It is a shield for technology platforms from being held

responsible when things like this happen. A judge in that case actually dismissed the lawsuit, saying even if TikTok had sent the blackout challenge content, it could not be sued because Section two thirty of the Communications Decency Act shielded it from being held liable. This problem exists across social media platforms. We've seen challenges like the blackout Challenge on you for example. I've also seen reports that it's been

on Facebook, Instagram, Snapchat. What makes TikTok different, though, is that this is a highly performative app, where in order to gain followers and build your presence on the app, you're encouraged to do extreme things. The algorithm really rewards extreme behavior. Thanks to Olivia Carville for sharing her reporting. You can read Olivia Carville's Business Week story about TikTok on Bloomberg dot com. When we come back, what parents can do to protect their kids? Obviously this is pretty

difficult to listen to. And if you're a parent of young kids or even a teenager, you might be one during how you're supposed to protect them from seeing or

possibly copying challenges like these. Doctor Platter, who you heard a little earlier, specializes in helping parents talk to kids about social media, so I asked him what advice he would give, so finding ways of connecting around the concerns without overdoing it, helping the children think things through, having a plan of how they're going to handle certain things. So much of the concerns on TikTok and and other places are predictable, so I try to come at it

less reactive. You want to start off with understanding that these concerns are there, and having some plan, whether it's written down media plan, where here are some rules. Having basic parental controls. I usually lean on the ones that are built into the phones. Um not that any of the parental control systems are perfect, and you can't just sort of set it and forget it, but it serves once it's set up, it does serve a good role. The prince controls that is in encouraging I'll say, if

not forcing the dialogue, family sharing. That's something that you want to do when you're first setting up the phone, but you can do it at any point where each phone, the young child's phone, even a young young kid has their own Apple I D And this way it's all under the same umbrella. If you go under settings, you just click on your name and you basically it says like set up your family sharing and so that's all through the iPhone set up. And then once you have

that set up, you set up screen time. Uh, if you want WiFi controls. Most of the larger internet connections, they have their own apps now, whether it's ex Affinity, Verizon or whatever the company is that you use for your at home WiFi. And I also recommend that setting that up separately because then you could turn other devices chromebooks off, Xboxes during the school day, during study hours, you can easily just toggle things on and off, which

again forces the dialogue. If your child wants to play Xbox, they can come say hey, can I play Xbox? And you say, sure, what are you playing? Who are you playing with? How long do you think you're gonna play? There a little bit of that dialogue back and forth, and then you take out your phone. It's all remote and you turn it on and then when they're done, you turn it off. Do the things you're describing here,

uh work practically. UM, have you seen that this is an effective way for parents to keep their kids safe while not turning their kids away from them? So I want to say yes because this is what I do. You know, typically parents, UM, providing supervision and providing a regulatory system, one that is uh that progresses and evolves

over time works. So it's not just that parental reaction to rain things in protect your kid by simply turning it off saying no, you're not allowed to have it, but doing it in conjunction with talking to the kid about why, so that they are kind of part of a conversation, you know, the counter of what you just suggest. It is not possible, especially coming out of the pandemic. But I would have said this if we spoke about this five years ago, but especially now, screens are everywhere.

It's for education, it's for work. You know, every aspect of that young child's life is connected in some way to some digital format. Some of the kids who are engaging in the Blackout Challenge and other things like that are much younger. They're ten years old. So they're not even teenagers yet. How do you talk to those kids? How do you protect those kids who don't even have the sophistication of a young teen when it comes to

understanding what's dangerous and what's not. So I'd be talking to the parents more than the kids when you get below ten, and then the parents um having to up

that dialogue with their younger kids. But that is the path for the parent to be talking and knowing that all of these dangerous things are out there, not to just be freaked out, but to use that as motivation, protective anxiety, parental anxiety, to talk and to make sure that some of these safeguards are in place where you know, if you have a nine year old, you know, and they're on TikTok, there has to be an awareness from my point of view, that that nine year old might

come in contact with not only strangers reaching out, but also some of these challenges that everyone else is doing, so it can pull them in in very, very scary, dangerous ways. Social media is changing so quickly. There's always new sights, and the ones that were all familiar with suddenly are kind of less attractive to kids, and they're going to something else. That's always been the case with everything. Um,

how should parents keep up? Because they're often the last to know, you know, by the time all the kids are on TikTok, they're just kind of first hearing about it. And that's a little bit by design, because kids don't want to be on the same platforms as their parents a lot of the time. How do you even keep up? As someone who makes it a practice to keep up? So part of my secret is that I work and talk to kids and teenagers all day long, so I'm hearing that firsthand. UM, I hear from my own kids.

I've got two teenagers at home, so I try my best. There's some great websites out there that do some of the work for you. Common sense media comes to mind is probably my go to, UM if I have a question or just to kind of see what's on their front page. And then you know, for ironically, if a parent is on you know, one of the social media apps, whether it's Instagram or Facebook, there's tons and tons of Facebook groups you know, focused on parental controls or often

run by a certain parental control software. But you know, it's not just for that software. UM, And there's a lot of forums out there where people are talking, and sometimes you can get pretty anxious reading some of the stuff. But as long as you go in with the right mindset that you're looking to educate yourself and not just be freaked out, it should serve that purpose. Dr Adam Platter, thanks for speaking with me. Thank you, thanks for listening to us here at The Big Take, the daily podcast

from Bloomberg and I Heart Radio. For more shows from my Heart Radio, visit the i Heart Radio app, Couple Podcast, or wherever you listen. Read today's story and subscribe to our daily newsletter at Bloomberg dot com slash Big Take, and we'd love to hear from you. Email us with questions or comments to Big Take at Bloomberg dot net. The supervising producer of The Big Take is Vicky Burgalina. Our senior producer is Katherine Fink. Our producer is Rebecca Chasson.

Our associate producer is sam Goa Bauer. Hilda Garcia is our engineer. Original music by Leo Sidrin. I'm West Kasova. We'll be back tomorrow with another Big Take on four

Transcript source: Provided by creator in RSS feed: download file
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast