You're listening to Comedy Central.
Yew.
I'm talking about photo filters.
Yes, they've helped mankind realize its dream of puking rainbows.
But some of the most popular filters just.
Help you look more attractive, which may sound harmless, but it could be anything but.
Cutting edge apps and social media filters are allowing ordinary people to enhance their online photos to impossible perfection.
In some cases, it's sparking a concerning phenomena.
With apps like Phase Tuned, you have the power to completely transform yourself. Bigger eyes, skinnier nose and jawline.
Smaller butt or flatter belly, whiter teeth, smoother skin. You can do it right on your phone.
When I take a selfie, I always use filters.
I wish I could look like my filtered self in real life.
This obsession with personal appearance that selfie culture encourages may have darker implications for mental health.
A study in the Journal of the American Medical Association says filtered pictures can take a toll on self esteem, body image, and even lead to body dysmorphic disorder.
I do feel like we're losing touch with what reality looks like. We're already getting there to the point where we're expecting people to look as unhuman as possible.
Yeah, photo editing filters set unrealistic expectations for beauty the same way fruit ninja sets up unrealistic expectations of how easy it is to slice floating fruit.
Da.
And once you have this filtered version of yourself in your head, you become dissatisfied with what you really look like. So in essence, we're basically catfishing ourselves. But if these editing apps can turn adults into quivering blobs of insecurity, just imagine what they're doing to kids.
Psychologists warn these photo filters can be particularly troubling for teens and young people who are still developing their sense of self.
Eighty percent of girls in one sir say they compare the way they look to other people on social media.
On instance, like I follow people like Kendall Jenner and Kylie Jenner, and they all have this time measure like body imit that everyone is expecting from this generation.
Young girls on social media have a negative body perception, with one in seven girls reporting being unhappy with the way they look at the end of elementary school, and that number almost doubling to nearly one in three by age fourteen.
Eighty percent of young girls are using photo retouching.
Apps to change the way they look before posting pictures.
And those with high scores for manipulating their photos were associated with high scores for body related and eating concerns.
Any of you ever question your body because of what you see on social media, shame man.
This is a vicious cycle for teenagers.
Social media makes them unhappy with how they look, so then they use filters, which perpetuate the unrealistic expectations for themselves and others. Plus, they're teenagers doing all of this while they are driving, which puts everyone at risk, and all the insecurity this creates is harmful for teenagers because I know it's hard to tune all of this out,
but teens shouldn't be obsessing over this stuff. Like I honestly wish I could sit all teenagers down and say, hey, don't worry about how you look.
The planet's going to die out before you're thirty. It doesn't matter.
Now.
It's bad enough when people wish they had the perfect Instagram look in real life. What's worse is when they actually try to make it happen.
The more people look at doctored up images, the more likely they are to actually start seeking out cosmetic procedures at younger ages.
These cosmetic procedures are becoming so popular with teens plastic surgeons have coined a new syndrome for it, Snapchat dysmorphia, and the number of kids getting niptucks may astound you. In twenty seventeen and nearly two hundred and thirty thousand teens had cosmetic procedures. Kids as young as thirteen are getting them.
Doctors seeing an influx of people of all ages turning to plastic surgery to look more like they're filter.
Sixty two percent of plastic surgeons reported their patients wanted to go under the knife because of dissatisfaction with their social media profile. Fifty seven percent said their patients wanted to look better and selfies.
Absolutely, it's becoming more and more common when people will show me images on their Instagram or even something that fussed on Facebook, or this is really high I want to look.
Just last week, I had a patient come in and asked me for more of an anime eye and she couldn't figure out why it's not possible.
Okay, man, this is really disturbing. Thirteen year olds in particular should not be getting plastic surgery. I mean, when you're thirteen, your physical appearance is already naturally changing. That's not how faces are doing. It's like long term plastic surgery. I mean, this is what I look like when I was thirteen. You gotta let that shit play out. Honestly, though, I don't blame the teenagers. I blame the parents and the plastic surgeons. I mean, how are you going to
let them do this to themselves. They can't even buy cigarettes, but you're gonna let them buy a new face.
Canny.
This is getting out of hand, which is why there's now a movement not just against filters, but all the ways that people have been distorting reality on social media.
Many influencers have started speaking up on this issue, admitting that they've presented altered images in the past and are opening up the conversation. Some are even posting raw, totally unedited photos of themselves and breaking down how people on your Instagram feed may be manipulating their angles and lighting to get that quote unquote perfect selfie.
There are many celebrities exposing the dangerous of digital distortion. They are posting images of themselves unedited, unfiltered online and this is a great example to young girls.
Pop star Lizo made a big splash when she posted as selfie in the nude and unretouched.
There's no shame anymore, and I just kind of post myself.
It's like, you take me as I am, you gonna have to love me.
British MP Luke Evans has proposed the Digitally Altered Body Image Bill, which would require advertisers and publishers to display a logo whenever a person's face or body has been digitally enhanced.
Okay, first of all, I love the idea of putting disclaimers on photos of people who have been digitally altered. I love it, and honestly, I don't think we should stop there. We need to do this with everything that's been digitally out, like food ads. Those are the worst. Every FoST food burger looks great on TV, but then when I order it, it looks like it fell asleep in a hot tub. But I'm glad that we're finally learning the truth about what celebrities look like. You No, personally,
I'm waiting for SpongeBob to join this movement. I mean, no way that guy is that square? Naturally have you seen that? It's not even maybe he's not real? The how we have a TV show? Now, I'm not naive enough to think that society is gonna stop creating unrealistic beauty standards anytime soon, all right, But what I do hope is that we can better educate our kids and ourselves that our own natural bodies all beautiful and except for that flap of old people's skin we have on
our elbows, like that shit is gross. I don't care who you are. It looks like a mid on bull sack, but everything else is beautiful. But because this movement could take a while, we hear The Daily Show decided to come up with a filter of our own that might help.
Are social media filters giving you body image issues? Are you depressed you don't look as good as your filter, then good news. You'll never have to worry about living up to your filter again. With Rudify, it's a brand new filter that turns your face into Rudy Giuliani. You'll never be happier with how you look in real life.
And obviously this wouldn't be effective if you could turn it off, so Rudeifi overrides all other filters, and just to be safe, Rudify retroactively applies itself to every face and every photo in your phone. The best part is the filter is permanent, just like Rudy himself.
You can never get rid of it.
Rudeify, You'll be overflowing with self esteem. Warning you have distilted by Rudy Giuliani. Will ruphter the fabric of space and time.
My guest tonight is former Facebook product manager turned whistleblower Francis Hogen. She's here to talk about how Facebook prioritizes profits over public safety and teen mental health. Francis Hagen, Welcome to the Daily Show.
Thank you for reviving me.
How diy be here? Your name isn't as big as the story that you really, I mean leaked to the world. You know a lot of people, you say, Francis Haugen, be like, who is that? But then if you say to people, hey, remember how we all learned that Facebook and what social media is doing is essentially destroying teenager's brains and it's harming all of us.
Yeah, that report happened because of you. So let's start with the question of why you've worked.
For Google, You've worked for Pinterest, You've worked for Yelp, and yet you blew the whistle on Facebook.
Why did you feel like, oh, this is something I can't sit on.
So when I joined Facebook, I thought I was going to work on misinformation in the United States, and I was surprised. I actually I was charged with working on information outside only.
Outside the United States, Okay.
And like many people and many technologists, I had never really focused on Facebook's impact internationally, And very rapidly I realized kind of the horrifying magnitude of the danger that we were facing. That Facebook's algorithms give the most reach to the most extreme and divisive ideas, and that process is destabilizing some of the most fragile places in the world, like Ethiopia or what happened in me and more.
You see, when you say that there were some people who were accepted immediately, and then there.
Are some people be like, that's not trut all.
But I'm sure there are a lot of people who will say, but I mean, aren't they always going to be extreme people in the world. You know, there's always somebody saying something. You know, Facebook themselves say they go like, hey, We're not part of the problem. We're merely a platform that people put their views on. We are not part of the problem. And you disagree with that.
So let's imagine you had a relative who had particularly extreme ideas. You know, we often say like the crazy.
Uncle, right, correct, Right?
Some of us have those that person if we're talking to them one on one or we're talking to them at a family gathering. That scale allows for resolution of ideas.
Okay, what's happening.
On Facebook right now is ideas that are the most able to elicit a reaction and the shortest path to a click is anger. Those get the most reach. And ideas that are more moderate or they try to bring us into synthesis or to help us find a middle path, those aren't as likely tolicit a comment from you or a listed alike, right, And so Facebook doesn't get as much distribution.
So Facebook is essentially propelling certain ideas out there. So they may say we're the platform, but really what they're also doing is they're advocating for certain viewpoints because they push those out because those get people engaged and spinning in the cycle.
So I think it's the thing where it's not. They sit out and go, how can we have more extreme ideas. I don't think that's what they're trying to do, but they do know that there are lots of solutions that are not about censorship. It's not about picking good or bad ideas or good or bad people. It's about how do we change the dynamics of these systems so that you could have good speech counter bad speech.
How do we give those more of a people say at the table.
You know, it's interesting that you lay it out like this because I remember talking to my friends about this, saying, have we become angrier in the world? Have we become less able to have misunderstands? Cause I sit with my friends and I go like, I don't agree on so many things with my friends, But I don't remember a time when that meant that I couldn't be friends with them. Now, in the role of social media, right now, it feels like if I disagree with you on this, I disagree
with you on everything. It seems like Facebook is if you disagree on this, we're going to make or we're going to find more ways to you know, to find way you disagree or how to make you disagree.
So some people have said this idea of like Facebook's always leaking for fault liance, right, So a high quality piece of content is one that gets lots of engagement and it turns out an angry common thread where you know, it causes us deal at each other. That is viewed as a higher quality.
Piece of content.
And so you know, during COVID, I've had friends who wrote very long, thoughtful pieces that when in digestive lots of information, that's going to cause less knee jerk reaction than something that really offends you.
So masks don't work. People are clicking on that.
Here's the thing about masks we have to consider in our society to boom.
Facebook's not apprescing that.
So what's interesting, you know, in everything that you've done is even though you're a whistleblower, even though you came up with this information, you've never said I'm anti Facebook. You've never said shut Facebook down, You've never said this thing needs to go away. What you're arguing for is we change our relationship with not just Facebook, but social media company as a whole.
Explain what that means. So often when.
People talk with this idea of when you change our relationship with social media it's more about like, should we use our phone so much? But I'm encouraged you to have a conversation about what's our relationship with companies or what's the company's relationships with us.
Okay.
Facebook knows that they have a level of on accountability that's very different than either Google or Apple or other big platforms, because in the case of Google, we can scrape, we can download their results and analyze them and see if they're biases. In the case of Apple, we can take apart the phones and put up YouTube videos saying Apple phones do or don't work the way they claim. Oh. But in case of Facebook, researchers and activists have been
telling Facebook, hey, we have found all these examples. We think there's a pattern here. We think there's too much human trafficking, we think kids are suffering.
Right, And Facebook keeps coming.
Back because they know no one can call them on it and saying that's just anecdotal, that's not real.
But when you release the information, we realized it wasn't anecdotal, it was real because Facebook can do basic research, right you see. Now this is interesting because this reminds me of the tobacco companies. Yes, it reminds us of the you know, the fossil fuel companies. They the research, they find out something really bad, and then I mean, obviously they go like, we're not going to put this out there,
but we know. One of the biggest things that's really gotten people worried is how social media affects younger brains.
I mean, it affects my brain.
I don't consider myself younger anymore, but young people are like, it seems like the highest at the highest risk.
Yeah.
Facebook's internal research says that the highest rates of problematic use peak at age fourteen. So you're not allowed to be on the platform until you're thirteen. Uh huh, So I guess it takes a little bit of time to
form that habit. But the largest fraction of any of those age cohorts they sampled, and they've done studies that have one hundred thousand people in the most beause they're not tiny studies, they find that the largest fractions say I can't control my usage, and I know it's impacting my health, my employment or my school.
Wow.
Is at age fourteen, and so there's a real thing where young people have both you know, they're struggling with issues in their life, yes, and they don't have the self control and they see this, they say this to researchers, They say, I know this is making me feel bad, and I can't stop. I fear I'll be ostracized if I leave. And those factors mean it's not easy for kids just to walk away.
It's also not easy for people to understand how to how to fix it. You have lawmakers now who are trying to figure this thing out. Silicon Valley is way ahead of them, right like, way ahead of them law makers. You see these hearings, they'll be like, what if my Facebook is an install flap?
They don't understand what's happening.
So how do we begin creating a world where we're not destroying the companies, but we're regulating them the way they should be.
So the main thing I'm advocating for is around transparency because one of the problems today is because Facebook is the only one that holds the cards, like they can see they can see whether or not they're holding a royal flush or not what they claim. We need to move to a world where we have more access to data, and that can be aggregate data, it can be privacy sensitive data, it doesn't have to expose people. It's not.
It's a false choice of privacy versus transparency. And once we have the ability to have conversations, we can stop talking about boogeyman social media, I see, and we can start having conversations on how do we solve these problems.
I think what we need to do, and this is why I want to spend twenty twenty two doing, is we need to start organizing people like I want to plant a youth directed movement where we can begin putting pressure on Facebook to release this information, because I don't think Facebook is just going to do it automatically out of the goodness it's heart. We have to force them to and I think there's lots of lots of opportunities where we can begin putting pressure on them, either socially or financially.
As a user of social media, as a person who speaks to my friends on social media, I have my enemies on social media. I appreciate this conversation because I think if we're not careful, we end up in a world where we're all fighting really just because an algorithm is trying to make us fight. Half of those fights I've noticed, even with the show. If I say a thing on the show, this has been like the most interesting thing for me is depending on what I say Facebook since to different people.
One time we have it on the show. Yeah, one time we had it on the show a where literally I'd laid out both arguments.
I said, here's here's one way to see it, here's another way to see it. But then on Facebook, only one way wins to each group.
Yeah, and I.
Was I was fascinated by that because then people are like, why didn't you say the other wise, Like, but that's exactly what I said. They're like, oh, well I didn't see that, and then you realize, you know, so.
One of things that happens is, so let's say you take a video clip, so we'll say you have out of any given on your shows, people go and put on YouTube lots of different little chunks.
Uh huh.
People can recede those onto Facebook. Over and over again they do. And each time that enters into Facebook, you know, someone posts it to a group. Someone doesn't this, it enters into the network in a different way, and the algorithm begins picking up data on that and it says, oh, interesting, like these kinds of people engage these cans still and so, and because it's entering from so many different points, it actually gets a chance to learn what kinds of people
or communities it engages the most with. And so that echo chamber is real. Like the algorithm pulls us towards the homogeneity.
It's almost like Facebook knows that you react most to your neighbor when they play loud music. There's no reaction from you when your neighbor's doing other things, but when your neighbor plays and loud music, that's when you react.
Actually, Facebook, I'm just gonna.
It's even worse than that.
Actually gonna be worse.
Imagine there's music that you like, and there's music you don't like. Wow, It actually knows you really don't like I don't know, reggaeton or.
Something, oh wow. And one of things that's I think.
It's really sad is one of the things that was in the documents is that people talk about, especially people from marginalized communities, that they will go and and correct people who are spreading racist actors, homophobic comments.
And guess what.
Now, Facebook knows you engage with those keywords, so.
They send you yeah, yeah, I didn't know they were doing that.
I don't do it a purpose. It's a side effect.
It's a side effect of the way the algorithm works.
So the algorithms designed to get as much engagement as possible, and then engagement unfortunately met.
You're more likely.
There's a there's a car accident, I'm going to engage yeah, or.
More of if they had an option of showing you a stream of different things so they know you got a car accidents, so they show you more.
Man, this is hard because on the one hand, I go like, well, I need to stop looking at car accidents.
But the brain is the brain.
It's the brain.
You know, if if all the AI is not intelligent, right we like to say artificial intelligence, Yes, but people who actually study called machine learning because it's not intelligence. It's just a hill climber. It's optimizing and I you know, think of all the different pieces of content you get exposed to. It's not trying to show you most extreme content. It happens to be to a fulfill its goal function. It mindlessly pushes you towards it. And so this is
about Facebook making choices to not fix it. That Facebook has lots of lots of options that don't deal with that machine.
Run wild. They're on the machine right Well, money, money, that's what it is. Thank you so much for being here. This has really been enlightning. Good luck on nurst of your journey.
Thank you so much.
Explore more shows from the Daily Show podcast universe by searching Daily Show wherever you.
Get your podcasts. Watch The Daily Show weeknights at eleven.
Ten Central on Comedy Central, and stream.
Full episodes anytime on fairmounth plus.
This has been a Comedy Central podcastow