A quick note. This is episode six of a six part series. If you haven't heard the prior episodes, we recommend going back and starting there. It should also be noted that this series explores sexualized imagery involving miners and violence. Please take here when listening Leavitt Town, the place was
itself and innovation. William J. Levitt of Levitt and Sons, inspired by the production assembly line, broke down mass produced housing into just twenty six steps, with construction workers building a house in just one day, quick cost saving production of identical track housing in mere months. In nineteen forty seven, thousands of houses were built and white families only were moving in restricting by racial covenant. The innovation for some,
but it was a kind of progress. And yet today the place that was once the pinnacle of innovation is like everywhere else now, struggling to keep up with the incredible force of technological change.
When I first got here, we were working off of VCR teaps.
Lieutenant Detective Devon Ross has been at the Nassau County Police Department since the late nineties.
Teap fed camp quarters in eight milimeter cassette taps for audio, and now we're in a world that's the people who worked with me back then wouldn't even recognize today today.
Ross is the of the Cyber Crimes Unit, investigating crimes like cyber extortion, online fraud, and deep fakes.
They call it an arms race between law enforcement, slash legislation and technology, and it's just where we're losing. We are absolutely losing the things that we're trying to think about solving that happened six weeks ago. They are already ten months ahead of us, and it's a shame.
The Petrick Carry case is sort of a perfect example. In twenty nineteen, he was essentially using glorified Photoshop to make the images. Basically, he was creating a digital collage a face here, pasted on a body there, with some rudimentary AI to help blend the pieces and presto a whole new picture. But now would be Patrick Carries. Can use an app or a website and a few swipes a few seconds later, boom, something new that looks real.
So I've opened up my laptop and I've got a few windows open of several of the neudifying apps which were on so Olivia and I decided to try it for ourselves. This tech that in many ways brought us to the story. So Drawn Nudes is the app we're going to go visit.
I'd love to see.
We were doing some reporting in California together and we decided to do a test run of some of the latest tools using artificial intelligence to modify pictures which make it so people who were dressed now appear naked. They're called undressing or nudifi websites and apps. For our experiment, we used one of the most popular websites, Draw Nudes, that had more than twelve million visits in the first half of twenty twenty four. These sites let you get
started quickly. You don't need to enter an email or a name to use them, and you can make your first few fakes for free. So it's saying, have someone to undress upload a photo. So I'm going to upload a photo from my.
Library.
So this is a photo of me. I am in a desert in Nevada. Oh, it looks so cool and I have been sort of going for a jog.
Okay, in this photo, I can see Margie. She looks really happy. She's got her arms stretched out, she's standing. It looks like you're wearing running shoes, maybe some a casual pair of shorts, a sports boru and a bucket hit in Sunny's and you're in the middle of the desert.
So done, I'm going to pick auto so ai paint clothes on its own, Okay, so you just paint over where your clothes are. So it's basically on my phone. I'm just basically tapping on my sports bar and my shorts and then it's highlighted in orange. Say's processing photos.
Almost of the place. Do I enclose the page? Oh my god, Oh, oh my god, it looks so real.
Yeah, I look completely naked. I kind of couldn't believe it. In a matter of seconds. This website did what it likely took Patrick Carey and others hours to accomplish.
And it's proportionally careached to your body. Oh my god.
Welcome to the future where you really can't believe your eyes. From iHeart Podcasts, Bloomberg and Kaleidoscope, this is Levettown. I'm Margie Murphy and I'm Onlivia Carve. Nothing can quite prepare you for this. One moment, a smiling vacation picture, the next you're naked. It's not really your body and it's no longer your photo. It's completely surreal to see it happen, and a little weird to share it with a coworker. I found it hard to look Olivia in the eye
for a moment afterward. I can't imagine what it would be like to have your classmates looking at something that feels so intimate. However, I created that particular image, I might not have known exactly what I would look like once the tech undressed me, but I knew basically what was coming. That's not often the case. Most people who make deep fake pornography haven't asked their subjects whether they're okay with it. They just do it. It's pretty much
the reason these apps exist. So when a person learns that their image has been altered this way for someone else's pleasure, it can feel like a violation. For many of the women we spoke to, it was a violation, a new kind of violence. As a society, we haven't even begun to grapple with this. Currently, it isn't a crime under federal law to create and share deep fake pornography based on adults. When it comes to fake neods of children, the law is a narrow and pertains only
to cases where children are being abused. In March of twenty twenty four, the FBI did issue a public announcement saying child sexual abuse material, even if created by AI, is illegal, but again, the law they are referring to
is quite narrow. So far, at least twenty five states have tried to address this with their own laws and regulations, but it should be noted that federal law protects the hosting websites, whether it's a forum, social media platform, or internet provider, from being held liable for any content, even if it's illegal, posted to their sites.
Meanwhile, EPP developers are racing to mat a growing demand to make better nodes faster, and many of the people who are using this tech aren't adults. The a teens wielding this technology against other teens.
Deep fake porn crushing high school girls in Westfield, New Jersey. Two students are some serious trouble after being accused of using AI to create inappropriate images.
Beverly Hills Middle.
School students are in trouble for using AI to create and share the fake nude images of their classmates.
Parents are at a loss and so are school officials. In so many places, there are no guard rails to make sure what happened in their school in the town doesn't happen again. Since I started reporting on Leavettown in twenty twenty three, I've followed as cases like these have mushroomed across the country, but one in particular stood out.
I didn't actually realize that it was such a big thing across the country.
I think.
Detective Laurel Beer is one of two investigators in a small police department in Lancaster County, Pennsylvania.
That's probably what surprised me the most was that there's been so many cases that haven't ever gone to prosecution simply because they didn't know how to, they didn't know how to use the laws to their advantage, or they didn't have a law that they could use to press charges for something like this.
Ancaster is a place synonymous with farmland and the Amish, where it's pretty normal to see a horse and buggy clumping down the street.
Because we only have two of us, we pretty much do every kind of investigation that comes into our office, from your property crimes like burglary or trespassing, all the way up to homicide that we would have it, but pornography cases, child pornography cases, stuff like that.
That's something that we handle between the two of us. We handle several a year.
In early twenty twenty four, Laurel got a call from a patrol officer. Parents whose children attended a nearby private school reported that one of the students had accidentally shared a photograph and a group text messaging thread. That photo was pulled from the social media account of a female classmate, but she had been undressed with technology. It was a deep fake. The officer didn't know whether they could arrest the student, but Laurel had worked a similar case the
year before. She knew that in Pennsylvania, any quote depiction of a child in a sexual manner, created by AI or not can lead to a child abuse charge.
It really helped move things along, I think, at a much faster piece because we knew where to go.
The pictures in this case weren't being traded on a tributing website like they were in Leavittown. This time, they were being shared between two classmates over a messaging platform called discord, until one of them messed up and sent a deep fake to a larger group threat. But because the rest of the pictures were being privately exchanged. No one knew exactly who'd been deep faked, so students and families were panicking, and the detective wanted to provide answers.
Like the police and prosecut is in Levittown, Laurel got a search warrant for online records and chat histories. It was a painstaking process to identify each victim. Laurel created spreadsheets, collected and cross checked the pictures against yearbook, interviewed witnesses, and contacted families. In all, she was able to identify sixty victims, fifty nine of whom were miners.
I was surprised over the course of the investigation to find out exactly how large scale this case actually was.
It's one of the largest known cases of deep fake pornography in the US that we know about. And just like in Levittown, these young women knew the alleged suspects.
Which seems to be the case in most of the ones that I've researched and seen across the country. Is that most of the time it's not somebody that these kids don't know. Sadly, it's someone that they do know.
Just knowing that day in and day out, this person was among them in their day to day life. Well, they ate lunch, while they went to class, while they played sports, you know whatever it was that they might have had a class with one of their students.
School officials in Lancaster first learned about the deep Feek images several months before Laurel got involved.
In November twenty twenty three, an anonymous tipster told school officials that a student had created AI generated nude images of their classmates. The school launched an internal investigation, but couldn't corroborate the tip. They did not go to the police. Months later, after another anonymous tip, they opened a new investigation. After that, one of the suspects was identified and withdrew from the school. The incident was reported by the school
to the state agency responsible for child abuse reporting. The police had no idea this was happening until spring twenty twenty four, when a mother told them.
A Lancaster Country Day High School student staged a walkout this morning in protest of the administration's handling of the private school's ongoing AI nude photo.
Scanned on this In November twenty twenty four, hundreds of students and faculty walked out in protest of how the school had handled the situation. Classes were canceled. One student said she thought the action sent strong Massachestts school, that students are upset with how the adults handle the AI situation and that girls don't feel safe at school. Parents were outraged about what they believed was negligence on the part of the school, and they sued. As of the
end of January, the cases ongoing. Officials from the school told us they cannot discuss the details of the matter. They said they are working on caring for their students in the process of reviewing school policies. They also added that they've begun additional staff training on child protection and
mandatory reporting. In the meantime. The suspects in the Lancaster case have been charged with felony offenses, including sexual abuse of children and the dissemination of child sexual abuse materials. They're due to enter please soon. Police haven't shared their ages, but according to local news reports, at least one of them was in ninth grade at the time. They will be tried as miners. In Laurel's experience, that could mean jail time or being sent to a rehabilitation facility for.
A facility that helps with the sexualized behaviors. It's a more of a treatment than a punishment, I guess you could say.
So that's one possible.
The other possibility is just a detention center like you would expect there to be. But it's nice that we have that option, that we have done that before. Worry you sent them to a treatment facility to again fix the behavior so it doesn't continue into adulthood.
Because that's the other issue in these cases that have popped up across the country. The people who are behind deep fakes, non consensual pornography and the harassment are often the same age as they're victims. As deep fake technology becomes ubiquitous, intuitive, and easier to use, the users become younger and younger, coming to these sites right when their desires and their concept of sexuality is starting to be hardwired,
and while they're exploring this part of themselves. Instead of grabbing a playboy from underneath dad's bed or sneaking a glance at Skinomax, they can now make in trade pictures of people that they actually know without necessarily understanding the consequences both psychologically and legally.
These kids can do it in the privacy of their bedroom.
No one knows that they're doing it, and that just creates this start on the line of deviancy, where now it's okay, well, if I can look at that, then maybe I should try girls my age, which is what we see a lot, is that they're looking at They're looking at pornographic images for kids their age, which for them doesn't seem wrong because it's their age, but it is still child pornography.
Of course. And I guess if they can't find pictures on the Internet of girls their age, now there are apps allowing them to create it themselves. Yes, at this point, it isn't a crime federal law to create and share deep fake pornography, and the Supreme Court hasn't ruled on what can be deep faked legally and what can't be. There is a constitutional prerogative to protect free speech, but stopping non consensual deep fake pornography is one of the
few things that nearly everyone in Congress openly supports. In fact, in twenty twenty four, the Senate unanimously passed a bipartisan measure that would regulate deep fake pornography, making it a federal offense to make and distribute non consensual deep fake porn, and it would require social media companies to remove the images within forty eight hours of being reported. Companies like Meta, Google,
and TikTok all came out in support. The bill died in the House at the end of twenty twenty four, but its backers say it's a priority for them to reintroduce and pass it into law in the new session. And while states are trying to pass their own laws, many are written so that only people under the age
of eighteen can be considered victims of deep fake pornography. Still, other states, like New York, have now expanded existing revenge porn statutes to include deep fakes, but this can mean that a victim has to prove intent that the person who made and posted the image actually wanted to do them harm. In Pennsylvania, law enforcement officials like Laurel are using child sexual abuse statutes to charge miners in non
consensual pornography cases. Late last year, Pennsylvania passed a bill expanding the definition of the quote unlawful dissemination of an intimate image to include deep fake pornography. So whatever legal protections do exist are patchwork, uneven and untested. But there are other officials going after the people profiting from this technology. If you could just start by introducing yourself, that would be perfect.
Sure. I'm David Chu. I'm the City Attorney of San Francisco.
David Chu had first heard about Newdifi apps through the women in his office, who were both lawyers and mothers of teenage girls. He said, he realized how dangerous these websites and apps could be, and he decided to act.
We have brought a first of its kind lawsuit against the world's largest websites engaged in AI facilitated exploitation. We have sued sixteen of the most visited websites that create and distribute AI generated non consensual pornography.
Choose Office filed the suit in August twenty twenty four, arguing the new Deify sites not only violate federal and state criminal statutes against revenge pornography and child pornography, but also california is unfair competition law, which regulates how companies can advertise online.
Collectively, these websites have been visited over two hundred million times in the first six months of this year in twenty twenty four. These operators are profiting from this content. They require users to subscribe or pay to generate images, and they know exactly what they're doing. They know that
this is non consensual. These images are incredibly realistic. There's one website that specifically promoted the non consensual nature of this, and I quote, imagine wasting time taking her out on dates when you can just use website xxx to get her notes. And so our lawsuit, among other things, is seeking to shut them down.
Have you heard from any of the companies named in the lawsuits.
It will be interesting to see if they scurry out of the darkest corners of the Internet to defend this.
She says. His office hasn't heard from many of the companies named in the lawsuit, and his office might be waiting a while because these sixteen websites and apps are being developed all over the world. One of these websites is draw Needs, the same one Olivia and I used to make a deep fake of me. You can find it easily, but finding out who runs the app that's trickier. We tried following the money when it came to the
draw neuds app. We paid eight dollars for their premium service, which allowed us a window into how they collect the cash. Making a payment isn't as easy as typing in your credit card, So when I was trying to pay, it takes you off app and you talk to these dealers. It's a live chat and you'll say to them, here you have a specific link that corresponds to your account,
and they'll say, okay, here's a PayPal link. Go to that PayPal account, give me the transaction ID, and then I will go and kind of put those coins that you get, the eight dollars worth of coins into your app.
Why there's so many steps to get coins to.
Do this, I think there's a couple of reasons. It could be that they are just trying to hide their tracks. If you are paying people to use their PayPal accounts, there's no link back to you, so there's no paper record. The other reason that you know that it's not just a simple use your credit card thing is because a lot of companies, payment processing companies are now banning these
kind of apps. The truth is it's not really complicated for the consumer, but those extra steps make it harder to determine the identity and the finances of the developers. And that's exactly who he wanted to find so we went looking for the guy whose name appeared to be connected to draw nudes to ask him about a lot of things, including the lawsuit. I was looking for a guy named Cyril. What I know about him is he appears to be in his early thirties and he's from Ukraine.
His hair is long enough to tuck behind both ears. He looks more like a DJ than a tech mogul. He has an active social media presence and appears to have a model girlfriend. He moved to the United States before the war in twenty twenty one and lives somewhere in the US. Cyril has several businesses. He's talked publicly about one of them, which acts as a sort of broker for internet for a fee. Cyril's company helps a
business to advertise on the web. Here he is at a conference in Thailand in December talking about marketing supplements. This one promises everything from weight loss to bigger muscles and better erections.
I run the agency and we help like we and we helped to some neutral guys like to scale or like to build the Neutra offers or neutral whitehead brands.
But it was another one of Cyril's companies that caught our attention sole e Com, which was listed on the bottom of the draw News homepage and in the David Chu lawsuit. Sole eCOM has a registered address in Los Angeles. I spent weeks trying to contact Cyril and anyone who might know him. That's when we tried knocking on the placelisted is soul Ecom's office. Hi, my name is Margie.
Are you Pilo? Oh?
Okay, so I'm a reporter.
From The person who answered said they'd never heard of his name or the business. But more than a month after I started looking for him, Cyril finally picked up the phone. We talked for maybe half an hour. During the conversation, Cyril was adamant that sole eComm and draw Nudes were separate companies. He said that, like his other clients, the developers of draw Nudes app had come to him
for help marketing their product. He also said he had actually met the app's owners, two men, he says, one a Russian living in Cyprus who was big in the porn industry, the other a Belarusian who lived in the States. Cyril said that after reading several negative articles about the draw Nudes app, he told the developers he didn't want to work with them anymore. Cyril told me that he in soul eCOM, had not received any form of payment
from Drawnuds or its developers. We spoke several times, but after our last conversation, promising to speak again, we hung up and then he ghosted me. I emailed, rang, sent telegram messages, booked time on his calendly. All the while I watched Cyril attend a tech conference in Thailand, travel through Italy, and spend Christmas on a luxury ski vacation in France. As Cyril was off avoiding me and living his best life, twenty twenty four came to a close,
and with it, the drawnuds app and website went offline. Meanwhile, I was looking at other nudify apps named in the California lawsuit, one which is called cloth Off. It's even bigger than draw Nudes, with almost twenty seven million visitors last year alone. As I clicked around the site and the app, I was struck by how similar the cloth Off, draw Nudes, and another site called undress app were. They all have the same interface, same graphics, same text, same design,
the same support assistant infrastructure. In fact, many of the top nudifying apps seemed to be shell websites that hyperlinked to each other, moving Internet traffic around to get people to pay up for a smaller group of services that all worked in exactly the same way. And when I was using draw Nudes and had a question for customer service, up popped the cloth Off name and logo, same customer service,
same company. I contacted customer service again to ask if cloth Off or draw Nudes had been bought or sold recently. I was assured that owners hadn't changed, but that the cloth Off and draw Nude websites had moved to a different domain for quote greater online recognition and to speed up its performance, which may very well be true, But another effect of switching domains is to make it really difficult to trace the owners and hold them legally accountable.
All signs point to one likely explanation. Cloth Off, draw Nudes, and the undress app all sites named in the California suit may be the work of the same people. We attempted to contact the sites via their service email addresses, but got no response. Again, a handful of people may be behind three of the most widely used nudifi apps in the country, and the impact of their product is at best terrorizing young women and girls and at worst changing young men and women. It's changing how they relate
to the world and each other. While Apple and Google have banned a number of nudified products, apps, and websites from their app stores, their sites can readily be found on Google or Safari. Developers keep putting them out and anyone, anyone can use them. If this lawsuit fails to stop the influx of these apps, David Chu from California says there are other ways to defang them.
There is a universe of other technology players and technology infrastructure in and around these websites.
He's talking about Internet service providers, search engines, login authenticators, or payment processes. It sounds boring, but this is how tech works. Software working in tandem with online businesses to make even the most basic sites run. After being called out for their relationships with the newdify apps, some of these tech companies have already said they're going to stop working with them.
But there are also richly capitalized teach companies like st Stability Ai and mid Journey that are creating powerful generative AI tools. These companies say they've put guardrails in place to prevent anyone from using the attack to create deep fake pawn, but there are ways to get around them by using the older open source versions of the attach or using special prompts to circumvent the word filters. Stability AI declined to be interviewed and we didn't hear back from mid Journey.
And Newdify apps aren't being pushed into the shadows. Giants like Matter are allowing these companies to buy ads and market to consumers on their massive platforms. By some reports, nearly ninety percent of traffic to certain Newdify apps is driven by ads on Facebook or Instagram. We emailed that owner Matter, which told us these apps and these ads
break their rules. They remove the ads when they become aware of them, disable the accounts responsible, and block links to the host websites, but the spokesperson noted this is an ongoing battle. The people behind these apps are constantly evolving their tactics to stay ahead of the game. If blocked, they just pop back up again under a new domain name.
Our hope is with more public attention to what's happening here, as well as letting people know about the lawsuit that we're involved in that other players will step up to do their part to help all of us crack down on these bad actors and their use of AI exploitative technologies to abuse real people.
Artificial intelligence is part of our reality now. It's everywhere, and it touches every single part of our lives. But just because it exists, it doesn't mean it can't be controlled. This isn't science fiction. Robots aren't taking over, at least not yet.
I have to say one of the personal reasons why I am so motivated for our team to get to the bottom of this is I have an eight year old son, and I'm worried in a few years that if we don't get a handle on deep fake pornography, that his generation is going to be impacted in ways
that we cannot anticipate today. Not just the impact on his female classmates, but what this will do to boys and teenagers as they grow up, learning about different norms in how to engage with their classmates, their friends, people they're going to have relationships with. This is horrifying and unfortunately, this AI facilitated exploitation, I'm worried has opened up a Pandora's box and if we don't act quickly, it could warp an entire generation of young people.
The Livet Town case showed us that the stakes are high, much higher than many of us realize. While their story was one of the first, it took us beyond deep fakes and tributing websites, beyond the hurt and fear experienced by a couple dozen school kids. Even now, it stands as a warning.
I'd hope that after this case, in his conviction, something's changed in the way that they approach these kinds of oversteppings of moral boundaries.
You know. I asked Cecilia, one of the Livett Town women, what she thought may help other people who were experien variencing what she did.
I'd hope that maybe after this we have more of an idea of what to do or how to approach it when someone does use AI or technology in a way to harm others. You know, it was one of the worst things I've ever been through, and not everyone is either healed enough or lucky enough to have an outlet to talk about that. But man, I really hope that something can be done in terms of regulation so that other people don't have to experience any aspect of how it feels to have that done to you.
All the women we spoke to agreed to share these stories because they recognized that there are still very few safeguards in place to protect them or others even now. Patrick Carrey was released from prison early for good behavior in August twenty twenty three. He spent four months behind bars. Cecilia found out the hard way. She drove past him walking down the street one day and nearly had a panic attack. Kayla has also seen Patrick just around the
corner from her house. She has become, in a way, a de facto spokesperson, a role she may not have asked for, but because of her experience, she says, she can't stay silent. How do you feel that there are still no federal laws preventing this today?
It's very frustrating, very frustrating. I don't understand any of it. I really don't understand how hard it is for them not you see that. There's plenty of cases you can pull up and be like this, this is this, Like how could you not make a change there? A change needs to be made. It's unfortunate that we have to go through this, but that doesn't mean you can't stop it. There's something you can do. I mean, I don't know how hard it is, but think about how hard it
is for me to speak out on this. I mean, I shouldn't have to do this, but I am because it's worth it, because it's worth getting the word out there, because something needs to change.
This series is reported and hosted by Olivia Carvill and me Margie Murphy. Produced by Kaleidoscope, led by Julia Nutta, edited by Netta to Luis Semnani, Producing by Darren Luckpott's Executive produced by Kate Osby, Original composition and mixing by Steve Bone. Fact checking by Lena chang Our. Bloomberg editors are Caitlin Kenney and Jeff Grocopp. Some additional reporting by Samantha Stewart. Sage Bauman is Bloomberg's executive producer and head
of Podcasting. Kristin Powers is our senior executive editor, lavettown archive booklips provided by screen Ocean Clips and footage from iHeart. Our executive producers are Tyler Klang and nicky.
Etour Special thanks to Robert Friedman Andrew Martin, Gilda Decale Andrew Ressio, Victor Evez and Sean Wenn at Bloomberg. Thanks also to Alex Sarmas, Henry Ida, Matt Torre, Paul Manton, and the Livetown Historical Society, Nassa County District Attorney Anne Donnelly, and Nicole Turso. And thanks to all the young women and parents in Levettown who spoke to us as we reported the story.