Search Engine is sponsored by Viori. Viori is a new perspective on performance apparel. It's perfect if you're sick and tired of traditional old workout gear. Everything is designed to work out in, but it doesn't feel or look like it. It's extremely comfortable. You'll want to wear it all the time. I promise you it is more comfortable than whatever you're wearing right now. The product is incredibly versatile. It can be used for just about any activity, running, training, swimming, yoga. But it is also great for my favorite form of exercise, which is lounging on a sofa.
It's a great way to get an offsetting their carbon footprint. They're using better sustainable materials for their products to empower your best active life. Viori is an investment in your happiness. For our listeners, Viori is offering 20% off your first purchase. Get yourself some of the most comfortable and versatile clothing on the planet at Viori.com slash PJ Search. That's Viori, V-U-O-R-I dot com slash PJ Search. Not only will you receive 20% off your first purchase,
but you'll also enjoy free shipping on any US orders over 75 bucks and free returns. Go to Viori.com slash PJ Search and discover the versatility of Viori clothing. Not all search engines are good. Some are morally dubious. A couple months ago, I found myself using a morally dubious search engine. This search engine lives on a website. I'm not going to tell you the name of it for reasons that will become clear. But I am going to describe to you how it works.
So you open the site, you upload a photo of a person's face, I can upload a photo of myself right now, wait about 30 seconds for the search to run, and then I get taken to this page, which includes all these different photos of my face from all these different places on the internet. I can click on any of the photos and it'll take me to the site where the photo lives. This is Jazam for faces.
If I put in a stranger's face, I'll almost always get their real name because it'll take me to one of their social media profiles. From there, I can typically use a different search engine to find a physical address, often a phone number. On the site itself, I also usually see stills from any videos the person has appeared in. When I first learned about this site, I did what you do when you get Google for the first time.
I looked myself up and then I started looking at my friends and it took about 30 seconds before I saw things that made me pretty uncomfortable. I was just seeing stuff I should not be seeing. I don't know the most delicate way to say this except people I knew had compromising stuff on the internet. Stuff they had put there, but not under their real names. And I don't think they knew, I certainly hadn't known, that the technology to search someone's face was right around the corner.
I decided to stop using the search engine, the line between general internet curiosity and stocking, this felt like the wrong side of it. It felt CD. But now, even just knowing this tool existed changed how I thought in the real world. I found myself trying to reach for it, the way any digital tool that works begins to feel like another lamb.
I found a driver's license on the floor of a dance club. The person had a name too common to Google, like Jane Smith. But I realized, let me just find their face with the search engine. Another night, two people at a restaurant were talking. One of them, the guy was telling, sounded like a very personal story about the vice president of America. Who was this guy?
I realized, if I snapped a photo of him, I now have the ability to know. We take for granted the idea that we have a degree of privacy in public, that we are mostly anonymous to the strangers we pass. I realized, this just wasn't true anymore. Right now, there are a lot of discussions about AI chatbots, about the ethics and problems of a very powerful new technology.
I feel like we should also be talking about this technology, these search engines, because my feeling using one was, we are not at all ready for this. This thing that is already here, and I wanted to know, is it too late? Is there a way to stop these tools or limit them? And I especially wanted to know, who unleashed this on us? So I call the person you call when you have questions like this. Can you introduce yourself?
Sure, I'm Kashmir Hill. I am a technology reporter at the New York Times. And I've been writing about privacy for more than a decade now. Kashmir is one of the best privacy and technology reporters in America. She published a book a few months ago about these search engines, and about the very strange story of how she discovered that they even existed.
It's called appropriately, your face belongs to us. Her reporting follows a company called ClearView AI, which is not the search engine I was referencing before. ClearView AI is actually much more powerful, and not available to the public. But in many ways, ClearView created the blueprint for copycats, like the one I had found. Kashmir told me the story of when she learned of ClearView AI's existence back when the company was still in deep stealth mode.
So I heard about ClearView AI. It was November 2019. I was in Switzerland doing this kind of fellowship there. And I got an email from a guy named Freddie Martinez who worked for a nonprofit called Open the Government. And he does public records research. And he's obsessed with privacy and security as I am. And I had known him for years. And he sent me this email saying, I found out about this company that's crossed the Rubicon on facial recognition. That's how we put it.
He said he'd gotten this public records response from the Atlanta Police Department describing this facial recognition technology they were using. And he said, it's not like anything I've seen before they're selling our Facebook photos to the cops. And he had attached the PDF he got from the Atlanta Police Department. It was 26 pages. And when I opened it up, the first page was labeled privileged and confidential.
And it was this memo written by Paul Clement, whose name I recognize because he's kind of a big deal lawyer with solicitor general under George Bush now in private practice. And he was talking about the legal implications of ClearView AI. And he's describing it as this company that has scraped billions of photos from the public web, including social media sites.
And ordered a produces facial recognition app where you take a photo of somebody and it returns all the other places on the internet where their photo appears. And he said, we've used it our firm. It returns fast and reliable results. It works with something like 99% accuracy. There's hundreds of law enforcement agencies that are already using it.
And he'd written this memo to reassure any police who wanted to use it that they wouldn't be breaking federal or state privacy laws by doing so. And then there was a brochure for ClearView that said stop searching, start solving. And that it was a Google for faces. And as I'm reading it, I'm just like, wow, how have I never heard of this company before?
Why is this company doing this and not Google or Facebook? And does this actually work as this real? Because it is violating things that I have been hearing from the tech industry for years now about what should be done about facial recognition technology.
I flash back to this workshop I've gone to in DC, organized by the Federal Trade Commission, which is kind of our de facto privacy regulator in the United States. And they had a bunch of what we call stakeholders there. Google was there, Facebook was there, little startups, privacy advocates, civil society organizations, academics. Good morning, and I want to welcome all of you both here in Washington DC and in those watching online to today's workshop on facial recognition technology.
This workshop that Kashmir remembers, it happened in 2011. It was called face facts or maybe face facts. The video of the workshop on the FTC's website shows a string of speakers presenting it a podium in front of a limb-looking American flag. People focus on the commercial use, that is, on the possibilities that these technologies open up for consumers, as well as their potential threats to privacy.
Most of you know this, but, missionally, FTC, we're going to have to talk about the needy gritty of facial recognition technology. What safeguards need to be put in place around this technology that's rapidly becoming more powerful. Everyone in the room had different ideas about what we should be doing. Google and Facebook were just tagging friends and photos. And there are some people there saying we need to kind of ban this.
But there was one thing that everybody in the room agreed on, and that was that nobody should build a facial recognition app that you could use on strangers and identify them. Since they were one, we asked ourselves, how do we avoid the one-use case that everybody fears, which is to do and analyze people? That's the CEO of a facial recognition company that would soon be acquired by Facebook. He was saying they had to prevent the use case no one wanted. Shazam for Faces.
So the input into our system is both the photos and the people that you want to have identified. Now we'll give you back the answer. So, in fact, you can never identify people, you do not know. That's our mantra, right? Is this one thing that we wanted to make sure that doesn't happen?
And so now I'm looking at this memo that says that has happened. And so, yeah, I was very shocked. And I told Freddie, I'm definitely going to look into this as soon as I fly back to the United States, and that's what I did. So, at this point, in late 2019, here's what Casimir knows about Clearview AI. It's supposedly a very powerful technology that has scraped billions of photos from the public web, and it's being used by the Atlanta Police Department.
She doesn't know who's behind the company, but she has ideas about how to find them. She starts calling their clients. And so, I reshought to the Atlanta Police Department. They never responded. Other foias were starting to come in that showed other departments using Clearview, and I just did a kind of Google-Dorking thing where I searched for Clearview and then site.gov to see if it showed up on budgets. Oh, that's really smart.
And so, I started seeing Clearview, and it was really tiny amounts, like $2,000, $6,000, but it was appearing on budgets around the country. And so, I would reshought to those Police Departments and say, hey, I'm looking to Clearview AI. I saw that you're paying for it. Would you talk to me?
And eventually, the first people to call me back were the Gainesville Police Department, a detective there named Nick Farrer. He's a financial crimes detective, and he calls me up on my phone. He said, oh, hey, I heard that you're working on a story about Clearview AI. I'd be happy to talk to you about it. It's a great tool. It's amazing. And he said he would be the spokesperson for the company.
So, he just loved it. He's just like, this is great. He loved it. He said he had a stack of unsolved cases on his desk where he had a photo of the person he was looking for, like a fraudster. And he'd run it through the state facial recognition system, not gone to anything. And he said, he ran it through Clearview AI, and he got hit after hit.
And he just said it was this really powerful tool. It worked like no facial recognition he'd used before. The person could be wearing a hat, glasses looking away from the camera. And he was still getting these results.
And this is sort of the positive case for any of this, which is that if a dangerous person who, like, has committed violent crimes is out in the world, and there's some photo of them where maybe they were like robbing a bank. And their mouth was covered, and there's a hat over their head. And if a cop can take that surveillance still, plug it into a big machine and find this person's name, we live in a safer world.
Right. This is the ideal of use case, solving crimes, finding people committed crimes, bringing them to justice. And so Nick for our other detective said, yeah, it works incredibly well. And I said, well, I'd love to see what the results look like. I've never kind of seen a search like this before. And he said, well, I can't send you something from one of my investigations. But why don't you send me your photo. And I'll run you through Clearview. And I'll send you the results. So I do that. I send some photos of myself. How do you pick the photos?
I tried to choose hard photos. So I had one where like my eyes were closed, one where I was wearing a hat and sunglasses. And another that was kind of like an easy photo in case those other two didn't work. And then I waited to hear how it went and see for myself how well the software works. And Nick for our ghost me. He just totally disappears.
Disappears won't pick up when I call him doesn't respond to my email. Casimir says she tried this again with a different police officer in a different department. And the same thing happened. They were friendly at first. Casimir asked them to run a search on her face. They agreed. And then they were gone.
And so eventually I kind of recruited a detective in Texas, a police detective who was kind of friend of a friend at the times and said, oh, you're looking into this company. I'm happy to download the tool, tell you what it's like. And so he request a trial of Clearview. And at this point, Clearview was just giving out free trials to any police officer as long as they had an email address associated with the department.
And so it's what Facebook did when they first opened up with college campuses. Exactly. It was exclusive just for government workers. And so he goes to their website where he can request a trial within 30 minutes. He's got Clearview on his phone. And he starts testing it, running it on some suspects whose identity he knows and it works. He tried on himself.
And he kind of had purposely not put a lot of photos of himself online because he was worried about exposure and people coming after him who he had been involved in catching sending the jail. And it worked for him. It found this photo of him on Twitter where he was in the background of someone else's photo.
And he had been on patrols, would actually had his name tag on it. So it would have been a way to get from his face to his name. And he immediately thought, wow, this is so powerful for investigators, but it's going to be a huge problem for undercover officers. If they have any photos online, it's going to be a way to figure out who they are. Yeah.
And so I told him about my experience with other officers running my photo and he ran my photo. And there weren't any results, which was weird because I have a lot of photos online. But he just came up like nothing. Nothing. And then within minutes, he gets a call from an unknown number. And when he picks up the person says, this is Marco with Clearview AI tech support. And we have some questions about a search that you just did. Oh my god.
And he says, why are you running photos of a lady from the New York Times? And the Jack to kind of plays it cool. And he's like, oh, I'm just testing out the software. How would I know somebody in New York? I'm in Texas. And anyways, his account gets suspended. Oh wow. And this is how I realize that even though Clearview is not talking to me, they have put an alert on my face. And every time an officer has run my photo, they've gotten a call from Clearview. Telling them not to talk to me.
Just to spell out what Kashmir believed was going on here, these police officers may have thought they were using a normal search engine like Google. But what they hadn't counted on was that someone on the other end of that search engine seemed to be watching their searches, surveilling the cops who were using the surveillance technology.
It was a moment where Kashmir saw clearly how this mysterious company by being the first to build this tool, no one else would, had granted itself immense power to monitor Kashmir, to monitor these cops. This company whose product would reduce the average American's privacy was keeping quite a lot of privacy for itself.
Of course, Kashmir is fortunately for us a nosy reporter, so all this cloak and dagger behavior just made her more curious. She tries to crack into the company a bunch of different ways. She's reaching out to anybody online who might have links to the company. She finds an address listed on Clearview AI's website. It's in Manhattan. But when she goes there in person, there's no such address. The building itself doesn't exist. It's a real Harry Potter moment.
Finally, she tries something that does work. On the website, Hitchbuck, she can see two of Clearview AI's investors, Peter Teal, no luck there. But also an investment firm based in New York. There are North of the city, and they weren't responding to emails or phone calls, so I got on the metro North and went up to their office. See if they had a real office.
It was an adventure being there. The office was empty. All their neighbors said they never came in. I hung out in the hallway for about an hour. A FedEx guy came. He dropped off a box. He says, oh, they're never here. I thought, oh, my gosh, this is a waste of a trip. But then I'm walking out of the building. It was on the second floor. I'm coming down the stairs. These two guys walk in.
They were wearing lavender and pink. They just looked like moneyed. They stood out. I said, oh, are you with Kieranaga Partners, which is the name of this investment firm. They look up and they smile at me. They said, yeah, we are. Who are you? I said, I'm Kashmir Hill in the New York Times reporter who's been trying to get in touch with you. Their face is just fall.
I said, I want to talk to you about Clearview AI. They said, well, Clearview AI's lawyer said that we're not supposed to talk to you. I was around seven months pregnant this time. I opened my jacket and just clearly displayed my enormous belly. I was like, oh, I've come so far. It was cold. It was raining out. David Scalzo, who's the main guy, main investor in Clearview at Kieranaga, he says, okay.
So Kashmir and the two investors go inside the office. Kashmir tells them all this not talking. It's making Clearview AI look pretty nefarious. She has a point. And so one of them agrees to go on the record and starts talking about his vision for the company that he has invested in. David Scalzo said, right now they're just selling this to kind of like retailers and police departments. But our hope is that one day everybody has access to Clearview.
And the same way that you Google someone, you'll Clearview their face and be able to see all the photos of them online. He says, yeah, I think we think this company is going to be huge. And now they give Kashmir the information that she'd really been looking for, the names of the people who are actually responsible for this tool.
And they said, oh, yeah, we're really excited about the founders. And they say it's this guy Richard Schwartz, who's kind of a media politics guy worked for Rudy Giuliani when he was mayor. And then there's this tech genius, real mastermind young guy and his name's Juan Tontat. And I, we're in a conference room. So I'm like, can you write that up on a whiteboard for me?
Well, I used to be a Juan Tontat. And so we write it out. And this is the first time I figure out who the people are behind this. After the break, the story of how Juan Tontat and his engineers got your face and my face and three billion photos worth of faces into this powerful new search engine.
Search engine is brought to you by June's journey. June's journey is a mobile game. You can download on your smartphone. Everybody loves a good family mystery, especially one with as many twists and turns as June's journey. Seven to the role of June Parker and search for hidden clues to uncover the mystery of her sister's murder. Engager observation skills to quickly uncover key pieces of information that lead to chapters of mystery, danger, and romance.
Where will each new chapter take you? June's journey is a hidden object mystery game with a captivating detective story taking you back to the glamour of the 1920s with a diverse cast of characters. Each new scene takes you further through a thrilling murder mystery story that says the main protagonist, June Parker, on a quest to solve the murder of her sister and uncover her family's many secrets.
I have never met an iPhone game that I don't like to play instead of thinking. Try June's journey. You can download it for free on iOS or Android. Discover your interactive when you download June's journey for free today on iOS and Android. Surgeon is brought to you by seed probiotics. Small actions can have big benefits. Like how taken care of your gut can support whole body health. Seeds DS1 daily symbiotic benefits your gut, skin, and heart health in just two little capsules a day.
My relationship with my body is a bit of a nightmare. Probiotics can help with things that are important to me like digestion and skin health. Your body is an ecosystem and great health starts in the gut. Your gut is a central hub for various pathways through the body. And a healthy gut microbiome means benefits for digestion, skin health, heart health, your immune system, and more.
Probiotics and prebiotics work best when used consistently like other routine health habits. Seeds subscription service easily builds DS1 into your routine with no refrigeration needed. Trust your gut with seeds DS1 daily symbiotic. Go to seed.com slash search and use code 25 search to get 25% off your first month. That's 25% off your first month of seeds DS1 daily symbiotic at seed.com slash search. Code 25 search.
Now that Kashmir had a name to search, Juan Tontat, she learned Juan had an internet trail. This guitar you're hearing, part of the trail. Juan had a habit in one chapter of his life of posting YouTube videos of himself pretty capable of playing guitar solos. In the videos he doesn't speak but you see him. Tall and slender with long black hair. Fashionable. Juan has Vietnamese roots raised in Australia. He moved to San Francisco in 2007.
His internet breadcrumbs suggest a strange collage of a person. A bay area tech guy who presents in a slightly gender fluid way has photos from Burning Man. But then also seems like a bit of a troll. In a Twitter bio, he claims to be a quote, an arco-transsexual Afro-Chicano American feminist studies major. What is clear is that Juan had come to America with big dreams of getting rich on the internet.
He started in the Farmville era of Facebook apps when you could make money building the right stupid thing online. Nothing he tried really took off though. Not apps like Friendquiz or Romantic Gifts. Not later efforts like an app that took an image of you and Photoshopped Trump's hair on your head. In 2016, Juan would move to New York. At some point he'd delete most of his social media. But Kashmir found an old artifact of who he was on the internet back then.
I found an archived page of his from Twitter on the way back machine. And it was mostly him kind of retweeting like bright-bart reporters and kind of saying, why are all the big cities so liberal? He doesn't have a Twitter account. He doesn't have a Facebook account. It seemed like, wow, this is weird. This guy is in his 20s, I think.
But he doesn't have a social media presence beyond a Spotify account with some songs that he apparently had done. It was a strange portrait, but it came away thinking, wow, this person is a character. Yeah. And I really want to meet him. At this point, it seemed like the company understood that Kashmir Hill was not going to go away. A crisis communications consultant reached out and eventually offered to arrange an interview with Juan Tantat.
When I met him, he was not what I expected him to be, which was he still had the long black hair. But now he had these glasses that felt very like office worker glasses. And he was wearing this blue suit. And he just looked like security startup CEO. Okay. Which just, again, wasn't what I expected based on everything else I saw about him. We met at a we work because they didn't have an office. I would find out that he mostly kind of worked remotely. Did a lot of the building of Clearview AI.
At the time he lived in the East Village and kind of just did it in cafes like places with free Wi-Fi. So they booked a room at we work for our meeting. The crisis communications consultant was there. She'd brought cookies.
Chocolate chip. And I feel like they were in the market or like saw salito cookies. I can't remember the brand. But yeah, we had lattes at the we work cafe and we sat down. And I just start asking my questions. And for the most part, he's answering them. And we had a couple of hours to talk. And he really was telling me a lot. And so it was this complete 180.
In person, he's very charismatic, very open. And it would be a vase of about some things wouldn't describe anyone else involved if the company besides Richard Schwartz is co-founder. But yeah, I mean, he was open and I was like, you have built this astounding technology like how did you do this? How did you go from what you're telling me about Facebook apps and iPhone games to building this?
They said, well, I was standing on the shoulders of giants. And he said, there's been this real revolution in AI and neural networks and a lot of research that kind of the most brilliant minds in the world have done. They've open sourced. They've put it on the internet. Juan told Kashmir that in 2016, in the early days of building, what would become the clear view AI facial search engine, he taught himself the rudiments of AI assisted facial recognition by just essentially googling them.
He'd gone on GitHub and typed in face recognition. He read papers by experts in the field. He told her, quote, it's going to sound like a Googled flying car and then found instructions on it. Which wasn't too far off until pretty recently facial recognition existed, but with somewhat crude. What Juan was learning on the internet was that machine learning neural networks had just changed all that.
Now computers could teach themselves to recognize a face, even at an odd angle, even with a beard, provided that the computer was given enough images of faces, training data to learn on. We reached out to clear view AI for this story. We didn't get a response. But in the years since his interview with Kashmir, Juan has done plenty of interviews with the press. One thing I do respect is the fact that you decided to come here live for an interview, so I appreciate you for taking the time.
And thanks Pat for having me on. Here's one with the YouTube show Value Tainment. Juan's dressed as he was with Kashmir in a suit, looking again like a standard tech exec just with an usually long hair. Here he describes what his research process for this search engine was like. I was looking at the advances in AI. So I saw ImageNet, which is a competition for recognizing things in images. Is this a computer? Is this a plant? Is this a dog?
Yeah. And the results got really good. And then I looked at facial recognition and we read papers. So Google had Facebook both had deep learning papers on facial recognition. I said, hey, can I get this working on my computer? And we ended up getting it working and what we realized was getting more data to train the algorithm to make it accurate across all ethnicities, Iranian people, Black people, White people, brown people. That was really key to improving performance.
This would be Juan's real innovation. A somewhat dark one. His advantage was how he would find the training data he needed. He built a scraper, a tool that would take without asking photos of human faces pretty much anywhere on the public internet they could be napped. He also hired people who built their own scrapers to Hoover up even more photos.
He said part of our magic here is that we collected so many photos and he built the scraper. He hired people from around the world to help him collect photos. And so it's similar to like when people talk about like large language models right now and companies like OpenAI, some of what they're doing is tuning their neural networks.
But a lot of what they're doing is like feeding their neural networks. It's like they have to find every taxes ever been published every library and then they run out of all the library texts and they have to find like transcripts of YouTube videos which like maybe they shouldn't be loading in there. But it's like part of what he had done correctly to get his products ahead of where the other ones were. It's just like he was not a genius at making the underlying AI.
That was mostly open source. He was passionate about finding faces on the internet to put into it. Yes. And so where was he looking? Oh man. So the first place he got faces was Venmo. This is funny to me because as a privacy journalist I remembered people being upset at Venmo's approach to privacy which at the time was if you signed up for Venmo your account was public by default. Yeah. In your profile photo was public by default.
And so he built this scraper you know this bot that would just visit Venmo.com every few seconds. And Venmo at the time had a real time feed of all the transactions that were happening on the network. And so he would just hit Venmo every few seconds and download the profile photo, the link to the Venmo profile. And he got just millions of photos this way from Venmo alone. And this is essentially what he was doing but with I mean thousands millions of sites on the internet.
Facebook, LinkedIn, Instagram, employment sites. Yeah just anywhere they could think of where there might be photos. I just want to butt in here to say all of this is completely astonishing. I know people at the dawn of social media who just didn't want to join Facebook or didn't understand why you would voluntarily offer your private life to the public.
But I don't think anyone or at least anyone I knew had an imagination sufficiently tuned to dystopia to know that if you had the brazenness to upload your photo to Venmo or to link to Venmo. You could one day be feeding your face into a machine. A machine that today if you go to a protest is capable of using a photo of your face to find your name, your email, your employer, even your physical address. Who knew this was the future we were fumbling our way towards? I asked Kashmir about all this.
I used Venmo, I used Facebook. I'm fairly sure that when I signed up I signed in terms of service. I did not really carefully. I don't think there was a section in there like okaying that my face could be used in photos cravings out where is what he did legal? So Venmo and Facebook both sent clear view AI cease and desist letters saying stops scraping our sites and erase the photos that you took, delete the photos that you took.
But they never sued so this hasn't been tested in court whether it's illegal or not so it's still a bit of a gray area and it hasn't been tested with clear view because none of these companies have sued them. In one interview, Chajasda month after his conversation with Kashmir, Juan sat down with a CNN reporter who asked about this, the legality of his project.
Is everything you're doing legal? Yes it is. So we've gone through and have some of the best legal counsel from Paul Clement who used to be the Solicitor General of the United States, it's done over 100 cases in front of the Supreme Court and he did a study independently saying this is not the way it's used is in compliance with the Fourth Amendment.
All the information we get is publicly available and we have a first amendment right to have public information on the internet and you have to understand what is also being used for. We're not just taking your information and selling ads with it or trying to get more pay-off. So we're actually helping solve crimes with this so. So your counsel is making the argument that there's a first amendment right to information that is publicly on the internet?
Yes and so if you take something like Google, Google, you know, crawls the internet, collects all these web pages and you search it with keyword terms. We're the same thing. We take all this information on public web pages but search it with a face. Juan's making a pretty radical argument here even though his tone doesn't suggest it. He's saying that someone being able to take your face and use it to make a search that will pull up your name, possibly your address and more is nothing new.
It's just like Google. His point is that Google collects every instance of your name on the internet. Clearview AI is just doing that but with your face and you know attaching it to your name. Whether you agree with this idea or not, it has happened and it has fundamentally changed how privacy works. Cashmere says that most of us are just not prepared for this brave new world. I just don't think that most people anticipated that the whole internet was going to be reorganized around your face.
Right. So a lot of people haven't been that careful about the kind of photos they're in or the kind of photos they've put up of themselves or the kind of photos they've allowed to be put up of themselves. And Juan actually did a clear view search of me there and I said, oh well, last time this happened there were no results for me. And he said, oh, there must have been some kind of bug. He wouldn't admit that they had put this alert on my face.
They had changed my results. But he ran my face and there were just tons of photos. Like lots of photos I knew about. But in one case, there was a photo of me at an event with a source and I was like, wow, I didn't realize. I hadn't thought that through now that if I'm out in public with a sensitive source and somebody takes a photo and post that on the internet, that could be a way of exposing who my sources are.
And it was really stunning how powerful it was. I mean, for me, there were dozens of not hundreds of photos. Me kind of in the background of other people's photos. I remember there were, like I used to live in Washington, DC. And there were photos of me at the black cat in which is a concert venue. But then you, just in the crowd at the show, it was incredibly powerful.
And I remember asking him, I was like, you've taken this software somewhere no one has before. Like you've created this really powerful tool that can identify anybody. You know, find all these photos of them. You're just selling it to law enforcement. But now that you've built this and you've described to me the accessibility of building this, there's going to be copycats. And this is going to change. And on Emity, privacy as we know it, what do you think about that?
And I remember he kind of was silent for a little bit. And he said, that's a really good question. I'll have to think about it. And it was just this stunning moment of seeing an action. People that are making these really powerful technologies who really just are not thinking about the implications, who are just thinking, how do I build this? How do I sell this? How do I make this a success?
Since Juan's interview with Kashmir, it seems like maybe he's had more time to think through better answers to hard questions. We've watched a lot of these subsequent interviews. What you notice is that now, he'll say that as long as he's CEO, he'll make sure his tool is only ever in the hands of law enforcement and in some cases banks. And he'll point again and again to the one strong reason why Clearview AI does need to exist.
Without Clearview AI, there are so many cases of child molesters that would have never been caught or children who wouldn't have been saved. Child predator will be extorting your children online. You don't even know about it. It's extortion. Child abuse. Child abuse. Child crimes. Child crimes. Child crimes against children. Dark web. Troves and troves of children's faces. These are kids that would have been identified.
This is why our customers are very passionate about keeping the technology and making sure it's used properly. It's hard to take the other side of that argument. But of course Clearview AI is not just being marketed as an anti-child predator tool. A Clearview AI investor told Kashmir he hoped one day it would be in the hands of regular people. And potential investors in the company were given the tool to use on their phones, like just to use as a party trick.
Will Clearview AI actually ultimately roll this tool out for wide use? Well, it sort of doesn't matter whether they do or not. Because remember, the copycats already have. After the break, help people are using and abusing this technology right now. Search Engine is brought to you by NetSuite. Okay, quick math. The last year business spends on operations, on multiple systems, on delivering your product or service. The more margin you have and the more money you keep.
But with higher expenses on materials, employees, distribution, and borrowing, everything costs more. So to reduce costs and headaches, smart businesses are graduating to NetSuite by Oracle. NetSuite is the number one cloud financial system, bringing accounting, financial management, inventory, HR into one platform and one source of truth. With NetSuite, you reduce IT costs because NetSuite lives in the cloud with no hardware required. Access from anywhere.
You cut the cost of maintaining multiple systems because you got one unified business management suite. And you're improving efficiency by bringing all your major business processes into one platform, slashing manual tasks and errors. Over 37,000 companies have already made the move. So do the math. See how you'll profit with NetSuite. Back by popular demand, NetSuite has extended its one of a kind flexible financing program for a few more weeks.
Head to NetSuite.com slash PJ. NetSuite.com slash PJ. NetSuite.com slash PJ. Search Engine is brought to you by Squarespace. Squarespace is the all-in-one website platform for entrepreneurs to stand out and succeed online. Whether you're just starting out or managing a growing brand, Squarespace makes it easy to create a beautiful website, engage with your audience, and sell anything from products to content to time. All-in-one place, all-in-your-terms.
Despite reporting a lot of stories about the internet, I find websites hard to make. Squarespace offers Blueprint AI and SEO tools. You can start a completely personalized website with the new Guided Design System, Squarespace Blueprint. They also have flexible payments. You can make checkout seamless for your customers with simple but powerful payment tools.
Squarespace also offers the Fluid Engine, the next-generation website editor from Squarespace. It's never been easier for anyone to unlock unbreakable creativity. Go to www.squarespace.com to save 10% off your first purchase of a website or domain. That's squarespace.com slash engine. So you leave that conversation at that point, do you write your story? Yeah, I think the story came out about a week after that interview.
And what was the response to the story? Did people understand the size of the thing? Yeah, so it was a front page Sunday story, and it was a big deal. I remember it landed and my phone was just blowing up because I was being tagged. This was back when Twitter was still a healthy space for conversation. So my Twitter's blowing up. I'm getting all these emails. People want to book me to talk about it on TV on the radio.
Like, it was just this huge deal. People were stunned that it existed, that it was using their photos, without consent. Just the way Clevver had gone about it, the fact that they had served me as a journalist, tried to prevent the reporting on the story. It was a huge deal. I thought this is going to be one of the biggest stories of the year. This was January 2020. I see. Shit. So the pandemic happens. It just kind of like... Yeah, like I was hearing that there were going to be hearings on NDC.
They start getting season to cis letters, lawsuits happen, but then March 2020, COVID hits. And it just instantly changed the conversation in the US, and around the world, to health concerns, safety concerns. And then I started seeing people talking about, can we use facial recognition technology to fight COVID? Can we start tracking people's faces, see where people were with other people?
If there's a known exposure, can we track people? And there was this talk about using facial recognition technology. So it's like we almost skipped the scared outrage face of the technology because the needle kind of juttored on everything with COVID? Yeah, we did a little bit. I mean, for certain groups, European privacy regulators, they all launched investigations in the Clearview, and they essentially kicked Clearview out of their countries.
They said it was illegal. I mean, there were repercussions for Clearview, but I feel like the bigger conversation, what do we do about this right now? It just got pushed aside by that larger concern around COVID. Somehow our debate about these search engines was just one of the infinite, strange casualties of COVID. A conversation we never quite got to have.
In the meantime, Clearview AI's copycats have continued to go further than the company itself, offering their search engines online for the public to use at a small cost. None of these search engines is as powerful as Clearview, but all of them are powerful enough to do what privacy advocates were worried about back in 2011.
The tool I would end up finding online was one of those copycats. I have been noticing more and more people using them mainly to settle scores with strangers on social media. I asked Kashmir where she has noticed people using these search engines in the wild since she published her book. I've seen news organizations using it. One of the more controversial uses was a news organization that used it after October 7th to try to identify the people who were involved in the attacks on Israel. Oh wow.
And I was a little surprised to see it use that way. Why were you surprised? I was surprised just because it's still controversial, whether we should be using face recognition this way. And the same site that was using it had published stories about how controversial this. Oh, and that there are these search engines have scraped the public web and that they invade privacy. Yeah, so I think it's still complicated.
It was a news outlet that had done maybe we shouldn't have the stories and then they were also using the tech. Yeah, like maybe this technology shouldn't exist, but also it's there so we're going to use it. Which sort of feels like the story of every piece of technology we've ever had a stomach ache about, which is we say we don't want to exist. And then some contingent circumstances arises in which at least some of us feel like well, it's okay for me to use this here.
Even if I don't think it should exist. Right. Case by case basis. I did ask Kashmir whether she'd seen these search engines used in a clearly positive way. I've heard of people using it on dating sites to figure out if the person they're talking to is legit. Make sure they're not being catfished. Make sure this person is who they say they are. I've heard about it being used by people who have compromising material on the internet.
So they have an only fans or just something they don't want to exist on the internet. And they've used these search engines to figure out how exposed they are. And some of the search engines do let you remove results. And so they've done that. They've gotten rid of the links to stuff they don't want the world to know about. I've talked to parents who have used these search engines to figure out if there's photos of their kids on the internet. That they don't want to be out there. Oh, wow.
So one woman I talked to who she's an influencer. She gets a lot of attention and she didn't want it kind of blowing back on her kids. So she stopped featuring them in any of her videos. And she searched for them with one of these search engines and found out that there was a news photo of one of her kids. A summer camp, I think, that one of the kids had gone to had posting photos publicly. And so she asked them to take it down.
But yeah, I mean, there are some positive use cases for these engines. So what Kazmier is saying is that the most positive use cases for these search engines might just be finding, compromising content on the internet about yourself first before someone else using one of these search engines does. Which seems like a questionable upside. Kazmier has also seen facial search engines using a way that I have to say was just breathtaking in its pettiness.
She recently reported on how the owner of Madison Square Garden was using facial recognition and surveillance to ban from the venue. Anyone who worked at a law firm, his venue was in litigation with. Kazmier even tested this. She tried to go see a hockey game with a personal injury lawyer. Something one used to be able to do freely. So I bought tickets to a rangers game and brought along this personal injury attorney. His firm was on the ban list just because I wanted to see this for myself.
And yeah, so I met her. I was meeting her for the first time that night. We stood in line. Just a ticket. Yeah. Just a seat. There we go. We were walking in. We put our bags down on the conveyor belt. And just thousands of people streaming in the Madison Square Garden. But by the time we picked them up, a security guard walked over to us. He said, I need to see some ID from her. She shows her driver's license. And he said, you're going to have to wait here. I'm one-valid.
So they said they actually would stand by. And she just has to come speak with you. And we appreciate it. Just hang out for me. Okay. A couple minutes here. We'll get someone down the toilet too. And Amanda came over and gave her this note and told her she wasn't allowed to come in. Wow. She had to leave. So we're the firm is involved. Apparently we need a lot here to go. This is the garden and the front is all tied.
It was insane to see just how well this works on people just in the real world walking around. Yeah. It was so fast. God, that's crazy. When Kashmir reported this story, she actually heard from facial recognition companies. She said they were upset the Madison Square Garden was doing this. It was making their tools look bad. It was not how they said they were supposed to be used. But misuse of any technology, it's almost a given.
And facial recognition is being misused not just by corporations, but also by individuals. So there was a TikTok account where the person who ran it, if somebody kind of went a little viral on TikTok, he would find out who they were and expose them. The one video that really struck me is during the Canadian wildfires when New York City kind of turned orange. Yeah. Somebody had done a TikTok that just showed Brooklyn and what it looked like. It looked like something from Blade Runner.
Yeah. And this guy walks by and he became the hot Brooklyn dad. And so the TikTok account found out who hot Brooklyn dad was and then found out who his son was. Instead, if you want to date somebody who's going to look like hot Brooklyn dad one day, here's his son's Instagram account. That is wildly bad behavior. That's crazy. Because that person didn't even consent to being in the first video. But I'm sure people sent it to him and they're like, hey, the internet thinks you're hot.
Don't worry. They don't know who you are. And they not only invaded his privacy further, but invaded his kids' privacy. Yeah, just for fun. And so that account was doing a lot of that. And four or four media wrote about it and eventually TikTok took the account down. I mean, the thing that sort of hovers around all of this is that prior to the invention of these things, it was like the internet had taken a lot of everyone's privacy.
But the one thing we had was the idea that if people didn't know your name or if you did something under a pseudonym, there's a degree of privacy. And now it's like, your face follows you in a way it wasn't supposed to. Or the internet follows your face. That's how I think about it. It feels like there's a world in which technology like this would just be like fingerprint databases, where law enforcement would have it, the general public wouldn't have access to it.
Isn't that one way this could be going instead of the way it's happening? Yeah, that is definitely a possible future outcome where we decide, okay, facial recognition technology is incredibly invasive. Kind of in the same way that wiretapping is. So let's only let the government and law enforcement have access to this technology legally. They're allowed to get a court order or warrant and run a face search in the same way that they can tap your phone line with judicial approval.
And the rest of us shouldn't have the right to do it. I think that's one way this could go. It seems preferable to me. It seems good, but then you also think about governments that can abuse that power. So recently here in the US, Gaza has been such a controversial issue and you have people out doing pro-democracy. And there was a lot of talk about, well, if you are on this side of it, then you're aligned with terrorists and you are not going to get a job.
We're going to rescind job offers to college students who are on this side of the issue. And it's very easy to act on that information now. You can take a photo of that crowd of protesters and you can identify every single person involved in a protest. And then you can take their job away. Or if you're police and there's a Black Lives Matter protest against police brutality, you can take a photo and you can know who all those people are.
But I think you notice now when you see photos from the protests, all these students are wearing masks. They're wearing COVID masks or they're wearing something covering their face. And it's because they're worried about this. They're aware of how easily they can be identified. And the thing is it might work, but I have tested some of these search engines. And if the image is high resolution enough, even wearing a mask, somebody can be identified.
Really? So just from like nose and eyes and forehead? Yes. I did this consensually with a photo of my colleague Cecilia Kong, who covers Tekken policy and DC. She sent me a photo of herself with a medical mask on. I ran it through one of the search engines and it found a bunch of photos of her. There's a world you can imagine it where someone passes a law and these tools are no longer offered to the public. They become like wiretaps, something only the police are allowed to use.
We would get some of our privacy back. But, and this might not come as a surprise, there have been problems when the police use these tools as well. These search engines sometimes surface doppelgangers. Images of people who look like you, but who are not you. Which can have real consequences. Casimir reported the story of a man who was arrested for a crime he was completely innocent of. The crime had to be in Louisiana. The man lives in Atlanta.
The police department had a $25,000 contract with Clear V AI, though the cops wouldn't confirm or deny that they used Clear V AI to misidentify him. How do these search engines deal with errors? Do they like correct things if they make a mistake? Is there a process? So in the minds of the creators of these systems, they don't make mistakes. They aren't definitively identifying somebody. They are ranking candidates in order of confidence.
And so when Clear V AI talks about their technology, they don't say they're identifying anyone. They say that they are servicing candidates. And ultimately it's a human being who's deciding which is a match. It's the human making a mistake, not the system. So if I were running for local office somewhere and there is a video of someone who looks like me doing something compromising and someone wrote a news story being like,
Hey, we put his face into the thing and this is what we found. And I went, Hey, you're smearing me. They were like, we're not smearing you. We're just pointing out that you look like this guy doing something he's not supposed to do in a video. Right. It's the news service that covered it. That smeared you, not the facial recognition engine. But for the person in jail, they know that they would not have been in jail if this technology didn't exist.
Yes, exactly. So there's this power of the government, right? Power of corporations. And then just as individuals, I think about this, basically every time I'm at dinner now at a restaurant and there's people sitting around me and I start having a juicy conversation whether it's personal or about work. And I think, wow, I really need to be careful here because anybody sitting around me could, if they got interested in the conversation, snap a photo of your face.
And with these kinds of tools, find out who you are. That's what I always think about. I was at a restaurant recently and it was like outdoor dining and I was the friend. And in the next sort of closed booth, there was this person, they took a phone call and they were like, one sec, this is Kamala Harris. And I think they were joking, but I could like hear them. And I was like, oh, I could just end their face. I could kind of figure this out.
And I think that might be able to find out privileged stuff about like a conversation with a very, very, very nervous government. I was like, this is, I felt real nausea. I felt nausea at the possibilities. Yeah, I mean, I think there's just so many moments in our daily lives where we just rely on the fact that we're anonymous. You know, you're at dinner, you're having this private conversation and then creepy PJs going to be sitting there and looking up your connection to the vice president.
Are you more, are you different in the public now? Yeah, I mean, I just think that is the risk of facial recognition technology. The same way that we feel this concern about what we put on the internet, like the tweets you write, the emails you write, the text you send, just thinking, am I okay with this existing and possibly being tied back to me, being seen in a different context.
That is going to be our real world experience. You have to think all the time is something that I'm saying right now that could be overheard by a stranger, something that could get me in trouble or something I would regret. And I don't know that just terrifies me. I don't want to be on the record all the time every minute. Anywhere I am in public.
You just kind of assume that these things that you're doing aren't going to haunt you for the rest of your life or fall for the rest of life or be tied to you. Yeah, unless you're celebrity of a very famous space and it's been funny because I've talked with various people who do have famous spaces. And I talk about this dystopia where it's like everything you do in public will come back to haunt you. And usually after the interview, they'll say, that's that's my life.
And I'm like, yes, what this technology does is it makes us all like celebrities like famous people, mine is the upsides, minus the upsides. What do you do if you don't want to be in these databases? Don't have photos of yourself on the public internet. It's hard not to get into these databases. These companies are scraping the public web. So we can't get out of clear views database. And there's no federal law yet that gives us the right to do that.
European privacy regulators have said that what clear view I did was illegal and that clear view needed to delete their citizens. And clear view basically said, we can't tell who lives in Italy or who lives in the UK or who lives in Greece. So there's not really much we can do. It's funny though, because I'm not a technology CEO. And if you asked me to actually fix that problem, I actually could fix that problem.
Like you could say anybody can email us and ask to be taken out if they prove that they live in Greece. You would think they could actually do something. Yeah, this work is so complicated. For a while, clear view AI was honoring requests from Europeans who wanted to be deleted from the database. But then at some point they just stopped and said, actually we don't feel like we need to comply with European privacy laws because we don't do business in Europe anymore.
God. They're like, ungovernable. Yeah, in some jurisdictions, you can get the company to delete you. In the US, there are a few states that have laws that say you have the right to access and delete information in a company has about you. California is one of those states. If you live in California, you can go to Clearview AI and give them your driver's license in a photo of you. And they'll show you what they have of you in the database.
And if you don't like it, you can say delete me. But there are only a few states that have such a law. For most of us, like here in New York, we don't have that protection. So we can't get out of Clearview's database. Basial recognition is hard because these companies are based in places that don't have great privacy laws like the United States. And they're making people around the world searchable. It really is a hard problem.
And on a larger sense as a country society world, if we were like, we just don't want this technology to exist. I know this is kind of like a child's question, but what would it look like to put the genie in the bottle? I mean, make it illegal for all companies to delete the algorithms. And you have to decide, are we talking about all facial recognition, your iPhone opening when you look at it? Right.
Or are we talking about just these big databases that are searching for your face among millions or billions of other faces? I don't think that's going to happen. I don't think it's going away. But I do think we have this kind of central question about facial recognition. Should these companies have the right to gather all these faces from the public internet and make them searchable? I think that is something that could be shut down if we wanted it to be.
Cash Mirror Hill. She's a reporter at The New York Times and author of the very excellent book, Your Face Belongs to Us. Go check it out. Stick around after the break. We have some show news. Welcome back. So quickly before we go this week, we are heading towards the end of season one of search engine.
Is there going to be a season two of search engine? How has season one gone? Great questions. We will be answering them, all of them, and whatever other questions you have about search engines present and future in questionably transparent detail at our upcoming board meeting. The date is Friday, May 31st. We will be sending out the details with the time and a zoom link to join.
This is only for our paid subscribers, people who are members of incognito mode. If you are not signed up, but you want to join this meeting, you've got to sign up. You can do so at searchengine.show. You get a lot of other stuff too. You can read about all the benefits on the website. Again, that URL is searchengine.show. If you're a paid subscriber, look out for an email from us next week and mark your calendar, May 31st, 2024. Search Engine is a presentation of Odyssey and Jigsaw Productions.
Search Engine was created by me, PJ vote, and truthy, Pintamintnany, and is produced by Gareth Graham and Noah John. Fact checking this week by Holly Patton. Theme original composition and mixing by Armin Bizarrehan.
Our executive producers are Jenna Weiss, Berman, and Leah Reese Dennis. Thanks to the team at Jigsaw, Alex Gibney, Rich Prello, and John Schmidt, and to the team at Odyssey, JD Crowley, Rob Morandi, Craig Cox, Eric Donnelly, Kate Hutchison, Matt Casey, Mora Curran, Justin Fiann and Francis, Kurt Courtney, and Hillary Schuff. Our agent is Orrin Rosenbaum at UTA. Follow and listen to Search Engine with PJ vote, now for free on the Odyssey app, or wherever you get your podcast.
Thank you for listening. We will see you in two weeks when we'll have a double episode for you. It's our version of the wall.