Why people are falling in love with AI chatbots - podcast episode cover

Why people are falling in love with AI chatbots

Mar 07, 202440 min
--:--
--:--
Listen in podcast apps:

Episode description

Our Thursday episodes are all about big topics in the news, and this week we’re wrapping up our short series on one of the biggest topics of all: generative AI. In our last couple episodes, we talked a lot about some of the biggest, most complicated legal and policy questions surrounding the modern AI industry, including copyright lawsuits and deepfake legislation. But we wanted to end on a more personal note: How is this technology making people feel, and in particular how is it affecting how people communicate and connect? Verge reporter Miya David has covered AI chatbots — specifically AI romance bots — quite a bit, so we invited her onto the show to talk about how generative AI is finding its way into dating. We not only discussed how this technology is affecting dating apps and human relationships, but also how the boom in AI chatbot sophistication is laying the groundwork for a generation of people who might form meaningful relationships with so-called AI companions. Links:  Speak, Memory — The Verge A conversation with Bing’s chatbot left me deeply unsettled — NYT Google suspends engineer who claims its AI is sentient — The Verge The law of AI girlfriends — The Verge Replika’s new AI therapy app tries to bring you to a zen island — The Verge Replika’s new AI app is like Tinder but with sexy chatbots — Gizmodo Don’t date robots; their privacy policies are terrible — The Verge AI is shaking up online dating with chatbots that are ‘flirty but not too flirty’ — CNBC Loneliness and suicide mitigation for students using GPT3-enabled chatbots — Nature Virtual valentine: People are turning to AI in search of emotional connections — CBS Transcript: https://www.theverge.com/e/23856679 Credits:  Decoder is a production of The Verge and part of the Vox Media Podcast Network. Today’s episode was produced by Kate Cox and Nick Statt and was edited by Callie Wright. The Decoder music is by Breakmaster Cylinder. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript

Support for Decoder comes from SAP Business AI Sure, we've all had fun messing around with AI image generators and conversation bots, but AI is more than a novelty. Businesses around the world have found ways to harness its potential, like spotting inventory shortages before they happen, or supporting supply chain management. And it's very possible that your business could benefit from AI integration too. Unlock the potential of AI and discover even more possibilities with SAP Business AI.

Evolutionary technology, real world results. That's SAP Business AI. Learn more at SAP.com slash AI. Hello, welcome to Decoder. I'm Nilay Patel, Editor and Chief of the Verge, and Decoder is my show about big ideas and other problems. Our Thursday episodes are all about big topics in the news. In this week, we're wrapping up our short series on one of the biggest topics of all, Generative AI.

Over our last couple of episodes, we've talked a lot about some of the biggest, most complicated legal and policy questions surrounding the modern AI industry. Copyright lawsuits and deep fake legislation. But we wanted to end on a smaller, more emotional though. How is AI making people feel? And in particular, how is it affecting how people communicate and connect? Verge reporter Mia David has covered AI chatbots, specifically AI romance bots quite a bit.

So we invited her onto the shows to talk about how Generative AI is finding its way into dating. As the boom in AI chatbots sophistication, it's also laying the groundwork for a generation of people who might form meaningful relationships with so-called AI companions. You'll hear Mia describe this as two big trends that are happening in parallel. Dating apps like Tinder are using Generative AI to help their users tailor their profiles and craft the perfect messages to send to potential matches.

And Mia's covered a new startup which offers an AI chatbot version of you, which can then go and message on your behalf with another person's AI chatbot to see if you're a match before you connect in real life. This is an actual company with a wild idea. But the real sci-fi vision is an AI companion like those from the company replica. replica encourages people to treat chatbots as friends, therapists, and even romantic partners.

You'll hear Mia explain that people are using these tools right now to form meaningful connections with AI. Even if the actual tech under the hood is pretty far away from what you might see in a sci-fi movie like her from Spike Jones. And finally we touched on how they are both very real, tangible mental health benefits to this approach. But also some serious pitfalls that people need to watch out for. Mia David, thank you for joining Dakota. Hi. You cover AI, you think about AI all the time.

One of the things that we are thinking about in our AI series is how AI is affecting industries, but also how it's affecting people. Real people are having real conversations with AI chatbots, with other kinds of AI technology. What do we think this is doing to people and how they communicate? In so many ways, the entire internet has been going this way. It's just this time there's a pretend human that is responding to you.

We've seen it before with how people interact on blogs and making online friends. This is something that's really interesting because it's a fake person most of the time. Inevitably we're all going to fall in love with robots. I get it. There are kind of two ways this could go, right? There are two schools of thought. One is that you'll have a regular dating app, like a Tinder or a Bumble or something.

AI will be the layer in between that more efficiently matches you to someone else or helps you make a profile. But ultimately the goal is you're going to find another person and then do whatever happens on those apps. I've never used those apps. I'm told they're very convincing. I'm old. That's the problem.

The other school of thought is that eventually we'll just talk to the AI chatbots that will all fall in love with the robot somewhere on the internet and will have real relationships with the robots or they'll become our therapists or some other kind of emotional attachment with a chatbot will emerge. Those are two trends that are kind of evolving in parallel. They might intersect. They might overlap. Where do you think they are right now?

Are we more likely to find more people online or more likely to find more robots? These two things have always run in parallel. There's always been AI that's going to be powering a lot of matching algorithms and making apps more efficient and things like that. But there's also always a group of people and companies who I don't want to say the word exploit, but can when given the right motivation exploit how people can become vulnerable and talking to people online or entities online.

We're going to see the more, I guess, a good word to say is legitimate or respectable type of AI within dating because these are tech companies and they want to be using the latest technology. But we're also going to see a lot more of these other companies that aren't as respectable quote unquote because these tools are very powerful and unleash creativity. And also I've always thought this you give someone the power to make a fake girlfriend. They will make a fake girlfriend online.

Now that the fake girlfriend responds as if it's a real person, they will make that fake girlfriend respond as if it was a real person. So we're going to see both of them. If you can have a fake girlfriend, there will be a fake girlfriend. It's just an iron law of the internet. I want to take a moment here to unpack this idea a little more because it's an important distinction. When Mia says legitimate, she's talking about companies like Tinder and Bumble using AI in their existing products.

She's also talking about companies like replica, which might sound a bit outlandish to the average person because it develops AI chatbots you can form a relationship with, but it's a serious company with millions of users and VC funding. It's legit. But if you just Google the phrase AI Girlfriend, you'll find scores of fly-by-night companies using off-the-shelf tech and whatever API access they can find to cobble together good enough products that mostly attract young male users.

These products might have dubious privacy policies and they might be filled with spam bots or even people trying to scan you. It's a serious problem and it's one we'll talk about more later. You mentioned there the difference between legitimate uses of AI and ones that are sketchy is an important one. Even within the legitimate tools companies are offering, we're seeing the law fake girlfriend's emerge. Some apps are being marketed as therapy or having mental health benefits.

Others are supposed to be there to just make friends, but that's not how everyone is using them. Some people are churning the therapy bots into their girlfriends and some people are using the friend bots as a mental health tool. Kevin Russe from the New York Times famously put Microsoft's Bing on the front page after it tried to convince him to leave his wife, which I don't think it was supposed to do.

The line between how these bots are designed, how they actually work and then how they're actually being used is really fuzzy. Are some companies actively pulling apart the therapy and friend use cases from the dating ones or is there some mudding of the waters going on? It's actually extremely easy to conflate the two because the term that a lot of companies use and I'm thinking of very specifically replica. This is a company that is extremely impressive.

What they've done is really cool, which is basically they started off taking someone's text messages and making that into an AI. What they do is interesting, but they call themselves an AI companion company. The term itself companion, it's already very muddy because it can either be a romantic companion, a friend companion and a lot of these companies, some of them aren't even just companies, some of them are just devs online.

A lot of them take that term AI companion and put in so many promises there. There was a report fairly recently from Mozilla that found that so many of these AI companion apps say that their AI companions help people with their mental health because it's talking with something or someone, which is admirable, but these aren't mental health apps. These are robots talking like human.

It's so easy to do that and it's so easy to claim, oh, we're helping people with their mental health because they can soundboard their feelings. Then there are other companies out there, again, I'm thinking about replica that do actively partner with app developers that are specifically talking about health and wellness. replica and blush, which is an AI dating app, partnered to build an app called Tomo, which is a mental health app. I tried it and it was interesting.

It does muddy the waters a little bit there because it's the same technology. It's asking people to open up, but it's confusing. Is it a dating app? Is it a mental health wellness app? You're unsure, but as always, it becomes, it's how people use it, but how people use it are also dependent on how it's being marketed. replica is really interesting because it is marketed as companions. It has a community of people who think they are having real emotional relationships with its bots.

I think of replica and I immediately turn to dystopian fiction, basically. You're on a 2049, Ryan Gosling falls in love with an AI. It's on a day armistice. Keep hearing people in AI community talk about the Spectrums movie, HER, which is fascinating because that is dystopian movie and everyone who's referencing it. It seems like there's maybe some introspection to be done there. How close is replica to those kinds of products?

People are having these full, emotional, romantic experiences with an AI. So many people are talking to these chat bots like real people. I talk it up to this idea that there's a lowliness pandemic. People want to talk and people have always wanted to talk to others. They just sometimes don't know how. I'm a child of live journal. So I made online friends that way.

So I understand when people talk about using AI chat bots, sure some of them may be falling in love with AI chat bots, but it's because they want to talk to people. The technology is still far off from her and Blade Runner, but people's attitudes, especially those who are most often more likely to use AI companion apps. They're already considering like having relationships with an AI. The version is sort of the last question here just to set the stage.

I feel like with almost every kind of technology, there's a sense that it's inevitable. You can just pull the thread to its final conclusion and that's definitely going to happen and why you even fight it. And sometimes I get it. Everyone in the world is going to have a smartphone. You can just pull the thread to that in 2007. And here we are in 2024 and almost everybody in the world has a smartphone. It does not seem like you can pull the thread out to we will all fall in love with chat bots.

But it also feels like that is an inevitability that will become very much more commonplace. How should we think about that? Is that definitely going to happen? It's going to become commonplace for all kinds of people or is it going to become more of a niche way to live your life? It's still going to be niche. We've never as a society or as a civilization accepted anyone going away from the norm. It took so long for things like same-sex marriage to be fully accepted within society.

It takes a while. So something like falling in love with a chat bot, it's something that's going to be niche and the way I say it's niche isn't just, oh, because people are like marrying a replica or marrying AI girlfriend, for example. No, because there's still so much stigma even just like making friends online. It's still pretty frowned upon by so many people. We'll be right back when we return.

Mia and I discuss the way dating companies like Matt Group are incorporating new AI tools into their existing products. Support for Decoder comes from SAP Business AI. It's all over the internet. AI this, say that. Your friend is turning his cat into a Monet painting. Your coworker used a chat bot to write a sonar about pancakes. AI isn't the stuff of science fiction anymore, but it's also more than the gimmicks we see on a day-to-day basis.

If you're a business owner, AI can offer real solutions to help you scale and innovate. It might be time to check out SAP Business AI. SAP Business AI can help you automate repetitive tasks, optimize inventory management and supply chain analysis, and identify opportunities for growth in your operations. SAP Business AI can help you with finance, sales, marketing, human resources, procurement, supply chain, and so much more.

Like guarding against fraud with AI assisted anomaly detection, or receive data-driven prescriptive guidance at critical decision points. They even have a generative AI roadmap to help you discover upcoming and cutting-edge innovations for your business. Who knows what innovations are around the corner? Revolutionary technology? Real-world results. That's SAP Business AI. Learn more at SAP.com slash AI.

We're back with Virgil Porter, Mia David, to discuss how newer AI features are being baked into existing dating products. Let's break this into two parts. I do want to talk about people eventually falling in love with chatbots. But like you said, that feels a little farther out. Right here, right now, we have dating apps. People have accepted dating apps. That is how people find each other. That is how they fall in love. The dating apps are all in on AI.

They are using them all over the place to try to more efficiently match people, to try to help people make better profiles, all the things. How are modern dating apps incorporating the AI tools into their products? Where do you find them most often? It's two things. The more established dating apps, things like Bumble and Tinder, they're beefing up a lot of their matching algorithms. They're using generative AI to make the process more efficient.

But then you also have startups that are using generative AI as basically their entire pitch deck. I've talked to this startup called Valar and their whole idea is you will sign up, list your interests, you swipe it the same way as Tinder or Bumble. But once you get to the point where you're chatting, they create an AI assistant for you that talks to an AI assistant of the person you swiped and they will talk. So there is a lot of those very interesting ways of using chat bots.

You know that's a movie with Shaline Woodley called Robots, where she has a robot of herself that she sends out on dates and the other guy has a robot of himself that he said and the robots fall in love and they run away. Are we close to this? Have you asked the companies you're covering if the AI might abscond? They promised me that the AI assistant of you isn't actually you and will not be running away. The promise made to me was like, yeah, after six chats or six messages, it's done.

So is the idea that I'm a real person talking to an AI of you or that I have an AI that's talking to an AI of another person and the AI's will decide if we're compatible? The AI's will be talking to each other for only six questions. The idea is to remove that awkward part of asking what do you do? What do you like? Which honestly, like I get it, but also what else to talk about? I mean, it's been a long time since I've dated. I seem to recall that part being important.

Yes, yes, extremely important. So what's the idea here that you're going to you're going to skip over the awkward first thing we're going to skip over the awkward, hey, what a stage of using a Tinder match? Yeah. So their idea is to let the AI assistants basically get a lot of the surface level questions out of the way so that once it's the real people talking to each other, it's out of the way they can go deeper into any topic that they want.

Which in my experience doesn't happen even after you've texted for several weeks and then you actually need in person to have a success rate with this. I'm very curious. Is this working? Do they have customers that are happy? This is still very much in very close beta. When I first spoke with them, they were only in Austin. They were supposed to tell me when they expanded to San Francisco in New York.

My idea was I wanted to try out the app for a blog, but they did say that there was a lot of interest and they couldn't tell me specifically if there were people who were together already, but they did say that there was pretty high success rate of about like 30 to 40% of people who were using it were able to go on actual in person dates. Much people in Texas is one thing, once it's the rest of the country, you'll see.

That idea that the AIs are going to go off and be autonomous, that's one very weird idea. The other idea I think makes more sense to people, which is that the AIs will help you be more interesting. We've seen this happen in pretty low-fi ways. A bunch of people just use chat GPT to generate answers to questions on dating apps. Then we've seen entire services. This one called Riz, which is very funny, that is an AI-powered dating coach. Should we use these or these working?

What's the story there? Is it convincing? I've heard from people that they think someone whose message them used chat GPT, the funny thing there is that if we can tell if an email was written by chat GPT, it's going to be a little hard to know if someone used chat GPT or Riz on a dating app. If you keep using it and you keep answering questions with chat GPT, you eventually be found out because it can't sustain that type of conversation. Although GPT does have memory now, so who knows.

You can tell the longer it goes on, those types of conversations. I'm sure people insist they are not chat GPT when they really are. I'll just warn the audience down. The more you use chat GPT, the more it makes you sound like LinkedIn come to life. I don't know if that's the vibe you want to put out on a dating app. There are people who date on LinkedIn, by the way. This is the reason I'm out of the dating scene. Matribe is the dominant player. They own everything.

It's been hard for other companies to break in. I think Bumble is the other one that's broken in that's a former Matribe person. They're not suing each other. Can these smaller AI powered startups pose a natural threat to Matribe in the dating world? No. I don't think so.

A lot of these smaller startups, they are buzy because they use AI, but they are in very small areas, or they catch the attention of one VC, but not the big VCs that are very important and have made Matribe group into what it is. It's still very much niche. And meanwhile, you have Tinder and Bumble, and all of these other dating apps that have been around for a while, all using AI.

They may be thinking of making it easier for people to create their profiles kind of the same way eBay is making it easier for people to sell stuff using generative AI. The present right now we're seeing AI enhance the existing online dating world. And in the future, it seems likely that at least some people in some way will have relationships with chatbots. The company that is leading the way here is replica. replica is really fascinating.

The founder wanted to resurrect a friend of hers who died. Casey Neuw actually wrote a big feature about it in 2016 for us, which is really wild to see a company come out of a story like that. How is replica evolved over the years? It started with this very noble, very personal thing, and now it is leading this sort of relationship, chatbot zone. How has it evolved? What's really interesting with replica is they took this very sci-fi, very black mirror type story and made it more consumer.

They're not offering people a way to remember their dead friends or be able to have a conversation with their grandfather. They haven't seen in years because they're estranged, right? They don't offer that. What they offer is the ability to create versions of characters that people can talk to. They can make that character into enemy, which way they want, give it a name, make an enemy version of them for the lack of a better term, and basically talk to them.

They created this entire, I should say, cottage industry of startups and other developers that are offering ways for people to talk with a version of a person they would want to be friends with. replica just does it in a, I guess, more powerful way because they do have a lot of this experimental technology that they've had since 2016, they're very good at moving away from that really sketchy type of create a girlfriend, things, again, they call themselves an AI companion company.

They also don't specifically market that you can make a girlfriend. You can make a replica and you can pretend it's your girlfriend. One of the things I think about with replica all the time is once you start letting people have relationships with a software product, they're going to have relationships with the software product.

replica got itself into some trouble last year because it was allowing for erotic roleplay with its chatbots and then they took that away and some of the people who had relied on that or cared for that were really upset. I look at that and I'm like, well, if you have a software product, of course the company is going to change it or remove features or update the features or change the way it works over time.

But that's pretty dicey when you're talking about a chatbot that you have a relationship with. Has there been any maturation or sophistication around that understanding of people's relationships with chatbots that you can't change it as much as I don't know if Google changes any of their products? Not yet. The way it happened with replica was that they were already getting in trouble with certain countries around data privacy and all of that.

So it was understandable for replica to try and cover a lot of legal basis for them. But for a lot of these other companies, they don't quite understand the power of the product that they have. When you look at terms of services, a lot of them still very much treat their products as a software, even though they don't necessarily market it that way.

I'll take, for example, when Mozilla put out that report, one of the apps that they mentioned specifically said that it was going to help guide people with their mental health. So it's different. It's not talking about a relationship or erotic situations with a chatbot. It was specifically talking about, oh, you can talk to it so it's a guide to your mental health. But when you go through their terms of services, you would see that they still say, this is software.

We are not liable for what the software says, which is a disconnect between what the app is to people who are using it and to the developers themselves who think this is still a tool. We need to take another quick break, wonder back at Mia and I dive into why human beings keep trying to form emotional connections with software. Support for Decoder comes from SAP Business AI. It's all over the internet.

AI this, say that, your friend is turning his cat into a moneye painting, your coworker used a chatbot to write a sonnet about pancakes. AI isn't the stuff of science fiction anymore, but it's also more than the gimmicks we see on a day-to-day basis. If you're a business owner, AI can offer real solutions to help you scale and innovate. It might be time to check out SAP Business AI.

SAP Business AI can help you automate repetitive tasks, optimize inventory management and supply chain analysis and identify opportunities for growth in your operations. SAP Business AI can help you with finance, sales, marketing, human resources, procurement, supply chain, and so much more. Like guarding against fraud with AI assisted anomaly detection, or receive data-driven prescriptive guidance at critical decision points.

They even have a generative AI roadmap to help you discover upcoming and cutting-edge innovations for your business. Who knows what innovations are around the corner? Revolutionary technology? Real-world results. That's SAP Business AI. Learn more at SAP.com slash AI. We're back with Virgil Porter, me and David to discuss why human-like chatbots are so appealing, and what the potential pitfalls are when we start opening up to them.

I feel like people have been falling in love with their computers since there have been computers. If people cannot make a fake girlfriend, they're going to make a fake girlfriend, me as a law. You can still use clones of a lies online. You should try it if you're listening to this. The idea of the people in the 60s were falling in love with a lies like everyone was on drugs in the 60s. But you bring that to now. Blake Lemoyne was a Google engineer. He was very famous in 2022.

He claimed that the Google prototype system was alive and he was falling in love with it. We mentioned Kevin and we're certainly earthenier times. We have that first version of Bing that claimed to be another system called Sydney, asked me to leave his white. This is a thing that happens over and over and over again. Is it just because it's a bunch of startups and they don't have the long history? Is it because people don't want to know this information?

What is it about us that makes us keep falling in love with the robots? Why doesn't that seem to be built into our understanding of how to build them? I chalk it up to this idea that if you use something a lot, you tend to form a relationship with it. The fact that chatbots talk back like humans, that really informs a lot of how people interact with it. There are people who anthropomorphize their cars and say, oh, this is so important to me. This keeps me going to places.

I will never... I want to be clear. I'm one of those people. Right? People have sentimentality. The thing with chatbots, even going way, way back to Eliza, is that they are designed to talk. When you talk to something, you feel a connection to it. That's why even pets, when they bark, you think they're talking. It's like, oh, my God, they understand me. It's baked into human behavior to connect with something.

When you program a chatbots, it's also easy to fall into that trap because so much of human data really relies on conversation and connection. It's easy to make that pattern connection for the AI and say, yeah, I'm your friend. That's why we keep pulling into this chatbots. We keep trying to make robots that are us. We want to believe that we can have relationships with them. There's a real dynamic here. There's a pretty widely cited Stanford study.

People used GPT-3 powered bots and in some cases, the people said, this helped me with my social anxiety, gave me a feeling of support. A lot of those people, 90% of the people in the study, so they experienced loneliness. That's the plus side. The negative side is it's a bunch of dudes who are doing weird things with robots. There's just like pure lonely dude wish fulfillment. That is a pretty negative cycle in all kinds of ways. How to balance those two things?

Number one, when you have a product that is ostensibly to create connection, you have to establish right away that it is not a person. It is not a substitute for a real live conversation. A lot of people will probably ignore that, but you've at least said, I warned you, this isn't a real person. The other thing is, I think there should also be understanding that there are people who are more willing to open up to essentially machines. What should we do to reach out to them?

Most often they'll say they don't want to be reached out to, understandable, it's fine. I don't like people either. As long as there's an understanding up front that this isn't real and an understanding on the part of governments or mental health professionals, that it does happen that people are more willing to open up to not another person, they have better understanding of how to approach the mental health services for so many other people.

You mentioned opening up, when I think about, okay, a bunch of people are going to tell their secrets to a chatbot and maybe get the mental health support they need, that's great. But the downside is privacy concerns, right? You're now you're opening up to a chatbot, you're telling all kinds of things about yourself potentially. Are they storing that information? Do they need to keep it to remember it? How are these companies protecting that information? What's the state of play?

There's no law specific to like AI dating privacy, of course. So they all just have to follow regular data protection and privacy services. I know for a fact that replica said that they do not sell the information that they do take from replica and also for a tomo, especially for a tomo because it's a mental health focused app. So they have to follow privacy rules.

Now whether or not they do it well is a big question because a lot of these are smaller companies, a lot of these are basically just like two devs creating a GPT on the GPT store. So whatever data they get, they just store it and hold for the best. Yeah, I don't love that. But if they're just devs in the GPT store, do they fall under the open AI privacy policy or are there other rules that might come into play here? Technically, open AI does not allow any GPT that is for companionship.

But just because they said that doesn't stop people. I wrote about a proliferation of GPT girlfriends on the GPT store and try as open AI might. They can't close all of them down because that's the whole point of the GPT builder is that it's easy. You can spin it up. They do fall under the privacy standards of open AI if they are built on the GPT builder because they have agreed to follow their rules. Again, they created something that is against the rules, but you know, that's how it goes.

Again, you can make a girlfriend, they will make a girlfriend. The thing that's fascinating about that to me is the movie horror, which the whole industry keeps coming back to, is based on Siri. Siri, when it was launched, Apple gave it a personality. Right? So Siri, it's funny. Alexa is supposed to have a personality. Amazon really weirdly insists on gendering Alexa as a woman, which is just super weird, but they insist on doing it.

And then the Google Assistant is just like devoid of personality. All right. There's all these gradations of how much personality the assistants we've had in our life have now. With these generative AI systems, you can have a lot of personality, a little personality. But the trend is more letting people pick, right? Tailoring a perfect companion here, and that seems opposite of what people thought would happen.

Is that definitively the trend or is that just where the pendulum is now and it will swing back? A lot of the beauty of generative AI is that, well, now you can fine tune it. And you can ring that sort of power to anywhere you want to put it, right? That's why you can use chat GPT to write an email or you can use chat GPT to make whatever your imagination lets you do. A lot of people now too, a lot of consumers also expect a certain level of customization.

If you take that away from them, especially if it's something that can create a friend, they're going to not want to use your product because they like the idea that they can use it in the way that fits them best. So I don't think we're going to go swing back into it's the devs that are dictating a personality. We're going to see more customization. We're going to see more ability to fit that AI chatbot, that personality to what you like. Last question, kind of the big thing question.

I have been thinking about this a lot. There is an epidemic of loneliness. There is not a lot of ability for people to practice human contact. They get an odd way. Like the amount of time we spend interacting online has made some people much worse than interacting offline. Our chatbots are way to fix that. Are people going to practice their social skills? Are they going to become more confident because they have a friend who will just always talk to them?

Is there a potential like great upside here? In a perfect world, that's what AI companion apps are supposed to do. If you look at the website for replica, you're going to see a lot of testimonials from people who say, this helped me practice conversation and made me more confident. The problem is not a lot of users use it that way. A lot of the companies that make AI companion apps also tend not to encourage that.

Even though they very honestly market themselves as a way to practice conversation. That's because it's easier to fall into the trap of just not going outside of the realm of the app and talking to an actual person. Ideally, a person who is shy can practice conversation with their AI companion. It will cheer them on. They can go out into the world, speak to a person in a bar, and find a new friend.

If you can create your perfect person or your perfect friend online, give it a personality, make it as if it really is talking to you. Why would you want to talk to a person? I think it's a very challenging question and I think that is a perfect place to end it. Mia, thank you so much for joining the show. Of course, thanks. Thanks again to Virgil Porter, me and David for joining us on Decoder. If you feel compelled to download replica, let us know what you think. Is this the future?

Is it dystopian? Are we all going to fall love through about it? It's interesting, it's deeply weird, it's only going to get weirder and we'd love to hear from you. From now on, we're going to keep bringing you second episodes of E-Coder every Thursday to do a more analysis and storytelling like this, in addition to our classic regular weekly interviews with CEOs, lawmakers and other troublemakers. You have thoughts about this episode or what you'd like to hear more of.

You can email us at decoderaththeverge.com. We love getting emails and we really do read all of them. You can also hit me up directly on the threads. I'm at Rutgers, 1280. We also have a TikTok. It's a lot of fun. It's at decoderpot. If you like decoder, please share it with friends and subscribe wherever you're podcasts. We really love the show. Give us that five star review. Decoder is a production of the version and part of the Voxing in the podcast network.

Today's episode was produced by Kate Cox and Nick Stat. It was edited by Kelly Wright. The decoder music is by Break Master Solution. We'll see you next time. Support for decoder comes from SAP Business AI. Imagine the most tedious task you have at work. Is it making one of those manual adjustments to your weekly spending reports? Or sending essentially the same emails over and over again? If you're looking for ways to innovate your business, it might be time to consider SAP Business AI.

It dozens of potential integrations to optimize sales, procurement, finance, human resources and more. SAP Business AI may be able to improve your business operations inside and out. Revolutionary technology? Real-world results. That's SAP Business AI. For more at SAP.com, slash AI.

This transcript was generated by Metacast using AI and may contain inaccuracies. Learn more about transcripts.