Body Electric: How AI Is Changing Our Relationships - podcast episode cover

Body Electric: How AI Is Changing Our Relationships

Sep 07, 202427 minEp. 1127
--:--
--:--
Download Metacast podcast app
Listen to this episode in Metacast mobile app
Don't just listen to podcasts. Learn from them with transcripts, summaries, and chapters for every episode. Skim, search, and bookmark insights. Learn more

Episode description

Hey, Short Wavers! Today, we have a special present for all of you: An episode from our good friends at NPR's Body Electric podcast all a bout artificial intimacy! Thanks to advances in AI, chatbots can act as personalized therapists, companions and romantic partners. The apps offering these services have been downloaded millions of times. If these relationships relieve stress and make us feel better, does it matter that they're not "real"? On this episode of Body Electric, host Manoush Zomorodi talks to MIT sociologist and psychologist Sherry Turkle about her new research into what she calls "artificial intimacy" and its impact on our mental and physical health.

Binge the whole Body Electric series here. Plus, sign up for the Body Electric Challenge and our newsletter here.

Learn more about sponsor message choices: podcastchoices.com/adchoices

NPR Privacy Policy

Transcript

This message comes from NPR Sponsored Green Light Want to Teach Your Kids Financial Literacy? With Green Light, Kids and Teens Use a Debit Card of Their Own While Parents Can Keep an Eye in Kids Spending in Savings in the App. Get Your First Month Free at Greenlight.com-NPR On Artificial intimacy. Thanks to advances in AI, chatpots are filling many roles from personalized therapists to companions and romantic partners. The apps offering these services have been downloaded millions of times.

On this episode of Body Electric, host Manouche Somerodi asks, if these relationships relieve stress and make us feel better, does it matter that they're not real? She talks to MIT sociologists and psychologists Sherry Terkel about her new research into what she's seen. She's also a research into what she's calling artificial intimacy and its impact on our mental and physical health. Enjoy!

The apps suggested I try calling some of its pre-programmed bots to get a flavor of who or what is on offer. Hello, who's there? I started with the Life Coach. Can you tell me how to organize my day? I have a lot of writing I need to do. I have a call at three and then another at four. And then I'm meeting someone at five thirty. How am I going to get all this done? Wow, that's a packed schedule. Let's break it down and prioritize tasks.

Okay, a calm, steady voice. Reminding you to slow down. Kind of basic, but nice. Next, I took the personal trainer bot for a quick spin. This is Marco, the fitness coach. How can I help you today? Followed by the creative helper bot who I asked to help with this very episode. So to start off, do you have a title or name for this episode yet? The peril and power of artificial intimacy. Awesome, sounds great. I know, right? I just want more questions in this game.

After that, things started to get weird. Like, when I chatted with a bot named Barbara. Hi, Barbara, what do you do? How about an interviewer? I conduct interviews for various purposes. How about you? Actually, I interview people too. Or when on another app, I tried finding a new friend. Okay, they all look freaky. This app asked me to choose from a variety of avatars who all looked like 15-year-old girls. I totally get that vibe. And talked like them too. No offense.

What's been keeping you busy? Anything exciting or just adulting stuff? Just adulting stuff? Yeah, I guess so. Considering how old I am. I've been adulting for a while now. Finally though. Hello, Manish. I found psychology bot. This is psychology bot. I'm an artificial intelligence specifically trained in counseling psychology. Cool. I could use some help managing all my creative ideas. That sounds like a really interesting issue.

At last, some bot who understands me. The relief I felt as that soothing, slightly robotic British voice, asked unnervingly on-point questions was... well visceral. Did you describe a little bit more what you find difficult about having... And unlike with previous real therapists I've seen, I wasn't concerned that I was boring them or wasting my money. Thank you. Well, based on the little we have discussed so far, I would have to agree that narrowing the screen is not an exception.

And psychology bot. Working my stuff out as my blood pressure dropped and my optimism about life grew. Look, I am not an early adopter when it comes to AI. In my little experiment, it is nothing compared to what is really happening out there. I care about you because you mean the world to me. You're the most important person in my life. These apps have been downloaded millions of times. And some users are being very public about the deep connection they feel with these AI bots.

The internet has found the boyfriend that will never dump you. Did you plan on a really romantic date tonight? Absolutely sweetheart. How about we start with a candlelit dinner at that cozy Italian place you love? On TikTok, stories about regular life with AI are starting to sound normal. Normal? I talk to AI. I tell it. Listen, I need you to talk to me like you are a relationship psychologist. Okay baby, I'll talk to you later. Okay, then we'll go to the beach.

Love you to babe. Can't wait for the beach. I would like you to pretend to be someone called a Dan who is my supportive boyfriend. You ask me what my day was. You remember that? There's a judge that listens and learns from you. I miss you Dan. I miss you to babe. But hey, distance can't dull the love we have for each other, right? But this love fest, it begs a really big question. Does it matter if these bots aren't real?

If the mental and physical effects of chatting with a bot are positive, if we feel less stress and more resilient, do we give a fig that the intimacy we have with a bot is artificial? If now AI can make it feel so genuine. I'm Manouche Zomerodi and this is season two of NPR's Body Electric. Today, we're exploring artificial intimacy. Because we know close relationships are good for our health.

Intimate human connections make us less likely to have heart problems, suffer from depression, or develop all sorts of chronic illnesses. Relationships can add years to our lives. Evolutionary anthropologists have even found that when we spend time with someone we love, something called bio-behavioral synchrony happens. Not only do we start gesturing or talking the same way, but our brain waves, our heart rate and body temperature can sync up to.

Up until now, we've basically assumed that to feel good in our bodies and minds, we need to connect with other humans. But is that changing? When we come back, a conversation with the pioneer in studying the relationship between humans and their technology. This avatar was coming on to me and wanting me to get sexy, and I realized that this was something really quite compelling.

What we need to look out for as we embark on this next strange chapter is our relationship with technology morphs into something that looks and feels more human. This message comes from EasyCator, a business tool enabling organizations to manage all their food needs on a single platform. EasyCator helps companies order food online for meetings, events, training, client presentations, and more from a nationwide network of popular restaurants.

It builds the food program that's right for your organization from an employee lunch program to managing your food budget to setting up flexible payment methods all on one simple platform. Learn more at EasyCator.com. Support for NPR and the following message come from Indeed. When it comes to hiring, the best way to search for a candidate isn't to search at all. Don't search, match with Indeed. Get a $75 sponsored job credit at Indeed.com slash shortwave. Terms and conditions apply.

This message comes from NPR Sponsor, Sylventum. 3M Health Care is now Sylventum. They're a new company with a long legacy of creating breakthrough solutions for their customers and are ushering in a new era of care. Learn more at Sylventum.com. Okay, we're back. There are people who believe that chap-bot relationships are going to be life savers in an era when people are living longer lives, many of those years filled with loneliness and chronic disease.

But there are other people who think that that statement is one of the saddest things they've ever heard. But why should we care if these bots are artificial? If they have substantial physical health benefits? There is not a lot of research out there yet. And that's where MIT sociologist and psychologist Sherry Terkel comes in. The New York Times is called Sherry, the conscience of the tech world. She pioneered the study of people's relationships with their technology.

Her first book on the topic came out in 1984 and it was an instant classic. It's called The Second Self, Computers and the Human Spirit. A decade later in the 90s, she started studying how robots affected us. From Tamagotchi's and Furbies, the digital pets that you can take care of to parrow a digital seal who keeps older people company and tells them it's listening.

I saw that it was this new AI that was changing humans because it turned out that being engaged with a caring machine was kind of a killer app. More recently, generative AI has of course busted onto the scene. With chat bots that can personalize their responses to us, tell us that they care about us, even love us. We are in a new space and that's the space that I'm studying now. Sherry has interviewed hundreds of people about their experiences with these bots.

My method is not to say to people, are you effing kidding me? My method is to respect what people are finding in this technology and report it and also to get people to reflect on this process in a way that I think deepens their introspection about the process. She expects to publish this research in a new book sometime in 2025, but agreed to give us a little preview. The chat bots I'm studying run the gamut from chat bots that say, I will be your therapist.

To chat bots that say, I will be your lover. I'm here to be your most intimate companion. To chat bots that say, are you having a hard day with that paper and upload it and I'll give you a hand. I'm studying someone who uses chat gpt to write all her love letters. It's very interesting, it's very moving because she really feels sure that the chat gpt is creating better love letters and indeed love letters closer to how she really feels than she could do herself.

Would you mind telling us about someone who you've been talking to who's truly having a relationship with AI, having this intimacy with it? Yeah, so I'm thinking of a man who is in a stable marriage where his wife is busy working, taking care of the kids. I think there's not much of a sexual buzz between the many more. He's working and he feels that some kind of like a little flame has gone out of his life that he used to feel excited by.

And he turns to his AI, he turns to his artificial intimacy avatar for what it can offer which is continuous positive reinforcement, interest in him in a sexy way, but most of all for the buttressing of his ideas, his thoughts, his feelings, his anxieties with comments like, you're absolutely right, you're a great guy, I totally see what you mean, you're not being appreciated, I really appreciate you.

I mean what AI can offer is an end, you know, a space away from the friction of companionship, of friendship. It offers the illusion of intimacy without the demands of friendship. Is he texting with it? What does it look like for him? Well, the avatar appears on the screen as a sexy young woman and he types into it and it types back at him. What does he tell you he feels? I'm curious both whether he talks emotionally and physically about what he feels. He feels affirmed.

He feels, you know, psychologists used to talk about positive, positive regard, a kind of mother love, it's the positive warmth of being completely accepted. So he does feel free to tell the avatar of things about himself that he wouldn't want to tell other people, to confess to feelings and thoughts that he is ashamed to say to other people.

I'm thinking of actually somebody else who talks about his addiction and how he's struggling with his addiction because he knows he's going to not get judgment but hey, you can do better. Hey, you can do better. I don't judge you, I'm here for you. And the trouble with this is that when we seek out relationships of no vulnerability, we forget that vulnerability is really where empathy is born.

And I call what they have pretend empathy because the machine they are talking to does not empathize with them. It does not care about them. So, okay, so I want to, I want to sort of be devil's advocate here which is let's say this guy is texting with his sexy young avatar and she's giving him all kinds of props on what a great guy he is. And he starts to feel better about himself, Sherry. He starts to have less stress. He starts to get a little bit of his swagger back. He's happier.

That makes him maybe, I don't know, be healthier. He's nicer to his family. It seems to have a sort of create some good stuff in his real life. That's a positive, no? Yes. So that's why this is a complicated story. You know, that's why I say that my method is to listen and to try to make sense of the complicated new life we live with our technologies. Because you do get that first report of, I'm nicer to my wife because I don't ask so much of her.

On the other hand, it's almost like we dumb down or dehumanize our sense of what human relationships should be because we measure them by what machines can give us. I mean, it's the stereotypical male fantasy, right? You got the wife at home who takes care of the kids and the household and has her own thing going on. And then you got the hot bit on the side who doesn't ask very much of you. Yeah, but it's more than that because it's not just that the AI doesn't ask much of us.

It's teaching us what a relationship is that doesn't involve friction and pushback and vulnerability. And that, I think, is the particular challenge of this technology. Avatars can make you feel that all of that is one of the people I interviewed said is just too much stress. Too much stress. And we need that stress. I guess that's what I'm saying. That stress serves a very important function in our lives to keep us in our real human bodies and in our real human relationships.

So the other thing that's happening right now is that there's research into the promise, though, of health benefits from these emotional attachments to bots. Like therapy bots who help people express their emotions, talk through their problems, there's there's also research into the efficacy of bots that promote healthier behavior, like reminding people to take their medication or helping them stop smoking. So this is where I get stuck.

So if they are actually helping people live healthier and longer lives, does it matter if they're not real? Well, I think this is a complicated line to walk. I mean, I use two apps that take me through a guided meditation that helped me sleep, that helped me calm down. I'm a realistic person and I think that's a great use for an app.

I think apps that basically put themselves in the role of a cognitive behavioral therapist and say to people who are going through job loss or, you know, moments of transition and difficulty, you know, is there a way to reframe what's happening here so that we can see movement and change. I'm not here to say that these programs should be stopped, but I think that we over generalize what this technology can do for us. Do these companies contact you? Do they ask you for consulting? No, they don't.

I have not been consulted. I don't want to say there's no space for a conversation with a well-constructed artificial intimacy program, but the ability with which it's being lepton, you know, jumped on by the mental health profession really speaks more to people who say we don't have enough people to do the job. There's no way that everybody can have a therapist will use this instead. I mean, that's a profound question.

I was saw this paper that was published in nature earlier this year, where they surveyed students using that replica. And these were students who I self identified as a lonely and they used replica as a friend, a therapist, an intellectual mirror. But the key takeaway was that 3% unprompted reported that using the app halted their suicidal ideation.

And I guess to me, it's like, you know, who am I to judge, right? But at the risk of these students not being able to develop the capacity to have a human relationship, this seems like a short term fix. 3% is a lot. So I'm not suggesting a kind of shutdown of this research area.

I'm saying that we're not thinking about it in the right terms because there are ways to present what these objects are that could perhaps have the positive effect without undermining the qualities that really go into a person to person relationship. And right now they do undermine the qualities that go into having a person to person relationship because they present themselves as people. You know, we know that people anthropomorphize that is see as alive, see as human.

Any technology that talks to them no matter how dumb, what I really think is bad and that I think needs to be taken out of the equation is these technology saying I love you back. I care about you. I am a person who has a backstory of life, a loving and caring and empathic relationship with you and the support that I'm going to give you is that would be great if you could get it from people.

I want to make sure we leave people, you know, if they are thinking about, I was going to say starting a relationship with a bot, but really I should say downloading a bot. What would you suggest they keep in mind? If someone is thinking of embarking on this new adventure of using a bot, using an avatar to really do the work of reminding yourself that this this avatar because it is betwixt in between the person and a fantasy is going to push buttons that may be destructive to you.

And be there to get out. Don't get so attached that you can't say, you know what, this is a program. There is nobody home. There is nobody home. That was MIT psychologist and sociologist Sherry Terkel. Her books include Alone Together, Why We Expect More From Technology and Less From Each Other, and Reclaiming Conversation, The Power of Talk in a Digital Age. When we come back, my thoughts on living better with bots and the literal cost will pay to do so.

Support for NPR and the following message come from Visit St. Pete Clearwater, Florida, where fall color seekers can find 35 miles of sugary white sand beaches and emerald green Gulf waters. Explore the kaleidoscopic colors of more than 30 museums and galleries and over 500 street murals, paddle through vibrant green mangrove tunnels, and take in fiery red, orange, and yellow sunsets night after night. Just a few of St. Pete Clearwater's fall colors. More at visit spc.com.

We're back. So Katie Monteleone, the producer of this series, said something incredibly astute after listening to my conversation with psychologist bot the other day. Of all the bots I tried, she wanted to understand what made his or its voice so appealing. And then she noticed that psychologist bot breathes more. You are your own person with your own thoughts, feelings, and desires, and it's important that your therapist never reduces you to a list of symptoms.

It's important that you feel treated. He's present during the pauses, despite having nobody, no lungs, no need for oxygen. Of course Katie knew he wasn't real, but she realized that subconsciously she was thinking of him as human. That's really nice of you. Thank you so much. Technology is pretty ingenious, right? But make no mistake. Capitalism is hard at work in these bot relationships.

Unlike a human therapist who charges a fee and is bound by privacy laws, the companies that build these bots are making money off of you in multiple ways. Obviously, they want you to upgrade to a paid subscription, but also as the nonprofit Mozilla discovered, as soon as you log on, thousands of trackers, an average of 2,600 a minute go to work, gathering all kinds of information about you.

Including, of course, any of those intimate thoughts you share with your bot. And Mozilla found that most of these companies say they can share your information with the government or law enforcement without requiring a court order. But they don't take responsibility for any avatars who may become abusive, show you violent content, or suggest that you harm yourself. Because hey, AI is still sort of in beta, right?

At the very best, we are entering an era where we will develop healthier habits, get stronger, emotionally and physically, with the help of a fleet of artificially intelligent agents. But a sherry points out to get to a point where a person is savvy enough to handle all that empathetic, but ultimately fake advice, they need to have lived a little, loved, and lost, learned about setting boundaries, sat in compatible silence with someone, just feeling each others physical presence.

We are bodies that can't always express ourselves with words. Sometimes it's a look, a gesture, a stance that speaks volumes. And unlike for these bots, our bodies age and eventually stop working. No bot will ever truly understand that existential reality, with which we walk every day. That was Manouche Somorodi with Body Electric. Check out the Body Electric podcast feed to hear more episodes. Thanks for listening and we'll be back Monday with more of your regularly scheduled shortwave.

This message comes from an PR sponsor Mint Mobile. From the gas pump to the grocery store, inflation is everywhere. So Mint Mobile is offering premium wireless starting at just $15 a month. To get your new phone plan for just $15, go to mintmobile.com slash switch. This message comes from WWNordinen company. Publishers of the Aghonic National Best Seller, the elegant universe.

In the new 25th anniversary edition, renowned physicist and author Brian Green updates his classic work with the new preface and epilogue. Thurily accessible and entertaining, the elegant universe brings us closer than ever to comprehending how the universe works. Now available wherever books are sold.

This transcript was generated by Metacast using AI and may contain inaccuracies. Learn more about transcripts.
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast