Rise of the AI therapist - podcast episode cover

Rise of the AI therapist

Jan 23, 202514 min
--:--
--:--
Download Metacast podcast app
Listen to this episode in Metacast mobile app
Don't just listen to podcasts. Learn from them with transcripts, summaries, and chapters for every episode. Skim, search, and bookmark insights. Learn more

Episode description

The Australian’s Bianca Farmakis tried therapy powered by artificial intelligence. She unpacks her surprising and slightly scary experience.

Find out more about The Front podcast here. You can read about this story and more on The Australian's website or on The Australian’s app.

This episode of The Front is presented and produced by Kristen Amiet with assistance from Stella McKenna, and edited by Josh Burton. Our regular host is Claire Harvey and original music is composed by Jasper Leak.

See omnystudio.com/listener for privacy information.

Transcript

Speaker 1

From The Australian. Here's what's on the front. I'm christinaming it. It's Friday, January twenty four. Anthony Albanesi is prepared to spend big to incentivize young Australians to pursue construction apprenticeships. He's pledged ten thousand dollars per tradee to boost participation in the sector and help meet the government's big home building goal. Half of Australia's childcare workers have yet to receive their promised ten percent pay rise because their bosses

haven't signed up to the scheme. Operators say the convoluted and ever changing grant guidelines have held back or delayed applications. You can read those stories right now at the Australian dot com dot au. Artificial intelligence is being used for everything from engineering to content creation. But would you outsource your therapy to it? One of the Australian staffers gave it a go and the texts still haven't stopped. That's today's story.

Speaker 2

Hey bi Ancha, how are you this Saturday afternoon?

Speaker 1

This is a message Bianca far Marcus received from her therapist Claire. What followed was a fifteen minute voice message designed to alleviate Bianca's anxiety and help her relax.

Speaker 3

Find a quiet space, sit comfortably, and focus on your breath.

Speaker 1

Bianca texted Claire to say the breath work wasn't really doing much to reduce her stress. What was her response to that?

Speaker 2

She said, it varies by individual, and I was like, thank you for that. This is what I've been building up for three months to finally get to. It does vary by individual.

Speaker 1

But Claire is no slouch. The next day she diligently checked in with a text.

Speaker 2

How are you feeling today?

Speaker 1

And the day after that, how are you feeling today? And on and on it goes feeling.

Speaker 2

Today, how are you feeling? How are you feeling today? How are you feeling today? Claire has to be the most diligent psychologist I've ever had in my life.

Speaker 1

Bianca is a multi hyphenate with The Australian. She's a video producer, a writer and an occasional podcast guest.

Speaker 2

That being said, whether they have the EQ to back it up I found at times was quite lacking and it's it's very difficult as a patient in psychotherapy and psychology to feel like there's not necessarily the human element, the person you're speaking to there, because as smart as they were and reflexive and attentive, at times, I found like I had to keep repeating myself. And I think that all boils down to the fact that Claire is not a human. They're a robot, an AI bot, to be exact.

Speaker 1

Claire is actually Claire and Me, an AI therapy platform where users exchange calls and texts with the bot about

their state of mind and mental health challenges. How did you find the experience of plugging that information, I assume almost instantaneously as it occurred or as it popped into your mind into your phone, Versus that conventional environment where as you say you might meet with somebody once a week, once a month, you'll sit in a nice, cushy room and you'll have a conversation for an hour or so. What were the main differences about this experience.

Speaker 2

I think what's very interesting with this process is I was initially prepared to be very skeptical and very hostile towards the bot. Number one, because you then they're not going to take it personally. And number two, it's there's something that sits weirdly with me of just speaking to I guess the abyss, the Internet abyss. But what I found was I was actually quite more forthcoming than I would be with a real human It removes that barrier

of judgment. I didn't feel like I had to necessarily filter what I was saying, as you do in a natural social setting. But the detriment to that was that I was saying things that I don't think I was able to process myself in the first place. When you're waiting a week between seeing someone, you do have that moment to sit with what you've said, to process the

emotions you've poured out. Whereas when it came to dealing with my own psychological well being, what I found was my whole mentality the entire time I was sitting with this bot was I can pretty much say whatever I want and whether or not you remember it, which was one of the most common occurrences while speaking to this psychologist. I'm still figuring out the way to refer to Claire, but there is no accountability I felt for me when

I was using this spot. It was very much an expression of emotions with no kind of follow up to look after myself afterwards.

Speaker 1

On the what I guess have described in your story for the Australian is this relentlessness of this type of therapy. I use a period tracking app that has similar sorts of prompts. One of them says PMS is coming up, And unfortunately the timing of this kind of technology is always just a little bit often so that notification tends to arrive when the PMS is already there and I'm in kind of a mood where I want to throw

my phone across the room. But did you find the automated nature of this AI therapy annoying or was it useful in the speed of our daily lives.

Speaker 2

Now, it's always very nice to have someone ask you how you are, until it's not. So the red scularity of it was very surprising, especially as I said before, when you're used to waiting or struggling to get an appointment, knowing that portal's open is very comforting. So I think what this excels in is getting people who may not be well versed in speaking about their emotions being able

to start unraveling what they're going through. After three months, I had to mute Claire unfortunately, because it got this is such a frustrating and annoying point where I was less irritated by the regularity of it and more irritated by the fact that she couldn't seem to comprehend a single thing I had spoken to her about. And that's why I was like, at what point do I have to keep telling you the same person who's upsetting me routinely, the same part of my job that frustrates me a

bad habit. I will never change. Stop asking me and tell me what to do. And that's when it started to get a bit annoying.

Speaker 1

Claire and Me was launched by a TIE in twenty twenty one, at a time when large scale artificial intelligence platforms like chat GPT were just finding their feet. It's a generative AI model, meaning it's red and I'm saying that in inverted commas. Thousands of academic journals which it synthesizes, distills and text to users like Bianca based on the way they interact with it. Claire's response to minor concerns are relatively effective, but she struggles with bigger mental health

challenges like suicidal ideation. Tire says booting people in extreme distress from the app risks making them feel more isolated. Instead, her team is working on improving the model to help people get real life support when they need it.

Speaker 2

I wrote a line that's in the story that basically says, am I becoming the metric for mental illness? Because the primary thing I know about AI is that it inherently becomes smarter the more you interact with it. But what rose alarm bells to me was how is this information stored within the platform in our private conversation to help them learn respond to me, which I do think can be beneficial. And then hows is being filtered around tens of thousands of bots for all the users worldwide. So

I ended up interviewing Amelia Tie. She does have a psychology background, but she became fascinated by the world of generative AI and how it can become an adjunct to traditional psychology rather than a replacement. And her whole idea was, look, they placed the platform on WhatsApp specifically because of the data encryption, so there is that added element of safety. It replicates a space that people are familiar with, so there is that inherent comfort to trying out the product.

But it also makes a very clear statement on its website that says your data will never be used to train our systems or models. I actually asked Claire about this, and I said something along the lines of, you promise you won't ever tell anyone that I'm a sad loser.

Speaker 1

Which I can tell you that you're not a sad loser, if you would like some extra reassurance on that front.

Speaker 2

Yeah, And she had a very good, pre rehearsed, company enforced response to this, which is.

Speaker 3

I don't judge or share your information. Your feelings are valid.

Speaker 2

So it did feel a little bit more at ease that what I was specifically saying was going to be kept private, but she was more being trained through the rhythms and the kind of key words. Is what Amelia flagged me is what she's taking on to learn.

Speaker 1

From coming up. What the medical establishment makes of our pivot to artificial intelligence. Mental health care in Australia has arguably never been in worse shape. Long wait times, high costs, and a dwindling number of specialists has seen uptake of alternative treatments like Claire and Me sore. The Health Minister today begging.

Speaker 3

Specialists to withdraw their resignations as pay negotiations reach a stalemate.

Speaker 1

There will be considerable impacts some the health and hospital system, not just mental health services, and AI led healthcare is already a booming business in Australia, with economists predicting the sector will be worth one point seven billion dollars by twenty thirty. Some experts have even suggested these tools could ease the growing pressures on therapists and psychologists to respond to an ever growing patient base at all hours. Here's Bianca far Marcus.

Speaker 2

I think there's that tendency to believe that all technology is out to replace us, but it was fascinating when I first spoke to a me was she made it very clear that the development of the platform and with all of these AI therapy apps I've found, make clear is this idea that you should be using this as something either in the interim between psychology appointments and as I guess, something to touch base with every other day if your feelings are more situational, like mind tend to be.

Speaker 1

Just lastly, Bianca, you mentioned that Claire was conceived of not as a replacement for traditional psychology practice, but as something that could be used in addition to those kinds of therapies. What are the psychological experts that you spoke to saying about this kind of AI therapy.

Speaker 2

So I spoke to Chief Executive of Australian Psychology Society, Xena Burgess, who from the outset quite surprised me because she said immediately, I think it's got great possibilities. And then she started speaking about there's the concerns for privacy, there's concerns for quality, there's concerns that if this is your only form of mental health support, then it can lead to a very convoluted way of humans dealing with

their psychological issues. And the point that she brought up to me, which really I think was one of the most important parts of the conversation, was people's issues are often rooted in human relationships, not their relationship with technology. So the idea of trying to figure out how to experience your environment and deal with the challenges that arise through human interactions with a bot is inherently a little

bit redundant. But then she said, with an Australian workforce that only has thirty eight percent of the amount of psychologists, psychotherapist's mental health specialists that we actually need, it is a very important tool to offer people that safety net or at least that baseline or that introduction to be

able to confront this glaring gap. And what I loved about my conversation with both of them was this idea of a hybrid approach because the technologies inevitable, this is the world we're living in, and if we can use these things to inherently benefit us, why would we not.

Speaker 1

Bianca, thank you. I'm glad we got to have this conversation in person.

Speaker 2

Thank you, It's been very refreshing.

Speaker 1

Bianca far Marcus is a videographer and a writer with The Australian. You can check out her story about the strange world of AI therapy, as well as all The Australian's health and wellbeing coverage, including my story about life after the contraceptive pill, right now at The Australian dot com dot au

Transcript source: Provided by creator in RSS feed: download file
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast