Deepfake porn and covert recordings - is NZ law keeping up with technology? - podcast episode cover

Deepfake porn and covert recordings - is NZ law keeping up with technology?

Jun 10, 202515 min
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Episode description

Is New Zealand’s legal system moving fast enough to adapt to new technologies?

It’s a question being asked by some of our top academics and MPs.

The conversation around covert recordings has made headlines this week – as well as questions around whether it’s illegal.

And Act MP Laura McClure made global headlines after holding up a photo of herself naked in Parliament. It was an AI-generated ‘deepfake’, which McClure said took her only moments to create.

So do our existing laws protect victims from being abused through rapidly developing technology?

First on The Front Page, we discuss that viral deepfake moment with Act’s Laura McClure. Then, on the rise of new techology, University of Canterbury professor of law, Cassandra Mudgway.

Follow The Front Page on iHeartRadio, Apple Podcasts, Spotify or wherever you get your podcasts.

You can read more about this and other stories in the New Zealand Herald, online at nzherald.co.nz, or tune in to news bulletins across the NZME network.

Host: Chelsea Daniels
Sound Engineer/Producer: Richard Martin
Producer: Ethan Sills

See omnystudio.com/listener for privacy information.

Transcript

Speaker 1

Hilda. I'm Chelsea Daniels and This is the Front Page, a daily podcast presented by the New Zealand Herald. Is New Zealand's legal system moving fast enough to adapt to new technologies. It's a question being asked by some of our top academics and MPs. The conversation around covert recordings has made headlines again this week, as well as raising questions around whether it's actually illegal and act. MP Laura McClure made global headlines after holding up a photo of

herself naked in Parliament. It was an AI generated deep fake, which McClure said took her only moments to create. So do our existing laws protect victims from being abused through rapidly developing technology. We discuss that later with the University of Canterbury Professor of Law, doctor Cassandra Mudgway, but first on the Front Page we discuss that viral deep fake moment with AX Laura McClure. So, Laura, what motivated you to create this nude deep fake of yourself?

Speaker 2

Yeah, I wanted to highlight the growing or emerging trend that is deep fake pornography and realistically it's now starting to happen for a youth in particular, and it's becoming a huge problem at schools. I'm our party's education spokesperson, so I'm out there talking to principles teachers parents alike,

and I'm also a parent myself. So I've noticed this concerning trend over the last few years, and I thought that I would create a members bill in order to attack the loophole that we have and see if we can do something about this.

Speaker 1

Are you surprised by the international reaction to this. I've seen it picked up by a lot of international news outlets.

Speaker 2

No, look, I'm actually not and I think this is a worldwide trend. When we see the increase in technology like AI, we often find that it is used sometimes by bad actors the time it's used for good, as we know. But I think that this is a growing trend worldwide, so I'm not surprised to see us picked up internationally.

Speaker 1

I saw that while Justice Minister Paul Goldsmith said they're not currently considering adopting the bill, he'll meet with you to discuss it. Has that meeting taken place, not just yet.

Speaker 2

But I do believe I am meeting with him next month, which I think is a positive step. I understand the government has a very busy workload that this is a really big problem in the story that I've been hearing are like harrowing, to be honest, and it's as this government going to wait until somebody actually commits suicide over this, or if are we going to actually do something there about it?

Speaker 1

Well, I know that you need to convince sixty MP's from across the House to support it so it can skip that member's bill ballot process. How's that all going?

Speaker 2

Look, I think it's going really well. These positive signs from all of the parties in government that if this bill was pulled they would definitely support it as far as bypassing the Tin. I've got great support from the Greens, which is sometimes a bit unusual coming from Act but they also have these share the same concerns labor and definitely New Zealand first, and I'm talking with Hannah Rafferty

to discuss if she's on board too. So I've written to their party whips or all the party whips saying, look, I think this is something a bit more serious and requires a lot more urgency. Can we get this out of the tin and see if we can get some action on it.

Speaker 3

This image is a naked image of me, but it is not real.

Speaker 1

This image is what we call a deep faith.

Speaker 3

It took me less than five minutes to make a series of deep fakes of myself. Scaringly, it was a quick Google search for the technology of what's available. In fact, I didn't need to enter an email address. I just had to tack a box A so I was eighteen and that it was my image and I could use it.

Speaker 1

Do you think we're moving fast enough as a country to adapt to new technologies? If we wait for your bill to go through the biscuit ten, get drawn, get debated, go through three readings, there could be a brand new tech or ways of doing things by then.

Speaker 2

Hey, well absolutely there could be. But I believe ye my bill would protect from any sensor sized images. So whether it's done via the current tech or future tech, I think it would cover us into the future. And I do believe it's something that we absolutely need to do. And this tech is moving really fast, like day by day there are new sites, new apps that can do all of this kind of thing. And I think, you know, as law makers, we really need to consider one. How

can we adopt AI for good. I think that's a real positive thing, but what can we do when people use it for malicious purposes?

Speaker 1

What are some of the horror stories you've heard from parents and students and teachers.

Speaker 2

Yeah, I've heard quite a lot of heroin stories that are really motivating me to push on with us and actually show how urgent it is. One of the main stories, or the key stories that I've been saying, is about a stirtying year old girl and I've got a twelve year old boy, so still feels very awfully young for thirty. She was deep faked and it was shared amongst her peers at school just year nine, and she attempted suicide

on school site. It was absolutely terrifying and traumatizing, not for just for the individual, but for her peers in the school, and the lack of support and resource around this is really terrifying. It would be on the extreme side of things, but other than that case, I've heard of many, many other young predominantly females where this is happening to her, whether it's at school, university for example, or within the workplace.

Speaker 1

Thanks for joining us, Laura, Thank you. Last week, Michael Forbes, a former Deputy Press Secretary in Christopher Lucksen's office was exposed in a staff investigation for having allegedly made audio recordings of sex workers and photographed women on the street without their consent. And this case, alongside the calls for a crackdown on deep fake porn, have highlighted the issues facing our legal system and keeping up with new technology. To discuss further, we're joined now on the front page

by doctor Cassandra Mudgway from the University of Canterbury. First off, Cassandra, what are some of the current laws in place around covert audio recordings? I imagine they must be illegal in some way.

Speaker 4

So there are some crimes under the Crimes Act around covert audio recordings, but they only cover audio conversations where the person who's intercepting, which is what it's called interception, is not party to the conversation itself.

Speaker 1

Right, So if I was to covertly record someone just in a conversation between us two, that's okay.

Speaker 4

Yes, yes, So if one party consents to that then that would be lead.

Speaker 1

What about in the case where I don't know, say it was a covert recording of a session with a sex worker.

Speaker 4

So that is a sort of an intimate sort of audio recording, and that is lawful, so it's not illegal under the Crimes Act. The Crime fact only covers intimate video recordings.

Speaker 1

What about recording or filming in public places, things like a woman walking down a street at the gem, extreme close ups, that kind of thing.

Speaker 4

Yeah, so intimate visual recordings that are prohibited under the Crimes Act, which is the non consensual creation, possession, or distribution of those. That definition requires that the person that you're covertly recording is in a place where they have a reasonable expectation of privacy. And secondly, that person is in a state of undress or engaged in sexual activity,

or otherwise maybe showering or going to the bathroom. If you have COVID taking of photographs or videos of women in public places, the obvious fish hook that I see is that the place where the women are being captured, generally when you're out using public spaces are so walking down the road or in a supermarket or even using a public gym, you have a lesser expectation of privacy

than you do in your own private home. And even where you could argue that perhaps zooming in on certain body parts where that becomes like the subject of the image or video, perhaps that could be recognized that you have a higher expectation of privacy there. The image itself, according to the definition, must be of those body parts being exposed in some way, so naked or in your underwear, so it's not easily captured under these sort of criminal or that slate of criminal offenses.

Speaker 1

Does any of this counter's harassment Potentially?

Speaker 4

It really depends on the circumstances. So under the Harassment Act, harassment involves a pattern of behavior, so at least two instances, and that includes specified acts, which does include following or watching well otherwise behavior that causes a person to fear for their safety. So covert filming could meet this definition

if it's repeated and or if it causes fear or distress. However, proving intent or knowledge regarding that fear can be a legal hurdle, I think, especially when the conduct is framed as more like your voyeuristic rather than threatening. It's probably also difficult to prosecute if the women are being followed or watched, they've never become aware of that. I can see that as being difficult to prosecute.

Speaker 1

Yeah, does it seem like the laws are a bit outdated here.

Speaker 4

Yeah, I think that this situation has really exposed some gaps in the law, particularly when it comes to those intimate audio recordings, which as the women who brought this forward, where we're shocked that wasn't a criminal offense, and generally this is a really good opportunity I think to really step back and have a look at our current laws and while they address some forms of non consensual recording, as we've discussed, they were designed with specific technologies and

behaviors in mind. So that's slate of criminal offenses I'm talking about that was created in two thousand and six. We're talking about the influx of digital cameras. It's not created with everyone having high powered sort of cameras just on your phones like we do now with smartphones. So as a result, the criminal offenses that we have fail to capture newer, equally harmful conduct like the covert intimate audio recordings, but also other things like synthetic media abuse,

like non consensual sexualized depth fakes. So I think that we need to really look at law reform in a sort of in a different way, something that is more technology neutral and perhaps look at what kind of harm we want to sort of protect people from. So what do we want to protect. We want to protect sexual autonomy, bodily antegrity, privacy, and a real focus on that lack of consent. I think.

Speaker 5

Initially we figured that there was a gap in the law. It's something that was raised as far back as twenty eleven.

Speaker 6

Is it's not a criminal harestler. I feel this case is so extreme. It's the most extreme end of cyber ar wrestler. But it could be used to shake Okay, this probably doesn't have any.

Speaker 5

Day and has been a constant query from the victims I've spoken to.

Speaker 7

Why why can the government not step in and put protocols in place to protect people? Surely there must be something out there that you know the government can can done.

Speaker 5

Well, Sorry, it turns out our laws are pretty good.

Speaker 1

Last year the Herald released Chasing Ghosts the Papper Tier, and that looked at the case of Catfish and Natalia Burgers. They looked at the legal side of trying to stop her and found there were laws that could target her, but not a specific anti catfishing law or anything like that. Despite a lot of the victims wanting something like that put forward. Do you think we trend towards being I suppose too specific in some of our laws in issues like.

Speaker 4

This, Yeah, because catfishing is a kind of impersonation scam, so it was sort of like a romance scam colloquially, that's kind of how we refer to it. And if you are too specific to tech specific, it could create gaps later on. I would assume that would be something around fraud or blackmail. But part of the purpose of criminal law, I think, is to signal public condemnation of certain harmful behavior. There is a purpose to explicitly criminalize

a specific behavior. You are making it very clear that this particular behavior as criminal behavior, and if you are convicted of that offense, that it carries the stigma of committing that particular behavior, And that's what we call fear labeling. So I can see how that could be a really good argument to making a specific catfishing offense or a

specific deep fake offense. You know, I can see that being very useful, But again, it runs that risk of being two tech specific, and you know, you can find gaps in the law later when new tech is developed and there are new ways to commit this kind of harm in the future.

Speaker 1

What kind of laws would you put forward to kind of capture all of the things that we've been talking about and to make sure that it's kind of future proofed. I guess yeah.

Speaker 4

I think it is just going back and thinking about, as I said before, thinking about law reform in a principled kind of way, focusing on what we want to protect and what is the harmful conduct that we want to criminalize, and that focus on a lack of consent. I think that's a whole of society sort of conversation to be had.

Speaker 1

It's interesting as well because these things happen all the time. Look at the stalking laws that have just come through or been passed. I mean, we do have the ability to look at these kind of consent issues and create legislation to cover it, and which covers things like covert recordings and taking photos of someone in the street. Hey, yep, yep, the.

Speaker 4

New stalking or although we don't know what the final form is going to be, probably would have been helpful in this situation and get sort of an earlier police intervention as well.

Speaker 1

Thanks for joining us, Cassandra, no worries. Thank you. That's it for this episode of the Front Page. You can read more about today's stories and extensive news coverage at enzad Herald dot co dot nz. The Front Page is produced by Ethan Sells and Richard Martin, who is also our sound engineer. I'm Chelsea Daniels. Subscribe to the Front Page on iHeartRadio or wherever you get your podcasts, and tune in tomorrow for another look behind the headlines.

Transcript source: Provided by creator in RSS feed: download file
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast