Today, I’m talking to Verge policy editor Adi Robertson about a bill called the Take It Down Act, which is one in a long line of bills that would make it illegal to distribute non-consensual intimate imagery, or NCII. This is a real and devastating problem on the internet, and AI is just making it worse.
But Adi just wrote a long piece arguing that giving the Trump administration new powers over speech in this way would be a mistake. So in this episode, Adi and I really get into the details of the Take it Down Act, how it might be weaponized, and why we ultimately can’t trust anything the Trump administration says about wanting to solve this problem.
Links:
The Take It Down Act isn’t a law, it’s a weapon | Verge
A bill combatting the spread of AI deepfakes just passed the Senate | Verge
Welcome to the era of gangster tech regulation | Verge
FTC workers are getting terminated | Verge
Bluesky deletes AI protest video of Trump sucking Musk's toes | 404 Media
Trump supports Take It Down Act so he can silence critics | EFF
Scarlett Johansson calls for deepfake ban after AI video goes viral | Verge
The FCC is a weapon in Trump’s war on free speech | Decoder
Trolls have flooded X with graphic Taylor Swift AI fakes | Verge
Teen girls confront an epidemic of deepfake nudes in schools | NYT
Credits:
Decoder is a production of The Verge and part of the Vox Media Podcast Network.
Our producers are Kate Cox and Nick Statt. Our editor is Ursa Wright.
The Decoder music is by Breakmaster Cylinder.
Learn more about your ad choices. Visit podcastchoices.com/adchoices
Why the Take It Down Act is a not a law, but a weapon | Decoder with Nilay Patel podcast - Listen or read transcript on Metacast