What AI anime memes tell us about the future of art and humanity - podcast episode cover

What AI anime memes tell us about the future of art and humanity

Apr 03, 20251 hr
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Summary

Brian Merchant joins Nilay Patel to discuss the ongoing AI art debate, focusing on the controversy surrounding Studio Ghibli memes and the broader implications for art, artists, and copyright. They explore the economic pressures AI places on artists, the moral and ethical considerations of using AI art generators, and the tech industry's engagement with art. The conversation also covers potential solutions for supporting artists in the age of AI and the future of creative work.

Episode description

Today, we’re diving head first into the AI art debate, which to be honest, is an absolute mess. If you’ve been on the internet this past week, you’ve seen the Studio Ghibli memes. These images are everywhere — and they’ve widened an already pretty stark rift between AI boosters and critics. Brian Merchant, author of the newsletter and book Blood in the Machine, wrote one of the best analyses of the Ghibli trend last week. So I invited him onto the show not only to discuss this particular situation, but also to help me dissect the ongoing AI art debate more broadly.  Links:  OpenAI's Studio Ghibli meme factory is an insult to art itself | Brian Merchant Seattle engineer’s Ghibli-style image goes viral | Seattle Times OpenAI just raised another $40 billion round from SoftBank | Verge ChatGPT “added one million users in the last hour.” | Verge ChatGPT’s Ghibli filter is political now, but it always was | Verge OpenAI, Google ask the government to let them train on content they don’t own | Verge Studio Ghibli in the age of A.I. reproduction | Max Read OpenAI has a Studio Ghibli problem | Vergecast AI slop is a brute force attack on the algorithms that control reality | 404 Media The New Aesthetics of Fascism | New Socialist Credits: Decoder is a production of The Verge and part of the Vox Media Podcast Network. Our producers are Kate Cox and Nick Statt. Our editor is Ursa Wright.  The Decoder music is by Breakmaster Cylinder. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript

Support for the show comes from Alex Partners. The market is evolving at a breakneck speed, and we're only starting to understand how disruptive forces such as AI, cyber threats, and tariffs will change the game. The winners will be those who prioritize execution and know when to adapt. And for unparalleled insights, they can turn to the Alex Partners Disruption Index. Stay tuned to hear more about it later in the show.

In the face of disruption, businesses trust Alex Partners to get things done when it really matters. Read more on the latest trends and C-suite insights at disruption.alexpartners.com. I can't prove, and neither can anyone else, that a computer is alive or not or conscious or not or whatever. I mean, all that stuff is always going to be a matter of faith. What I can say is that this emphasis on trying to make the models seem like they're freestanding new entities does blind us.

to some ways we could make them better. So how can we make artificial intelligence better? That's this week on The Gray Area. New episodes every Monday. Scientists find weird kinds of life all the time. And normally they can run experiments. If I hypothesize, life can live in bleach. Well, I can get bleach and see if life lives in it. But what if the weird thing about the life they find is that it lives for millions of years?

Time. I don't have any control over that. I can literally do nothing with time. This week on Unexplainable, intraterrestrials. Aliens on Earth, deep beneath the seafloor. Follow Unexplainable for new episodes every Wednesday. Hello and welcome to Decoder. I'm Neil Apatel, editor-in-chief of The Verge, and Decoder is my show about big ideas and other problems.

Today, we're going to talk about AI, art, and the controversial collision between the two. A debate which, to be honest, is an absolute mess. And if you've been on the internet this past week, you undoubtedly know that controversy just kicked up a notch because of the Studio Ghibli memes that you can create using OpeningEye's new image shape.

These images, which unmistakably lift the style of the legendary Japanese film studio, are everywhere. They're even being shared by open-eye executives like CEO Sam Altman. And they've widened an already pretty stark rift between AI's boosters and its critics. Brian Merchant is a good friend of The Verge and author of the newsletter and book, Blood and the Machine.

He wrote one of the best analyses of the Ghibli trend last week. So I invited him onto the show not only to discuss this particular situation, but only figure out the ongoing AI art debate more broadly, as it continues to collide with legal frameworks like copyright. Now, Brian and I tend to agree more than we disagree when it comes to tech and culture, so I did my best to really take the other side here and push on these ideas as hard as I could.

Technology and art and culture have always been in a dance with each other. That's part of the founding ethos of The Verge. And I think it's important to put AI in that content.

Not least because we can see the obvious joy regular people find in using some of these tools to express themselves in ways they might not otherwise be able to. But there's expressing yourself, and then there's churning out AI slop that's designed to evoke classics like Spirited Away and My Neighbor Totoro in a way that devalues and even outright steals from actual human artists.

Add in the way that the Trump White House jumped on the trend by ghiblifying a deportation photo. And it's not hard to see why a lot of people perceive these tools as utterly grotesque and offensive. Or as, quote, an insult to life itself, as Ghibli co-founder Hayao Miyazaki once famously said of an AI demo he witnessed back in 2016. So you'll hear Brian and I really go back and forth on what this all means for art and artists.

and for a creative economy that has long since transitioned from the world of physical scarcity to one of limitless digital supply. And most importantly, we spend a lot of time talking about how we should feel using these tools when they might pose very real threats to people's livelihoods and the ongoing climate.

I'm going to warn you, there are no easy answers here, and I don't think Brian and I came to a single conclusion. I don't even think we wanted to. But I do think this conversation helped me consider more clearly how to think about AI in art. But you let me know what you think. Okay, the AI art debate. Here we go.

Brian Merchant, you're a journalist. You're the author of Blood and the Machine, which is both an excellent book and an excellent newsletter. Welcome to Decoder. Thanks for having me, Eli. Every time we talk, it's an existential crisis, and I think this one's going to be no different. It's a never-ending stream of crises in 2025, isn't it?

Yeah, here's what I want to talk to you about the most, but I also want to broaden this out to maybe the nature of art itself. Last week, obviously, OpenAI released its newest model. Everyone went nuts doing Studio Ghibli memes and creating art. It is a time of crisis and conflict and tension and drama, and that's usually when art gets made. Like, those things go together.

It's weird to me that a tool that lets so many more people create so much more art is itself a source of this much drama and tension and conflict. How have you been thinking about that?

Well, first I would put a little caveat there and say that I think that, and not to dive right into the deep end on the very first question right away, but I think a lot of people would question the premise of that question, right? Is this... creating art if you're just typing something into a prompt box and kind of getting a result and it kind of looks like art i mean i'm not saying it's not i'm saying like this is the knock this is sort of

The, I think, justified critique of like this practice that so often the AI folks claim is democratizing art is the line, of course, but. Or is it just like automating art, which kind of... automating the production of art-like products. That seems to be more accurate a descriptor of what's happening.

And not that there's not tons of gray area, right? There's a huge spectrum. There are a lot of different use cases. But what really sort of lurks at the heart of the issue people have with this is the ease and the volume by which now you can just sort of... flood the zone with stuff that's just sort of a reproduction of previous art, right? Okay, we're in the deep end. We're going all the way there.

I'm ready for it. And I will tell you and the audience, Brian and I largely agree on so many things. So I'm going to do my level best. to actively disagree with you through the course of this conversation, because I think it's important to challenge these assumptions from both sides. So here's the hard question that I've been really wrestling with.

Do you have a working definition for art? Now, as someone who's like not an artist, right, I'm a writer, I'm a creator, I produce things that sort of live in the world. I, you know, I don't claim to have sort of my... special corner of expertise be on what art is. The short answer to that is no, I don't really have a working definition of art. And I'm not a scholar or a theorist on the subject. I just spend a lot of time talking to artists.

And my sort of interest and concern tends to be less with, you know, whether or not you can use an AI tool to produce art, which it seems to me like you can. But a lot of the stuff, obviously, that's just being flooded on social media, I don't even think the people producing it would say this is art. I think that they would say, like, this is a meme that I made to get in on this viral craze. Or if you're like, you know, the White House. you know, social media manager, then it's a way to sort of

use a trend to broadcast a policy objective or to make a statement, right? As we saw them getting in on the Ghibli, Ghiblification trend as well. I think for me, it's less important, you know, whether or not you know, the actual end product is considered art, but I'm more looking at, you know, what's happening to sort of the market for art for, you know, the livelihoods of artists. as our economy, our world gets flooded with all of this stuff and it does things like...

push down sort of wages for artists, sort of replaces a lot of use cases for commercial artists, right? Like I've seen a lot of artists talking about how they've just already lost huge chunks of their income because people are using mid-journey for stuff that's not public-facing. Because, you know, obviously if you're going to publish something with a magazine, you don't want to use AI art because you're going to get blowback.

A lot of, you know, PowerPoints, internal presentations, even like trade mags, stuff that used to be graphic design. A lot of that stuff that used to be the province of artists and illustrators is now sort of being. the AI produced stuff. And that's more what I'm looking at, what's going on. And that's why I think the critique sort of resonates with me. The reason I started with what is art is because I tend to live in the same space you're talking about, right?

in copyright law in markets and what is the business of this stuff who is making the money where's it going and it just strikes me that You know, the copyright law perspective is Really about scarcity, right? Copyright law is there to create scarcity. It's to say, you own this thing, you can buy it and sell it. No one else can buy it or sell it, even though on the internet, everything is copies of copies all the way down.

We've manufactured illegal fiction that creates some scarcity. And that is under a great amount of stress. because of the internet, right? The original scarcity was literally in the physical goods. There was only one copy of this book and you can buy it and trade it however you want it. There's only one copy of this vinyl record. That's all gone. So now we live fully in this legal fiction of...

Well, that's mine, because I and the government have said that's mine. And then you get to this next turn, and it's, okay, Miyazaki has a style. Studio Ghibli has a style. And now we're saying that belongs to him. And that's a stretch. That's a new one for me. And it feels like you need to have a framework outside of

to say, okay, now we're going to say the style belongs to someone. And that, that feels like it challenges a lot, many, many more presumptions in how we make art and who gets to make art. I think you're absolutely right. And I think, I mean, you're the copyright law expert, you know, that's not protected, like explicitly, like in the style of a studio is not protected. And I think this example is interesting because... I think, you know, Miyazaki and his works are not

particularly threatened by AI, right? Like they are the sort of embodiment of handcrafted, sort of laborious, meticulously created. And they've built up that reputation over decades and decades. So people are going to go see a new Studio Ghibli film if it comes out. And it's not like they're not at real risk of competition from AI generated stuff. But I think it's more like the point.

that this is surfacing, like that this is making, that that's sort of resonating with people, the ease by which it is possible to flood the zone with this kind of stuff. And I think obviously there is just sort of... The indignity that seems omnipresent in it because, you know, Miyazaki has that famous line when he's shown and AI generated.

CGI kind of zombie thing walking, and sort of the developers and the designers of that program kind of show it to him. And then he responds with some of the strongest sort of revulsion you could possibly imagine. And then so he's kind of... He says this is an insult to, you know. I can read you a line. He says, I'm utterly disgusted. I would never wish to incorporate this technology into my work at all. I strongly feel that this is an insult to life itself. Which is just one of the hardest.

things you can ever say. You should say that all the time. You get a bad cheeseburger, you should be like, I'm utterly disgusted. Yeah, that's how I'm sending back my food from now on. But and then like the camera kind of like zooms in on the guys who are like presenting it to him and they're just like sort of just like frozen in horror. I mean, you got to feel bad for them.

I think that is all kind of wrapped up in it. And I think this particular cycle, there's a number of ways to read it, but I do think that... This particular burst of outrage was more over sort of like that moral quality than the copyright dimension, which I, which you are absolutely right. Like that is. the toughest ground to navigate. I hope we talk about that more in a second. But like, yeah, I think this is like the sort of like the moral indignity, the fact that like everybody's aware.

of Miyazaki's sort of critique, or at least a lot of, you know, the AI guys are the most sort of ardent, you know, pro AI art generation. And, you know, certainly Sam Altman has been forwarded this critique before. And so seeing all these guys kind of... throw the avatar. you know, in their ex-bios. I think a lot of the artistic community felt like this was a real affront, and I heard from a number of artists. who were really upset in a way that...

They're used to contending with the proliferation of AI tools. and have been for a couple of years now, that this was kind of a new level. It really kind of felt like the AI community was trying to rub their noses in it. I'll just put this in context. Just recently,

Graydon Carter, who was the famous celebrity editor of Vanity Fair for years and years, put out his memoir. And that's had a lot of ex-Vanity Fair writers and editors from over the years coming forward to talk about the heyday of magazine journalism. Brian Burrow, who was an investigative journalist, From 1992 until 2017, he wrote, For six months a year, I wrote three pieces. 10,000 words each, and I got paid $500,000 for that work. That's a lot of money for three pieces a year.

We're all listening to the depth of the quality. That's a lot of money for three pieces a year. Brian even says, I know this is obscene. That is a reflection of supply and demand. There wasn't so much supply, but there was still the same amount of demand for new stories, new information. And because

Vanity Fair at that time was a print magazine. You had to go and buy. The economics of that worked out. I don't know that the economics of Condé Nastrian say that actually worked out, but they worked out enough for that to happen from 1992 to 2017. That's all gone now, right? There's an infinite amount of free content. And the thing that creates value... for musicians, for artists, whoever is. some sort of personal relationship.

with the audience for, for sub stackers. Most of the sub stackers I know are tweeting all day long for free and then giving away their best work for free in order to go build an audience and build a relationship with the audience. That's the other model, right? That's the one that. Everyone always comes and argues with me about that. The other model is built on relationships, not scarcity.

I don't know how AI fits into that mix because it seems very obvious to me that anybody with enough of a relationship could just start slipping a bunch of AI work into their feeds. Yeah, I mean, I think you could get away with it, right? I think because, yeah, I mean, it's all about parasocial. That's the word. And it's, you know, it's sort of explicitly recommended to folks doing this stuff. Yeah, as you, you know, sort of gestured towards, I recently started kind of.

taken a swing at doing this newsletter thing as kind of a full-time experiment for at least the next six months or so. And it's interesting, and you get all the metrics built in and everything, and you can see sort of... The result of putting hours and hours and hours of human labor into a story versus

something that you blast out in an hour or two and it just happens to catch the right algorithmic wave on a social media network and then you get rewarded for that one. So you can already see the incentives are kind of built in. So like, you know, if that's the case, right, and you could even talk about how. The middle piece of that evolution from magazine journalism.

to AI-generated content is blogging and aggregation and the period of digital media that we lived through where the demands by readers were broken down a little bit. You know, it wasn't AI. It was like a CMS or like a blogging platform. But it enabled, you know, more writers to produce more work with less labor and less cost, you know.

That's how I broke into it. And now, yeah, maybe, you know, there are people who are explicitly trying this now. 404 Media, Jason Kebler wrote a piece about sort of this new sort of AI slop economy where people are just... flooding the zone with AI-generated stuff.

And if you pair that maybe with just a strong personality and you're talking about the slop that you're creating, that could be a new formation that we've stumbled on here too. But I think you're right. I think this is a further sort of... breakdown in the labor process of journalism. And I mean...

Say what you will about creating art or creating opinion pieces or whatever, which I think there's immense value to all of that stuff. But I mean, especially journalism, right, stands to be hit by this wrecking ball because. A, I cannot really do good. You can find other stuff that's been written online and maybe aggregate it even when it gets, but you still need somebody who's actively seeking out new information.

And things like journalism and reporting stand to be hit the hardest by all these wins. Obviously, I only want to talk about the threat to journalism, very self-serving way. That's all I care about. But I'm looking broadly at art, like art, visual art, music, film, all of that. And the first thing that I think about always is whenever you go to an art museum, I would always go to the Art Institute in Chicago when I was in school.

You would just see the students sitting in front of the great works, copying them, like literally copying them in order to make their bodies feel the brushstrokes that the masters had felt so that they would learn those moves and learn those feelings. And then inevitably, that is the basis from which they might go on to become great artists. And that is repeated all over the place, right? You buy a guitar, you learn to play your first three chords.

You're going to feel like Johnny Ramone for five minutes and maybe you learn some more chords and you go on from there, right? Like that is the basis by which all great artists are. Is there something meaningfully different? In saying, I wish to communicate like Miyazaki, I'm going to go use this tool and try to make people feel the way I want them to feel. The whole thing is that you're not only sort of cutting out those fundamental steps, and this is

This is the complaint that I hear from educators as well, right? When they're, you know, like seeing. AI tools being adopted in the classroom where it's like, well, if I can just get the answer and get a fully formulated paragraph, then I don't have to think through it and I don't have to flex the cognitive muscles that are necessary to begin to construct paragraphs. And 100%, I think that that's... that's the risk here is that you're eliminating all these steps.

that are necessary in trying to sort of process a thought, sort of free a thought, like connect with a thought, express the thought, right? If you can just hit a button and you can just sort of get the automatic. automatic generated output, then you are sort of dissociated sort of by definition from that output right you don't have a whole lot

to do with it, especially if it is sort of this aggregate amalgam of previously generated works. I think where the artistic... potential might come in is then if you do something else with this output or you work to get interesting outputs from it i think that that's like a relatively narrow slice of of you know how these tools are actually being used you know there's always

going to be like Holly Herndon, who's kind of like really doing, you know, the extra work to try to make this stuff interesting. And then you have, you know, 95% of the fact is the zone's being flooded with sort of people just...

churning out output. Ted Chiang has written about this too, quite interestingly, I think, in that... A lot of the sort of jobs or tasks that AI stands to automate are also ones that, you know, we might not like bemoan or sort of say, oh, it's really important that we have junior copywriters or like society needs those to function.

A lot of aspiring writers, aspiring artists, you know, they take these sort of jobs to hone their craft. They're writing a lot. They're figuring it out. A lot of craft, a lot of art, all of it takes repetition, takes practice, takes being in the trenches, figuring it out.

If we remove those sort of opportunities for people, then we're going to have a lot fewer artists in the long run who can actually sort of make a living or hope to make a living practicing doing art. And I think that's another sort of... element that often gets overlooked in the AI art debate. Again, and it comes down to its relation to material conditions. We need to take a quick break. We'll be right back. Support for Decoder comes from Vanta. Trust isn't just earned.

It's demanded. Whether you're a startup founder navigating your first audit or a seasoned security professional scaling your GRC program, proving your commitment to security has never been more crucial or more complex.

That's where Vanta comes in. Businesses use Vanta to establish trust by automating compliance needs across over 35 frameworks like SOC 2 and ISO 27001. Vanta also helps centralize security workflows complete questionnaires up to five times faster, and proactively manage vendor risk. Vanta not only saves you time, it can also save you money. A new IDC white paper found that Vanta customers achieve $535,000 per year in benefits.

and that the platform pays for itself in just three months. You could join over 9,000 global companies like Atlassian, Quora, and Factory who use Verta to manage risk. improve security in real time. For a limited time, our audience gets $1,000 off Vanta at vanta.com slash decoder. That's V-A-N-T-A dot com slash decoder for $1,000 off. Support for Decoder comes from Kinsta.

Running a business can be overwhelming, especially if you have no idea how to manage a website. That's where Kinsta comes in. They can help you manage your WordPress site and your hosting so you can focus on the millions of other things you've got going on. Their expert team handles it all. They've bundled up all the essentials so you can easily manage your WordPress site without having to be a one man support team. That includes security that never sleeps.

and a dashboard that's incredibly intuitive. And when you hit a snag, you'll be able to talk to real humans. 24-7, 365. Actual people who get it, not AI chatbot. In short, Kinsta is perfect for those who want professional results without needing that technical background. So if you're tired of being your own website support team, you can switch your hosting to Kinsta and get your first month free. And don't worry about the move.

They'll handle the whole transition for you. No tech expertise required. Just visit kinsta.com slash decoder to get started. That's K-I-N-S-T-A dot com slash decoder. Fox Creative. This is advertiser content from Mercury. Hey, I'm Josh Muccio, host of The Pitch, a Vox Media podcast where startup founders pitch real ideas to real investors. I'm an entrepreneur myself. I know and love entrepreneurs. So I know a good pitch and a good product.

especially if it'll make an entrepreneur's life easier. So let me tell you about a good product called Mercury, the banking service that can simplify your business finances. I've been a Mercury customer since 2022. From the beginning, it was just so clearly built for a startup. Like there's all these different features in there, but also they don't overcomplicate it. Here's your balance. Here are your recent transactions. Here you can pay someone or you can receive money.

These days, I use Mercury for everything like managing contractors, bill pay, expense tracking, creating credit cards for my employees. It's all in Mercury. Mercury, banking that does more. Mercury is a financial technology company. Banking services provided by Choice Financial Group, Column N.A., and Evolve Bank & Trust, members FDIC.

We're back with journalist and author Brian Merchant discussing AI in art. Before the break, we are going over the Studio Ghibli saga and how so much of what we're arguing about regarding generative AI today has to do with scarcity. Before widespread digital tools and internet distribution, it took a lot of specialized human labor to make art, whether that art was hand-drawn anime or in-depth magazine journalism.

That means some of the people who were fortunate enough to be professional artists or writers got paid handsomely, while for others it was difficult, maybe even impossible, to break in. But now we live in a world of digital abundance.

where anyone can make content and distribute it for free to an audience that might be infinitely large. That's pushed Price's way down and made it harder to be a professional creative. And as you heard Brian and I discuss, TrainerDive AI might just be the next turn there. even if it's accelerating the economic pressures around art in ways that might push it to a breaking point.

But I wanted to keep playing devil's advocate here, because I think it's important to really stress test the argument against AI art. I don't think that it's just going to go away, after all, and the forces that want society to accept it are expanding well beyond just the AI companies themselves.

All right, I'm going to piss everybody off by asking this question in this way, but I'm doing it to be provocative. I look at the history of that. I look at the history of famous magazine editors and people who got to come up in the film industry. and famous photographers. I'm like, oh, that's a bunch of Nepo babies. Like just straight up. That's who they are, right? They are rich people who had rich parents or parents with industry connections. And before the internet, I had no shot.

of going to work for a magazine in New York to write about anything. Before the internet, there was effectively a 0% chance. that any major radio company would have given you and me an hour to talk about the implications of technology and art. It simply would not have happened.

And all of this technology has democratized that, right? It has made it possible for us to create, for us to make livings, maybe at lower numbers than the Nepo Baby magazine editors of yore, but it's made it possible to some extent. and that this is just the next turn, right? Everyone has within them an artist.

Most people have jobs and families and other commitments that make honing their craft impossible. And if what you really want to do is express what's in your heart, maybe it's not so bad to just type a prompt and get a piece of artwork that reflects your innermost desires. I don't know.

I struggle with that. I see that. Yeah, I mean, I would say that, like, yeah, a lot of the debate around AI always tends to obscure the fact that the situation wasn't particularly great for artists beforehand, right? This was, again, it was a very precarious... very difficult sort of way to make a living. And that's the reason that a lot of them are fighting, you know, quite valiantly. I think this rear guard effort against this new sort of.

economic force that's being levied against them. So, yeah, I don't think the core problem is or ever was AI. I think it is the very sort of... foundation of the way that we've sort of organized our economy to treat artists in general. And we, especially in the United States, we don't quite value them as much as many other sort of nations that have programs that support artists.

that have, you know, better social welfare systems that can support artists while they're getting going. All these things, we make it incredibly difficult to become artists or journalists. or, you know, anything that isn't corporate law or, you know. Applications to law school are a recession indicator, by the way, and they're skyrocketing. So I hear what you're saying. Right. But yeah, I mean, there is, I think.

a tendency to push back on that, which I think that sentiment is true. I don't know that we necessarily would be able to have had like the career trajectories that we've both had, both through, you know, I went through Vice, you went through. The Verge. AOL. It was so much less cool than Vice. I want to be clear about that. Well, it's still around. So the Verge is still standing. It didn't get gobbled up by a private equity firm and set on fire. So it has that going for it.

AOL did do. I want to be clear about that. That absolutely happened. Yes, AOL is, yes, and now I mean The Verge. Technology can sometimes, yeah, provide an opening. It does disrupt. standard economic practice, right, where you have a new entry point into an industry that was, you know, gate-capped. And there is, as always, a kernel of truth. to sort of the complaint or the counter complaint from the tech industry, which is that

All the artists are just gatekeepers. And of course, they then go on to say like, well, we should just be able to sort of use their work and replace them anyways. It is a difficult industry to break into. And that's part of the reason why a lot of the artists that feel threatened by this. are sort of speaking out the most loudly and are sort of like hanging on to dear life. And that's understandable, but it's because it's precarious in the first place. It's because...

It's so rarified that it inspires a lot of passion. I think that there's been a lot of work and a lot of effort to get where they are. And I think that's understandable. But I think the answer is not to say we figured it out with digital media last time because...

We see now that we didn't really, as you know, it's a constant stress. I've been, you know, part of like three mass layoffs at this point. And, you know, private equity is swooped in and sucked up digital media firms left and right. And it's it's an ongoing disaster. But, you know, to say that AI, you know, things will eventually kind of sort themselves out, I think, is the wrong response, too. I think it's an opportunity to say.

What do we really want? Like, how do we really want to nurture artists? Like, if we really... do want to have a thriving artistic community that can say, then be, you know, part of creating new works for the next generation of LLMs or whatever, then shouldn't we be finding ways to support them? So I do think this is an opportunity to say like, Artists feel like they're facing an existential threat.

The answer isn't probably to abolish AI. That cat is kind of out of the bag, right? But what do we do instead? How do we get artists the resources they need? One of the things you talk about is the idea that... In particular, OpenAI released their newest model with Studio Ghibli memes as a provocation.

To say, we can just take this from the artistic community. There's a writing with Gareth Watkins, who wrote for a magazine called The New Socialist, that AI art itself represents what he calls the new aesthetics of fascism. You can just flood the zone on these platforms and devalue things like rigor and skill and craft by saying we can just take it, we can use it, we can use your art to express whatever we want, and that you can't distinguish from them.

That's a lot of ideas, right, all wrapped up into basically a power dynamic, into the big tech companies saying, OK, we have the power to just take this. Maybe you'll sue us. Maybe we'll win or lose a couple of copyright lawsuits. Maybe we'll get the White House to change the law, which Google and OpenAI have asked for. That's all just a power dynamic. Is that at the heart what artists are reacting to, is this mass imbalance in power?

Yeah, I think that's a big part of it. And I think that it's not just sort of relegated to artists. I think that is maybe, you know, along with sort of the direct threat that many people perceive to their livelihoods. I think that's sort of... the sort of, you know, blanket unease or resistance to AI is that Especially now, right? Especially now that we have Elon Musk in the White House and his Doge lieutenants talking about introducing AI-first strategies.

building AI chatbots while they're firing, you know, thousands of federal workers as we have this like logic. you know, on the biggest stage on one hand. And then, yeah, we have these firms that are like, like OpenAI and, you know, to some extent, Google and Anthropic and the rest of them. operating by this logic that, yeah, we can take whatever... Whatever has been recorded is ours for the taking, ours for the reappropriating.

Ours for the transmuting into profit, right? And I think that last part is also especially. sort of angering to folks who've, you know, spent the last 10 years sort of watching big tech sort of amass more monopoly power over the internet, over the tools that they use, over general conditions, and saying like, well, now like... The scope has been expanded right now, like everything's up for grabs. They say they can take that and they can use it how they see fit.

And then, yeah, and then Elon Musk and the White House do give us, like, as we mentioned earlier, perfect examples of that inaction. Like, they make the Ghibli memes. They just sort of... spit them out and sort of offer like a kind of a final like stamp or an exclamation point on the argument. So, yeah, I do feel like that seeing examples like this is further fomenting anger and sort of an expectation that it's going to tilt the scales toward.

inequality further, which I think that that intuition happens to be right. I think that's exactly what's happening right now. And people have a real reason to sort of feel disempowered and angry. And artists... in particular, right? This is going to be ridiculous, but I want you to go with me on this. Do it. Every time I end up in a conversation like this, I think about my Midwestern aunts who have live, laugh, love signs in their kitchens.

And I have a lot of Midwestern ants on both sides of the family who have left love signs in their kitchens. I'm from the Midwest. Do you have a live, laugh, love sign? You should get one. I'm going to make you want to chat GBT. I regard these words with actual empathy because it's always obvious to me what the point is. They just want you to feel that way in their kitchen. Like, they're instructions. Like, please laugh in this kitchen. Like, love, I command you. Love in this room.

And like that is as blunt of an instruction as you can give someone. And I think it's very silly and fully cringe, and it's certainly not my personal style, but I kind of get it. I see why those things happen, why those objects permeate the target stories of our lives. And I think about this industry and the characters in it, and most of them have about a live, laugh, love. capability in terms of emotional communication, right? That's about as far as they can go. They're like...

enjoy, like is basically what they can do. And the last CEO that could actually connect emotionally with people, artistically with people was Steve Jobs. He was great at it. But I think he was the height that this industry has ever seen. He might be the height this industry will ever see. And he was very into, hey, I made a new startup sound for the Mac. I hired the entire New York Symphony Orchestra. for a week to make these samples for GarageBand.

He was proud of the money he was spending on art to infuse his computers with humanity. And he would stand in front of that intersection sign at the end of all his events and be like, Apple is at the intersection of technology and the liberal art. And everyone was always like the liberal, like Vassar. Like no one ever really understood what he meant, right? But he meant art. Like just generally art and the humanities was the point that he meant. And that seems to have gone away.

Right. It's it's we're not talking about that anymore, except these AI art generators. are a reemergence of that. And they seem to have given a bunch of characters with approximately live, laugh, love levels of emotional communication ability, like a level up to say, okay, now we can do art.

And I'm just, like, I'm sitting here just, like, trying to wrap my head around that. Like, are they just jealous of Taylor Swift that she can command the hearts and minds of millions and they can't because they run search engines? Yeah, I mean, that's the question. I wrote in the piece that kind of is the jumping off point for a lot of this discussion, I was long very skeptical of sort of the argument in many artistic communities that...

The AI industry folks actively have contempt for art. There's this line, they hate us. They hate art and they want to prove that they don't need it or that they can do it or that they don't need us and they can do it without us. And before this latest sort of bruja, I sort of...

Wrote that off as an overreaction of the artistic community under tense situations. But now I can kind of like empathize with that and I can kind of see like why, you know, when I posted the story I should mention like on X. And on Blue Sky, but only on X, when I got reactions from a ton of sort of folks in the AI industry who literally were embodying that ethos. I got the quote, was literally, fuck Miyazaki. literally sort of embodying that. Like, we can do what we want.

He's outdated. He may have made some stuff 20 years ago that was good, but he doesn't matter anymore. This is the new age. And real sort of like Italian futurist vibes where like... Out with the old art, like in with the new. And I don't think that there's sort of a coincidence that like that really did become sort of the aesthetics of fascism. And that might.

Sounds strong. That might be a lot. But as you said, there's been a lot of sort of talk lately about, you know, why the White House, why Elon Musk is gravitating towards the use of this stuff, because it. is kind of being used, you know, in this way to subsume or to sort of conquer or to domineer over artists. And I do think, like, you're absolutely right. I think this element has completely fallen off. I know exactly what you're talking about with Steve Jobs. And it was always sort of

front and center. It was a top concern. that things both be sort of usable and beautiful. And, you know, you can debate the extent, you know, to which, you know, that became a core, you know, principle of Apple after he was left. And I know there's been a lot, you know, a common like refrain to when Apple does AI stuff, as you know, is like.

Steve Jobs would have hated this, right? Apple intelligence, all that. But it's because it was there. And I think it's true across the board where there used to be at least more... interest in the aesthetics, in how it intersects or interacts with the humanities. I think it used to be at least... something that people in the tech industry were thinking about. And I don't know if it's sort of just

a logical outgrowth of monopolization or like cold bureaucratic expansion. And we just don't need that anymore. We have all these different product categories that we can just... keep updating and marching out. But whatever it is, I do feel like we are at a point now where we do have this very, very shallow, almost thoughtless, maybe some would argue worse, engagement with art.

or with the concept of art, or even the concept of making something that, you know, people would really care about. I think it does come down to the logic of automation. And I argue this a lot. There are a lot of cool and very good things that can be done with LLMs and with AI. But by and large, the argument that a lot of the industry is making is that you should be using this to automate stuff. Automate this job. Automate the production of an image. you know, video editing.

Automation is inherently sort of antithetical to art, right? You're just automatically producing something. You're not taking the care to stamp it with history and personality and your own individual thought process. And so we have this... tool of mass automation that is now sort of the central focal point of the entire tech industry at the moment. We need to take another quick break. We'll be right back. It's been a rough week for your retirement account, your friend who...

and also Hooters. Hooters has now filed for bankruptcy, but they say they are not going anywhere. Last year, Hooters closed dozens of restaurants because of rising food and labor costs. Hooters is shifting away from its iconic skimpy waitress outfits and bikini days, instead opting for a family-friendly vibe. They're vowing to improve the food and ingredients, and staff is now being urged to greet women first when groups arrive.

Maybe in April of 2025, you're thinking, good riddance? Does the world still really need this chain of restaurants? But then we were surprised to learn of who exactly was mourning the potential loss of Hooters. Straight guys who like chicken, sure. But also a bunch of gay guys who like chicken. The Nintendo Switch 2 is basically guaranteed to be the most interesting gadget of 2025.

And we learned a lot of new stuff about it this last week or so. Some of the games that are coming out, some of the specs of the new device, and the fact that it's going to cost $449.99. Except maybe it's not, because the other thing going on right now is tariffs. And tariffs...

to change just about everything about tech, what it is, how it's made, where it comes from, and crucially, how much we have to pay for it. So that's what we're talking about on The Vergecast all week, wherever you get podcasts.

We're back with journalist and author Brian Merchant. Before the break, we were discussing the tech industry's history with the arts, and specifically how Steve Jobs represents a kind of high watermark for the intersection of Silicon Valley and the kind of obsessive craftsmanship and detail you might associate with Studio Ghibli. That has of course changed over the years.

Instead of Steve Jobs, we now have Sam Altman, Elon Musk, and the White House's social media team using AI art memes in ways that seem purposefully designed to devalue the human creative labor these images are built on. But while the backlash to AI art is quite vocal, and Brian has done a great job articulating why we should be sympathetic to many of those arguments against the tech, there's one really big problem I've been struggling with for a long time now.

These AI tools are really popular. People like to use them, and they find them fun. So I've been having a hard time making sense of how we balance the fact that this technology might be antithetical to art, to artists, and their livelihood. with the fact that a lot of people get utility out of these tools, and that maybe one day we're going to be talking about this debate the same way we talked about Photoshop or Pro Tools.

The thing that comes up on this show a lot in response to that criticism specifically is, yeah, but when you look at the data, people are using this stuff off. The CEO of Adobe, Shantanu Ryan, was on the show, and he told us that generative fill in AI is used at approximately the same rate as labor. Which basically means at the same rate as Photoshop. We just had the CEO of Splice. She pointed out to us people are using the tools at high rates in Splice to make music.

You can just keep going down the line. I use generative AI noise reduction in Lightroom. I think it is fantastic. None of the text, I'm too picky about text. I won't use it there, but. It has breathed new life into my old cameras. And you can just see it. For a lot of people, there's an enormous amount of utility here. Right. There's this big society level automation debate that's happening about the value of art and artists.

whether there's enough scarcity in personal styles or parasocial relationships to support people who make creative work. I don't know if that's true or not, but that's the debate. And then on the other hand, you have a lot of people who have very boring jobs being like, yeah, just write me the email.

You have a lot of people who have very boring jobs saying, yep, just clone that background for me. I don't just knock it out. Like these assets need to go out today. Just I'm going to push the button and get it done. And then you have people just gleefully making Studio Ghibli memes, right? Sam Altman says the GPUs are melting. Sundar Pichai, they released the newest version of Gemini the other day. He said the TPUs are running hot. How do you square this circle?

Yeah, there's two things I want to say to that. Number one is I think. The biggest problem that... that I am convinced that we have with technological development and deployment, especially In the United States, where we don't like to do a lot of regulations or safeguards or sort of, you know, legislation to dig into this, but we don't have good tools. to decide as a society what we want to automate and what we don't.

Right. There's this famous line that keeps coming up that it's like, oh, you know, they promised that technology would automate doing laundry, but it's automating artists. Right. Some extent like that, like why that does seem out of whack. And I think most people would agree that we want more laundry folding automation and less, you know, replacing the art.

In general, as a society. So we lack like the levers to sort of decide. Clearly, you know, the market is going to decide in favor of production and having more stuff for now anyways. And that leads into the sort of the second point that I want to make is that, yeah, I just have to register a little bit of skepticism about, you know, of course, Adobe. has an interest in sort of saying that these tools are so popular. And I'm sure people are using them. I'm sure they are.

But like, you know, where are they placing the tools? Are they advertising the tools? Are they promoting them? You know, Google came out and said, oh, overviews, the most popular feature we've ever released. And it's like. Yeah, of course it is. You put it right up top on every Google search that everyone's already using. It's not exactly a huge success story when you already have. 90% of the search market and you're just throwing a feature up in front of it.

That said, are people automating things because they're squeezed for time and they want to, you know, just get that layer in there and sort of shoot out the thing? Are they, as you said, like, are they just like writing the emails on autopilot because they have 50 other things? Would they like to take the time to write an email out to somebody and respond in detail?

Is it more sort of like the logic of our economic system at large that is further accelerating the use of a lot of these trends or in some cases necessitating them, right? If you have a boss that now is aware of the productive capacity of AI. And, you know, you have a junior colleague that can send 800 emails a day. Suddenly that's the standard for you, too, if you have to send those marketing emails. So I think we just have to question every layer of this. Now, are some of them good uses?

using AI to edit your photos. That seems great, right? You were never going to like hire somebody to, you know, to. photos and go through your camera roll and sort of just like edit them no i mean I don't know if you ever used the noise reduction. That's not something you could hire someone to do. Right. Right. I need to light up the GPU on my computer to 100% for five minutes to get this done. Right. Like that is, to me, generative fill, not really something.

Anybody at any scale could just hire people to do all the time. And so those are in work context, right? Those are professional pieces of software by and large. Yes, they advertise the features, but professionals only use them because they make the jobs better or faster or cheaper.

I would just contrast that to we're all going to make a bunch of Studio Ghibli memes and post them on X, which is just a consumer behavior. People were just delighted to do this thing. And that seems like it's reflective of something else. delighted to do this thing or having access to a tool that allows them to create some content and maybe chase some likes and follow. It's hard to delineate. Once again, our incentive structures are so scrambled. But yeah, I take your point.

People are having fun doing it. Of course, I am used to this as somebody who's critical about a lot of this stuff a lot. Chime in with my wet blanket. Like, you know, I get a lot of like, here comes the fun police posts. But, you know, I think we need to be thinking about this stuff. And, you know, I do think the bottom line, again, is that there's got to be a way. to have this technology.

You know, feeling like artists are existentially threatened, that they're insulted, that they're suffering indignities day in and day out. I mean, we have this incredible technology. Can't we also have like a system, an economic system that can accommodate, you know, artists to while using this? And right now. There's a case to be made that we don't, right? That it's just... The free market is pretty ruthless to artists and to creatives, especially if they're freelance, like most of them are.

And it's that that it keeps coming back around to, that vulnerability. People like to do it. People like to do a lot of things. You know, it's not always healthy for society to sort of, you know, indulge personal mass consumption or mass production. So we do have to figure out the balance. And, you know, right now it just doesn't seem like we are equipped. to do that at all. And then so you get, you know, guys like me on there showing up to throw the big wet blanket on the fun.

You're kind of gesturing at one of the big criticisms here, right? There's the economic criticism. You're putting a bunch of artists out of work. Where are you going to get your new training data from? No one has an answer to that question. Then there's a set of moral concerns, right? The base layer of them is

You're inventing all this technology. You're taking all this stuff. That feels like stealing. Oh, also you might be destroying the climate, right? Like this might not be worth it in the end for all of the associated costs. especially if it makes some group of people demonstrably poorer along the way.

Does that argument strike you as something that is effective? I hear it a lot. We hear it in our own comments. Even experimenting with these tools extracts too high of an environmental or social or moral price. And we shouldn't use them at all. Yeah. I mean, once again, it's an argument that I'm sympathetic to because we are, you know, looking at the broader trend lines, right? Like there's been a lot of great reporting from Bloomberg and from the New York Times.

The power plant contracts that were set to expire, fossil fuel power plants, have been renewed specifically because we need more juice for the data centers. So, you know, broadly speaking... We absolutely need to come to grips with the fact that this is requiring a lot more energy, and the easiest energy to tap for it right now is often natural gas in some cases.

coal because that's what we've had built out in our energy delivery infrastructure for so long. I think to me, What makes it a little more compelling of an argument is the fact that we have like seven giant firms all striving to build the biggest sort of arrays of data centers. to process the most data, to use the most compute. It's just this more is better sort of strategy that looked like it was for a second kind of under threat by DeepSeq, but then was kind of...

quickly brushed aside. Now we're doing Stargate, you know, just keep shoveling it in. And most recently, this enormous funding round being poured into OpenAI with SoftBank. This is historic level of investment to sort of... validate and to continue this model of more is better so like that to me is alarming that we have all of these sort of different corporations with just so much capital, with so much power, you know, just adopting this expansionist sort of mind. So yeah, like...

Data centers are projected to consume a larger and larger percentage of the U.S. power grid. And just like everything else. You know, it's not that like it should be zero percent of this going to AI, but it's just like this seems. At a moment when we should be worried about things like the climate crisis, like kind of a bad way to do this, right? Because, again, we don't have a good way.

to put checks on it in this country. We don't have a good way to say like, well, wait, does society really like need all of this AI? Do we need that many Ghibli memes? Like, do we need so much? Like, do we, or how do we, you know, how do we navigate this terrain? It does. come down to this true, what I like to call the true issue of democratization. Because right now all these calls are being made by a handful of guys like Sam Altman and Sundar Pichai.

Not really any meaningful sense, a democratic approach to actually building these tools and asking how we might integrate them into society. I think what they would argue... is that they're going to make the tools and some people use them and some people won't. And then this bubble will pop. And then there will be a universe of dead companies and wasted effort and failed attempts along the way. I'm not sure that it will happen. These companies are...

so big that they tend to just fail up in a way. But that is the argument, right? That we're going to invent a bunch of stuff and there'll be winners and losers and people will decide. The thing that makes me really struggle with that argument in particular is the sense that People are using the tools and they say they hate them.

People are using the tools and they say they're morally wrong for other people to use the tools. People are using the tools and they happen to be heads of major Hollywood studios and they know they're going to put all the filmmakers out of business. And that set of contradictions, I think. still unresolved, but it's the thing that will come to a head far faster than we will learn how to regulate the tech industry.

Yeah, I think there's truth to that. And that's what happened 20 or more years ago, right? When the first .com bubble burst, and that's how it... According to the theories of creative disruption, that's how it should happen. But now, I think you're right. There's question marks everywhere. These companies have gotten so big, they can graft onto one another. They've gotten better at integrating with the state.

And so there could be, you know, measures to prevent a full on, you know, bubble bursting before it happens. Or, you know, or it could burst if people, you know, if, oh, the market cannot bear, you know, eight competing massive, you know.

chatbots. There's no need for that. Maybe it does end up sort of bursting. But I agree with you. And again, I would just sort of highlight this question is how much is it that, you know, some people like to use the tools. Sure, yeah, people... absolutely are interested by them and are compelled by them. But how many people are also sort of compelled to use them by those broader economic forces? How many people, you know, again, if it was up to them, would just sort of...

spend more time on these tasks rather than automating them. We just have to question whether or not people are using them because they want to or because It's part of this broader set of impulses that they're being compelled to for different reasons. No, I fully accept that people are...

happy to use these. I personally don't really use them all that much. I use them as a reporter. I mess around whenever there's a new model. I get in there and I try to see what they're capable of and what they do. But I, yeah, I've never used one to automate an email. I've never used one to certainly write anything. But again, I don't necessarily think that's immoral.

dimension for me personally maybe partly but it's more just because i just don't find that you know useful for me and that's that's just me obviously the kids are out there Brian, this was amazing. Thank you so much for being on Decoder. Hey, my pleasure, Nilay. Cheers. I'd like to thank Brian for joining me on the show and thank you for listening. I hope you enjoyed it.

If you have thoughts about this episode of Decoder or really anything else, you can email us at decoderattheverge.com. We really do read all the emails. You can also meet me up directly on Threads or Blue Sky, and we have a TikTok and an Instagram account now. Check them out. They're at DecoderPod. A lot of fun.

If you like Decoder, please share with your friends and subscribe wherever you get your podcasts. If you really like the show, hit us with that five-star review. Decoder is a production of The Verge and part of the Vox Media Podcast Network. Our producers are Kate Cox and Nick Statt. Our editor is Ursa Wright. The Decoder music is by Breakmaster Cylinder. We'll see you next time.

This transcript was generated by Metacast using AI and may contain inaccuracies. Learn more about transcripts.
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast