¶ Intro / Opening
Hey, it's Noah Chestnut from The Athletic. If you're into games and sports, pay attention. I'm gonna give you four sports terms. You tell me the common thread. Ready? Game. Match. Point. Set. This one's kind of a gimme. The answer is how tennis is scored. Do you want more of a challenge? Check out Connections Sports Edition. It's a new daily game for sports fans. To play now, go to theathletic.com slash connections.
The last few episodes of the show have been about attention, but about attention in terms of something else. Attention in terms of Zoran Mamdani and Donald Trump. Attention in terms of the Big Beautiful Bill. But I wanted to do an episode. It was about attention in terms of itself. If we're going to say attention is currency, if we're going to say it's power, well, what kind of currency is it? What kind of power is it? Where is it strongest?
And where is it weakest? I'm a pretty loyal reader of this newsletter, Kyla's newsletter, by Kyla Scanlon, which often feels to me like it's being sent back in time from some future economy. And in a way, it is. Kyla is very much a member of Gen Z and the economy she is reporting on, the economy she is theorizing.
is one that Gen Z has been turned out into, an economy that works much more digitally, an economy where attention drives capital as opposed to capital driving attention. And so she has sort of emerged, I think, as a leading theorist of the economics of attention. Not how they work everywhere, but how they work in the part of the economy and the part of the society that is most exposed to attentional dynamics.
Kyla's also the author of the book, In This Economy, and you might have heard some of her different coinages like Vibe Session. She's a fascinating person to talk about this with. As always, my email, Ezra Klein Show at nytimes.com. Kyla Scanlon, welcome to the show. Thanks for having me. So what is different about the economy that you live in?
¶ Gen Z's Economic Disillusionment
that you see, that you feel Gen Z is experiencing from the way us like 40 and 50 year olds described or understood the economy. So one of my pieces that. went far was a Gen Z and the end of predictable progress. And it was based on my research over the past year. I've been traveling like on a quasi book tour, you know, going to a lot of college campuses and going to a lot of conferences. And I've been talking to young people about how they experience.
the economy and how they think about their future. And so for them, it doesn't feel like there's that path of predictable progress that maybe their parents had or their grandparents had. And of course, every generation has had its own challenges. But for Gen Z, you don't have that predictable return on a college education anymore because of things like AI, because of how expensive it is.
college has gotten, you don't necessarily have a path toward buying a house that feels even remotely approachable. And, you know, if you think about retirement or just moving through a career path, that also feels really far away. And so that whole piece was like, well, what does it look like if this path that everybody has followed has kind of disappeared more or less?
Let me stay on the topic of the feel of it. What does it feel like if you can't see your way to a house, if you're not sure what the career paths are, if you're not sure what... AI will do? Like, if you had to describe the emotional structure of the conversations you have with people on the tour, what are the dominant emotions?
I think there's a lot of worry. I mean, I think everybody picks up on this. Like, there are so many think pieces about, like, the kids are not all right, right? Like, there's a lot of nihilism. There's a lot of concern. There's a lot of fear. There's a lot of anxiety.
And you sense that when you talk to people because like they don't really know what to do because this path that has sort of been instilled in you from the time that you were little and go to college, graduate, buy a house. It's just it's out of reach. And so. When you can't get that, it feels far away, right? And so David Brooks had this piece talking about the rejection generation that I think encapsulated it really well.
Because he was like, the Gen Zers are facing like rejection after rejection. Like it's hard to get into college. And then when you graduate college, it's hard to get a job. And so I think that element of always being rejected from everything, or at least feeling. like you're being rejected from everything, creates those elements of nihilism that show up in how Gen Zers might spend or save or invest. Tell me about your barbell theory of Gen Z.
¶ The Gen Z Barbell and Data
Essentially, there's like two ways that people seem to be responding to the uncertainty in the economy and the lack of a path of predictable progress. One path is tool belt pragmatism. So people going back to the trades, like becoming a plumber, electrician. right? Taking a path that isn't as speculative and uncertain as, you know, getting a bunch of debt and going to college.
Other people are going a meme coin, gambling, sports betting type of route. And so you have these two ends of the extreme of risk. But both of those are responses to that path. of get a college education, get a white-collar job, and go off into the sunset not really working anymore. How much is this a narrative and how much is it a reality? So if you look at...
unemployment for Gen Z or new college graduates. Recently, the unemployment between new college graduates is up a bit, but it's not, you know, 30% or 40%. I've seen people debating the housing question. You know, were millennials or now Gen Zers? Are they really so far behind other generations on housing? And won't all those houses get passed on anyway? So is this really a problem? Is there a divergence in your view between...
the data on the economy that people in Gen Z are in and how they feel about it? Or does it match? I still think there are some elements of a disconnect. Like people might be feeling a certain way because of social media that doesn't always match the data. But I think when we look at some data points, so basically how much more you make with a college degree than not.
And that's eroded over the past couple of years. And so you're graduating with a college degree, very expensive. You're not making as much as you might expect. Housing is still quite expensive. Like eventually those homes will... pass on but I think the median age to buy a house now is like 54 years old and in the 1980s I believe it was 34.
And so there are some data points that support the fact that there are these elements of nihilism and perhaps reasons for feeling nihilistic that weave into how Gen Z experiences the economy, but I think it's all exacerbated. by things like social media for sure. Like there is elements of narrative that might sweep beyond the data, but the data tells the story too. You also talk about how there's not just one Gen Z. There are multiple. Yeah. What are they?
¶ Segmenting the Gen Z Generation
Yes. So I'm an ancient Gen Z. I'm an elder millennial. So I graduated right into the pandemic. And so I'm part of the Gen Z ones. So those that were, you know, you remember a time in college where you weren't in Zoom, like you were pre-pandemic, you were in the workforce during the pandemic.
And then Gen Z one and a half would be where like my little brother is. And so he was in college during the pandemic. And so that shaped both his relationship with institutions and his relationship with digital. And so he got a lot of education via Zoom. relied on digital tools, as we all did during that time, to forge friendships and to forge connections. Then Gen Z 2.0 is the people that are in college now and high school now.
The first part of that generation that is entirely digital. So for them, like that digital seems to be an extension of reality. Rachel Johns-Foz was the first person that kind of came up with bucketing the Gen Zs. She had Gen Z 1.0. Gen Z 2.0 and I stuck a Gen Z 1.5 in the middle because I think it deserves a bit more splicing. But that's how you can think of it is like the relationship with technology and the relationships to institutions.
The pandemic is such a big player in that, right? To graduate before the pandemic, to be in school during the pandemic. Is the implicit argument here that Gen Z is just going to be a very different generation because it had this shock? that other generations didn't, right? That the pandemic hit at a much more formative time for eventually all Gen Zers. And what came before them, what's going to come after them is going to be less volatile.
than what they went through. I think so. I think Gen Z is the beta generation, the beta tester generation rather. So they're the ones who were like, is TikTok good for the brain? We'll see. And, you know, can students? Do Zoom classes and still get an education? We'll see. Is unfettered access to the internet okay?
We'll see. We've tested a lot of things on Gen Z. Yeah, smartphones in schools. Right. And we're learning. We're learning. A lot of schools are starting to ban smartphones. But yes, I think Gen Z is the generation that hopefully will take a lot of lessons for... Gen Alpha and perhaps not put them through what the Gen Zers have gone through from a digital mental perspective. How does AI fit into the job market that young people... are both seeing and sensing.
¶ AI and Job Market Uncertainty
So AI is interesting because there's all these stories about how AI is going to replace entry-level jobs and these companies are going to automate all this work with AI, like Salesforce, I think has automated, so they say like 30 to 50% of their workforce. or something like that. And so if you're a young person and you're looking at that, you're like, oh no, these entry-level jobs that I rely on to get on the bottom rung of my career ladder are going to be...
taken away from me. And so I think that also creates an element of fear. It's like, well, where do you even enter the workforce if the entry-level jobs are supposedly all going to be done by AI? This is where I think the feeling of it is important. I both feel this a bit myself, but when I talk to people who are starting out more in their careers, my sense of the moment...
is that it's a little bit, for a lot of people, like looking at a wave in the distance. And you turn on the news and they keep telling you you have a tsunami warning. And you're like, well, is that wave going to turn into a tsunami? And if so, how quickly? And do I have time to get out? Or am I in a high enough area? And you don't really know, but that it has become this fog between you and the stable vision of your own.
Future. When I started out in journalism, it wasn't clear that I could succeed in journalism. But it was never because there might be a massive technological shock to journalism in which a computer would do my job instead of me. And obviously things like that happen in manufacturing with both globalization and automation. But I think it's some of that, that there is just this ambient uncertainty. It feels like that moment of like.
liquidness, or I think fog is probably the better metaphor, is very present for people. Yeah, and if you look at the money that the companies are spending, like that's concerning too. When you see like Mark Zuckerberg spending billions of dollars.
to poach people from OpenAI, it's like, well, clearly something is happening here. Clearly he has some sort of plan. Like, will that plan work? Who knows? Because the metaverse didn't work out, right? And so I think it is that fog, as you're saying, but the...
overwhelming narrative is that like oh gosh well if you majored in computer science like you're out of luck man good luck out there or if you majored in arts or if you're trying to be an artist like too bad AI is going to do your job and so I think right now
Now we're at this phase of technology where we're trying to establish the human part of it. And we're not doing a good job at figuring out where humans need to be in the technology equation. And instead we're like, no, we're just going to automate everything.
¶ Policy Void on AI Disruption
Sorry, we don't have a plan for people. This really frightens me, actually. So if AI was going to hit like COVID hit and just put 50% of the population out of work, we would do something about it. But if it moves slowly and just eats a category and a tranche of jobs at a time, we're going to blame the people who don't have jobs, right? We're going to say, well, most people who graduate with a marketing degree.
got a job. So you're just not working hard enough. You're not smart enough. You're not one of the good ones, right? We're so used to doing that in this economy, blaming every individual for what happens often when it's, you know, very sectoral, very technological. And I talked to a lot of politicians about this. And they're kind of abstractly worried about it. And none of them have an inkling of a policy answer for it. Like, what do we do?
If AI, which seems totally possible, just doubles 18 to 24-year-old unemployment in the next six years or triples it, like what policy would we deploy for that? And nothing. I don't know if anybody has a good answer. Like the common answer you'll hear is UBI, like universal basic income, right? Like, well, just set aside $1,000 a month for everybody and that'll be that. And I don't think that's the right answer either.
to be human is to work and to have meaning and to have purpose and so if we all of a sudden just say like no you don't have meaning and you don't have purpose like that feels really bad and so I worry You know, even if we do have a policy response, like the upheaval, the societal upheaval that could happen if all of a sudden we're like no jobs for everybody could be really bad. And then there's other side of the coin where there's a good paper that was published.
in the National Bureau of Economic Research that talks about nails, like hammer and nails were once, I think 0.5% of GDP, like in the 1800s, like the nail part. And so a lot of GDP. Yeah. Right. But like that just shows like technology is always shifting and changing and like no longer nails like that big of our economy. Like we develop new parts of our economy, just like we did with Internet. And so that's kind of what we have to hope for is that new jobs will be established with AI.
rather than no jobs at all. Which maybe they will, but it's that disruption period that is very scary, I think. You mentioned UBI. I've been around a lot of UBI. Of course, my wife wrote a book on giving people money a couple years back. My old colleague, Dylan Matthews, I thought had a great line on this, which he said that UBI is simultaneously too much and too little of a solution for the AI.
Because you imagine you're a unionized truck driver, right? An interstate truck driver. You're making $88,000 a year. And you're... job is automated by a driverless truck, which we are currently pretty close to being able to do. In fact, maybe already can do it. And then what? A UBI in a far-fetched scenario where we actually pass one?
is going to give you $22,000 a year and also going to give me $22,000 a year. So it's not enough of a solution for you. It didn't replace your income or give you your dignity back. And it's too much solution for me. Like, I didn't lose my job. And the UBI thing always just struck me as a very strange, like, and fanciful response to AI. It might be good for other reasons, but AI is not going to put everybody out of work all at once.
AI is going to make workers more productive, which will slow down the hiring of new workers. And that's just a very, it's actually just a much harder problem to solve or to address. How do you think we should address it? I have no idea. So even like in talking to the policymakers, there hasn't been anything floated that like, is this just something that we're all just going to stare at each other until it's here? I think that.
going to some of your work on attention, you're going to need a focusing event. And we don't know what it will be yet. Where something happens that clicks into place that this is a problem we need to deal with. that people are really losing their jobs now. And it's going to depend on how big that event and that trend are. So a lot of people lost their jobs to...
the movement of factories to China. We didn't really do that much for them. No. We just failed to respond to that, and it was very destabilizing over time in our politics. I was talking to somebody who's in this world, and he was a big skeptic.
that AI was going to take away jobs, even though he was a big believer in AI. And his argument to me was like, look, you're going to have so much capital investment in data centers and all this. We're going to need so many electricians. We're going to meet. And it struck me as very fanciful in this idea that we're going to turn all of these comms majors into electricians really quickly. So to me, I just haven't heard a solution that really...
Makes sense. And my worry, it's weird to be hoping for much more disruption rather than less, but I think we would have a much better chance of responding to it well if the disruption is significant enough to be undeniable. than if it is slow and a lot of little pieces happening kind of all at once, but nobody can quite prove what it's coming from. My confidence is higher in an almost emergency scenario of this.
than it is in the accretion that I think we're likely to get. Yeah, so you'd rather have a flood than a slow drip. I think so. Yeah. What do you think would work? I don't know either. That's like the hard part of all. It's so easy to diagnose problems and it's much harder to come up with solutions, it turns out.
But I don't know. I mean, I think the physical world element of AI is also interesting too. Like it does require a significant amount of resources. Like the data centers are massive and they're quite loud if you're near them. I think we are speed running it. Wouldn't be surprised if we see a flood rather than a slow drip just based on the amount of money that's going into it. My version of what I think is going to happen, or one, let's call it plausible scenario for what will happen, is that...
It's going to be during the next economic downturn when companies need to squeeze their labor forces that they're going to make a big transition to AI. And so if you imagine the mixture attentionally. of a recession, which focuses a lot of coverage and interest on the economy, and watching companies do what they've done in past recessions, which is using the recession as a moment to make a technological jump.
into some higher productivity technology like AI, that I think is going to be the kind of scenario where this becomes a really dominant conversation. And by the way, you could imagine another category of... answer to it, which is regulations about how you can and cannot use AI to replace people and various kinds of protectionism. There was a somewhat famous interview from a couple years back between Tucker Carlson and Ben Shapiro where
Carlson basically says, trucking is one of the most common jobs for men in most states. I would absolutely outlaw driverless trucks. And Shapiro's like, you would? And Carlson's like, yeah, like, what's wrong with you that you wouldn't? I think debates like that, should we actually welcome this productivity increase or should we stop it, we'll become much more salient and in a way that people are not yet.
ready for because they're so used to technology just being adopted as opposed to debated. Yeah, not all progress is.
¶ AI, Truth, and Digital Noise
Because what's also interesting about AI is like, yes, it's being used to replace some jobs, but it's also being used in ways that are... spreading misinformation and kind of capturing people. Like people have sent 3 million messages to the AI chicken on Instagram, right? And so you also have to question like- It's just a hell of a sentence that I don't really understand.
to be honest. Really? I mean, I can do it. I guess there's a chicken that is AI generated on Instagram. And you can message it and talk through it. I don't actually know quite how it messages back, but there have been 3 million messages exchanged with the chicken. And there's all sorts of like AI slot videos on TikTok. And that's a big part of it too. Like, you know, being a person, you find meaning within work, but you also find meaning with.
And how you spend your leisure time and increasingly people spend it scrolling, understandably, right? Like I was on a plane ride yesterday for five and a half hours and the woman next to me was very nice, but she was scrolling on TikTok for the entire five and a half.
hours. And you can just only imagine how AI will accelerate the addictiveness of stuff like that, as well as the impact it'll have to the labor force. You talk about how AI is going to create this abundance of intelligence that will create a scarcity of truth. Tell me about that. So yeah, I think truth is really valuable. It's the most important commodity of the present moment. And it's something that is increasingly scarce. And once you lose it, it's very difficult to...
regain it. And so I think AI is going to create a lot of information and a lot of noise. And it'll be increasingly important for people to be able to sort the truth out from that because like the AI does hallucinate quite a bit if you've ever you know talk to chat gpt it does make stuff up and you can be like hey you made that up and it'll correct itself but you still have to be able to source it like what the truth is and what that means
I think that's also the problem with social media, too, is like those algorithms are designed and the incentives are perhaps not aligned to the user. They're aligned toward the corporation. And so anything that people can get addicted to. and there's a monetary incentive for them to get addicted to it, it's going to happen. And so I think there is a world where the AI can be a source of truth, but right now I don't think it is. But people take everything it says at face value.
Like the number of at grok, is this true? That happens on Twitter where people are like asking the AI to validate a tweet rather than go and do the research themselves and like work that muscle in their brain. It's concerning because you do have to have a radar for truth.
truth because it's so easy to get taken advantage of right now there's just so much information there's so much noise it's just non-stop and it's very easy to make mistakes and a lot of people do and you have to be able to know what's true and what isn't and have your own like moral and value compass so i guess maybe here would be an optimistic version of this which is
I think a very standard story about social media is it shattered the thing we now call consensus reality. And like how much we have our consensus reality, I think you can debate, but probably more at some other points in history than at this point in history. And that what is AI but an articulator of consensus reality? What is AI but when you say, hey, is it true? It gives you this sort of...
middle common denominator vision of truth, which is going to miss perspectives that are maybe valuable at the margins, right? I think there's a worry that AI is a technology of intellectual mediocrity. coheres everybody on the same set of consensus ideas. But we've been sitting here for so long lamenting the destruction of that consensus reality. Maybe this world where everybody's asking ChatGPT or Grok. Is this true is exactly the thing we've been yearning for. Maybe.
Can't blame me for trying. Yeah, I think there's a lot of value in optimism. But I think what you were maybe talking about is sort of like dead internet theory almost, like where there is this intellectual flattening of the public and then maybe people.
don't come up with new ideas and they don't challenge themselves and they don't like there's you know a bunch of articles talking about how the college students are having a tough time because ai is making things a bit too easy it's a little too frictionless and i think i worry about that too like the cognitive effect it could have like i notice it in myself like if i overuse ai in terms of research which i do use it i do use it i notice the lack of sharpening of my own toolkit
You've written a lot about this world in which it seems to you like the social incentive for thinking is being diminished. Well, I mean, I have to be cautious of broad generalization. So like most of my... I make videos about the economy, right? And newsletters. And newsletters and wrote a book. You're multi-platform. But like I spend a lot of time on social media and the reason I make videos on social media is so I can understand the mechanisms. So every day I like post a video on...
Instagram or TikTok. And that way I can see how people respond and I can sort of get a radar for how social media is interpreting one side of a conversation. And so then in terms of the social incentive for thinking, I mean, I do worry that social media... has created elements of polarization that's been very widely discussed by a lot of people. And I think AI could exacerbate.
that and make it more challenging for people to to go out and seek information there's a lot of value in memorizing things so that way you can like pull forward that fact rather than going in and googling but you know these arguments are also old right like people did say the same stuff about googling it's like making people dumb and so perhaps it's just that same argument rehash you
This is A.O. Scott. I'm a critic at The New York Times. These days, there are so many movies and books and television shows and songs that it's hard to make sense of it all. At the New York Times, what the critics do is sort through as much of that as we can to come up with advice, with recommendations, to guide you toward the stuff that's worth your time and attention.
But we don't only offer guidance. Critics are here to help you make sense of things, to get you thinking about the way a movie connects with history or politics, the way a song opens up emotion. How a piece of art illuminates the world in the magical way that only art can do. Really what I do and what the other critics here do is part of the same project that all of the journalists at the New York Times work on every day.
to give you clarity and perspective, and above all, a deeper understanding of the world. When you subscribe to The New York Times, it's not just here are the headlines, but here's the way everything fits together. If you'd like to subscribe, please go to nytimes.com slash subscribe. So I think AI is also a bridge to this other argument you're making, which is that attention is infrastructure.
¶ Attention as New Economic Infrastructure
And one of the things you write in that is that traditional economic substrates are land, labor, capital, bedrock inputs to make stuff. But now the foundational input is attention. Walk me through that. So that's an argument that... We used to need things to kind of like raise money or like move through the world. And now increasingly you can just have attention, right? And the idea is that attention is increasingly becoming an infrastructure.
that people have to build upon. So no longer is the economic foundation something like land, which is very physical, labor, a person, capital, actual money, it's attention and then narrative. So the story that you tell to gain that attention is the capital that...
¶ Speculation Layered on Attention Economy
inflates the attention itself and then increasingly now we have speculation on top of both of those so speculation is kind of like the operating layer right so it operationalizes the attention and makes it move throughout the world because now you can attach actual dollars or signs how much attention you're gathering.
So things like prediction markets would be the best example. So Polymarket has a sub stack and they wrote about a man that bet on Mamdami in the mayoral race in New York. And he made like $300,000 off of it. And what he was doing was... essentially seeing where attention was going and trying to make inverse bets off that and seeing where the story the narrative was wrong and he was able to operationalize that attention going in the wrong direction the stories being incorrect
through speculation via predictive markets. So this is sort of a weird way, it sounds to me, in which I think we think of the intentional economics and revolution as being primarily about digital media. But it sounds to me like the way you're describing it, I think this is true, is that it's like the weird bastard child of digital media, etc., and financialization of everything.
that it's the ability to endlessly bet, right? The venture capitalists are betting, the day traders are betting, the crypto people are betting. In a way, when we just give things our time, we are sort of making a bet about what's important. But people always talk about the attention economy, but you're also talking about the sort of speculative economy layered on top of that.
So influencers, you get money through views, like that's the monetization of the attention economy more or less. But now we're able to bet on where attention goes. So like influencer, it's a very box situation. Like you have a video that does a million views. make your money, blah, blah, blah. But people can like bet on how much views your video might get. And so that creates this multidimensional aspect to the attention economy.
Well, and creates a feedback loop, too. I mean, polling has always had a bit of this quality. But I really watched with Mamdani and Trump the way the betting markets drive now feedback loops.
where people see something happening and then they begin posting more on it. And then the thing begins happening more and you can see it in the betting market. And so they begin posting more on it or giving more money to it or whatever it might be. One, it turns attention into capital because money follows it.
Two, it just makes it realer, right? It is a way of something that's starting small can become exponential just through these feedback loops. You know, again, if it's sufficiently viral and people keep picking it up. Yeah, and they do because it gathers like when money goes, like attention follows. When attention goes, money follows. And so it does create this like really quite nice, like in terms of structure, not nice in terms of effects, but like nice feedback loop that's really interesting.
to watch. And it was very interesting to watch what happened with Mamdani and the prediction markets and how that moved him. And increasingly, prediction markets, I think it's always kind of been this way, but it's not what people think.
They're not betting on what they think. They're betting on what they think other people think. And so they're essentially betting on the attention economy itself and using the stories that people are telling about said attention economy to determine where their money should go. more money follows. It's crypto in a sense. But so you said attention is a foundational input. That's clearly true for a certain set of products. Your theory of attention is a major infrastructural input.
¶ Attention Economy vs Physical World
Does it say anything about the big parts of the economy that we just don't pay that much attention to that don't have big narratives around them? Or does that just, that's just sort of not part of his theory? So there's the physical world and there's the digital world.
The attention economy really applies to the digital world. There's elements of attention that can serve the physical world. Like if care workers need a new policy passed around them, like the more attention that they can have on things like that, the better, because that's how you get. things done is a bunch of people talking about it. But yes, the attention economy primarily sits within the digital world and the physical world for now is free of it.
There's a real thing happening here, but there's also a very speculative, attentional thing happening here. One thing about AI as a technology is that the leading figures of it are very big influencers on social media. Sam Altman is probably the most masterful of the CEOs alongside Elon Musk at driving attention to whatever he wants it to be on.
But the world of AI people are just extremely dominant on social media, on YouTube, right? AI is a technology, but it's also a very compelling storyline. in a way that very few other technologies have been. How do you think that plays into this? Yeah, I mean, the entire S&P 500 is a bet on if the AI will make it, right? And so there is a lot of money making really big bets, and there is a lot of incentive for them to keep attention.
on it and to like hype it up as much as possible. Like I remember this one interview that Sam Waltman did where he talks about how AI is going to require a reordering of the social contract. And so to go back to what we were talking about at the beginning of the conversation. like when you hear something like that you're like oh what does that mean for me and that you know the
40 years I have left on earth or 60 years or however long. And so I think for the AI universe, they do a great job at keeping attention on them. And like, that's actually one very useful part about the image generation and the video generation. is that you're able to direct a lot of eyeballs towards your AI model and then also use that AI model in a workplace. Hype is very valuable. So there's a company called Cluey.
that raised $15 million from A16Z. They're interesting. They have a lot of really... how would you say, engaging videos. And, you know, they were able to raise a bunch of money just on the back of those videos for essentially what is an AI node taker. But they were able to capture attention. They were able to capture the $15 million because...
In this world where ChatGPT or OpenAI and Anthropic are so far ahead, the people that will win in the AI world are those that can differentiate. And the way that you differentiate is through attention. And then you get the speculation. What do they do to capture attention? Those videos. Roy Lee, who's the CEO, he drew a... Can I say dick? Sure. He drew a dick on the whiteboard. So it's just like very... They have this like line cheat on everything, right? Yeah. So it's just...
Cheat on everything is very inflammatory within itself, right? And so they mastered attention. They got two newsletters out of me and plenty of think pieces out of other people as well. And it was just based off these videos. It's Jake Paul. Jake Paul, the boxer, he was a big YouTuber. He was like one of the first Gen Z influencers. And he got very popular through stunts and virality. And you see startups now copying this Jake Paul playbook.
to be in your face. We're going to be loud. Because in a world where it's difficult to differentiate AI products because it's all essentially the same technology underneath, you have to be able to draw eyeballs. And that's where you get the speculative capital from venture capital.
Here's a question I have, even when I just try to think about the way attention and speculation matter in the economy. You know the old book, James C. Scott's Seeing Like a State, that the state has to make things legible to itself? The algorithm has to make...
things legible to itself. And so those of us following along in life through the algorithm will see the things that are legible to the algorithm, clearly is very legible to the algorithm, right? Because it has this high engagement strategy. On the other hand, 15 million bucks from A16Z, it's nothing. On the list of AI investments, it's pennies. And most of these companies making a ton of money.
are raising a ton of money. Many of them are in stealth mode because they don't want other people copying them, right? There's a lot happening that is less attentionally salient. But because it's not... visible to the algorithm, like we can't even really discuss it. It is very hard to tell when you are overestimating the role of attention in the economy.
Because the nature of the things that get attention is you can see them more clearly. And there are all these things that don't. Like an economics of attention needs to be able to distinguish when attention is also misleading you. How do you think about that?
Yeah, no, it's something I've thought a lot about because I do, you know, I do make videos on social media. And so I can fall into the trap of thinking that that's the entire universe. I think most people who use social media can fall into that trap. But there are a lot of companies outside.
¶ Trump as Human Algorithm Hybrid
the attentional universe that are creating very real things but I think what's been really interesting about the Trump presidency is that he's expedited the value of attention and so I think for him like attention is the value creation it's not like the path to value creation And so he has also established some sort of playbook within politics that people are starting to follow. And so there's always going to be pockets of the attention economy that are outside of the attention economy, right?
things are falling within the purview, including politics, increasingly. You have this line on Trump. You said, Trump is the first human algorithm hybrid president governing via truth, social truths. bond market reactions, and direct market signals. A feedback loop in a suit. Yes. Tell me about that. I mean, the guy, he uses Truth Social, his policy platform. Like, we got an announcement about a bombing of Iran.
via Truth Social. And so I think for him, he's able to use this platform that he owns as a way to... spread messages and get his point of view out there to move markets, which are quite responsive to him, to give his opinions on Elon Musk, to give his opinions on the big, beautiful bill and tariffs. uses Truth Social just as an extension of the presidency in a way that we've never really seen a president do. Like, we've never had a president who tweets so much.
truth so much, I suppose. I refuse to use these stupid bespoke verbs. You had another line that I thought was very perceptive on this, where you said that the way things typically have worked... Is it events create narratives? And the way they seem to work in the Trump presidency is that narratives create events. Talk me through that.
I think it's all kayfabe to a certain extent. What is kayfabe? There's the theatrics. The show must go on type of thing in WWE. There's this great essay by Roland Barres that I always talk about, but it's called The World of Wrestling. And he talks about how...
They're always in character, they're always doing stunts and performance, and it's just always a... the show increasingly elements of politics you know have elements of wrestling like there's this theatric pursuit of justice this theatric pursuit of truth and you can kind of align it with like how wrestling has a heel
And there's always a bad guy that you have to defeat. And then you defeated the bad guy and you did a great job. And now on to your next opponent, which is kind of how Trump moves throughout his presidency. Like he got bored of the war, essentially. Yeah, I feel like this is one.
¶ Politics of Spectacle and Decay
of the really interesting things, that it feels like we get a spectacle and then we move on from it because the policy never had a strategy. So did we destroy Iran's nuclear? There was like a three-day news cycle here, it felt to me, or a four-day news cycle. We bombed Iran. Then there was an intelligence leak that we only set it back by a few months. Then there was a Trump administration yelling at the intelligence leak. And now we just moved on to debating the big, beautiful bill.
Yeah. It's not that past policy efforts didn't create spectacles. They did. But the spectacles were yoked to some kind of strategy and end goal. I mean, the terrorists have had this feeling too. Like, what is our tariff policy at this point?
And what is it trying to achieve? Does anybody actually know? Could anybody give me an account of our global tariffs and where they think they're going to be in 30 days or 60 days or 90 days? There's something with the way things rise and fall now that genuinely feels. different it's like once you've gone through like the big storyline it's like the thing has gone even if the policy has not been decided or the goal has not been reached it's it's like the decay rate yeah has really accelerated
Yeah, no, it feels that way. And I think it creates so much fatigue in people that are paying attention. And I think a big part of that is the attention economy. Like Trump, he gets it. He gets how people work. He gets how people consume information. And he knows that in order to keep people engaged, you have to keep moving the storyline along. Like the guy is a reality TV star, literally. And so I think he understands that better.
than any president we've ever had and there are valuable parts to that playbook right like it is There's value in using social media as a tool to spread a message and to capture people's attention and then, you know, send them somewhere rather than just dragging them along to the next storyline. But yeah, I mean, I'm curious for you, you know.
said past policies have had some sort of spectacle. Like, have you ever seen anything quite like what we're seeing now? No, but it's hard for me to describe the difference. Yeah. Because... I don't know, it's like the Obamacare fight was like a year. But it was driving for a year to an outcome. There is this way in which it feels like the Trump administration loses interest before the outcome is reached.
What you just said a minute ago was that Trump gets us better than anybody alive. He gets how to engage you. Let me try this on you because I do believe it. Everybody always says, Everybody, but, you know, pundits. He's trying to distract you. The Trump administration is trying to distract you. Muzzle velocity. They're flooding the zone. Trump isn't trying to distract you. Trump is himself distracted.
He doesn't understand this better than anybody else does. He embodies it. Like, as you say, like you said a feedback loop in a suit, but I don't even think it's that. He's just very, very, very distractible. Listen to him talk. The way his mind works, he cannot follow, oftentimes, the thing he is talking about for more than a couple of sentences before he wanders down another pathway. In the first term,
Somebody who worked for him told me that to try to brief Donald Trump on policy is like chasing squirrels around a garden. Yeah. They just keep running in different directions. And so I think there's a sense sometimes that Donald Trump is conceptually in control of events, right? That he has got some plan. But I don't think it's that at all. I think that... He starts things and then loses interest.
Now, in cases where there's like another part of the administration or the government that will try to like drive the thing to its conclusion, it keeps going, right? Congress has a process to try to decide if this bill is going to pass or not pass. Or Stephen Miller is very focused on his hatred of immigrants. And he's going to keep using machinery of government to prosecute that as long as he can. But Trump just sort of phases in and out of how much he cares about things.
Like, what happened to Doge? That was like a big project. And Elon Musk was running it. It very much could have continued. at its level of aggression after Elon Musk left. A lot of people thought it would. But we actually don't hear nearly as much about...
government reform and waste and fraud and abuse as we did a couple months ago, because clearly just Trump has moved on and other parts of the administration has moved on and nobody actually owns the thing to drive it. I think that the difficulty with Trump is it. He really is like the algorithm. And the algorithm just like lives for the moment. It's about the current thing. Yes. And when the current thing moves on, it moves on and so does he.
Yeah, no, I mean, that's a good point. So it sort of refutes the idea like, oh, he's not doing it on purpose. Like, it's just how he is. And I think like part of it might be that. Perhaps I have a hope that some elements of it is on purpose because then it feels like there's more control happening versus like it is really just a... a funhouse going on in the government because like obviously things have very lasting impacts and I think it
You know, your point, he is the algorithm. Like there is that element of instant gratification that you see within him. Like if he, because he wasn't able to fix the war, he was like, ah, nevermind. But like now, what's going to happen to Iran? Are they going to continue to develop these weapons? And what does that mean? And there's all of these real answers that we still have to seek. And because I think everyone does get...
kind of yanked around in terms of trying to follow what's happening next, it does create a very distracted, overwhelmed, and fatigued public. And then you have things like the big beautiful bill a bunch of Americans haven't even heard about. So when you then get to thinking about, okay, like imagine Trump as like a golem summoned by the attention economy. We are being governed by the human embodiment.
of the Twitter algorithm. I actually believe this. So then it creates this interesting question, because on the one hand, Trump is completely dominant over events. And on the other hand, he's not that... He's unpopular. I mean, we're going to see what happens with the big, beautiful bill. But his ability to actually drive all the way to the outcomes he wants is often weak. It looks like...
Republicans are headed for a pretty bad midterm. Tom Tillis, the North Carolina senator, just said he's not running for reelection. Because Trump threatened to primary him. Because Trump threatened to primary him. But that's exactly what I mean. Trump got mad at Tom Tillis for like a second because Tillis is not going to support the level of Medicaid cuts in the bill.
threaten to primary him, putting at risk a North Carolina Senate seat that could become the deciding line between Democrats recapturing the Senate and not. That's not a person acting out of anything more than a momentary incentive. That is a person who is not thinking forward into his own future.
Yeah. Like the whole Trump experience in a way is like, what would it be like to really live in the algorithm? What does the algorithm miss? What can't it see? Like attention and its opposite. Yeah. Like are simultaneously more salient than they ever were before.
Yeah, totally. I mean, and it kind of, so Trump, yes, he's like very focused on, you know, I guess he's a squirrel, so the next nut or whatever. And, you know, that gets into the digital economy aspect of things and then the physical world kind of. of falls apart because then we get science cuts we get cuts to solar power all these things that have real
world consequences, but because it's not within that algorithm, because it's not aesthetically pleasing or whatever, you know, it gets ignored and forgotten about. And I think that is like the biggest consequence of the Trump presidency is that he seems to have... forget that the physical world needs investment just as much as his digital presence.
¶ Theory of Digital Friction
Give me your theory of friction. So basically the idea of friction is that there is value in things being like a tiny bit difficult. And so when we use digital... tools like there really isn't a lot of friction like dating apps it makes dating very easy doordash makes things coming to your house very easy so you can kind of have this like frictionless existence you also have an algorithm that's designed around you and so you can
served anything within your echo chamber and you don't ever have to, if you don't want to, kind of think outside of whatever you want to be thinking about.
Whereas in the physical world, there's a lot of fiction. So if anybody's flown over the summer, it might have been a challenging experience because we don't have enough air traffic controllers. We don't have enough people doing maintenance on the plane. Like I was trying to get off my plane yesterday and the wheels of the... jet bridge we're facing the wrong way and it's just like little things like that that
feel like they should be simpler and they feel like they should be running smoother and they're not because they're not part of the digital universe that gets so much money and investment. And so it's this idea that there's a bunch of friction in the physical world and perhaps not enough.
in the digital world. Do you think those things are connected that there's become so much friction in the real world and the real world of building things and accomplishing things that it's created this like flight to the... digital world, that people are overinvested in the digital world because we have done such a bad job managing friction in the real world.
Yeah, I mean, it's easier to be online. You don't have to like reckon with stuff. And so I think you can create this curated experience. Like my boyfriend talks about it all the time. He's like, you can just have this.
perfect appearance and people write about this too like your instagram page is your best moments ever and so you can kind of have this life that isn't your life whereas in the physical world you might just be dealing with like headache after headache like how do i pay this bill how do i
¶ Friction, Meaning, and Nihilism
make rent this month, how to make sure I'm still succeeding in my job, all of those things. So for you, there's a connection between friction and meaning or frictionlessness and meaninglessness. Yeah. How do you draw that? I think the idea is that when things are a little too easy, it's tough to find meaning in it. The way I think it might be best to visualize it is Wall-E. So if you're able to kind of lay back and watch a screen on your...
little chair and just have smoothies delivered to you. It's tough to find meaning within that kind of WALL-E lifestyle because everything is just a bit too simple. The good parts of life often come through. the hardest struggles and I think that that is where people do find a lot of meaning and that's what all the greats you know write about is like the struggle is the path keep pushing the rock up the hill more or less that's
how I think about it, but I don't have data to back that up. No, but you do have literary analysis, which I've enjoyed. You have a great piece, or I thought it was a great piece, on the screw tape letters. Best book. Well, why don't you describe what that book is for people who have not heard of it? Yes, it's one of my favorite books in the whole world. Have you read it? I have not read it. Oh, you should. You should really read it. I will, yeah. It's this...
demon called Screwtape. And he's writing these letters to his demon nephew, Wormwood. And it's all about how you demonize this human. So Wormwood has a human within his care and his whole thing is like to bring him into hell and away from God. And the way that you demonize somebody, like you would think.
you'd kill somebody. Like you would want them to go kill somebody, right? Like that's how you would think of it. But it's really just keeping that person stagnant. So all of Screwtape's letters to Wormwood are like, no, just keep him not feeling anything. Keep him in one... place like don't let him fall in love don't let him get passionate just keep him baseline keep him bored don't keep him doing his prayers like let him forget all meaning and purpose within his life and that's when he'll
come to us, the demons. And then towards the end, you know, Screwtape gets mad at Wyrmwood and they eat each other, essentially. Spoiler. Yeah. But like, it's a good... metaphor for kind of how badness moves through the world like badness does usually end up eating other forms of badness like we kind of not to call it trump and musk demons but like they're infighting now like like trump is
threatening to deport musk and so like not that they're bad actors per se but like people who are trying to find perhaps bad parts of people are always going to find that in another person and they turn on each other eventually Did you listen to my colleague, Russ Douthat's podcast with Peter Thiel? I have seen clips. I've not listened to the whole thing yet. So this is Thiel's view of the Antichrist. This is sort of his theory.
We will be lulled into the Antichrist One World government by the environmentalists promising us peace and safety. And there's something very strange about it. And he's like out there like funding Palantir and surveillance technologies that sure seem to me like what would create a one world dangerous government. But I guess from this screw tape letters, like if you're going to imagine.
that the worst thing for society is stagnation, that the true path away from godliness is a kind of decadence, that...
There's a connection between those two theories. Yeah. I would say like the way that C.S. Lewis is writing Screwtape, or at least the way that I interpreted it, was that if you do have people that are... numb to the world around them it becomes difficult for them to push back like if there is you know a palantir takeover or something like that and there's also the sense that there's some kind of
corrosive meaninglessness in the society we're building, in the absence of certain kinds of struggles, certain kinds of social structures. I'm not sure we know how to pull those things apart. But like when you said at the beginning of this conversation that the thing you hear a lot is nihilism from young people. And the thing in a way that AI feels to me like it.
pushes is nihilism right is a kind of sense of like well if this can do so much of my work for me what am I exactly I don't think any of us know how to put this on the chart But it certainly feels to me like people are acting from a sense of its truth. Yeah, it's like one of those things that you can feel when you talk to people. Like there is this element of a fear.
And worry. And, you know, one thing about the U.S. is we don't really have a social safety net. And so like if you do fall because of AI, how do you climb your way back up? Who knows? So I think people are trying to find truth. And meaning while having, you know, this technology that's incredibly hypnotic. And we'll get more so hypnotic with things like AI that also might take your job. And so it's a bunch of forces all at once that...
is perhaps a bit too much for humans. Like, we really haven't evolved that much since we were hunter-gatherers, and now we're being tasked with... like in this hotel i stayed in in new york they had all these headlines on the wall from the new york times and it was like great paper
Great paper. But it was, you know, people are looking for Amelia Earhart and basically everything from like 1912 to now, like they had just these headlines in this picture. And I was looking at that and it was like the saddest headlines ever. And I was like, I feel like I see all of this.
headlines just in one day now you know and it's like that was like a hundred years of headlines and that's a lot for the human brain to process considering we haven't really kept up evolutionarily you know there's something in this you often make the argument that everything feels like crypto now
¶ Hollowness and Broken Ladders
And that feels relevant somehow here, where crypto has this quality of so much energy and hustle and money and attention. all circling a kind of nothing at the core. Particularly as I think the more idealistic dreams of what it could do have faded, and now it's just pretty clearly a vehicle for speculation. And there's this way in which I think a lot of things...
are developing a little bit of that quality. Like, politics sometimes develops that quality. Like, it does really matter. But there is this kind of nihilism people can sense in it that I think is really depressing. Right. It really, really depresses people and the economy. Right. If you feel like you've put in all this work in your education and you're learning and now the deal you're promised isn't really there and we're inventing.
like a computer program that can write your essays and do your learning and produce the kinds of things we are asking you to produce better than you can. So you're still running really fast. You're still trying to get grades. You're still... you know, like trying to achieve the thing. But what is at the core of that if it becomes a little bit less obvious? And I think hollowness is something people sense really well. And is very very very dangerous. Yeah.
So I used to work in crypto and was like once very excited about it and I was like, this thing's going to be cool and we're going to have alternatives and it has increasingly become just purely a speculative asset and there is that element of... of hollowness in it. I know a lot of people still work in crypto and they're working on very cool projects and there's a lot of optimism sometimes.
But you do see a lot of people in this space being like, well, what is behind the screen door or the curtain? And I think that sort of speculative hollowness, as you've pointed out, does show up in how people are trying to navigate. their lives because it's like, what is next? What does this mean? And I think part of it is this like broken ladder problem. So you graduate from college. Do you get a job?
23% of Harvard MBAs were unemployed three months after graduation. Those are the cream of the crop, right? If they can't find jobs, that might be tough for you. And you're like, you just see all this data and you're like, oh no, housing. career progression because people are taking longer and longer to retire and so i think all of those points can create that feeling of hollowness like what is it all for i have this sense that we are
¶ Re-Engaging with Real World
undergoing a kind of correction where interest and attention are moving back into things that are just more difficult and more real i would not have thought i would publish a book on zoning reform and the policies under which we can build transmission lines and it would become like a big national bestseller and throw off endless discourse. And we're talking today after Gavin Newsom signed a bill, really, really changing the California Environmental Quality Act. There's so much energy around.
even just a conversation about how do we do things in the real world, right? It was pent up, right? It's not my book that did this. It's that people wanted there to be attention on this. And the book was a focusing event. And even with my thing with abundance, I've noticed the discourse that you would find if the only place you followed it is people screaming at each other on X. It's a very polarized discourse.
aggressive defenders and you know aggressive critics whereas then in like the world as i watch it actually filter through it's like I find moderates picking up parts of it. I find leftists like Zahra Mamdani picking up parts of it. I find Gavin Newsom doing parts. It's just everything is much more complex than the simulacra.
Somehow you need to fuse the dynamics of the attention economy and the real economy of the dynamics of performative politics and real politics. So eventually, you know, the next... Democratic or maybe even non-democratic president will be better at managing both performative policymaking and spectacle, which obviously Joe Biden was not even in the game on.
strategies to actually try to get things done. Right. Which he was better on. I think that attention has become like more like capital and you need to know how to raise it. Right. But just like with capital. To succeed in the long run, you also need to know how to spend it. Yeah. And I think right now there's a lot more sophistication in the raising of attention than the spending of attention. Sure. And I think that's going to be a...
It's like a societal learning we're all going through together and not painlessly. But I wonder if like... pretty impressive things can't lie on the other side of it, right? If we could get some politicians who understood both sides of it, more business leaders who understood both sides of it. The thing I'll say about OpenAI and Sam Altman, as I said a few minutes ago, and it's true for Elon Musk too.
Sam Altman, very, very good with attention, but OpenAI keeps releasing impressive products. Elon Musk, very good with attention, but SpaceX keeps doing impressive things in space. That there is this way in which, like, the future belongs not to the influencers, but to the influencers who can deliver. It's maybe not enough to deliver without being an influencer.
¶ Attention: Precursor to Power
but it's also not enough to be an influencer without being able to deliver. I think that's my optimistic take. Yeah, I mean, I think attention is a precursor to power rather than a byproduct now. So it's like... As you're saying, the more attention that you can raise, narrative is a form of capital. Memetic storytelling is really useful. And then that way you can build all of that around yourself. And then it makes getting into power a bit easier.
The way that I think about it is there's extractive speculation and then there's strategic speculation. So extractive would be like, you know, Trump kind of dragging people from one thing to the next or, you know, Cluley's marketing campaign just sort of like drag.
people along and then they end up somewhere where they're like, where am I? But strategic would be, you know, like your book, which got a lot of attention, got a lot of discourse, reaching Gavin Newsom, right? And like that was a really good use of the attention.
economy. I don't think it was intentional on Ural's part, but it was incredible, like the amount of discourse. And then now you've changed policy, which is the ultimate goal, right? I don't want to take too much credit for this. It was...
hopefully helpful, but a lot of people... He said directly, shout out to Ezra. Fair, but a lot of people were working on this for a long time before me. Of course, but do you get what I'm saying? Yeah. Right? As you're saying, there has to be an end goal, but you can still use the framework of the attention economy to get people to that end goal.
And that's what has to happen. The bones are still important. And I think they're not going away for a bit. The question will be, where do they go from here? And you have to be able to send them somewhere. And then always our final question.
¶ Book Recommendations
What are the books you'd recommend to the audience? Yes. So I really like C.S. Lewis, so I'd recommend The Screwtape Letters. And then I'd also recommend A Grief Observed, which is by him also. It's like a really nice read on...
grief because grief can feel very lonely when you're going through it like if you've ever gone through a loss it's just it's devastating and you feel like you're the only person that's kind of experienced something so big and heavy and people are telling you it's going to be fine and you're like no it's not and
And he just writes in a way that makes you realize you're not alone in that. And my last book recommendation is Jonathan Livingston Seagull by Richard Bach, which is about this sequel that... Everyone's like, you're never going to be able to fly that far. You're never going to be able to do all of this. And he just takes it upon himself to fly. And I won't ruin the ending, but it's really nice read on the power of people taking a chance.
but then what happens when we forget why we took that chance in the first place. Kyla Scanlon, thank you very much. Thank you. This episode of The Ezra Klein Show is produced by Roland Hu, fact-checking by Michelle Harris. Our senior engineer is Jeff Geld, with additional mixing by Amin Sahota. Our executive producer is Claire Gordon.
The show's production team also includes Marie Cassione, Annie Galvin, Elias Isquith, Marina King, Jan Koble, Kristen Lynn, and Jack McCordick. Original music by Pat McCusker. Audience strategy by Christina Samulewski and Shannon Busta. The director of New York Times Opinion Audio is Annie Rose Strasser.