Hey, guys, ready or not, twenty twenty four is here, and we here at breaking points, are already thinking of ways we can up our game for this critical election.
We rely on our premium subs to expand coverage, upgrade the studio ad staff give you, guys, the best independent coverage that is possible. If you like what we're all about, it just means the absolute world to have your support. But enough with that, let's get to the show. Joining us now is venture capitalist Mark Andresen. He's out with a stunning new essay, Why AI will Save the world? Counterintuitive or is it so? Mark? It's great to see you.
Hey, good morning Mark.
You go through a couple of the myths, and I actually just thought, as a fun interview, you know, we have a general, like a general audience. Some of these people may not be all familiar with your work. You're just on the Joe Rogan experience. But the most common ones, I thought, you do a fantastic job of breaking these down. The first and foremost is will AI kill us all? Why won't it?
I will not, in fact kill us all. It's like it's it's it's it's a very exciting new technology. It's a technology. We build it, we control it. You know, it runs, it runs on computers that we own.
We don't like what it does, we turn the computers off.
It's a you know that that part is just I think straightforward engineering.
One of the reasons why I think it's important for people to understand what you're talking about about the level of control is that there's been a lot of I don't want to say fear mongering, but maybe there is about singularity about the ability to you know, pass the Turing test and all of this. As someone who's a technologist there from the very beginning, from so many of so much of the Internet and all that, why do you think that that is something that we shouldn't be concerned about.
Yeah, so you may know, we Californians are famous for our cults. We love our cults. We have produced many thousands of them over the over the decades and continue to do so. And so there's this concept in in religious studies of a millenarian cult, sort of an end of the world cult. And the way that he start is with sort of the idea of creating heaven on Earth, which is actually a term you may be familiar with.
It's an old Bill Buckley term called immin immanitizing the Eschaton, which is sort of bring about, bring about heaven on Earth. And this is the idea of the singularity, right. So techis who get into this stuff kind of you know, come with this idea of heaven on Earth through the.
Application of technology.
But of course the flip side of heaven on Earth is creating hell on Earth, right, And so the sort of utopian cult tend to transform over time into being these sort of apocalyptic cults. And you know, they sort of hear this term, you know, kind of summon summoning the demon, and then you know, and then things.
Go horribly wrong.
So these are just like ideas that are kind of very deeply rooted in kind of Western theology, Western philosophy, and whatever new is happening in the world.
We tend to kind of build up these cults. I maybe I'm I don't know, I'm a simpler guy.
I'm an engineer. You know, I got where I am by writing software. Writing software is hard. You know, lots of like practical aspects involved in getting any of this stuff to work.
I Mean, the.
Funniest idea right now is you can't even today, you can't even buy chips to to do AI. Like there's this massive, like global chip shortage for AI chips, and so I have these kind of fantasies that there's a baby, you know, malevolent artificial intelligence running in a lab somewhere that has like a huge order out to Nvidia and it's just like incredibly frustrated that it can't take over the world because it can't get the chips.
That's that's well said. Yeah, it turns out that the limiting principle here is actual factories, not in terms of the idea, which is kind of amazing. Let's also talk about AI ruining our society, as you said, hate speech, misinformation. This is why we need regulation. We need it now, we need to shut down. The government's got to get involved. What do you make of that that argument there?
Yeah, so after we get through the apocalypse apocalyptic cult, then we get to sort of the mode basically, you know, of any sort of significant new technology basically where it threatens existing power structures. Right, so it sort of it sort of threatens the hierarchy of who's in charge and who gets to decide, you know, kind of what people think.
And of course they you know, the Internet is you know, you know, you guys are an example of this, Like the Internet has had a shattering effect on the traditional you know authority of people who used to be in a position to be able to gate keep access to information and used to be able to, you know, determine what people say. You know, there's once upon a time, if it if it was set on the major three
TV networks, that's what everybody believed. And then you know, guys like you come along and and kind of you know, blow that up.
And so you know, look, we're we're on that.
We're on the tail end now of you know, fifteen years of like very dramatic change.
But by the way, on.
Both sides of the political aisle, uh, you know, in terms of like who gets to basically decide what truth is? And of course the people who used to decide what truth is are extremely upset about that. And of course a lot of like ordinary people who now have a chance to learn alternate pint points of view and get information they would not have had otherwise. We're obviously very excited about that.
I think this.
Concern is like another turn on that right because because you know, people are going in the future, people are going to be talking to AI, learning from AI, you know, getting information from AI, you know, at least as much as they do that with other people. And so what the AI says is going to have a big impact on society. I think that's going to be an overwhelmingly positive impact because I think people are going to have access to a lot more information. It's going to be
like the Internet ont steroids. But if you're a traditional gatekeeper, like you don't like this at all, and so that's when you bring out, you know, the sort of scare factors like like hat spetion and misinformation.
Yes, smart. And one of the questions though here is about development, and this is one of the only ones of which I'm still have some questions around, is given that we have the existing players Google, Facebook, the bigger corporations involved, should we worry then about monopolies, you know, being the most transformative in this space and carrying over existing policy or will the technology itself be able to trump those concerns.
Yeah, so that's a very open, live question right now, and I can kind of argue all sides of that question. I don't know the answer, you know, this is a very unusual situation. You know, usually new technologies get running, you know, kind of starret of a decade or so before they become sort of super political, before people start to lobby for regulatory capture. This, this technology is unusual in that the major players are are They're already in Washington,
and they're already lobbying aggressively for regulatory capture. And I, you know, I said them in the piece, and I've said out places like the big companies here are trying to form a cartel. You know, they're trying to basically get the status that the big banks have in financial services.
Right.
They're trying to basically get protected, you know, as institutions that are too big to fail, and they're trying to basically get a regulatory wall established so that startups can't.
Compete with them.
And look, they haven't early head started on that. They've made excellent progress. They've used these scare messages like end of the world and Hatespe and all the rest, you know, kind of as levers to try to get this kind of regulatory capture. You know, I and others have been pointing out that this is like a transparent grab for regulatory capture. You know, it's it's ironic that I feel like I have to encourage Washington to like, not like and trust big tech, but in fact they should not.
In fact, you know, in this case, trust big tech and then they should let let you know, they should let technogy run. Why should they let the technology run competition? You know, they basically the big companies should be subject to competition from start ups and from open source start ups. An open source should be subject to competition for big companies.
And of course that that that that is the time honored way to you know, to to discipline companies is to make sure that they're subject to actual competition.
Yes, I think that's an important point. We also do you touch on AI taking all of our jobs in the essay? One of there's been some talk here about well, it turns out that blue white collar jobs are a little bit more at risk even after decades of self driving car discourse. But even that mark you kind of try to make the case here that make taking all of our jobs and all that is the doom. You know that people have been crying about with new technology
for a long time. You know, you've all know Harari many others have been like no, no, no, no, this time is different. So why why is it not different?
Yeah, So if you look historically, basically this is the cry. This is sort of the paranoid cry on the economic front that's going to come up basically every new wave of technology for three hundred years, and the cry is always the same cry, and it's it's based on this fallacy and economics called the lump of labor fallacy, which is, by the way, is the heart of Marxism. One of the reasons why marx system doesn't work is because like
this fallacy is false. And the fallacy basically is that there's a fixed amount of labor and either machines are either humans are going to do the labor and have jobs, or machines are going to do the labor.
And then humans won't have jobs.
And so that you know, the proposal then always is that new technology will lead to a miseration of human beings. So, of course what happens in practice is the opposite of that. New technology drives economic growth, New technology drives job creation, new technology drives wage growth. The sectors of the economy with higher levels of technological development show higher job growth and higher wage growth, and The reason for that is very simple, which is technology is a lever on human effort.
If you take a human being and you add technology to what they do, they become more productive, right, They generate more output with the same amount of human effort. That leads to higher wages, that leads to you know, basically growth in the economy. If if AI in fact is allowed to run through the economy, I have no doubt that the result is going to be massive economic growth, massive job growth, and massive wage growth. I think actually the bigger threat is AI won't be allowed to run.
It'll be outlawed in some form. And just give you a couple examples of that, which is, you know, will the medical boards allow there to be AI doctors? You know, will the bar associations allow there to be AI lawyers?
You know?
I think it's actually equally likely we'll see the opposite problem, which is AI will not be allowed to actually run through the economy, and as a consequence, it will actually have less economic impact even than I would hope. I hope that's not the case, but I do worry about.
That, right well, I mean, you got classic job preservation there going on, which actually those does lead to the next kind of myth that you talk about here about AI leading to crippling inequality. So the obvious answer is, like, you're right, market, it will unleash all this economic growth, but all of the gains will aggregate to the top. You don't need that many employees actually involved in these companies, and they will capture the trillions of dollars in value
and a new underclass will form. That's why we need UBI. So see how quickly I got there in terms of this one. What's your view?
Yeah, so this is sort of again, this is sort of right. This is a fallacy right of the heart of Marxism, right. And the way I always kind of teach my friends on this is it's like, if the answer is UBI, then the question when it was communism, right, Like, it's it's amazing how quickly sometimes sometimes people get to like, oh, we just need to give everybody money. It's like, okay,
that sounds familiar. So the reason that doesn't happen. The reason doesn't happen is because when people miss and this is this is again, this is a fallacy that's repeated over and over again. Technology democratize this, and with AI it's actually striking that. It's it's so obvious that that's already happening. You and I and anybody else in the world today can get access to and use in our daily lives the very best available AI, like anybody watching
this can use this same. You know, I cannot pay a million dollars and get a better AI than anybody watching this can. And the way and the way that you do that if you want that is, you know, you can go to you can get to go on chat GPT for twenty bucks a month you get the good stuff. Or you can go to Microsoft Being which is a version of GPT four, or you can go to the Google Bard, which is also getting quite good.
Those are actually free, right, So Microsoft and Google are already bringing this stuff to market for free.
Why why are they doing that?
It's because they're trying to use it, right to boost their existing search businesses whatever, that's not your problem is they're giving it for free. And so the best AI is already available to anybody who wants it. Over one hundred million people are already using it in their in their daily lives. That number is going to rant very fast from here, and so you'll have it.
I'll have it.
Everybody else will have it. Everybody will happen in their lives, everybody will have it in the workplace. Everybody will be able to build businesses based on it. Everybody will be able to upgrade any aspect of how they operate based on it, and so and so democratized. And the and the reason it democratizes is actually it's it's sort of an an Adams, you know, Ada Smith refutation of Marxism.
The reason it democratizes is because these companies want, the companies like Microsoft and Google, that and open Aid want to become big important companies.
Right.
The way to be the biggest possible vendor of technology is to provide that technology the largest possible market, and the largest possible market is everybody on the planet. And so basic commercial self interest actually leads to the opposite result, which is everybody gets the technology. That happened obviously with the smartphone, it happened obviously with the Internet.
It happened to the microship, but happened with the car.
You know, it's happened with every other new form of technology. And that that's what's happening here.
Mark. One of the points say you often make, which I think is very important is that tech. Right now, the industries that are actually driving the most inequality are housing, education, and healthcare, both of which are the most resistant actually to technology. As you alluded to earlier, lawyers being able to keep and effectively a moat around their fees, doctors in the same way, even if technology exists which could
quite easily give the consumer a much better product. Could you touch on that though, about how tech has actually not been allowed and not even just tech, but really like recompetition has not been allowed to penetrate the regulatory moat around these through sectors which are increasingly driving the vast majority of the inequality in US society.
Yeah, so we live in a bifurcated economy bisector. And why I described this is there's called fast sectors, which are the sectors.
That are allowed to incorporate in your technology.
And these are sectors like in someerer electronics, television sets, right, media, software, right, you know a lot actually by the way, food and drink. Interestingly, where you're allowed to have basically very rapid technological evolution. You're allowed to have free mark capitalism, You're allowed to have tons of competition, tons of new entrant tons of startups.
Then you've got what I call the slow sectors, and the slow sectors of the sectors of the government is very involved in them, and the government establishes all these regulatory barriers and constraints in what companies can do, and the result of that is rapidly escalating prices.
With very little technological change.
And you mentioned it, but the three big slow sectors are housing, education, and healthcare. And by the way, you could put a couple other sectors in there, and I would, I would, And you could say also.
By the way, government is in there.
Generally government agencies are also very resistant to do technology. And then I would just say, also, like the entire field of law essentially, like so the entire field of like administration of the economy, which of course has been you know, growing tremendously, also is very resistant to new technology.
And so you have this like bizarre situation if you're trying to like raise a family in the US right now, you have this bizarre situation, which is you can buy, you know, a television set that covers your entire wall right and spectacular, you know, four K, eight K visual splendor to watch Netflix. You can buy that for like three hundred bucks, but if you want to send your kid to college for four years, it's three hundred thousand dollars.
Right and and and those prices buy fur Kate further every single day, right and and and Actually, my analysis is this, This is why it feels like our politics have gone so wild on both the left and the right. I think this is the underlying reason why there's so much populism on both sides.
Of the aisle.
Is because the experience of just the typical American voter citizen is that the American dream, as defined by the very basics of buying and living in a house, you know, having sending your kids to college, and having excellent health care, the prices in those things are running completely out of control. Yes, this is one hundred percent function of government involvement. The government in all three of those sectors subsidizes demand while
restricting supply. And so those are sectors that have a cartel like structure. Right, It's virtually impossible to start a new university, It's virtually possible to build houses lots of places. It's first virtually impossible to start a new hospital or get a new doctor license. And so the government basically has a set of policies in place that restricted option technology and drive prices in those sectors.
I think they should stop doing that. You know.
One of the things we're trying to do in our business is in jectmore technology into those sectors. But it is very hard because the barriers there too new technology are very high.
Yes, they're outrageous. And finally, actually one point that you touch on is about national competition visa via China. There's been some arguments here about the Pentagon and whether we should have completely state subsidized AI and the way that the Chinese companies do it. You've argued here in the essay that a robust private sector is actually enough in
order to quote unquote beat them. Given though that they're massive investments inside of China on quantum computing and more, why is it still enough for our private sector to be able to come out on top eventually.
Yeah.
So, first of all, this is sort of an incongruity and how Washington's dealing with AI. And this is something I experienced a lot, which is I talk to people in Washington on Tuesday, right, and they're they're very worried about all the arguments that we've already discussed, you know,
sort of internal to the US. I talked to the same people on Thursday, and they're very worried about China, and all of a sudden, they're like, Wow, we in the private sector need to like teme up and advance AI as fast as we can because we have to compete with China, and we might be in a future of war with China, and that war might be want or lost on the basis of the application of AI. And so there is this weird thing of course, where you know, societies kind of turn on themselves when there's
no external threat, you know, the Chinese external threat. You know, maybe maybe it's showing up at a good time kind of you know, sort of weirdly, not in the way that we would prefer, but like, you know, maybe it's going to get us little bit more focused on things that matter. Look, so both the Chinese state and the American state have decided that AI and autonomy or the future of warfare. China has been very public on this. They you know, they the great thing about China is
they just like publish their plans. They just like tell you what they're doing. It's the thing very straightforward, you know, kind of how they communicate. And then and then look in the US, the Pentagon, you know, several years ago now basically announced decided that what they called the third offset, just sort of the third big national strategy for you know, for technology, advanced and military fairs, was going to be a AI in autonomy. And so they've they've actually already
been kind of marching down this path. And so it is pretty clear that both the major powers in the world kind of believe the future of warfares is gonna be it's gonna be based around AI. In China, they have you know, they have a system, and their system has their public sector and their private sector completely intertwined. Right the the the private sector in China is owned
by the public sector. It's owned by the Communist Party, and so they have the they have the advantage that they can directly control the entire productive capability of their system against you know, their their military needs, which is which is what they're doing. You know, we don't have that, you know, we have we have at least in theory
of free market system. The advantage of the free market system historically is a much faster rate of innovation, right if if you get all these companies actually competing with each other, you get all these new entrepreneurs that we back competing with each other, competing with the big companies. Uh, you know, you get a much faster rate of innovation. Historically, the US system has led to a much much faster rate of innovation than the centralized Soviet system did or
the centralized Chinese system has. And so my argument is just very straightforward, which is we should unleash the animals spirits of our private sector in our defense sector in the same way we did during the Cold War. We should have an absolute determination to know just completely swamp the Chinese centralized system in terms of the quality of our AI and then an any future military conflict like we should be set up to just beat them decisively right out of the gate with superior technology.
Yes.
And then my last question, Mark, you're one of the best read people I know. What are some great books that people can read to prepare for this transition on a social level, a technological level, and a political level.
Yeah, so, I mean there's tons.
I mean, my number one recommendation on the shifts of technology, there's a great short book written by an Mit professor fifty years ago called men, Machines and Modern Times, And it goes through this sort of cycle of adoption, like what happens when a new technology arrives that sort of threatens to, you know, kind of up end the established order.
The book is interesting actually because it has a specific focus in the book on historical precedence and military technology, and so it has this incredible story of the first naval gun that was able to automatically compensate for the role of ships. Yes, like one hundred and hundred years ago, the change naval warfare forever. And it was the level of institutional resistance from both the US and the UK navies to not adopt this technology was like, you know,
completely wild. And so anyway, it's it's a it's a it's a very good book on Uh, it's a very good book on on a societal impact of new technologies. I also, I can't help but recommend on the fiction side. I'm reading with my kid right now. I Hitchiker's Kind to the Galaxy, and I bring that up. I bring that up because Douglas Douglas Adams, the author of that, had maybe the sort of funniest uh, you know, explanation
of how new technologies are received by society. Uh, he said, if you were below the age of fifteen, and new technology is just obviously the way the world works, why wouldn't why wouldn't it be, which, by the way, is how my eight year old kind of us AI is like, Yeah, of course the computer answers questions like why wouldn't the
computer answer questions? And then he says, Adam says, if you're between the ages of fifteen and thirty five, the new technology is extremely exciting and you might be able to make a career in it. And if you're above the age of thirty five, the new technology is unholy and scary and it's going to destroy society. Right, So I think that I think it's framework actually also is exactly what's applying to AI right now.
All right, well, we'll have links down those in the description. Mark. We really appreciate your time. I think the audience will get a lot out of this.
Awesome. Thank you man, Thanks man.
We're always looking for a good take out there. This one is genuinely stunning. Scott Galloway of Take Pivot Pivot Fame, a co host of the podcast with Karroswisher, usually confined to the tech world into like cringe liberal politics, has decided to wait into the Republican primary. Here's this prediction.
I think President Trump is not going to run for president. Yeah, under the auspices of a plea deal.
Where is that coming from? No one says that stuff.
Actually Governor Christie said said it and or at least a components.
You're going be on the stage if Trump shows up, he qualified thanks for my donation to him. I can't believe I gave him money.
I actually think Governor Christie is going to surpass just antism the number too. But I don't think it'll matter.
I think Trump.
I don't understand it. Can't empathize with President Trump, but I know how old Richmond think. He has a very nice life, and his life can be going back to golf and sycophants and having sex with porn stars, which I think is a good thing. I being cynical, I would like to do more of that at some point in my life.
Or I'm going to hit you when this is over, just saying, or.
He can live under the threat of prison.
Okay, that's certainly a take. Here's why I think Scott's wrong. If that was true, then he would have not run again in twenty twenty four. Everything that he said, I've heard that, by the way, from a lot of people who knew Trump. They're like Trump legs to golf. He doesn't like to do work. All that's true. Guess what he can do all that as president and still get all of the attention and craving that he's always wanted.
So I think that the analysis is spurious. I also think that given the threat of legal indictment, being president is the best possible scenario because it shields you legally from a lot of it gives you legal immunity in many of these cases, and gives you the best political case to wage in order to keep you out of a prison cell.
Okay, so I agree with you, but I'm going to make the other case just to make it interesting.
So if he.
Didn't run for president, then he wouldn't like the crimes were already committed. This writing was on the wall that he was likely to face indictment in a variety of cases with regard to January sixth and the documents and the thing in New York as well. So since that
writing was already on the wall. If he doesn't run for president, then he doesn't have any leverage to try to secure the plea deal that Scott Galloway is floating here, in which others have floated as well, which is like, listen, you agree to not run for president, you go back to your private life, and this.
All goes away.
So it would make sense in that context of like, I get to hold the threat of me being president again over their head, so I can secure some sort of a deal to make sure that my ass does not end up in prison, and then you know, the fallback play and is to actually win the presidency again and then he can make his own problems go away.
The reason that I don't think that this is correct is, you know, if they strike this kind of a plea deal with Trump, it really validates the case that this all was political to start with, that the only thing they cared about was keeping him out of the White House to begin with. And the lawyers that I've talked to, because you know some of the mechanics of this, like I don't have full insight into the lawyers I've talked to, you really think that this is a very unlikely outcome.
Because you're offering a political solution to what is ultimately a legal problem, and those two things don't normally mix. So that's why I, you know, that's why I don't see it happening frankly, but hey, it's interesting.
Yeah, Look what do you think of the Chris Christy take? Oh totally wrong? I mean no, look, no offense, Scott, But like this is just not your wheelhouse really at all. I mean we've covered really I think here's the problem for him. Scott genuinely is like a resistance liberal and I hates Trump and he wants to see somebody like assent. But you and I are familiar with how actual GOP primary voters think an act, and they love Trump, and the thing that they hate the most about Chris Christy
is that he's anti Trump. Chris Christy, if he was running on literally anything else, probably would be doing fine. If you're somebody who is genuinely anti Trump, the best lesson you can learn is to be like a Nancy Mace or a what was the guy who got elect a governor of Georgia. I'm playing Brian Camp, Brian Camp. If you want to be like Brian Camp in those two instances, what do you do. You run against the left and you just kind of hope that nobody really
notices that you do have differences with Trump. That's not the Christie tactic. There's no way you can be number two like that, not whenever. That's like running against Ronald Reagan in nineteen eighty. It's just not going to happen.
I do you think I agree with like, he's not going to be the nominee. I don't know if he could pull into a second although actually I did see I mentioned this was New Hampshire poll that has him actually tied with DeSantis in New Hampshire for second for a second place in New Hampshire.
I do think that.
There is something just on a like basic human level that is compelling about the fact that he feels like just like Trump does. He feels like he's saying the things that he really thinks. Whereas desantist, Tim Scott, Nikki Haley, all this other cast of characters, it's so naked that they're posturing, they're trying to get the right answer, they're using whatever their little consultants and focus groups told them
to say. And like obviously Chris Christy has his own advisors and is doing his own version of that game, but he has much more of that feel of like, listen, I know this is uncomfortable and unpopular. I know the audience that I'm speaking to.
I don't care.
I'm just going to say it anyway. There is something compelling about that? Is it win the GUOP nomination?
Compelling?
No?
Is it do a little bit better than I think people are giving him credit for.
Yeah, I could see that. I could see that. All right, Well maybe we'll find out. Maybe I'll look like an idiot. You could roll my clip and say I'm the actual cringe person who doesn't know what I'm talking about. That's always fun. So we'll see you guys later.
So some fascinating charts were posted to Twitter showing increasing political polarization around some key issues of just like basic self esteem, basic self worth, and these things seem to be splitting, Like there seems to be more of an ideological divide in these feelings over time. Let's go and put this up on the screen so you can get
a sense of what we're talking about here. So you've got some questions, like on the whole, I'm satisfied with myself, so you have a spike in Democrats who disagree with that sentiment over Republicans. How satisfied are you with your life as a whole?
These days?
You again have Democrats really dissatisfied with their lives and an increasing divide there. Same thing with life often seems meaningless. I enjoy life as much as anyone. This one has potentially the biggest spike here, which I actually thought was kind of revealing. The future often seems hopeless. You've got nearly forty percent of Democrats who agree with that. Again, that has really spiked very recently in recent years, whereas the Republican number has shown a recent spike, but not
nearly as much as the Democratic number. And that's one of the biggest divides. So kind of fascinating that there seems to be a real split in assessments of the future, in assessments in personal satisfaction, how I feel about myself, how I feel about my life, how I feel about the future, etc. Between liberals and conservatives.
I do wonder how much of this is generational though, just because so many young people are liberal, they are going to self select for more doumerism, And I mean, you can't blame them on one hand, So yeah, but I do think there is something to this. You can see it in the attitudes around COVID, you can see it around safetyism, you can see it around optimism, around the feeling of the country in some cases. You know, I'm on various different sides of those, so I do
tend to understand. But I think the one that is very sad is life. Life often feels meaningless, and a huge spike in the agreed number for self identified liberals, and I enjoy life as much as anyone with a disagree also spiking as well. So in general, I think this is probably skewed most by generational gaps and by wealth, but there is definitely something like I'm not going to
say spiritual, I don't know what is it. I don't know what the word is, epistemological to something about what's going on here.
Yeah, I think the generational point is well taken. I do think it's worth noting on almost all of these metrics, even though the liberal numbers go up at a faster clip, the conservative numbers are rising as well, So you do have increasing levels of like personal dissatisfaction and pessimism rising among the population, and in some ways, I think that's justified. In a lot of ways, I think that's justified. You know,
the one about the future often seems hopeless. That one really struck me because listen, if you you know, if you're a young person that's looking at you know, the fact that the error is like unbreathable from wildfires and there's increasing you know, extreme temperatures and climate catastrophes, and it just seems like there's no end in sight and this is going to continue to spiral out of control. Yeah, it's pretty justified to have some negative feelings around that.
And then in terms of personal material conditions, if you are a person who benefited from, you know, growing up and starting your career at a certain age in America where now you've got a home and you feel pretty solid, you pretty feel pretty stable, and you've got a nice little nest egg, and you're of that generation, Yeah, you're probably going to feel pretty pretty good about things, at least better about things than young people who are like I'm never going to be able to buy a house,
I'm not going to be able to have a family, I'm never gonna be able to have kids, never going to be able to have some basic measure of stability, and rather than feeling like, yeah, but things will get better over time, it just seems like we're facing down a landscape of decline. You know. To bring it back to politics, because that's what we do here, just the specter of we're really going to have Donald Trump and Joe Biden as like the best we can do in pology.
I just think that's so symbolic of this sense of national decline which pervades people's overall feelings of optimism and also, you know, can can lead into their own personal feelings of how their life is likely to go over the coming years.
Yeah, I think that's true. I think a lot of it focuses on future conditions, and that's where I would you know, the most immediate ones that you can try is cost of living and you know, not even educational attainment, but attainment towards the or at least a feeling that you can ascend to a higher social class if you work hard enough. That's really you know, that's all the basic American contract that was there. It's exactly why also that those numbers were much more equalized in the past.
So I do think this is a very generational chart, and I think that's actually even more unfortunate than whenever you divide it by partisan, although I do there is something even amongst people who are of the same age in terms of some sorting on these issues, which is not good, by the way, for people to have totally different world.
I think there's I think there's always been a divide between liberals and conservaive conservatives tend to be like, rank higher on levels of happiness, and so there's always been a bit of a split here. But the fact that you have these generational divides coming to the.
Four I think is very noteworthy. Agreed, all right, we'll see you guys later.