Support for this episode comes from the current report from data privacy to the future of TV, retail, media, and beyond. The world of digital marketing is constantly in flux. So how can you keep up? Well, the current report is there for you. Each week, marketing leaders in the cutting edge give you the latest insight. So if it's creating a buzz, they'll be talking about it. Subscribe to the current report wherever you get your podcasts.
Hello and welcome to Decoder. I'm Nilay Patel, Editor and Chief of The Verge, and Decoder is my show about big ideas and other problems. We've been talking about AI a lot on the show lately. It's unavoidable. And there's one piece of feedback we keep getting that I really wanted to spend some time on. How the lightning fast explosion of AI affects the climate. After all, to run AI at scale, we need to build a lot of data centers and pack them full of power-hungry GPUs.
That takes a lot of energy, and whether using all that juice is worth it comes up frequently whenever we talk about AI. Both as a matter of practical concern — can our aging grid support all of this — and as a moral objection, we shouldn't build these systems because they'll wreck the
planet. And what's particularly complicated is that big tech companies like Amazon and Google and Microsoft have all spent the last few years working with governments around the world to set ambitious goals around sustainable energy usage, so that we might slow the rate of climate change to simply bad instead of catastrophic. But now with AI, all of them are blowing past their emissions targets, and they're actually getting worse over time instead of better. And it's not great.
But putting a bunch of computers in a data center and running them at full tilt is how basically everything works now. If you have a moral objection to AI based on climate concerns, you might also have a moral objection to TikTok and YouTube, which are constantly ingesting and encoding millions of hours of video. You might have a moral objection to video games, which both run on power-hungry GPUs and people's homes, and often also require intense data center
workloads for online multiplayer. And I'll take a guess, but I feel pretty certain that anyone with climate concerns about AI has a pretty harsh assessment of crypto as well. And think about it this way. The Nvidia H100, which is the gold standard for AI GPUs, is pretty similar to the gaming focused Nvidia RTX 4090 in terms of power draw. What frameworks should we use to evaluate the climate impact of those cards and how we feel about how they're used? It's messy and it's complicated
and there are a lot of apparent contradictions along the way. So really, it's perfect for the color. To help start it out, I've invited Verge Senior Science reporter Justine Kama on the show to see if we can untangle this knot. Justine Kama, welcome to Dakota. Thanks for having me. There's a lot going on here, and we're going to get into a lot of the moving parts. But I want to start with some really basic facts just to set the stage, because we can't talk about a problem
getting worse, we're not getting worse, and we can't name it to begin with. There are a lot of data centers in the United States. There are more of them every day. How much power are they using right now? Is that a reasonable baseline to set? There are about 8,000 data centers around the world as of the latest estimates from the International Energy Agency from this year, and about a third of those are in the US. So the US has more data centers than any other country in the world.
We know that globally data centers use about 2% of electricity. Again, that's for those 8,000 data centers globally. The US has about a third of those data centers that use roughly 4% of US electricity. It's a small but meaningful chunk. Is that number growing? It is. I mean, we've seen data centers become more energy efficient, but we've also seen their energy use expand since 2000, and that's expected to grow a whole lot more in the coming years, particularly with data centers
that are used for crypto mining and for AI. When you say data center, is that encompass everything from Amazon's big fancy AWS data center or Microsoft's big fancy Azure data center all the way down to a guy in Texas who rented out a barn and stuff that full of crypto ASICs?
I mean, data centers have been around for a long time, but I think what is wearing some energy analysts now is where there's a lot of fast growth, which is outpacing both the capabilities of our aging grid, as well as our abilities to try to meet climate goals that have been set in the US and globally. So data centers are expected to use about twice as much energy, electricity, and in particular by 2026 as they did last year. An AI in particular could use about 10 times
as much electricity in 2026 as it did last year. So that's huge growth and demand. Where's that growth and what we're coming from? Is that an independent observer? Is that with the companies are saying? That's coming from the International Energy Agency, which is an international organization that was started back in the 1970s to monitor the world's energy supply use. It was founded in the wake of the 1970s oil crisis, and so they're really concerned with making sure
that we have enough energy to make the world run. Lately, they've also done a lot of tracking around the clean energy transition as well. And when you have an estimate from them that is AI will use 10 times as much power as it does today, what is the methodology there? Is that something we can go look at? Is that, do they really deal it? They're looking at a few different things. When it comes to a data center, they're looking at the power that the machines are actually using. So that
accounts for about 40% of the electricity demand of a data center. Cooling requirements are another 40% and then the remaining 20% comes from other associated IT equipment. And then they do put an estimate of lower band and a higher band in terms of how much electricity data centers could use in the future. And so that's where you get a couple different estimates in terms of what that might look like. So upper band, the most would be equivalent to adding at least the electricity demand
equivalent of Germany. That's on the larger end. An additional 590 terawatt hours of electricity demand. And then on the lower end would be around 160 terawatt hours of electricity demand, which would be more in line with the electricity that a country likes to be doing with use. Let's put that into some context. Hearing the United States, there's an entire federal agency, the Energy Information Administration, that exists to gather, analyze, and publish information
about energy usage in the United States. They publish all the reports you could ever want about how much electricity it's used in this country, how it's made, and where it goes. In 2022, the EIA says that total energy production in the United States is about 4,180 terawatt hours. And total energy consumption was 4,070 terawatt hours. Those numbers are very close. And the amount we use is going up faster than the amount we produce, which leaves a very little margin for error.
This is what people mean when they worry about the grid. Companies make power plants of whatever kind, gas solar, nuclear hydroelectric, and so on. And then all of that electricity gets transmitted through a bunch of systems up until the last mile utility cables on your street bring it to your house. The grid is in a pretty shaky spot, with demand growing faster than supply across the board.
There are some pretty obvious examples of this, like the blackouts or brownouts you see when there's a severe cold snap or heat wave, and some local part of the grid just can't keep up with heating or cooling demand. All of our data centers basically just plug into the grid the same way that any house or store office building might. And there's just not a lot of wiggle room left for all that extra demand. I think it's important to understand the broader context here of why we're sweating over
small single-digit percentages of electricity demand increase. And that part of the reason why is because our power grid is already operating on a razor blade. You're constantly trying to generate just enough electricity to meet demand to avoid power outages, right? We also don't have a whole lot of energy storage on the grid to safeguard from an energy crisis that we're starting to see more of whenever we see these stronger storms or heat waves that are putting a lot more pressure
on the grid as well. We also have really not a lot of time to meet climate goals that have been set at the global level for the U.S. and also that many of these tech companies have set for themselves. And so when we're talking about trying to thread the needle, there's a lot at stake. That's where people start to sweat over numbers like this. We have to take a short break. When we come back, we'll dig into where all that energy in the
grid comes from. And if big tech companies are keeping their promises for how to deal with it. Sport for this show comes from ARM. Have you ever thought about the technology that makes this podcast possible to listen to on your phone, in your car, or on your laptop? And then there's the data centers that make it all work. One company is at the heart of it all. It's the same company that powered the smartphone revolution and is helping define the AI revolution. That company
is called ARM. ARM designs compute platforms for the biggest companies in the world so they can create silicon and solutions to power global technology. ARM is proudly NASDAQ listed and became a NASDAQ 100 company within a year of its IPO. ARM touches nearly 100% of the globally connected population. 99% of smartphones are built on ARM. Major clouds run on ARM as well as all major mobile and PC apps. Now, ARM's engineers are tackling the insatiable demand for compute and
power efficiency that AI is creating. AI-enabled ARM CPUs are able to provide the compute platform for the global AI revolution in the years to come. But for now, relax and enjoy this podcast. It's very likely running on your own ARM-powered device. Visit arm.com slash discover to learn more. Support for decoder comes from the world as you'll know it. A podcast about the forces shaping our future. Most people understand the reality of climate change. You can see and experience the effects.
But what's harder to see is what's being done to combat it. On this season of the world as you'll know it, they go deep inside the technological revolution to replace fossil fuels with clean energy. It's a monumental task and it's a story full of big successes, major failures, difficult grinding work and enormous promise. You can meet the scientists, engineers and entrepreneurs taking this challenge on. And they dig into questions about technologies. Can you make clean steel at scale?
Can you capture carbon from the atmosphere and lock it underground? The show looks at how solar power became cheaper than gasoline. At batteries that can power longer and longer drives and at air conditioners that might be able to keep us cool without heating up the planet. It's the new season of the world as you'll know it. The great rebuild. Available wherever you get your podcasts. This episode is brought to you by Shopify. Forget the frustration of picking commerce platforms
when you switch your business to Shopify. The global commerce platform that supercharges your selling wherever you sell. With Shopify, you'll harness the same intuitive features, trusted apps and powerful analytics used by the world's leading brands. Sign up today for your $1 per month trial period at Shopify.com slash tech. All lowercase. That's Shopify.com slash tech. Welcome back. I'm talking with Verge, Senior Science Reporter, Justine Kama, about how energy
use and climate commitments are colliding with the growth in generative AI. We've talked about the grid a little bit and how there's not a lot of storage in it. Supply is constrained, demand is going up. The big company is like Amazon, Microsoft and Google, which are building all of these data centers. Love to talk a big game about how they're using renewable or sustainable energy and reducing their emissions and fossil fuel usage. What kind of power or most data centers using
right now? Right now, the grid in the US and globally, power grids are still running on a lot of fossil fuels. In the US, it's still a majority of fossil fuels. Gas in particular. And so while we do see solar and wind energy ramping up, it's not ramping up at a pace that right now is going to get the US to meet its climate goals under the Paris Agreement. We honestly just don't have enough
renewable energy on the grid right now. When you plug into the grid, you don't really know where that electricity is coming from, whether it's coming from a gas-fired power plant or from a solar farm, that makes it hard to make sustainability claims. When you hear a big company say, we're going to have 100% renewable data center. Is it actually they've got enough solar panels on the roof to run the data center? Are they just trading credits back and forth? Are they just making
these power purchase agreements or is it adding up and they get to claim it? Who ought to it's that? A lot of these companies have made really big climate pledges, particularly to reach net zero or net negative carbon dioxide emissions. And a big part of that is to be able to run on renewable electricity. For them to say that they are running on renewable energy. Again, if they're plugging into the grid, they don't know where that electricity is coming from. All the
electrons get jumbled up right in the grid. We don't know if it's coming from gas-fired power plant or if it's coming from a solar farm. And so typically companies will often buy renewable energy certificates to claim the environmental benefits of generating renewable energy. And they buy enough of those to match their renewable electricity use. That's generally how a lot of
companies are meeting their sustainability goals. They're also trying to enter into power purchase agreements to facilitate the building of new solar wind farms to bring more renewable energy capacity online. But yeah, generally you're not seeing a data center that is just running off of solar power on its rooftop. It's important to say that these are part of broader climate goals. So you have under the Paris Agreement a commitment to stop global warming from moving beyond roughly
1.5 degrees Celsius above where they were before the industrial revolution. And that is really meant to save homes, that's meant to save lives, that's meant to save species. You know, if you move beyond that, you're looking at things like losing 99% of the world's coral reefs, losing a lot of coastlines to sea level rise. And so the goals that they've set aren't out of nowhere. They're
meant to help get us these science facts target. The companies that have set these big sustainable goals have also absolutely with those goals as they build out more and more data centers. Microsoft submissions, for example, were 30% higher in 2023 than they were in 2020 when the company made its first big climate pledge. Amazon is pumping out 34% more missions than when it made its sustainability pledge also in 2019. And Google's 2023 emissions were up by 48% compared to 2019.
So these companies are just going for it. They're not going to be fewer data centers anytime soon. And the hard part is that they're not just for AI. Pretty much everything we do these days happens in the cloud. And the cloud is really just someone else's computer. If you're watching video on Netflix or YouTube or listening to music on Spotify, talking to someone on Zoom or playing an online multiplayer game, it's all using some amount of computing power in a data center somewhere. But the
where specifically is really important. When you get right down to it, an Nvidia H100 in a data center rack is drawing a comparable amount of power to a consumer grade Nvidia 4090 in a gaming PC in someone's living room. But the H100 in the data center is sitting next to hundreds of other H100s. And that's where it starts to become a problem. On top of that, data centers themselves tend to be clustered in certain regions or various financial and logistical reasons. That means all
the energy usage isn't distributed evenly around the country. Instead, it's putting a deep strain on certain parts of the grid. Location matters. And so you're seeing a lot of concentration of data centers being built. In say Virginia, for example, or for crypto mining in Texas. So when you have a lot of data centers that are pulling electricity from the same local power grid, that has a much bigger impact than diffuse across the US. Right. So you do run into more issues if all of this is
going down in one location, you have a lot more stress on that local power grid. You have that utility potentially looking to extend the life or bring on more fossil fuel power plants. And then also facing more energy crunches when it's trying to balance the energy use for those data centers with what other businesses and residents need in their day-to-day lives. We've seen in some
places folks electricity bills go up a little bit. You also see concerns again about building out new fossil fuel plants more pollution from that as well as what happens if there is an extreme heat wave and you have the need to power a lot of people's air conditioning, but also all of these new data centers. And you don't have enough electricity supply during a demand peak. I've definitely seen a lot more concern about what's going on with AI and crypto. Data centers have been energy
hogs for a long time. Right. And these companies have done a lot to say that they're addressing that. They've looked for more efficient ways to cool down these data centers because cooling is a huge electricity cost. They've tried to make more efficient chips. I think right now it's just that we're seeing AI throw a wrench into those kind of best laid plants. This is a complexity for me. I can't think of a point in modern human history, especially the history of humans with computers,
where energy usage has gone down. It only ever goes that. That's what we do here. And we feel like AI might accelerate that. It feels like crypto definitely accelerated how much energy we're using to power computers. But underneath it is just a linear increase. And we're not getting too mad about Netflix every so often re encoding its entire library, which uses a lot of power. Runs a lot of computers at 100%. Gmail just gets bigger. We'll not make a Gmail smaller. YouTube,
we just keep adding an infinite amount of video to YouTube every single day. Our cell networks are getting bigger and we're putting more computers on them every day. There's that. And there's AI and crypto. And it feels like the difference to me is maybe yes, maybe they're going faster. But we as a society have not assigned them a value that evens out the amount of usage that they take up. I feel like we could get really mad about how much power YouTube runs every day versus the value
of YouTube. But actually, most people think YouTube is really valuable. That is a big part of the question is just what are we using these AI tools for and is it seen as something that is worth the risk. But I do think there are some other factors I'll play here too. Where we are now, as we're seeing the world just had four of the hottest days on record just in the past week. We're seeing more intense heat waves, storms, fires. We're running really close to not being able to meet a lot
of climate goals to keep those things from growing much worse. And so right now it's just that the pressure is on and we are really supposed to be using less energy. That is a big piece of the puzzle too is trying to be more strategic about our energy use and figure out how we can use energy much more efficiently. Let's take another short break. When I come back, oh really dig into that tension a bit more. What is the value of AI right now? And will the industry be able to realize its
grand emissions around all of its mounting energy use? My dad works in B2B marketing. He came by my school for career day and said he was a big row as man. Then he told everyone how much he loved calculating his return on ad spend. My friends still laughing me to this day. Not everyone gets B2B, but with LinkedIn you'll be able to reach people who do. Get a hundred dollar credit on your next ad campaign. Go to LinkedIn.com slash results to claim your credit. That's LinkedIn.com slash results
terms and conditions apply LinkedIn the place to be to be. I'm Pete Barara, former US attorney for this other district of New York. Second gentleman Doug M. Hoff joins me this week on my podcast stay tuned and he had some choice words for JD Vance and Donald Trump. These supplements, these opportunists like JD Vance and others. And it shows this is countering. It's buffoonery. We discuss Vice President
Kamala Harris's historic road to the White House. Mr. M. Hoff's important advocacy work and what life has been like in the eye of the storm that is this year's election. The episode is out now. Search and follow stay tuned with Prit wherever you get your podcasts. Welcome back. I'm talking with Verge Senior Science reporter Justine Kamma about the incredibly messy collision of climate goals and our reliance in the cloud, especially for AI.
Right before the break, Justine asked a really important question. What are we using these AI tools for and are those uses worth the risk? And that's where we really start to feel the tension around AI and energy. We all understand there's a value in being able to watch Netflix in your living room or talking to your coworkers or friends on Zoom or FaceTime. But when we ask about AI's value, we're really talking about two different things. One is its financial and economic value. Big tech is
all but tripping over its own feet to invest in AI as fast as it can. But so far, they're not getting any returns. Every time I've talked to a CEO about AI and decoder, they all say the same thing. The AI investments are for the long term. Every interest making a huge bet they hope will pay off in the long run. It might be that all those H100s should sit idle. This is where the comparison
to crypto gets really complicated. An unusual aspect of crypto mining, especially Bitcoin mining, is that it's guaranteed to use 100% of the computers you throw at it, since the underlying problems are designed to get harder to solve over time. The more computers you throw at Bitcoin, the harder they all have to work, and you can basically predict their usage in a straight line. But we don't actually know if anyone's going to be using AI, or if they even will at a meaningful
scale. We can't really quantify how correct or incorrect our projections are, because we don't really know what the value of AI is going to be for other individuals or businesses over the long run. We might be stuffing H100s and data centers to create an enormous supply of AI compute power that there isn't enough demand for. And that's really affecting the quality of the research and the estimates we can get. I think right now, again, a lot of the research has been focused on
the energy consumption of training AI. There are really big questions about actually using all of these AI tools, the energy intensity of the inference part of AI. And that's where we're still seeing research kind of starting to come out now. And there are questions about how that might materialize. The question that a lot of energy experts and researchers are posing is what do we want to use AI for and what do we see as something that's worth the energy and environmental costs and what is
potentially something that maybe is just not worth it? It's complicated because AI is a pretty bubbly industry. The energy you suggest and it's we're talking about are based on things like Nvidia's own sales projections on the market saying the line will just keep going up forever. If that's true, we'll then figure out how much energy will be required to support AI as a grows is a huge challenge. But is that line going to keep going up forever? If you read some of
the emails from our audience, it definitely won't. But Nvidia and the rest of the industry are betting that everything will be successful and all those AI server farms will keep running at 100% forever and the answer is somewhere in the middle. We just don't know. And that brings us to an even more complicated problem, which is the moral question. You can gauge financial value pretty easily and
you can look at something like monthly active users as a metric for success. But what good is AI to me or to humanity is a lot less quantifiable and we're going to be arguing about it for a long time. There are a lot of moral objections to AI. We hear them every day at the verge whenever we cover the subject. Artist writers and other creatives are mad about having their work use as training data and their filing copyright infringement lawsuits left and right. People who care about
democracy and elections are really worried about deep fakes. A problem which is being exacerbated by the very leaders of the tech industry on their own platforms like X. A lot of people are deeply concerned about losing their jobs and executives and some of the major AI companies who've spoken pretty callously about that idea. There's just a long list of moral objections you can raise about AI. And a lot of people who do raise objections feel pretty powerless in the face of systems built by
billionaires. And because climate concerns are inherently existential, they can feel like a powerful argument against the technology that might cause a lot of other disruptions along the way. So it's useful to compare how we talk about AI energy use and its climate impact with how we talk about other big energy transitions to see how different the conversation is. Because as a society, we've assigned very different moral values to different kinds of energy use. For example, the EV
transition is broadly considered to be good. Even though all those cars have big batteries that need charging, which will add up to more strain on the grid. Anti-EV activists, including Donald Trump, bring this up all the time. But the arguments don't have the same impact. When you're looking at EVs, they're displacing existing fossil fuel energy demand, right? So this demand that was being met by gasoline by fossil fuels, that is ostensibly supposed to
switch to demand for renewable energy. So that's one distinction. Is that going to put a lot of stress on the power grid? Yes, it is expected to put a lot of stress on the power grid. And that is something that that utility is and energy regulators are preparing for. It's also seen as something that is going to be a more steady increase. A lot of climate advocates will say that we actually need to depend less on cars in general, right? So we don't want to replace every gas powered
car with another EV. We want to actually make our cities more walkable. We want to make our cities more bikeable. We want to have better public transit. Those are all solutions to pollution from
transportation. It's not just EVs. And then the other thing that I think is worrying about the kind of sudden growing demand for electricity for data centers in particular is that utilities are considering either keeping coal-fired power plants online longer and coal-fired power plants are a lot more polluting than even methane gas power plants that are more dominant in the US now. Or perhaps even expanding their gas-fired power plants as well. And that again is moving the
opposite direction of where we're supposed to be headed with our climate goals. And scale really matters. Sure, your energy use at home will go up if you start charging new EV or if you have a high-powered GPU and a gaming PC and you play games for 10 hours a day. No, judgment. In AI, we'll start adding to that use too in subtle ways. For example, Apple is promising that most of its upcoming AI features will run locally on their phones. But the laws of physics to apply
and all that extra computation will still use energy. It will just be the energy in your phone's battery that you might have to charge more often. It's hard to know how all of this will play out. Those features haven't even shipped yet. And it's hard for us as individual consumers to evaluate our choices at home compared to the massive scale of the problems out there in the world. I try to walk this line with consumers because what we do individually as consumers
as gamers is whatever is important. But I don't think we should let regulators and big companies off the hook because what folks are really concerned about are these giant new hyper-scale data centers, right? Not necessarily what you're doing in your home. The bigger impact is what's happening on that much larger scale. And pressure for large scale change notoriously needs to be top-down, not bottom-up. Because consumers only have so much power. Regulators and governments
and theory have a lot more. But they also have incentive to think more broadly than just one sector. I don't know that regulators are really trying to single out any particular industry just because of their whatever moral sense is on that industry. I think really what it comes down to is making sure that that there are science-based targets to hit. And honestly, even with just what we have right now, even just with EVs and electrifying homes, that's already going to be a huge
difficult lift. That power grids and policy makers are struggling to implement. And so, again, we're talking about a system that's really teetering here. And that's where a lot of the concern comes when there are these big unexpected factors that add more pressure to an already really, really strained system and energy transition. There's one other big lever to push here, and that's cost. Energy isn't cheap. And the more constrained the supply gets, the more expensive it will be to use.
If there's one thing all of these companies don't love to do, it's spending more money than they have to. Money itself is a pretty big incentive for the companies investing in AI to also invest in making that AI as energy efficient as possible in the long run. Of course, the solution everyone would prefer would be to magically develop an abundant source of cheap energy. Inside the Silicon Valley VC set, the crypto boosters have all started talking about nuclear power as a solution.
They start proposing we build lots more nuclear power plants. Is that a reasonable idea? Is that something on the horizon in the United States? So, nuclear energy is a heavy topic. I mean, that's really controversial. On the one hand, you have some environmental advocates that say that it's the biggest single source of carbon dioxide-free energy that we have in the U.S. today.
It also doesn't fluctuate with the weather and time of day like solar and wind do in hydro. And so, it's seen as something that can provide a stable source of carbon dioxide-free energy and the Biden administration has been really all on board with nuclear energy as well as a way to hit its climate goals. You also have on the other side of the issue, you have a lot of environmental justice advocates that are really wary and skeptical of nuclear energy because it still comes with nuclear waste
issues, right? And there are concerns about other environmental impacts from things like mining, uranium, which has left the toxic legacy in the novel donation in decades past. And so, you have concerns about nuclear energy along its supply chain from where you get your fuel to what you do with that fuel. It's also been really, really expensive historically to deploy in U.S. The last new reactive to come online in the U.S. Humongue, Wayover budget, WayPass, which initial
deadline. There are efforts right now to design next generation nuclear reactors that are smaller, modular, supposed to be easier and cheaper to build and site in different locations. However, we're still likely years away from seeing any of those commercial reactors. And again, there are still concerns about how much waste those are going to generate and where you store that radioactive waste for safekeeping, potentially for hundreds of years.
There's the other side of nuclear, which is the hit here that one day will build a fusion reactor. Various companies love to talk about this as their magic book solution. Same Altman has talked about it. Same Altman talks about spending trillions of dollars through the world offering artificial general intelligence. So it feels like that's in the same bucket. Right. It's just hype, pure hype. But is there any meaningful progress towards fusion that
people should be aware of? Folks have been saying for decades that we're on the cusp of a nuclear fusion reactor and you also have Microsoft making a pretty audacious deal to purchase electricity from a future nuclear fusion generator. And it believes that generator might be online by 2028, which is very, very soon. And you have other folks who are saying we're still like
decades away. If at all, close to having a nuclear fusion generator plugged into the grid, we did have in 2022 the biggest breakthrough in nuclear fusion in years when scientists accomplished something called fusion ignition, meaning they were able to trigger a fusion reaction that resulted in a net energy gain because prior to that, you actually ended up needing to use more energy to trigger that reaction than you would get out of it. That's where we're at right now.
It's just even just trying to have a net energy gain. So I hope that gives a sense of how much further there is to go with that. If we achieve fusion, if Microsoft sends up a fusion reactor in 2028, does that solve it? Do we stop worrying about AI? Does blockchain take over the world? Would I love to see nuclear fusion? I'd be super excited about that. And a lot of the folks that I've talked to are even skeptical about it happening anytime soon say the same thing.
But it's generally not really thought of as a solution for the most pressing issues we're facing today. We'll see you next time.