The Surging Problem of AI Energy Consumption - podcast episode cover

The Surging Problem of AI Energy Consumption

May 09, 202427 min
--:--
--:--
Listen in podcast apps:

Episode description

On April 9th, Rene Haas, CEO of Arm Holdings, a British semiconductor and software design company came out and made a statement about data center energy consumption that most people would find shocking.

He said, “by the end of the decade, AI data centers could consume as much as 20% to 25% of U.S. power requirements. Today that’s probably 4% or less.”

Everyone wants to talk about AI, but this reality is something we don’t discuss nearly enough. AI may be the greatest unrecognized threat to the environment today, because AI is an energy hog.

Example: It requires nearly 10 times as much energy to do an Internet search in ChatGPT as using Google. Are the added benefits or the improved experience worth it? What about at scale?

In this episode of the Art of Supply podcast, Kelly Barner takes an honest look at the very real problem of AI-driven energy consumption:

  • Why AI requires so much energy to operate
  • Projections for the growth of AI usage and therefore AI energy consumption
  • How the use of AI should change given today’s sensibilities about sustainability

Links:

The Surging Problem of AI Energy Consumption | Art of Supply podcast - Listen or read transcript on Metacast