Home
FAQs
Pricing
Blog
Home
FAQs
Pricing
Blog
Episode description
#371 – Max Tegmark: The Case for Halting AI Development
Lex Fridman Podcast
Apr 13, 2023
•
3 hr 54 min
•
Transcript available on
Metacast
--:--
--:--
10
30
Listen in podcast apps:
Metacast
Spotify
Youtube
Episode description
Max Tegmark is a physicist and AI researcher at MIT, co-founder of the Future of Life Institute, and author of Life 3.0: Being Human in the Age of Artificial Intelligence. Please support this podcast by checking out our sponsors: - Notion:
https://notion.com
- InsideTracker:
https://insidetracker.com/lex
to get 20% off - Indeed:
https://indeed.com/lex
to get $75 credit EPISODE LINKS: Max's Twitter:
https://twitter.com/tegmark
Max's Website:
https://space.mit.edu/home/tegmark
Pause Giant AI Experiments (open letter):
https://futureoflife.org/open-letter/pause-giant-ai-experiments
Future of Life Institute:
https://futureoflife.org
Books and resources mentioned: 1. Life 3.0 (book):
https://amzn.to/3UB9rXB
2. Meditations on Moloch (essay):
https://slatestarcodex.com/2014/07/30/meditations-on-moloch
3. Nuclear winter paper:
https://nature.com/articles/s43016-022-00573-0
PODCAST INFO: Podcast website:
https://lexfridman.com/podcast
Apple Podcasts:
https://apple.co/2lwqZIr
Spotify:
https://spoti.fi/2nEwCF8
RSS:
https://lexfridman.com/feed/podcast/
YouTube Full Episodes:
https://youtube.com/lexfridman
YouTube Clips:
https://youtube.com/lexclips
SUPPORT & CONNECT: - Check out the sponsors above, it's the best way to support this podcast - Support on Patreon:
https://www.patreon.com/lexfridman
- Twitter:
https://twitter.com/lexfridman
- Instagram:
https://www.instagram.com/lexfridman
- LinkedIn:
https://www.linkedin.com/in/lexfridman
- Facebook:
https://www.facebook.com/lexfridman
- Medium:
https://medium.com/@lexfridman
OUTLINE: Here's the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time. (00:00) - Introduction (07:34) - Intelligent alien civilizations (19:58) - Life 3.0 and superintelligent AI (31:25) - Open letter to pause Giant AI Experiments (56:32) - Maintaining control (1:25:22) - Regulation (1:36:12) - Job automation (1:45:27) - Elon Musk (2:07:09) - Open source (2:13:39) - How AI may kill all humans (2:24:10) - Consciousness (2:33:32) - Nuclear winter (2:44:00) - Questions for AGI