428. The Future of Large Language Models, Open vs. Closed Sourced, and Why 2024 is the Year of Inference (Ashu Garg) - podcast episode cover

428. The Future of Large Language Models, Open vs. Closed Sourced, and Why 2024 is the Year of Inference (Ashu Garg)

Apr 08, 202437 minEp. 428
--:--
--:--
Listen in podcast apps:

Episode description

Ashu Garg of Foundation Capital joins Nate to discuss The Future of Large Language Models, Open vs. Closed Sourced, and Why 2024 is the Year of Inference. In this episode we cover:

  • Venture Capital Investing Strategies and Ideal Founder Characteristics
  • AI Model Development and Its Future
  • AI Models for Legal Tech and Their Potential Applications
  • Tech Giants' Positioning in the AI Shift
  • Google's Innovation Dilemma and AI Adoption

Guest Links:

The hosts of The Full Ratchet are Nick Moran and Nate Pierotti of New Stack Ventures, a venture capital firm committed to investing in founders outside of the Bay Area.

Want to keep up to date with The Full Ratchet? Follow us on social.

You can learn more about New Stack Ventures by visiting our LinkedIn and Twitter.

Are you a founder looking for your next investor? Visit our free tool VC-Rank and we’ll send a list of potential investors right to your inbox!

428. The Future of Large Language Models, Open vs. Closed Sourced, and Why 2024 is the Year of Inference (Ashu Garg) | The Full Ratchet (TFR): Venture Capital and Startup Investing Demystified podcast - Listen or read transcript on Metacast