771: Gradient Boosting: XGBoost, LightGBM and CatBoost, with Kirill Eremenko - podcast episode cover

771: Gradient Boosting: XGBoost, LightGBM and CatBoost, with Kirill Eremenko

Apr 02, 20241 hr 55 min
--:--
--:--
Download Metacast podcast app
Listen to this episode in Metacast mobile app
Don't just listen to podcasts. Learn from them with transcripts, summaries, and chapters for every episode. Skim, search, and bookmark insights. Learn more

Episode description

Kirill Eremenko joins Jon Krohn for another exclusive, in-depth teaser for a new course just released on the SuperDataScience platform, “Machine Learning Level 2”. Kirill walks listeners through why decision trees and random forests are fruitful for businesses, and he offers hands-on walkthroughs for the three leading gradient-boosting algorithms today: XGBoost, LightGBM, and CatBoost.This episode is brought to you by Ready Tensor, where innovation meets reproducibility, and by Data Universe, the out-of-this-world data conference. Interested in sponsoring a SuperDataScience Podcast episode? Visit passionfroot.me/superdatascience for sponsorship information.In this episode you will learn:• All about decision trees [09:17]• All about ensemble models [21:43]• All about AdaBoost [36:47]• All about gradient boosting [45:52]• Gradient boosting for classification problems [59:54]• Advantages of XGBoost [1:03:51]• LightGBM [1:17:06]• CatBoost [1:32:07]Additional materials: www.superdatascience.com/771
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast
771: Gradient Boosting: XGBoost, LightGBM and CatBoost, with Kirill Eremenko | Super Data Science: ML & AI Podcast with Jon Krohn - Listen or read transcript on Metacast