
Sign up to save your podcasts
Or
In this episode, I'm going to talk about ensemble models, particularly bagging and boosting. Bagging is very useful for reducing variance, boosting is used for reducing bias. The most common bagging algorithm is Random Forest, the most common boosting algorithm is Gradient Boosting, whose most common implementations are XGBoost, LightGBM and CatBoost.
Home Page: https://www.yourdatateacher.com
In this episode, I'm going to talk about ensemble models, particularly bagging and boosting. Bagging is very useful for reducing variance, boosting is used for reducing bias. The most common bagging algorithm is Random Forest, the most common boosting algorithm is Gradient Boosting, whose most common implementations are XGBoost, LightGBM and CatBoost.
Home Page: https://www.yourdatateacher.com