
Sign up to save your podcasts
Or
In the final episode of this mini-series, Shea and Anders cover the other common tree-based ensemble model, the Gradient Boosting Machine. Like Random Forests, GBMs make use of a large number of decision trees, but they use a “boosting” approach that cleverly makes use of “weak learners” to incrementally extract information from the data. After an explanation of how GBMs work, we compare them to Random Forests and go over a few examples where they have used GBMs in their own work.
4.6
3030 ratings
In the final episode of this mini-series, Shea and Anders cover the other common tree-based ensemble model, the Gradient Boosting Machine. Like Random Forests, GBMs make use of a large number of decision trees, but they use a “boosting” approach that cleverly makes use of “weak learners” to incrementally extract information from the data. After an explanation of how GBMs work, we compare them to Random Forests and go over a few examples where they have used GBMs in their own work.
237 Listeners
8,513 Listeners
30,853 Listeners
32,121 Listeners
112,814 Listeners
56,200 Listeners
9,521 Listeners
266 Listeners
9,241 Listeners
16 Listeners
9,693 Listeners
5,850 Listeners
8,385 Listeners
5,377 Listeners
90 Listeners