
Sign up to save your podcasts
Or
In the final episode of this mini-series, Shea and Anders cover the other common tree-based ensemble model, the Gradient Boosting Machine. Like Random Forests, GBMs make use of a large number of decision trees, but they use a “boosting” approach that cleverly makes use of “weak learners” to incrementally extract information from the data. After an explanation of how GBMs work, we compare them to Random Forests and go over a few examples where they have used GBMs in their own work.
4.6
3131 ratings
In the final episode of this mini-series, Shea and Anders cover the other common tree-based ensemble model, the Gradient Boosting Machine. Like Random Forests, GBMs make use of a large number of decision trees, but they use a “boosting” approach that cleverly makes use of “weak learners” to incrementally extract information from the data. After an explanation of how GBMs work, we compare them to Random Forests and go over a few examples where they have used GBMs in their own work.
1,646 Listeners
4,328 Listeners
1,379 Listeners
77,730 Listeners
30,651 Listeners
32,114 Listeners
25,777 Listeners
110,854 Listeners
55,906 Listeners
9,510 Listeners
16 Listeners
11 Listeners
2 Listeners
2,092 Listeners
1,610 Listeners