
Sign up to save your podcasts
Or


In the final episode of this mini-series, Shea and Anders cover the other common tree-based ensemble model, the Gradient Boosting Machine. Like Random Forests, GBMs make use of a large number of decision trees, but they use a “boosting” approach that cleverly makes use of “weak learners” to incrementally extract information from the data. After an explanation of how GBMs work, we compare them to Random Forests and go over a few examples where they have used GBMs in their own work.
By Society of Actuaries (SOA)4.6
3131 ratings
In the final episode of this mini-series, Shea and Anders cover the other common tree-based ensemble model, the Gradient Boosting Machine. Like Random Forests, GBMs make use of a large number of decision trees, but they use a “boosting” approach that cleverly makes use of “weak learners” to incrementally extract information from the data. After an explanation of how GBMs work, we compare them to Random Forests and go over a few examples where they have used GBMs in their own work.

78,273 Listeners

32,079 Listeners

30,670 Listeners

25,891 Listeners

4,359 Listeners

1,384 Listeners

1,630 Listeners

112,454 Listeners

56,402 Listeners

9,516 Listeners

15 Listeners

11 Listeners

2 Listeners

2,108 Listeners

1,651 Listeners