
Sign up to save your podcasts
Or


In the 30th episode we review the the bootstrap, method which was introduced by Bradley Efron in 1979, is a non-parametric resampling technique that approximates a statistic’s sampling distribution by repeatedly drawing with replacement from the observed data, allowing estimation of standard errors, confidence intervals, and bias without relying on strong distributional assumptions.
Its ability to quantify uncertainty cheaply and flexibly underlies many staples of modern data science and AI, powering model evaluation and feature stability analysis, inspiring ensemble methods like bagging and random forests, and informing uncertainty calibration for deep-learning predictions—thereby making contemporary models more reliable and robust.Efron, B. "Bootstrap methods: Another look at the bootstrap." The Annals of Statistics 7 (1977): 1-26.
By Mike E3.8
55 ratings
In the 30th episode we review the the bootstrap, method which was introduced by Bradley Efron in 1979, is a non-parametric resampling technique that approximates a statistic’s sampling distribution by repeatedly drawing with replacement from the observed data, allowing estimation of standard errors, confidence intervals, and bias without relying on strong distributional assumptions.
Its ability to quantify uncertainty cheaply and flexibly underlies many staples of modern data science and AI, powering model evaluation and feature stability analysis, inspiring ensemble methods like bagging and random forests, and informing uncertainty calibration for deep-learning predictions—thereby making contemporary models more reliable and robust.Efron, B. "Bootstrap methods: Another look at the bootstrap." The Annals of Statistics 7 (1977): 1-26.

32,003 Listeners

229,169 Listeners

14,322 Listeners

890 Listeners

585 Listeners

532 Listeners

6,401 Listeners

302 Listeners

87,348 Listeners

4,182 Listeners

211 Listeners

95 Listeners

5,512 Listeners

15,272 Listeners

54 Listeners