Machine Learning Bytes

Bias, Variance, and the Bias-Variance Tradeoff

07.23.2019 - By Erik PartridgePlay

Download our free app to listen on your phone

Download on the App StoreGet it on Google Play

The bias-variance trade-off is a key problem in your model search. While bias represents how well your model can capture the salient details of a problem, and generally correlates with more complex algorithms, it comes at the trade off of variance. Variance is the degree to which on individual predictions your estimators stray from the mean output on those values.  High variance means that a model has overfit, and incorrectly or incompletely learned the problem from the training set. Most commonly, high bias = underfitting, high variance = overfitting.

Please consider joining the conversation on Twitter. I also blog from time to time. You can find me at erikpartridge.com.

For more academic sources, consider reading the slides from this fantastic Carnegie Mellon lecture.

---

Send in a voice message: https://podcasters.spotify.com/pod/show/mlbytes/message

More episodes from Machine Learning Bytes