
Sign up to save your podcasts
Or


Classically, entropy is a measure of disorder in a system. From a statistical perspective, it is more useful to say it's a measure of the unpredictability of the system. In this episode we discuss how information reduces the entropy in deciding whether or not Yoshi the parrot will like a new chew toy. A few other everyday examples help us examine why entropy is a nice metric for constructing a decision tree.
By Kyle Polich4.4
475475 ratings
Classically, entropy is a measure of disorder in a system. From a statistical perspective, it is more useful to say it's a measure of the unpredictability of the system. In this episode we discuss how information reduces the entropy in deciding whether or not Yoshi the parrot will like a new chew toy. A few other everyday examples help us examine why entropy is a nice metric for constructing a decision tree.

290 Listeners

622 Listeners

584 Listeners

302 Listeners

332 Listeners

228 Listeners

205 Listeners

205 Listeners

306 Listeners

96 Listeners

515 Listeners

262 Listeners

131 Listeners

228 Listeners

622 Listeners