bored Colleen

Understanding Probability Distributions and Entropy: A Dive into Uncertainty


Listen Later

Our story begins with the simple act of flipping two fair coins. These two coins can result in four possible outcomes. So, how do we measure the uncertainty of these outcomes using information entropy? Each bit can represent two possible states: 0 and 1. Therefore, flipping two coins gives us a total of 2 bits of entropy. This is the starting point of our journey into entropy.


What exactly is entropy? Entropy is a measure of the disorder or uncertainty within a system. Originally introduced in thermodynamics, this concept also plays a crucial role in information theory. Simply put, when the probability of all possible outcomes is equal, entropy reaches its maximum value. This helps us understand how uncertain a given system is.

...more
View all episodesView all episodes
Download on the App Store

bored ColleenBy Bored Colleen