LessWrong (Curated & Popular)

"Introduction to abstract entropy" by Alex Altair


Listen Later

https://www.lesswrong.com/posts/REA49tL5jsh69X3aM/introduction-to-abstract-entropy#fnrefpi8b39u5hd7

This post, and much of the following sequence, was greatly aided by feedback from the following people (among others): Lawrence Chan, Joanna Morningstar, John Wentworth, Samira Nedungadi, Aysja Johnson, Cody Wild, Jeremy Gillen, Ryan Kidd, Justis Mills and Jonathan Mustin. Illustrations by Anne Ore.

Introduction & motivation

In the course of researching optimization, I decided that I had to really understand what entropy is.[1] But there are a lot of other reasons why the concept is worth studying:

  • Information theory:
    • Entropy tells you about the amount of information in something.
    • It tells us how to design optimal communication protocols.
    • It helps us understand strategies for (and limits on) file compression.
  • Statistical mechanics:
    • Entropy tells us how macroscopic physical systems act in practice.
    • It gives us the heat equation.
    • We can use it to improve engine efficiency.
    • It tells us how hot things glow, which led to the discovery of quantum mechanics.
  • Epistemics (an important application to me and many others on LessWrong):
    • The concept of entropy yields the maximum entropy principle, which is extremely helpful for doing general Bayesian reasoning.
  • Entropy tells us how "unlikely" something is and how much we would have to fight against nature to get that outcome (i.e. optimize).
  • It can be used to explain the arrow of time.
  • It is relevant to the fate of the universe.
  • And it's also a fun puzzle to figure out!

I didn't intend to write a post about entropy when I started trying to understand it. But I found the existing resources (textbooks, Wikipedia, science explainers) so poor that it actually seems important to have a better one as a prerequisite for understanding optimization! One failure mode I was running into was that other resources tended only to be concerned about the application of the concept in their particular sub-domain. Here, I try to take on the task of synthesizing the abstract concept of entropy, to show what's so deep and fundamental about it. In future posts, I'll talk about things like:

...more
View all episodesView all episodes
Download on the App Store

LessWrong (Curated & Popular)By LessWrong

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

12 ratings


More shows like LessWrong (Curated & Popular)

View all
Macro Voices by Hedge Fund Manager Erik Townsend

Macro Voices

3,072 Listeners

Odd Lots by Bloomberg

Odd Lots

1,932 Listeners

EconTalk by Russ Roberts

EconTalk

4,264 Listeners

Conversations with Tyler by Mercatus Center at George Mason University

Conversations with Tyler

2,455 Listeners

Philosophy Bites by Edmonds and Warburton

Philosophy Bites

1,547 Listeners

ChinaTalk by Jordan Schneider

ChinaTalk

288 Listeners

ManifoldOne by Steve Hsu

ManifoldOne

95 Listeners

Machine Learning Street Talk (MLST) by Machine Learning Street Talk (MLST)

Machine Learning Street Talk (MLST)

96 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

519 Listeners

Clearer Thinking with Spencer Greenberg by Spencer Greenberg

Clearer Thinking with Spencer Greenberg

138 Listeners

Razib Khan's Unsupervised Learning by Razib Khan

Razib Khan's Unsupervised Learning

209 Listeners

"Econ 102" with Noah Smith and Erik Torenberg by Turpentine

"Econ 102" with Noah Smith and Erik Torenberg

151 Listeners

Money Stuff: The Podcast by Bloomberg

Money Stuff: The Podcast

393 Listeners

Complex Systems with Patrick McKenzie (patio11) by Patrick McKenzie

Complex Systems with Patrick McKenzie (patio11)

134 Listeners

The Marginal Revolution Podcast by Mercatus Center at George Mason University

The Marginal Revolution Podcast

96 Listeners