
Sign up to save your podcasts
Or


Audio note: this article contains 212 uses of latex notation, so the narration may be difficult to follow. There's a link to the original text in the episode description.
Our posts on natural latents have involved two distinct definitions, which we call "stochastic" and "deterministic" natural latents. We conjectured that, whenever there exists a stochastic natural latent (to within some approximation), there also exists a deterministic natural latent (to within a comparable approximation). Four months ago, we put up a bounty to prove this conjecture.
We've been bottlenecked pretty hard on this problem, and spent most of the last four months attacking it. At long last, we have a proof. As hoped, the proof comes with some qualitative new insights about natural latents, and we expect it will unbottleneck a bunch of future work. The main purpose of this post is to present the proof.
This post [...]
---
Outline:
(01:14) Recap: What Was The Problem Again?
(01:22) Some Intuition From The Exact Case
(02:53) The Problem
(02:56) Stochastic Natural Latents
(03:59) Deterministic Natural Latents
(05:02) The Goal
(05:22) The Proof
(05:25) Key Ideas
(06:39) Math
(06:42) Assumptions & Preconditions
(07:42) Resampling Conserves Naturality
(09:47) Pareto Minimization - Single Objective Minimization
(11:38) Lagrangian & First Order Conditions
(12:41) Putting The Pieces Together & Solving The Equations
(15:26) A (Non-Strict) Pareto Improvement Via Coarse Graining
(17:28) Finally, A Deterministic Natural Latent
(18:03) Can we do better?
(18:26) Whats Next?
The original text contained 2 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
By LessWrong
Audio note: this article contains 212 uses of latex notation, so the narration may be difficult to follow. There's a link to the original text in the episode description.
Our posts on natural latents have involved two distinct definitions, which we call "stochastic" and "deterministic" natural latents. We conjectured that, whenever there exists a stochastic natural latent (to within some approximation), there also exists a deterministic natural latent (to within a comparable approximation). Four months ago, we put up a bounty to prove this conjecture.
We've been bottlenecked pretty hard on this problem, and spent most of the last four months attacking it. At long last, we have a proof. As hoped, the proof comes with some qualitative new insights about natural latents, and we expect it will unbottleneck a bunch of future work. The main purpose of this post is to present the proof.
This post [...]
---
Outline:
(01:14) Recap: What Was The Problem Again?
(01:22) Some Intuition From The Exact Case
(02:53) The Problem
(02:56) Stochastic Natural Latents
(03:59) Deterministic Natural Latents
(05:02) The Goal
(05:22) The Proof
(05:25) Key Ideas
(06:39) Math
(06:42) Assumptions & Preconditions
(07:42) Resampling Conserves Naturality
(09:47) Pareto Minimization - Single Objective Minimization
(11:38) Lagrangian & First Order Conditions
(12:41) Putting The Pieces Together & Solving The Equations
(15:26) A (Non-Strict) Pareto Improvement Via Coarse Graining
(17:28) Finally, A Deterministic Natural Latent
(18:03) Can we do better?
(18:26) Whats Next?
The original text contained 2 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.

26,392 Listeners

2,423 Listeners

8,623 Listeners

4,151 Listeners

92 Listeners

1,585 Listeners

9,830 Listeners

89 Listeners

488 Listeners

5,469 Listeners

16,091 Listeners

537 Listeners

133 Listeners

96 Listeners

502 Listeners