
Sign up to save your podcasts
Or


Audio note: this article contains 182 uses of latex notation, so the narration may be difficult to follow. There's a link to the original text in the episode description.
Suppose random variables _X_1_ and _X_2_ contain approximately the same information about a third random variable _Lambda_, i.e. both of the following diagrams are satisfied to within approximation _epsilon_:
"Red" for redundancyWe call _Lambda_ a "redund" over _X_1, X_2_, since conceptually, any information _Lambda_ contains about _X_ must be redundantly represented in both _X_1_ and _X_2_ (to within approximation).
Here's an intuitive claim which is surprisingly tricky to prove: suppose we construct a new variable Lambda' by sampling from _P[Lambda|X_2]_, so the new joint distribution is
_P[X_1 = x_1, X_2 = x_2, Lambda' = lambda'] = P[X_1 = x_1, X_2 = x_2]P[Lambda = lambda' | X_2 = x_2]_
By construction, this "resampled" variable satisfies one of the [...]
---
Outline:
(02:07) Notation
(02:27) Proof
(03:36) Step 1: Scaling Down The Errors
(05:37) Step 2: Second Order Approximation
(05:42) Validity
(06:40) Expansion
(07:36) Step 3: Good Ol Euclidean Geometry
(07:59) Jensen
(09:12) Euclidean Distances
(10:11) Empirical Results and Room for Improvement
The original text contained 2 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
By LessWrong
Audio note: this article contains 182 uses of latex notation, so the narration may be difficult to follow. There's a link to the original text in the episode description.
Suppose random variables _X_1_ and _X_2_ contain approximately the same information about a third random variable _Lambda_, i.e. both of the following diagrams are satisfied to within approximation _epsilon_:
"Red" for redundancyWe call _Lambda_ a "redund" over _X_1, X_2_, since conceptually, any information _Lambda_ contains about _X_ must be redundantly represented in both _X_1_ and _X_2_ (to within approximation).
Here's an intuitive claim which is surprisingly tricky to prove: suppose we construct a new variable Lambda' by sampling from _P[Lambda|X_2]_, so the new joint distribution is
_P[X_1 = x_1, X_2 = x_2, Lambda' = lambda'] = P[X_1 = x_1, X_2 = x_2]P[Lambda = lambda' | X_2 = x_2]_
By construction, this "resampled" variable satisfies one of the [...]
---
Outline:
(02:07) Notation
(02:27) Proof
(03:36) Step 1: Scaling Down The Errors
(05:37) Step 2: Second Order Approximation
(05:42) Validity
(06:40) Expansion
(07:36) Step 3: Good Ol Euclidean Geometry
(07:59) Jensen
(09:12) Euclidean Distances
(10:11) Empirical Results and Room for Improvement
The original text contained 2 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.

26,311 Listeners

2,461 Listeners

8,597 Listeners

4,170 Listeners

97 Listeners

1,608 Listeners

10,041 Listeners

97 Listeners

528 Listeners

5,529 Listeners

16,055 Listeners

570 Listeners

138 Listeners

93 Listeners

473 Listeners