
Sign up to save your podcasts
Or


-I can be a nightmare conference attendee: I tend to ask nitpicky questions and apply a dose of skepticism to a speaker's claims which is healthy in doing one's own research, but probably not optimal when everyone else is trying to follow a talk. I'm working on being better at this, but for now I blame my background.
There is one nitpick that comes up again and again. In fact in one conference I brought it up so often that Jake Mendel coined a term for it: "Dmitry's koan".
In koan form, the nitpick is as follows:
There is no such thing as interpreting a neural network. There is only interpreting a neural network at a given scale of precision.
On its face, this observation is true but a bit banal. Indeed there are two extremes:
---
Outline:
(04:13) Elucidating the spectrum of precision
(04:17) Step 1: coming to terms with imprecision
(10:57) Step 2: Factoring in the memorization-generalization spectrum
(18:31) Natural scale and natural degradation
(18:56) Sometimes reconstruction loss is not the point
(20:10) Degradation as a dial
(27:49) Natural scale
(30:31) Natural degradation
(35:08) Possible issues
(36:08) Experiment suggestions
The original text contained 12 footnotes which were omitted from this narration.
---
First published:
Source:
Narrated by TYPE III AUDIO.
By LessWrong-I can be a nightmare conference attendee: I tend to ask nitpicky questions and apply a dose of skepticism to a speaker's claims which is healthy in doing one's own research, but probably not optimal when everyone else is trying to follow a talk. I'm working on being better at this, but for now I blame my background.
There is one nitpick that comes up again and again. In fact in one conference I brought it up so often that Jake Mendel coined a term for it: "Dmitry's koan".
In koan form, the nitpick is as follows:
There is no such thing as interpreting a neural network. There is only interpreting a neural network at a given scale of precision.
On its face, this observation is true but a bit banal. Indeed there are two extremes:
---
Outline:
(04:13) Elucidating the spectrum of precision
(04:17) Step 1: coming to terms with imprecision
(10:57) Step 2: Factoring in the memorization-generalization spectrum
(18:31) Natural scale and natural degradation
(18:56) Sometimes reconstruction loss is not the point
(20:10) Degradation as a dial
(27:49) Natural scale
(30:31) Natural degradation
(35:08) Possible issues
(36:08) Experiment suggestions
The original text contained 12 footnotes which were omitted from this narration.
---
First published:
Source:
Narrated by TYPE III AUDIO.

112,586 Listeners

130 Listeners

7,219 Listeners

531 Listeners

16,096 Listeners

4 Listeners

14 Listeners

2 Listeners