
Sign up to save your podcasts
Or
A Surprising Development in the Study of Multi-layer Parameterized Graphical Function Approximators
As a programmer and epistemology enthusiast, I've been studying some statistical modeling techniques lately! It's been boodles of fun, and might even prove useful in a future dayjob if I decide to pivot my career away from the backend web development roles I've taken in the past.
More specifically, I've mostly been focused on multi-layer parameterized graphical function approximators, which map inputs to outputs via a sequence of affine transformations composed with nonlinear "activation" functions.
(Some authors call these "deep neural networks" for some reason, but I like my name better.)
It's a curve-fitting technique: by setting the multiplicative factors and additive terms appropriately, multi-layer parameterized graphical function approximators can approximate any function. For a popular choice of "activation" rule which takes the maximum of the input and [...]
---
Outline:
(00:07) A Surprising Development in the Study of Multi-layer Parameterized Graphical Function Approximators
(04:00) Multi-layer Parameterized Graphical Function Approximators Have Many Exciting Applications
(06:05) An Example of Applying Multi-layer Parameterized Graphical Function Approximators in Success-Antecedent Computation Boosting
(10:35) Risks From Learned Approximation
---
First published:
Source:
Linkpost URL:
http://zackmdavis.net/blog/2024/03/deep-learning-is-function-approximation/
Narrated by TYPE III AUDIO.
A Surprising Development in the Study of Multi-layer Parameterized Graphical Function Approximators
As a programmer and epistemology enthusiast, I've been studying some statistical modeling techniques lately! It's been boodles of fun, and might even prove useful in a future dayjob if I decide to pivot my career away from the backend web development roles I've taken in the past.
More specifically, I've mostly been focused on multi-layer parameterized graphical function approximators, which map inputs to outputs via a sequence of affine transformations composed with nonlinear "activation" functions.
(Some authors call these "deep neural networks" for some reason, but I like my name better.)
It's a curve-fitting technique: by setting the multiplicative factors and additive terms appropriately, multi-layer parameterized graphical function approximators can approximate any function. For a popular choice of "activation" rule which takes the maximum of the input and [...]
---
Outline:
(00:07) A Surprising Development in the Study of Multi-layer Parameterized Graphical Function Approximators
(04:00) Multi-layer Parameterized Graphical Function Approximators Have Many Exciting Applications
(06:05) An Example of Applying Multi-layer Parameterized Graphical Function Approximators in Success-Antecedent Computation Boosting
(10:35) Risks From Learned Approximation
---
First published:
Source:
Linkpost URL:
http://zackmdavis.net/blog/2024/03/deep-learning-is-function-approximation/
Narrated by TYPE III AUDIO.
26,446 Listeners
2,389 Listeners
7,910 Listeners
4,136 Listeners
87 Listeners
1,462 Listeners
9,095 Listeners
87 Listeners
389 Listeners
5,432 Listeners
15,174 Listeners
474 Listeners
121 Listeners
75 Listeners
461 Listeners