
Sign up to save your podcasts
Or
Despite our best efforts, sometimes what we say we want isn’t precisely what we mean. Nowhere is that felt more acutely than when we’re giving instructions to a machine, whether that’s coding or offering examples to machine learning systems. We unpack the alignment problem in the latest installment of our oral history project.
We Meet:
Brian Christian, University of Oxford researcher, and author of The Alignment Problem
Credits:
This episode of SHIFT was produced by Jennifer Strong with help from Emma Cillekens. It was mixed by Garret Lang, with original music from him and Jacob Gorski. Art by Meg Marco.
4.4
3030 ratings
Despite our best efforts, sometimes what we say we want isn’t precisely what we mean. Nowhere is that felt more acutely than when we’re giving instructions to a machine, whether that’s coding or offering examples to machine learning systems. We unpack the alignment problem in the latest installment of our oral history project.
We Meet:
Brian Christian, University of Oxford researcher, and author of The Alignment Problem
Credits:
This episode of SHIFT was produced by Jennifer Strong with help from Emma Cillekens. It was mixed by Garret Lang, with original music from him and Jacob Gorski. Art by Meg Marco.
426 Listeners
30,898 Listeners
32,202 Listeners
1,030 Listeners
323 Listeners
405 Listeners
9,553 Listeners
190 Listeners
258 Listeners
5,422 Listeners
106 Listeners
71 Listeners
457 Listeners
312 Listeners
253 Listeners