
Sign up to save your podcasts
Or


Despite our best efforts, sometimes what we say we want isn’t precisely what we mean. Nowhere is that felt more acutely than when we’re giving instructions to a machine, whether that’s coding or offering examples to machine learning systems. We unpack the alignment problem in the latest installment of our oral history project.
We Meet:
Brian Christian, University of Oxford researcher, and author of The Alignment Problem
Credits:
This episode of SHIFT was produced by Jennifer Strong with help from Emma Cillekens. It was mixed by Garret Lang, with original music from him and Jacob Gorski. Art by Meg Marco.
By Jennifer Strong4.4
3131 ratings
Despite our best efforts, sometimes what we say we want isn’t precisely what we mean. Nowhere is that felt more acutely than when we’re giving instructions to a machine, whether that’s coding or offering examples to machine learning systems. We unpack the alignment problem in the latest installment of our oral history project.
We Meet:
Brian Christian, University of Oxford researcher, and author of The Alignment Problem
Credits:
This episode of SHIFT was produced by Jennifer Strong with help from Emma Cillekens. It was mixed by Garret Lang, with original music from him and Jacob Gorski. Art by Meg Marco.

9,513 Listeners

1,647 Listeners

1,095 Listeners

3,144 Listeners

640 Listeners

1,612 Listeners

1,285 Listeners

5,563 Listeners

262 Listeners

5,510 Listeners

130 Listeners

217 Listeners

630 Listeners

181 Listeners

1,432 Listeners