
Sign up to save your podcasts
Or
Despite our best efforts, sometimes what we say we want isn’t precisely what we mean. Nowhere is that felt more acutely than when we’re giving instructions to a machine, whether that’s coding or offering examples to machine learning systems. We unpack the alignment problem in the latest installment of our oral history project.
We Meet:
Brian Christian, University of Oxford researcher, and author of The Alignment Problem
Credits:
This episode of SHIFT was produced by Jennifer Strong with help from Emma Cillekens. It was mixed by Garret Lang, with original music from him and Jacob Gorski. Art by Meg Marco.
4.4
3131 ratings
Despite our best efforts, sometimes what we say we want isn’t precisely what we mean. Nowhere is that felt more acutely than when we’re giving instructions to a machine, whether that’s coding or offering examples to machine learning systems. We unpack the alignment problem in the latest installment of our oral history project.
We Meet:
Brian Christian, University of Oxford researcher, and author of The Alignment Problem
Credits:
This episode of SHIFT was produced by Jennifer Strong with help from Emma Cillekens. It was mixed by Garret Lang, with original music from him and Jacob Gorski. Art by Meg Marco.
1,643 Listeners
1,080 Listeners
3,138 Listeners
9,538 Listeners
656 Listeners
1,531 Listeners
1,286 Listeners
5,443 Listeners
262 Listeners
5,505 Listeners
130 Listeners
213 Listeners
547 Listeners
163 Listeners
1,294 Listeners