The Jefferson Exchange

OSU researcher works to screen the bias out of AI


Listen Later

(Eric Slyman Photo by Johanna Carson.)

Artificial Intelligence has caused concerns in the abstract since the first "Terminator" movie, and maybe before that. We'll put aside the end-of-the-world stuff for a bit and focus on the biases that can show up in AI... because AI is set up by humans, and they have biases.

An Oregon State University researcher, working with the software company Adobe, created a training technique for AI that can filter out some biases. The conversation can go way over our heads, but Eric Slyman keeps it down-to-Earth and understandable in his chat with the JX.

...more
View all episodesView all episodes
Download on the App Store

The Jefferson ExchangeBy Mike Green