
Sign up to save your podcasts
Or
Artificial Intelligence is radically changing how we work, learn, play and socialize, from virtual assistants helping organize our day to bots that can score Taylor Swift tickets or write college-level essays. But that vast computing capability may also come at a cost, generating results that are rife with bias if the data that was used to train AI systems is itself biased against or excludes certain groups of people. To counter this issue, we hear about the efforts of two engineering and computer science doctoral students in the Pacific Northwest.
At the University of Washington, Kate Glazko led a team of researchers on a study that found that the popular AI application ChatGPT routinely ranked job seekers lower if their CVs mentioned an award or recognition that implied they had a disability such as autism or blindness. At Oregon State University, Eric Slyman developed computing instructions that can be used to train AI to be less biased against marginalized groups when generating image search results. Slyman and Glazko join us for more details.
4.5
261261 ratings
Artificial Intelligence is radically changing how we work, learn, play and socialize, from virtual assistants helping organize our day to bots that can score Taylor Swift tickets or write college-level essays. But that vast computing capability may also come at a cost, generating results that are rife with bias if the data that was used to train AI systems is itself biased against or excludes certain groups of people. To counter this issue, we hear about the efforts of two engineering and computer science doctoral students in the Pacific Northwest.
At the University of Washington, Kate Glazko led a team of researchers on a study that found that the popular AI application ChatGPT routinely ranked job seekers lower if their CVs mentioned an award or recognition that implied they had a disability such as autism or blindness. At Oregon State University, Eric Slyman developed computing instructions that can be used to train AI to be less biased against marginalized groups when generating image search results. Slyman and Glazko join us for more details.
9,090 Listeners
3,892 Listeners
38,121 Listeners
999 Listeners
25 Listeners
6,622 Listeners
222 Listeners
14,491 Listeners
135 Listeners
4,631 Listeners
111,191 Listeners
55,990 Listeners
4 Listeners
10,068 Listeners
4,200 Listeners
15,916 Listeners
5,950 Listeners
963 Listeners
15,059 Listeners
216 Listeners
170 Listeners