
Sign up to save your podcasts
Or
In this episode of In-Ear Insights, the Trust Insights podcast, Katie and Chris discuss how to mitigate gender bias in AI systems by assessing risk, implementing human review processes, and building an equitable company culture.
[podcastsponsor]
Watch the video here:
Can’t see anything? Watch it on YouTube here.
Listen to the audio here:
Download the MP3 audio here.
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.
In this week’s In-Ear Insights, let’s talk about gender bias and fairness, specifically, in the context of artificial intelligence, it is no secret, obviously, that large language models and diffuser models both have biases in them on things like gender, and race or ethnicity, background, religion, etc.
Because these models are cobbled together from the entirety of the public’s content on the internet.
And the reality is, these are in many ways, mirrors of humanity itself, and humanity contains biases and things to look at when we talk about gender bias and fairness, if you wouldn’t mind start off, in your mind, what is fairness? What does that mean to you, particularly since you’re of a different gender than I am? And how do you see that manifesting in the uses of artificial intelligence.
So at its core, fairness, is giving equal opportunity to any one person to any one gender to any one background, ethnicity.
So Chris, if you and I were, you know,
both presented with a speaking opportunity.
To be fair, we would both be given equal, you know, opportunity to, you know, apply for it to, you know, when the opportunity, whatever the thing is, unfortunately, what tends to happen is an unconscious bias or conscious bias of, well,
Chris is a man and Chris has been doing this longer.
And Chris has data scientist in his title, so therefore, he must be more well suited for this.
And I’ve never heard of Katie, even though she’s spoken at a bunch of events and doesn’t have scientist in her title.
So I’m going to pick Chris, because I know him versus Katie.
So that’s sort of where the unfairness comes in.
When you start to create gender bias.
That’s when you’re like, well, this audience is going to be more men.
So they probably respond better to a man speaking versus a woman.
And that’s, and so and then therefore, Chris, youR
5
99 ratings
In this episode of In-Ear Insights, the Trust Insights podcast, Katie and Chris discuss how to mitigate gender bias in AI systems by assessing risk, implementing human review processes, and building an equitable company culture.
[podcastsponsor]
Watch the video here:
Can’t see anything? Watch it on YouTube here.
Listen to the audio here:
Download the MP3 audio here.
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.
In this week’s In-Ear Insights, let’s talk about gender bias and fairness, specifically, in the context of artificial intelligence, it is no secret, obviously, that large language models and diffuser models both have biases in them on things like gender, and race or ethnicity, background, religion, etc.
Because these models are cobbled together from the entirety of the public’s content on the internet.
And the reality is, these are in many ways, mirrors of humanity itself, and humanity contains biases and things to look at when we talk about gender bias and fairness, if you wouldn’t mind start off, in your mind, what is fairness? What does that mean to you, particularly since you’re of a different gender than I am? And how do you see that manifesting in the uses of artificial intelligence.
So at its core, fairness, is giving equal opportunity to any one person to any one gender to any one background, ethnicity.
So Chris, if you and I were, you know,
both presented with a speaking opportunity.
To be fair, we would both be given equal, you know, opportunity to, you know, apply for it to, you know, when the opportunity, whatever the thing is, unfortunately, what tends to happen is an unconscious bias or conscious bias of, well,
Chris is a man and Chris has been doing this longer.
And Chris has data scientist in his title, so therefore, he must be more well suited for this.
And I’ve never heard of Katie, even though she’s spoken at a bunch of events and doesn’t have scientist in her title.
So I’m going to pick Chris, because I know him versus Katie.
So that’s sort of where the unfairness comes in.
When you start to create gender bias.
That’s when you’re like, well, this audience is going to be more men.
So they probably respond better to a man speaking versus a woman.
And that’s, and so and then therefore, Chris, youR
181 Listeners
4 Listeners