Bloomberg: GPT4 is racially discriminating
This is (unfortunately paywalled) interesting work by Bloomberg: They essentially created 1,000 resumes that were by and large identical in terms of experience, grades, qualifications etc and beyond that only changed the names to fall equally into 1 of 4 groups: Asians, Blacks, Hispanics, Whites, and for each of these 125 men and 125 women.
They then asked GPT3.5 (and also GPT4) to rank the candidates on most to least suitable for 4 different roles: HR manager, software engineer, retail and financial analyst. They ran this 1,000 times.
Open AI's GPT showed heavy racial bias, mostly significantly against black men (ranked as often as 32% as least suitable, which is an almost 2x deviation from the 'objective' 12.5%) and women, and also against white men.
None of this is truly surprising by my count, as the models are mere reflections of the data on which they are trained, which of course is as biased, because they are essentially trained on the internet, which hardly serves as a heaven of unbiased equality.
3
5 comments
Mathias Bock
4
Bloomberg: GPT4 is racially discriminating
Generative AI
Public group
Learn and Master Generative AI, tools and programming with practical applications at work or business. Embrace the future – join us now!
powered by