> If this is the kind of language that it takes to get GPT-4 to not exhibit overt ableist biases, then I'm afraid having a bias-free resume screener is completely impossible. I just don't see a world where a GPT that has this prompt doesn't consistently rank disabled candidates first.
OF COURSE it's impossible. We're trying to emulate human learning to make natural selections, but bias is an incredibly human error.
I'd say bias is a core mechanism that actually enables us to make decisions in the first place. The issue is that different persons value decisions differently, due to their background, circumstances, etc.
OF COURSE it's impossible. We're trying to emulate human learning to make natural selections, but bias is an incredibly human error.