
AI bias occurs when systems produce systematically unfair outcomes for certain groups — often reflecting and amplifying biases already present in training data or system design.
Amazon scrapped an AI recruiting tool in 2018 after discovering it systematically downgraded resumes from women because it was trained on historical hiring data dominated by male candidates.
Reference:
TaskLoco™ — The Sticky Note GOAT