The world is designed around white men
Some things, you might think, are obvious. For example, if you design a device which shines light through someone’s fingertip to measure the oxygen level of their blood, then the colour of the skin through which that light is shining should be a factor when the device is calibrated.
But no. Research suggests that, with honourable exceptions, pulse oximeters, the machines which do this, overestimate oxygen levels three times more frequently (12% of the time) in people with black skin rather than white.
When this informs decisions on whom to admit to hospital during a pandemic, more black than white patients are sent home on the mistaken conclusion that their blood-oxygen levels are within a safe range. This could have fatal consequences.
The pulse oximeter is only the latest example of an approach to design which fails to recognise that human beings are different from one another.
Other recent medical cases include an algorithm that gave white patients in America priority over those from racial minorities, and the discovery that implants such as prosthetic hips and cardiac pacemakers cause problems more often in women than in men.
Beyond medicine, there are many examples of this phenomenon in information technology: systems that recognise white faces but not black ones; legal software which recommends harsher sentences for black criminals than white; voice-activated programs that work better for men than women. Even mundane things like car seat-belts have often been designed with men in mind rather than women.
The origin of such design bias is understandable, if not forgivable. In the West, which is still the source of most innovation, engineers have tended to be white and male. So have medical researchers. That leads to groupthink, quite possibly unconscious, in both inputs and in outputs.
Input bias is particularly responsible for the IT cock-ups. Much of what is commonly called artificial intelligence is actually machine learning. As with any learning, the syllabus determines the outcome.
Train software on white faces or men’s voices, and you will create a system that is focused on handling them well. More subtle biases are also in play, though. The faulty medical algorithm used prior medical spending as a proxy for current need.
But black Americans spend less on health care than whites, so it discriminated against them. Sentencing software may similarly conflate poor social circumstances with the propensity to reoffend.
lethal [ˈliːθl] adj. 致命的;致死的
prosthetic [prɑːsˈθetɪk] adj.假体的
mundane [mʌnˈdeɪn] adj. 世俗的;世界的
syllabus [ˈsɪləbəs] n. 教学大纲;课程表
conflate [kənˈfleɪt] v. 合并