考研英语阅读能力想提升,需要考生不断地日积月累阅读文章才能感受到进步!接下来,小编为帮助考生补习英语阅读能力的不足,特意整理——2022考研英语经济学人:一个为白人男性而设计的世界,供考生参考。
2022考研英语经济学人:一个为白人男性而设计的世界
The world is designed around white men
这个世界都是围绕白人男性而设计的
Some things, you might think, are obvious. For example, if you design a device which shines light through someone’s fingertip to measure the oxygen level of their blood, then the colour of the skin through which that light is shining should be a factor when the device is calibrated.
在你看来,有些事情似乎是显而易见的。例如,你设计了一种利用透过指尖的光线测量人们血氧含量的设备,当你在校准该设备时,你应该将肤色的影响考虑在内。
But no. Research suggests that, with honourable exceptions, pulse oximeters, the machines which do this, overestimate oxygen levels three times more frequently (12% of the time) in people with black skin rather than white.
但是并没有。研究表明,除了一些可敬的例外情况,脉搏血氧仪(一种用于测量血氧浓度的设备)对黑人血氧水平高估频率(12%)是白人的三倍。
When this informs decisions on whom to admit to hospital during a pandemic, more black than white patients are sent home on the mistaken conclusion that their blood-oxygen levels are within a safe range. This could have fatal consequences.
在疫情大流行期间,血氧水平决定着一位病人能否住院,因此有更多的黑人病人因血氧水平被错误地检测为在安全范围内而被医院拒收。这将导致致命的后果。
The pulse oximeter is only the latest example of an approach to design which fails to recognise that human beings are different from one another.
脉搏血氧仪只是设计方法未能认识到人与人之间差异的一个新的例证。
Other recent medical cases include an algorithm that gave white patients in America priority over those from racial minorities, and the discovery that implants such as prosthetic hips and cardiac pacemakers cause problems more often in women than in men.
其他医学上的例子还包括一种将美国白人患者的优先性置于少数族裔患者之前的算法,以及相比男性,安装在女性身体内的髋关节假体和心脏起搏器等植入物更容易出现问题。
Beyond medicine, there are many examples of this phenomenon in information technology: systems that recognise white faces but not black ones; legal software which recommends harsher sentences for black criminals than white; voice-activated programs that work better for men than women. Even mundane things like car seat-belts have often been designed with men in mind rather than women.
除了医学外,设计偏倚的现象在信息技术领域也很常见:人脸识别系统能够识别出白人面孔,但无法识别黑人面孔;法律软件对黑人罪犯的判罚建议比白人更严厉;声控程序对男性声音的识别率高于女性。即使像汽车安全带这类普通物件的设计往往也着重考虑了男性,而忽视了女性。
The origin of such design bias is understandable, if not forgivable. In the West, which is still the source of most innovation, engineers have tended to be white and male. So have medical researchers. That leads to groupthink, quite possibly unconscious, in both inputs and in outputs.
究其根源,这种设计偏倚虽然很难原谅,但也是可以理解的。在仍作为创新之源的西方,工程师往往都是白人和男性。医学研究人员亦是如此。这就导致了输入和输出过程中的群体思维(他们很可能是无意的)。
Input bias is particularly responsible for the IT cock-ups. Much of what is commonly called artificial intelligence is actually machine learning. As with any learning, the syllabus determines the outcome.
输入偏差是造成信息技术领域设计偏倚的主要原因。大多数被称为人工智能的技术实际上只是机器学习。和其他任何学习一样,教学大纲决定了结果。
Train software on white faces or men’s voices, and you will create a system that is focused on handling them well. More subtle biases are also in play, though. The faulty medical algorithm used prior medical spending as a proxy for current need.
用白人的面孔或男性的声音培训生成的软件,将会创造一个专注于服务他们的系统。不过,更细微的偏见也在起作用。错误的医疗算法会依照过去的医疗支出形成当前的需求。
But black Americans spend less on health care than whites, so it discriminated against them. Sentencing software may similarly conflate poor social circumstances with the propensity to reoffend.
但是,美国黑人在医疗保健上的花费要比白人少,所以这是对他们的歧视。类似地,量刑软件也可能会将恶劣的社会环境与再犯罪倾向混为一谈。
重难点词汇:
lethal [ˈliːθl] adj. 致命的;致死的
prosthetic [prɑːsˈθetɪk] adj.假体的
mundane [mʌnˈdeɪn] adj. 世俗的;世界的
syllabus [ˈsɪləbəs] n. 教学大纲;课程表
conflate [kənˈfleɪt] v. 合并
以上是“2022考研英语经济学人:一个为白人男性而设计的世界”,希望对2022考研考生们,在英语阅读上面有所帮助!祝2022考研顺利!
推荐阅读: