考研英语水平的进步,不仅要记单词,还需要阅读外语文献等资料。接下来,小编为2024考研者们,整理出——2024考研英语同源外刊7月:为什么老给你推类似短视频?供考生参考。
2024考研英语同源外刊7月:为什么老给你推类似短视频?
From Thanksgiving dinner conversations to pop culture discourse, it’s easy to feel like individuals of different political ideologies are occupying completely separate worlds, especially online. People often blame algorithms—the invisible sets of rules that shape online landscapes, from social media to search engines—for cordoning use off into digital “filter bubbles” by feeding us content that reinforces our preexisting world view.
无论是感恩节晚餐聊天还是谈论流行文化,很容易感受到不同政治意识形态的人在完全不同的世界中分居一隅,尤其是在网上。人们经常指责算法是罪魁祸首——算法是构筑了社交媒体、搜索引擎等网络地区的无形的规则——通过给我们提供强化原有世界观的内容,把对网络的使用权隔离在数字化的“过滤气泡”中。
Algorithms are always biased: Studies have shown that Facebook ads target particular racial and gender demographics. Dating apps select for matches based on a user’s previous swipe history. And search engines prioritize links based on what they deem most relevant. But according to new research, not every algorithm drives political polarization.
算法总是有失偏颇的:研究表明,Facebook的广告针对特定的种族和性别人群。约会软件会根据用户之前的刷屏记录来选择匹配对象。搜索引擎根据他们认为较相关的内容对链接进行先排序。但根据一项新的研究,并非所有算法都会导致政治两极分化。
Most studies analyzing algorithm-driven political polarization have focused on social media platforms such as Twitter and Facebook rather than search engines. That’s because, until recently, it’s been easier for researchers to obtain usable data from social media sites with their public-facing software interfaces. But Ognyanova and her co-authors found a way around this problem. Rather than relying on anonymized public data, they sent volunteers a browser extension that logged all of their Google search results — and the links they followed from those pages—over the course of several months. Ultimately, the team found that a quick Google search did not serve users a selection of news stories based on their political leanings. Instead strongly partisan users were more likely to click on partisan links that fit with their preexisting worldview.
大多数分析算法导致的政治两极分化的研究关注的重点都放在了推特和脸书等社交媒体平台而非搜索引擎上。这是因为,截至目前,研究人员从拥有面向公众的软件界面的社交媒体网站上获取可用数据更加容易。但奥格尼扬诺娃和她的合著者找到了解决这个问题的方法。研究小组发现,一个快速的谷歌搜索并不能根据用户的政治倾向有选择地提供新闻报道。相反,有强烈党派倾向的用户更有可能点击符合他们原有世界观的有党派倾向性的链接。
This doesn’t mean that Google’s algorithm is faultless. The researchers noticed that unreliable or downright misleading news sources still popped up in the results, regardless of whether or not users interacted with them. “There’s also other contexts where Google has done pretty problematic stuff,” Robertson says, including dramatically underrepresenting women of color in its image search results.
这并不意味着谷歌的算法完全没有问题。研究人员注意到,不管用户是否与之互动,搜索结果仍然会弹出不可靠或彻头彻尾误导人的新闻。罗伯逊说:“在其他情境下谷歌也做了一些相当有问题的事。”比如,在图片搜索结果中有色人种女性的比例严重不足。
Still, the findings reflect a growing body of research that suggests that the role of algorithms in creating political bubbles might be overstated. “I’m not against blaming platforms,” Trielli says. “But it’s kind of disconcerting to know that it’s not just about making sure that platforms behave well. Our personal motivations to filter what we read to fit our political biases remains strong.”
不过,这些发现反映出,越来越多的研究表明,算法在制造政治泡沫方面的作用可能被夸大了。特里利表示:“我并不反对指责平台。但令人不安的是,这件事不仅是确保平台遵守规矩这么简单。我们自己过滤符合自身政治偏向性内容的个人动机仍然很强。”
词汇:
1. discourse
/ˈdɪskɔːs/
n. 演讲,论述; 谈话,交流; 话语,语篇
v. 论述; 交谈;演奏出
2. swipe
/swaɪp/
v. (挥动手臂或物体)击打,猛击; 偷窃; 刷(磁卡); 滑动
n. 批评,抨击; 挥击,抡打; 滑动
3. downright
/ˈdaʊnˌraɪt/
adj. (强调反面的或令人不快的事物)彻头彻尾的,十足的; (人的举止或行为)直截了当的,生硬的
adv. (强调反面的或令人不快的事物)彻头彻尾地,十足地
4. disconcert
/ˌdɪskənˈsɜːt/
v. 使不安,使困惑,使尴尬
综上是“2024考研英语同源外刊7月:为什么老给你推类似短视频?”,希望对备战2024考研考生们有所帮助!让我们乘风破浪,终抵彼岸,考研加油!
推荐阅读: