{"id":2807,"date":"2021-02-03T20:11:04","date_gmt":"2021-02-03T20:11:04","guid":{"rendered":"https:\/\/thenextweb.com\/?p=1337440"},"modified":"2021-02-03T20:11:04","modified_gmt":"2021-02-03T20:11:04","slug":"scientists-invented-an-ai-to-detect-racist-people","status":"publish","type":"post","link":"https:\/\/www.londonchiropracter.com\/?p=2807","title":{"rendered":"Scientists invented an AI to detect racist people"},"content":{"rendered":"\n<div><img decoding=\"async\" src=\"https:\/\/img-cdn.tnwcdn.com\/image\/neural?filter_last=1&amp;fit=1280%2C640&amp;url=https%3A%2F%2Fcdn0.tnwcdn.com%2Fwp-content%2Fblogs.dir%2F1%2Ffiles%2F2019%2F07%2Fspiderman-spot.png&amp;signature=83d25bf648a7cb9f6b39b545e000b173\" class=\"ff-og-image-inserted\"><\/div>\n<p>A team of researchers at the University of Virginia have developed an AI system that attempts to detect and quantify the physiological signs associated with racial bias. In other words, they\u2019re building a wearable device that tries to identify when you\u2019re having racist thoughts.<\/p>\n<p><b>Up front:<\/b> Nope. Machines can\u2019t tell if a person is <i>a racist<\/i><span>. They also can\u2019t tell if something someone has said or done is <\/span><i>racist. <\/i><span>And they certainly can\u2019t determine if you\u2019re thinking racist thoughts just by taking your pulse or measuring your O2 saturation levels with an Apple Watch-style device. <\/span><\/p>\n<p><span>That being said, this is fascinating research that could pave the way to a greater understanding of how unconscious bias and systemic racism fit together. <\/span><\/p>\n<p><b><span>How does it work? <\/span><\/b><\/p>\n<p><span>The current standard for identifying implicit racial bias uses something called the Implicit Association Test. Basically, you look at a series of images and words and try to associate&nbsp;them with \u201clight skin,\u201d \u201cdark skin,\u201d \u201cgood,\u201d and \u201cbad\u201d as quickly as possible. You can try it yourself <a href=\"https:\/\/implicit.harvard.edu\/implicit\/takeatest.html\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">here<\/a> on Harvard\u2019s website. <\/span><\/p>\n<p><span>There\u2019s also research indicating that <a href=\"https:\/\/www.nytimes.com\/2004\/04\/20\/health\/hard-wired-for-prejudice-experts-examine-human-response-to-outsiders.html\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">learned threat responses to outsiders<\/a> can often be measured physiologically. In other words, some people have a physical response to people who look different than them and we can measure it when they do.<\/span><\/p>\n<p><span>The UVA team combined these two ideas. They took a group of 76 volunteer students and had them take the Implicit Association Test while measuring their physiological responses with a wearable device. <\/span><\/p>\n<p><span>Finally, the meat of the study involved developing a machine learning system to evaluate the data and make inferences. Can identifying a specific combination of physiological responses really tell us if someone is, for lack of a better way to put it, experiencing <\/span><i>involuntary feelings of racism<\/i><span>? <\/span><\/p>\n<p><span>The answer\u2019s a muddy maybe. <\/span><\/p>\n<p><span>According to the team\u2019s research paper:<\/span><\/p>\n<blockquote readability=\"6\">\n<p><span>Our machine learning and statistical analysis show that implicit bias can be predicted from physiological signals with 76.1% accuracy.<\/span><\/p>\n<\/blockquote>\n<p><span>But that\u2019s not necessarily the bottom line. 76% accuracy is a low threshold for success in any machine learning endeavor. And flashing images of different colored cartoon faces isn\u2019t a 1:1 analogy for experiencing interactions with different races of people. <\/span><\/p>\n<p><span><b>Quick take: <\/b><\/span><span>Any ideas the general public might have over some kind of wand-style gadget for detecting racists should be dismissed outright. The UVA team\u2019s important work has nothing to do with developing a wearable that pings you every time you or someone around you experiences their own implicit biases. It\u2019s more about understanding the link between mental associations of dark skin color to badness and the accompanying physiological manifestations.<\/span><\/p>\n<p><span>In that respect, this novel research has the potential to help illuminate the subconscious thought processes behind, for example, radicalization and paranoia. It also has the potential to finally demonstrate how racism can be the result of unintended implicit bias from people who may even believe themselves to be allies. <\/span><\/p>\n<p><span>You don\u2019t have to <\/span><i>feel<\/i><span> like you\u2019re being racist to actually be racist, and this system could help researchers better understand and explain these concepts.<\/span><\/p>\n<p><span>But it absolutely doesn\u2019t actually <\/span><i>detect bias; <\/i><span>it predicts it, and that\u2019s different. And it certainly can\u2019t tell if someone\u2019s <\/span><i>a racist<\/i><span>. It shines a light on some of the physiological effects associated with implicit bias, much like a diagnostician would initially interpret a cough and a fever as being <\/span><i>associated<\/i><span> with certain diseases while still requiring further testing to confirm a diagnosis. This AI doesn\u2019t label racism or bias, it just points to some of the associated side effects. <\/span><\/p>\n<p><span>You can check out the whole pre-print paper <a href=\"https:\/\/arxiv.org\/ftp\/arxiv\/papers\/2102\/2102.01287.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">here<\/a> on arXiv. <\/span><\/p>\n<p class=\"c-post-pubDate\"> Published February 3, 2021 \u2014 20:11 UTC <\/p>\n<p> <a href=\"https:\/\/thenextweb.com\/neural\/2021\/02\/03\/scientists-invented-an-ai-to-detect-racist-people\/\">Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A team of researchers at the University of Virginia have developed an AI system that attempts to detect and quantify the physiological signs associated with racial bias. In other words, they\u2019re building&#8230;<\/p>\n","protected":false},"author":1,"featured_media":2808,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/2807"}],"collection":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2807"}],"version-history":[{"count":0,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/2807\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/media\/2808"}],"wp:attachment":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2807"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2807"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2807"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}