{"id":1907,"date":"2020-12-22T13:00:25","date_gmt":"2020-12-22T13:00:25","guid":{"rendered":"https:\/\/thenextweb.com\/?p=1332148"},"modified":"2020-12-22T13:00:25","modified_gmt":"2020-12-22T13:00:25","slug":"algorithms-behaving-badly-2020-edition","status":"publish","type":"post","link":"https:\/\/www.londonchiropracter.com\/?p=1907","title":{"rendered":"Algorithms behaving badly: 2020 edition"},"content":{"rendered":"\n<div><img decoding=\"async\" src=\"https:\/\/img-cdn.tnwcdn.com\/image\/neural?filter_last=1&amp;fit=1280%2C640&amp;url=https%3A%2F%2Fcdn0.tnwcdn.com%2Fwp-content%2Fblogs.dir%2F1%2Ffiles%2F2020%2F12%2F1-copy-52.jpg&amp;signature=e923d7faa36b6c2bc9516ed01a3a328e\" class=\"ff-og-image-inserted\"><\/div>\n<p>The perils of leaving important decisions to computer algorithms are pretty easily imagined (see, e.g., \u201cMinority Report,\u201d \u201cI, Robot,\u201d \u201cWar Games\u201d). In recent years, however, algorithms\u2019&nbsp; job descriptions have only grown.<\/p>\n<p>They are replacing humans when it comes to making tough decisions that companies and government agencies prefer to say are grounded in statistics and formulas rather than the jumbled calculations of a human brain. Some health insurers use algorithms to determine <a href=\"https:\/\/themarkup.org\/ask-the-markup\/2020\/03\/03\/healthcare-algorithms-robot-medicine\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">who gets medical care<\/a> and in what order of priority, instead of leaving that choice to doctors. Colleges use them to <a href=\"https:\/\/www.usnews.com\/education\/best-colleges\/articles\/how-admissions-algorithms-could-affect-your-college-acceptance\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">decide which applicants to admit<\/a>. And prototypes of self-driving cars use them to weigh how to <a href=\"https:\/\/insights.techreview.com\/who-should-decide-how-algorithms-decide\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">minimize harm during a traffic accident<\/a>.<\/p>\n<p>Some of that computational outsourcing springs from high hopes\u2014that computer algorithms would <a href=\"https:\/\/www.nytimes.com\/2020\/09\/18\/business\/digital-mortgages.html\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">take bias<\/a> out of the lending process, for instance, or <a href=\"https:\/\/www.nature.com\/articles\/d41591-020-00027-9\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">help researchers develop a safe COVID-19 vaccine<\/a> in record time.<\/p>\n<p>But it\u2019s been proven again and again that formulas inherit the biases of their creators. An algorithm is only as good as the data and principles that train it, and a person or people are largely in charge of what it\u2019s fed.<\/p>\n<p>Every year there are myriad new examples of algorithms that were either created for a cynical purpose, functioned to reinforce racism, or spectacularly failed to fix the problems they were built to solve. We know about most of them because whistleblowers, journalists, advocates, and academics took the time to dig into a black box of computational decision-making and found some dark materials.<\/p>\n<p>Here are some big ones from 2020.<\/p>\n<h2><strong>The racism problem<\/strong><\/h2>\n<p>A lot of problems with algorithmic decision-making come down to bias, but some instances are more explicit than others. The Markup <a href=\"https:\/\/themarkup.org\/google-the-giant\/2020\/07\/23\/google-advertising-keywords-black-girls\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">reported<\/a> that Google\u2019s ad portal connects the keywords \u201cBlack girls,\u201d \u201cAsian girls,\u201d and \u201cLatina girls\u201d (but not \u201cWhite girls\u201d) to porn. (Google blocked the automated suggestions after The Markup reached out to the company. In the meantime, Googles\u2019 search algorithm briefly sent our story to the first page of search results for the word \u201cporn.\u201d)<\/p>\n<p>Sometimes the consequences of such bias can be severe.<\/p>\n<p>Some medical algorithms are racially biased\u2014deliberately. A paper in the New England Journal of Medicine identified 13 examples of <a href=\"https:\/\/www.nejm.org\/doi\/full\/10.1056\/NEJMms2004740\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">race \u201ccorrections\u201d<\/a> integrated into tools used by doctors to determine who receives certain medical interventions, like heart surgery, antibiotics for urinary tract infections, and screenings for breast cancer. The tools assume patients of different races are at different risks for certain diseases\u2014assumptions not always well-grounded in science, according to the researchers. The result: <a href=\"https:\/\/www.consumerreports.org\/medical-tests\/medical-algorithms-have-a-race-problem\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">a Black man who needs a kidney transplant<\/a> was deemed not eligible, as Consumer Reports reported, among other disasters.<\/p>\n<p>A related issue emerged in a lawsuit against the National Football League: Black players allege it\u2019s much <a href=\"https:\/\/www.nytimes.com\/2020\/08\/25\/sports\/football\/nfl-concussion-racial-bias.html\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">harder to receive compensation for concussion-related dementia<\/a> because of the way the league evaluates neurocognitive function. Essentially, they say, the league assumes Black players inherently have lower cognitive function than White players and weighs their eligibility for payouts accordingly.<\/p>\n<h2><strong>Algorithms that make renters\u2019 and lower-income people\u2019s lives more difficult<\/strong><\/h2>\n<p>If you\u2019ve ever rented a home, and the chances are you have, as renting has skyrocketed since the 2008 financial crisis, a landlord has likely run you through a tenant screening service. Whatever results the background check algorithms spit out generally constitute the difference between getting to rent the home in question and getting denied\u2014and, The Markup found, those reports are <a href=\"https:\/\/themarkup.org\/locked-out\/2020\/05\/28\/access-denied-faulty-automated-background-checks-freeze-out-renters\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">often faulty<\/a>. The computer-generated reports confuse identities, misconstrue minor run-ins with law enforcement as criminal records, and misreport evictions. And what little oversight exists typically comes too late for the wrongfully denied.<\/p>\n<p>Similarly, MIT Technology Review reported, lawyers who work with low-income people are finding themselves <a href=\"https:\/\/www.technologyreview.com\/2020\/12\/04\/1013068\/algorithms-create-a-poverty-trap-lawyers-fight-back\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">butting up against inscrutable, unaccountable algorithms<\/a>, created by private companies, that do things like decide which children enter foster care, allocate Medicaid services, and determine access to unemployment benefits.<\/p>\n<h2><strong>Policing and persecution<\/strong><\/h2>\n<p>There\u2019s an enduring allure to the idea of predicting crimes before they happen, even as police department after police department has discovered <a href=\"https:\/\/themarkup.org\/ask-the-markup\/2020\/08\/20\/does-predictive-police-technology-contribute-to-bias\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">problems with data-driven<\/a> models.<\/p>\n<p>A case in point: the Pasco County Sheriff\u2019s Department, which The Tampa Bay Times found routinely <a href=\"https:\/\/projects.tampabay.com\/projects\/2020\/investigations\/police-pasco-sheriff-targeted\/intelligence-led-policing\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">monitored and harassed people it identified as potential criminals<\/a>. The department \u201csends deputies to find and interrogate anyone whose name appears\u201d on a list generated from \u201carrest histories, unspecified intelligence and arbitrary decisions by police analysts,\u201d the newspaper reported. Deputies appeared at people\u2019s homes in the middle of the night to conduct searches and wrote tickets for minor things like missing mailbox numbers. Many of those targeted were minors. The sheriff\u2019s department, in response, said the newspaper was cherry-picking examples and conflating legitimate police tactics with harassment.<\/p>\n<p>Facial recognition software, another policing-related algorithmic tool, led to the faulty arrest and detention of a Detroit man for a crime he did not commit, The New York Times reported in an <a href=\"https:\/\/www.nytimes.com\/2020\/06\/24\/technology\/facial-recognition-arrest.html\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">article cataloging the technology\u2019s privacy, accuracy, and race problems<\/a>.<\/p>\n<p>And in a particularly chilling development, The Washington Post reported, the Chinese tech company Huawei has been testing tools that could scan faces in crowds for ethnic features and send \u201c<a href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/12\/08\/huawei-tested-ai-software-that-could-recognize-uighur-minorities-alert-police-report-says\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">Uighur alarms<\/a>\u201d to authorities. The Chinese government has detained members of the Muslim minority group en masse in prison camps\u2014persecution that<a href=\"https:\/\/www.nytimes.com\/2020\/09\/24\/world\/asia\/china-muslims-xinjiang-detention.html\" target=\"_blank\" rel=\"nofollow noopener noreferrer\"> appears to be expanding<\/a>. Huawei USA spokesperson Glenn Schloss told the Post the tool \u201cis simply a <span>test, and<\/span>&nbsp;it has not seen real-world application.\u201d<\/p>\n<h2><strong>Workplace surveillance<\/strong><\/h2>\n<p>Big employers are turning to algorithms to help monitor their workers. This year, Microsoft <a href=\"https:\/\/www.theguardian.com\/technology\/2020\/dec\/02\/microsoft-apologises-productivity-score-critics-derided-workplace-surveillance\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">apologized<\/a> after it enabled a Microsoft 365 feature that allowed managers to monitor and analyze their <a href=\"https:\/\/www.theguardian.com\/technology\/2020\/nov\/26\/microsoft-productivity-score-feature-criticised-workplace-surveillance\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">workers\u2019 \u201cproductivity.\u201d<\/a> The productivity score factored in things like an individual\u2019s participation in group chats and the number of emails sent.<\/p>\n<p>Meanwhile, Business Insider reported that Whole Foods uses heat maps, which weigh things like a number of employee complaints and the local unemployment rate, to predict<a href=\"https:\/\/www.businessinsider.com\/whole-foods-tracks-unionization-risk-with-heat-map-2020-1\" target=\"_blank\" rel=\"nofollow noopener noreferrer\"> which stores might see unionization attempts<\/a>. Whole Foods is owned by Amazon, which has an <a href=\"https:\/\/www.vice.com\/en\/article\/5dp3yn\/amazon-leaked-reports-expose-spying-warehouse-workers-labor-union-environmental-groups-social-movements\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">elaborate apparatus<\/a> for monitoring worker behavior.<\/p>\n<h2><strong>Revenge of the students<\/strong><\/h2>\n<p>Anyone searching for inspiration in the fight against an algorithm-dominated tomorrow might look to students in the United Kingdom, who <a href=\"https:\/\/twitter.com\/HUCKmagazine\/status\/1294985562106015750\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">took to the streets<\/a> after the education system decided to use an algorithm to <a href=\"https:\/\/www.wired.com\/story\/an-algorithm-determined-uk-students-grades-chaos-ensued\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">give grades based on past performance during the pandemic<\/a>. Or <a href=\"https:\/\/www.theverge.com\/2020\/9\/2\/21419012\/edgenuity-online-class-ai-grading-keyword-mashing-students-school-cheating-algorithm-glitch\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">these kids<\/a>, who discovered their tests were being graded by an algorithm\u2014and then promptly figured out how to exploit it by essentially mashing up a bunch of keywords.<\/p>\n<p><em>This article was <a href=\"https:\/\/themarkup.org\/2020-in-review\/2020\/12\/15\/algorithms-bias-racism-surveillance\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">originally published on The Markup<\/a> and was republished under the <a href=\"https:\/\/creativecommons.org\/licenses\/by-nc-nd\/4.0\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">Creative Commons Attribution-NonCommercial-NoDerivatives<\/a><a rel=\"nofollow noopener\"> license.<\/a><\/em><\/p>\n<p class=\"c-post-pubDate\"> Published December 22, 2020 \u2014 13:00 UTC <\/p>\n<p> <a href=\"https:\/\/thenextweb.com\/neural\/2020\/12\/22\/algorithms-behaving-badly-2020-edition-syndication\/\">Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The perils of leaving important decisions to computer algorithms are pretty easily imagined (see, e.g., \u201cMinority Report,\u201d \u201cI, Robot,\u201d \u201cWar Games\u201d). In recent years, however, algorithms\u2019&nbsp; job descriptions have only grown. They&#8230;<\/p>\n","protected":false},"author":1,"featured_media":1908,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/1907"}],"collection":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=1907"}],"version-history":[{"count":0,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/1907\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/media\/1908"}],"wp:attachment":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=1907"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=1907"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=1907"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}