{"id":3353,"date":"2021-02-27T14:00:35","date_gmt":"2021-02-27T14:00:35","guid":{"rendered":"https:\/\/thenextweb.com\/?p=1340435"},"modified":"2021-02-27T14:00:35","modified_gmt":"2021-02-27T14:00:35","slug":"can-auditing-eliminate-bias-from-algorithms","status":"publish","type":"post","link":"https:\/\/www.londonchiropracter.com\/?p=3353","title":{"rendered":"Can auditing eliminate bias from algorithms?"},"content":{"rendered":"\n<div><img decoding=\"async\" src=\"https:\/\/img-cdn.tnwcdn.com\/image\/neural?filter_last=1&amp;fit=1280%2C640&amp;url=https%3A%2F%2Fcdn0.tnwcdn.com%2Fwp-content%2Fblogs.dir%2F1%2Ffiles%2F2021%2F02%2F1-copy-57.jpg&amp;signature=7a2f63dc6d0361a4560cc12f8c6694b2\" class=\"ff-og-image-inserted\"><\/div>\n<p>For more than a decade, journalists and researchers have been writing about the dangers of relying on algorithms to make weighty decisions: <a href=\"https:\/\/www.propublica.org\/article\/machine-bias-risk-assessments-in-criminal-sentencing\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">who gets locked up<\/a>, who gets a job, <a href=\"https:\/\/www.npr.org\/2018\/11\/24\/670513608\/how-some-algorithm-lending-programs-discriminate-against-minorities\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">who gets a loan<\/a> \u2014 even <a href=\"https:\/\/www.technologyreview.com\/2020\/12\/21\/1015303\/stanford-vaccine-algorithm\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">who has priority for<\/a> COVID-19 vaccines.<\/p>\n<p>Rather than remove bias, one algorithm after another has codified and perpetuated it, as companies have simultaneously continued to more or less shield their algorithms from public scrutiny.<\/p>\n<p>The big question ever since: How do we solve this problem? Lawmakers and researchers have advocated for algorithmic audits, which would dissect and stress-test algorithms to see how they work and whether they\u2019re performing their stated goals or producing biased outcomes. And there is a growing field of private auditing firms that purport to do just that. Increasingly, companies are turning to these firms to review their algorithms, particularly when they\u2019ve faced criticism for biased outcomes, but it\u2019s not clear whether such audits are actually making algorithms less biased \u2014 or if they\u2019re simply good&nbsp;PR.<\/p>\n<p>Algorithmic auditing got a lot of press recently when HireVue, a popular hiring software company used by companies like Walmart and Goldman Sachs, faced criticism that the algorithms it used to <a href=\"https:\/\/epic.org\/privacy\/ftc\/hirevue\/EPIC_FTC_HireVue_Complaint.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">assess candidates through video interviews were biased<\/a>.<\/p>\n<p>HireVue called in an auditing firm to help and in January touted the results of the audit in a press release.<\/p>\n<p>The audit found the software\u2019s predictions \u2018work as advertised with regard to fairness and bias issues,\u2019 HireVue said in a <a href=\"https:\/\/www.hirevue.com\/press-release\/hirevue-leads-the-industry-with-commitment-to-transparent-and-ethical-use-of-ai-in-hiring\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">press release<\/a>, quoting the auditing firm it hired, O\u2019Neil Risk Consulting &amp; Algorithmic Auditing (ORCAA).<\/p>\n<p>But despite making changes to its process, including eliminating video from its interviews, HireVue was widely accused of using the audit \u2014 which looked narrowly at a hiring test for early career candidates, not HireVue\u2019s candidate evaluation process as a whole \u2014 as a PR stunt.<\/p>\n<p>Articles in <a href=\"https:\/\/www.fastcompany.com\/90597594\/ai-algorithm-auditing-hirevue\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">Fast Company<\/a>, <a href=\"https:\/\/venturebeat.com\/2021\/01\/30\/what-algorithm-auditing-startups-need-to-succeed\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">VentureBeat<\/a>, and <a href=\"https:\/\/www.technologyreview.com\/2021\/02\/11\/1017955\/auditors-testing-ai-hiring-algorithms-bias-big-questions-remain\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">MIT Technology Review<\/a> called out the company for mischaracterizing the audit.<\/p>\n<p>HireVue said it was transparent with the audit by making the report publicly available and added that the press release specified that the audit was only for a specific scenario.<\/p>\n<p>\u201cWhile HireVue was open to any type of audit, including one that involved looking at our process in general, ORCAA asked to focus on a single use case to enable concrete discussions about the system,\u201d Lindsey Zuloaga, HireVue\u2019s chief data scientist, said in an email. \u201cWe worked with ORCAA to choose a representative use case with substantial overlap with the assessments most HireVue candidates go through.\u201d<\/p>\n<p><em>[Read:&nbsp;<a class=\"c-link c-message_attachment__title_link\" href=\"https:\/\/thenextweb.com\/plugged\/2020\/11\/27\/build-pet-friendly-gadget-experts-animal-owners-design\/\" target=\"_blank\" rel=\"noreferrer noopener\" data-qa=\"message_attachment_title_link\"><span dir=\"auto\">How do you build a pet-friendly gadget? We asked experts and animal owners<\/span><\/a>]<\/em><\/p>\n<p>But algorithmic auditors were also displeased about HireVue\u2019s public statements on the audit.<\/p>\n<p>\u201cIn repurposing [ORCAA\u2019s] very thoughtful analysis into marketing collateral, they\u2019re&nbsp;undermining the legitimacy of the whole field,\u201d Liz O\u2019Sullivan, co-founder of Arthur, an AI explainability and bias monitoring startup, said.<\/p>\n<p>And that is the problem with algorithmic auditing as a tool for eliminating bias: Companies might use them to make real improvements, but they might not. And there are no industry standards or regulations that hold the auditors or the companies that use them to account.<\/p>\n<h2><strong>What is algorithmic auditing \u2014 how does it work?<\/strong><\/h2>\n<p>Good question \u2014 it\u2019s a pretty undefined field. Generally, audits proceed a few different ways: by looking at an algorithm\u2019s code and the data from its results, or by viewing an algorithm\u2019s potential effects through interviews and workshops with employees.<\/p>\n<p>Audits with access to an algorithm\u2019s code allow reviewers to assess whether the algorithm\u2019s training data is biased and create hypothetical scenarios to test effects on different populations.<\/p>\n<p>There are only about 10 to 20 reputable firms offering algorithmic reviews, Rumman Chowdhury, Twitter\u2019s director of machine learning ethics and founder of the algorithmic auditing company Parity, said. Companies may also have their own internal auditing teams that look at algorithms before they\u2019re released to the public.<\/p>\n<p>In 2016, an Obama administration report on algorithmic systems and civil rights encouraged the&nbsp;<a href=\"https:\/\/obamawhitehouse.archives.gov\/sites\/default\/files\/microsites\/ostp\/2016_0504_data_discrimination.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">development of an algorithmic auditing industry<\/a>. Hiring an auditor still isn\u2019t common practice, though, since companies have no obligation to do so, and according to multiple auditors, companies don\u2019t want the scrutiny or potential legal issues that that scrutiny may raise, especially for products they market.<\/p>\n<p>\u201cLawyers tell me, \u2018If we hire you and find out there\u2019s a problem that we can\u2019t fix, then we have lost plausible deniability and we don\u2019t want to be the next cigarette company,\u2019 \u201d ORCAA\u2019s founder, Cathy O\u2019Neil, said. \u201cThat\u2019s the most common reason I don\u2019t get a job.\u201d<\/p>\n<p>For those that do hire auditors, there are no standards for what an \u201caudit\u201d should entail. Even a proposed <a href=\"https:\/\/legistar.council.nyc.gov\/LegislationDetail.aspx?ID=4344524&amp;GUID=B051915D-A9AC-451E-81F8-6596032FA3F9&amp;Options=Advanced&amp;Search\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">New York City law<\/a> that requires annual audits of hiring algorithms doesn\u2019t spell out how the audits should be conducted. A seal of approval from one auditor could mean much more scrutiny than that from another.<\/p>\n<p>And because audit reports are also almost always bound by nondisclosure agreements, the companies can\u2019t compare each other\u2019s work.<\/p>\n<p>\u201cThe big problem is, we\u2019re going to find as this field gets more lucrative, we really need standards for what an audit is,\u201d said Chowdhury. \u201cThere are plenty of people out there who are willing to call something an audit, make a nice looking website and call it a day, and rake in cash with no standards.\u201d<\/p>\n<p>And tech companies aren\u2019t always forthcoming, even with the auditors they hire, some auditors say.<\/p>\n<p>\u201cWe get this situation where trade secrets are a good enough reason to allow these algorithms to operate obscurely and in the dark, and we can\u2019t have that,\u201d Arthur\u2019s O\u2019Sullivan said.<\/p>\n<p>Auditors have been in scenarios where they don\u2019t have access to the software\u2019s code and so risk violating computer access laws, Inioluwa Deborah Raji, an auditor and a research collaborator at the Algorithmic Justice League, said. Chowdhury said she has declined audits when companies demanded she allows them to review them before public release.<\/p>\n<p>For HireVue\u2019s audit, ORCAA interviewed stakeholders including HireVue employees, customers, job candidates, and algorithmic fairness experts, and identified concerns that the company needed to address, Zuloaga said.<\/p>\n<p>ORCAA\u2019s evaluation didn\u2019t look at the technical details of HireVue\u2019s algorithms \u2014 like what data the algorithm was trained on, or its code\u2014though Zuloaga said the company did not limit auditors\u2019 access in any way.<\/p>\n<p>\u201cORCAA asked for details on these analyses but their approach was focused on addressing how stakeholders are affected by the algorithm,\u201d Zuloaga said.<\/p>\n<p>O\u2019Neil said she could not comment on the HireVue audit.<\/p>\n<p>Many audits are done before products are released, but that\u2019s not to say they won\u2019t run into problems, because algorithms don\u2019t exist in a vacuum. Take, for example, when Microsoft built a chatbot that <a href=\"https:\/\/spectrum.ieee.org\/tech-talk\/artificial-intelligence\/machine-learning\/in-2016-microsofts-racist-chatbot-revealed-the-dangers-of-online-conversation\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">quickly turned racist once it was exposed to Twitter users.&nbsp;<\/a><\/p>\n<p>\u201cOnce you\u2019ve put it into the real world, a million things can go wrong, even with the best intentions,\u201d O\u2019Sullivan said. \u201cThe framework we would love to get adopted is there\u2019s no such thing as good enough. There are always ways to make things fairer.\u201d<\/p>\n<p>So some prerelease audits will also provide continuous monitoring, though it\u2019s not common. The practice is gaining momentum among banks and health care companies, O\u2019Sullivan said.<\/p>\n<p>O\u2019Sullivan\u2019s monitoring company installs a dashboard that looks for anomalies in algorithms as they are being used in <span>real-time<\/span>. For instance, it would alert companies months after launch if their algorithms were rejecting more women applicants for loans.<\/p>\n<p>And finally, there\u2019s also a growing body of adversarial audits, largely conducted by researchers and some journalists, which scrutinize algorithms without a company\u2019s consent. Take, for example, Raji and Joy Buolamwini, founder of the Algorithmic Justice League, whose <a href=\"https:\/\/medium.com\/@bu64dcjrytwitb8\/on-recent-research-auditing-commercial-facial-analysis-technology-19148bda1832\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">work on Amazon\u2019s Rekognition tool<\/a> highlighted how the software had racial and gender bias, without the company\u2019s involvement.<\/p>\n<h2><strong>Do companies fix their algorithms after an Audit?<\/strong><\/h2>\n<p>There are no guarantee companies will address the issues raised in an audit.<\/p>\n<p>\u201cYou can have a quality audit and still not get accountability from the company,\u201d said Raji. \u201cIt requires a lot of energy to bridge the gap between getting the audit results and then translating that into accountability.\u201d<\/p>\n<p>Public pressure can at times push companies to address the algorithmic bias in the technology \u2014 or audits that weren\u2019t performed at the behest of the tech firm and covered by a nondisclosure agreement.<\/p>\n<p>Raji said the <a href=\"http:\/\/gendershades.org\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">Gender Shades study<\/a>, which found gender and racial bias in commercial facial recognition tools, named companies like IBM and Microsoft to spark a public conversation around it.<\/p>\n<p>But it can be hard to create buzz around algorithmic accountability, she said.<\/p>\n<p>While bias in facial recognition is relatable \u2014&nbsp;people can see photos and the error rates and understand the consequences of racial and gender bias in the technology \u2014 it may be harder to relate to something like bias in interest-rate algorithms.<\/p>\n<p>\u201cIt\u2019s a bit sad that we rely so much on public outcry,\u201d Raji said. \u201cIf the public doesn\u2019t understand it, there is no fine, there are no legal repercussions. And it makes it very frustrating.\u201d<\/p>\n<h2><strong>So what can be done to improve algorithmic auditing?&nbsp;<\/strong><\/h2>\n<p>In 2019, a group of Democratic lawmakers introduced the federal <a href=\"https:\/\/www.booker.senate.gov\/news\/press\/booker-wyden-clarke-introduce-bill-requiring-companies-to-target-bias-in-corporate-algorithms\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">Algorithmic Accountability Act<\/a>, which would have required companies to audit their algorithms and address any bias issues the audits revealed before they\u2019re put into use.<\/p>\n<p>AI For the People\u2019s founder Mutale Nkonde was part of a team of technologists that helped draft the bill and said it would have created government mandates for companies to both audits and follow through on those audits.<\/p>\n<p>\u201cMuch like drug testing, there would have to be some type of agency like the Food and Drug Administration that looked at algorithms,\u201d she said. \u201cIf we saw the disparate impact, then that algorithm wouldn\u2019t be released to the&nbsp;market.\u201d<\/p>\n<p>The bill never made it to a vote.<\/p>\n<p>Sen. Ron Wyden, a Democrat from Oregon, said he plans to reintroduce the bill with Sen. Cory Booker (D-NJ) and Rep. Yvette Clarke (D-NY), with updates to the 2019 version. It\u2019s unclear if the bill would set standards for audits, but it would require that companies act on their results.<\/p>\n<p>\u201cI agree that researchers, industry, and the government need to work toward establishing recognized benchmarks for auditing AI, to ensure audits are as impactful as possible,\u201d Wyden said in a statement. \u201cHowever, the stakes are too high to wait for full academic consensus before Congress begins to take action to protect against bias tainting automated systems. It\u2019s my view we need to work on both tracks.\u201d<\/p>\n<p><em>This article was <a href=\"https:\/\/themarkup.org\/ask-the-markup\/2021\/02\/23\/can-auditing-eliminate-bias-from-algorithms\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">originally published on The Markup<\/a> and was republished under the <a href=\"https:\/\/creativecommons.org\/licenses\/by-nc-nd\/4.0\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">Creative Commons Attribution-NonCommercial-NoDerivatives<\/a><a rel=\"nofollow noopener\"> license.<\/a><\/em><\/p>\n<p class=\"c-post-pubDate\"> Published February 27, 2021 \u2014 14:00 UTC <\/p>\n<p> <a href=\"https:\/\/thenextweb.com\/neural\/2021\/02\/27\/can-auditing-eliminate-bias-from-algorithms-syndication\/\">Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>For more than a decade, journalists and researchers have been writing about the dangers of relying on algorithms to make weighty decisions: who gets locked up, who gets a job, who gets&#8230;<\/p>\n","protected":false},"author":1,"featured_media":3354,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/3353"}],"collection":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=3353"}],"version-history":[{"count":0,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/3353\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/media\/3354"}],"wp:attachment":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=3353"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=3353"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=3353"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}