{"id":1539,"date":"2020-12-02T18:06:11","date_gmt":"2020-12-02T18:06:11","guid":{"rendered":"https:\/\/thenextweb.com\/?p=1330230"},"modified":"2020-12-02T18:06:11","modified_gmt":"2020-12-02T18:06:11","slug":"study-shows-how-ai-exacerbates-recruitment-bias-against-women","status":"publish","type":"post","link":"https:\/\/www.londonchiropracter.com\/?p=1539","title":{"rendered":"Study shows how AI exacerbates recruitment bias against women"},"content":{"rendered":"\n<p>A <a href=\"https:\/\/about.unimelb.edu.au\/__data\/assets\/pdf_file\/0024\/186252\/NEW-RESEARCH-REPORT-Ethical-Implications-of-AI-Bias-as-a-Result-of-Workforce-Gender-Imbalance-UniMelb,-UniBank.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">new study<\/a> from the University of Melbourne has demonstrated how hiring algorithms&nbsp;can amplify human gender biases against women.<\/p>\n<p>Researchers from the University of Melbourne gave 40 recruiters real-life&nbsp;resum\u00e9s for jobs at UniBank, which funded the study. The resum\u00e9s were for roles as a data analyst, finance officer, and recruitment officer,&nbsp;which Australian Bureau of Statistics data shows are respectively male-dominated, gender-balanced, and female-dominated positions.<\/p>\n<p>Half of the recruitment panel was given resum\u00e9s with the candidate\u2019s stated gender. The other half was given the exact same resum\u00e9s, but with traditionally female names&nbsp;and male ones interchanged. For instance, they might switch \u201cMark\u201d to \u201cSarah\u201d and \u201cRachel\u201d to \u201cJohn.\u201d<\/p>\n<p>The panelists were then instructed to rank each candidate and collectively pick the top and bottom three&nbsp;resum\u00e9s for each role. The researchers then reviewed their decisions.<\/p>\n<p><em>[Read: <a href=\"https:\/\/thenextweb.com\/readme\/2020\/11\/26\/how-to-build-a-search-engine-for-criminal-data\/\">How to build a search engine for criminal data<\/a>]<\/em><\/p>\n<p>They found that the recruiters consistently preferred&nbsp;<span>resum\u00e9s from the apparently male candidates<\/span>&nbsp;\u2014 even though they had the same qualifications and experience as the women. Both male and female panelists were more likely to give men\u2019s&nbsp;<span>resum\u00e9s a higher rank.<\/span><\/p>\n<figure class=\"post-image post-mediaBleed aligncenter\"><img decoding=\"async\" loading=\"lazy\" class=\"size-full wp-image-1330306 lazy\" src=\"https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2020\/12\/Screenshot-2020-12-02-at-16.02.47.png\" alt width=\"888\" height=\"882\" sizes=\"(max-width: 888px) 100vw, 888px\" data-lazy=\"true\" data-srcset=\"https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2020\/12\/Screenshot-2020-12-02-at-16.02.47.png 888w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2020\/12\/Screenshot-2020-12-02-at-16.02.47-96x96.png 96w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2020\/12\/Screenshot-2020-12-02-at-16.02.47-211x210.png 211w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2020\/12\/Screenshot-2020-12-02-at-16.02.47-272x270.png 272w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2020\/12\/Screenshot-2020-12-02-at-16.02.47-136x135.png 136w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2020\/12\/Screenshot-2020-12-02-at-16.02.47-796x791.png 796w, https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2020\/12\/Screenshot-2020-12-02-at-16.02.47-192x192.png 192w\"><figcaption>Credit: The University of Melbourne<\/figcaption><figcaption><a href=\"https:\/\/thenextweb.com\/neural\/2020\/12\/02\/study-shows-how-ai-exacerbates-recruitment-bias-against-women\/#\" data-url=\"https:\/\/twitter.com\/intent\/tweet?url=https%3A%2F%2Fthenextweb.com%2Fneural%2F2020%2F12%2F02%2Fstudy-shows-how-ai-exacerbates-recruitment-bias-against-women%2F&amp;via=thenextweb&amp;related=thenextweb&amp;text=Check out this picture on: Data suggest 70% of data analysts in Australia are men. If an algorithm is trained to rank candidates based on these statistics, it could assume that a male name is a desirable quality for the position.\" data-title=\"Share Data suggest 70% of data analysts in Australia are men. If an algorithm is trained to rank candidates based on these statistics, it could assume that a male name is a desirable quality for the position. on Twitter\" data-width=\"685\" data-height=\"500\" class=\"post-image-share popitup\" title=\"Share Data suggest 70% of data analysts in Australia are men. If an algorithm is trained to rank candidates based on these statistics, it could assume that a male name is a desirable quality for the position. on Twitter\"><i class=\"icon icon--inline icon--twitter--dark\"><\/i><\/a>Data suggest 70% of data analysts in Australia are men. If an algorithm is trained to rank candidates based on these statistics, it could assume that a male name is a desirable quality for the position.<\/figcaption><\/figure>\n<p>The researchers then used the data to create a hiring algorithm that would rank each candidate in-line with the panel\u2019s preferences \u2014 and found that it reflected their biases.<\/p>\n<p>Read:&nbsp;<a href=\"https:\/\/thenextweb.com\/readme\/2020\/11\/26\/how-to-build-a-search-engine-for-criminal-data\/\">Amazon\u2019s sexist hiring algorithm could still be better than a human<\/a><\/p>\n<p>\u201cEven when the names of the candidates were removed, AI assessed resum\u00e9s based on historic hiring patterns where preferences leaned towards male candidates,\u201d said study&nbsp;<span>co-author Dr Marc Cheong in <a href=\"https:\/\/about.unimelb.edu.au\/newsroom\/news\/2020\/december\/entry-barriers-for-women-are-amplified-by-ai-in-recruitment-algorithms,-study-finds\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">a statement<\/a>.<\/span><\/p>\n<p>\u201cFor example, giving advantage to candidates with years of continuous service would automatically disadvantage women who\u2019ve taken time off work for caring responsibilities.\u201d<\/p>\n<p>The study relied on a small sample of data, but these types of gender biases have also been documented in large companies. Amazon, for example, had to <a href=\"https:\/\/www.reuters.com\/article\/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">shut down a hiring algorithm<\/a> tool after discovering it was discriminating against female applicants, because the models were predominantly trained on resumes submitted by men.<\/p>\n<p><span>\u201cAlso, in the case of more advanced AIs that operate within a \u2018black box\u2019 without transparency or human oversight, there is a danger that any amount of initial bias will be amplified,\u201d added Dr&nbsp;Cheong.<\/span><\/p>\n<p>The researchers believe the risks can be reduced by making hiring algorithms more transparent. But we also need to address our inherent human&nbsp;biases \u2014 before they\u2019re baked into the machines.<\/p>\n<p class=\"c-post-pubDate\"> Published December 2, 2020 \u2014 18:06 UTC <\/p>\n<p> <a href=\"https:\/\/thenextweb.com\/neural\/2020\/12\/02\/study-shows-how-ai-exacerbates-recruitment-bias-against-women\/\">Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A new study from the University of Melbourne has demonstrated how hiring algorithms&nbsp;can amplify human gender biases against women. Researchers from the University of Melbourne gave 40 recruiters real-life&nbsp;resum\u00e9s for jobs at&#8230;<\/p>\n","protected":false},"author":1,"featured_media":1540,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/1539"}],"collection":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=1539"}],"version-history":[{"count":0,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/1539\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/media\/1540"}],"wp:attachment":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=1539"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=1539"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=1539"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}