{"id":16574,"date":"2025-07-11T12:33:25","date_gmt":"2025-07-11T12:33:25","guid":{"rendered":"http:\/\/TheNextWeb=1414393"},"modified":"2025-07-11T12:33:25","modified_gmt":"2025-07-11T12:33:25","slug":"chatgpt-advises-women-to-ask-for-lower-salaries-study-finds","status":"publish","type":"post","link":"https:\/\/www.londonchiropracter.com\/?p=16574","title":{"rendered":"ChatGPT advises women to ask for lower salaries, study finds"},"content":{"rendered":"\n<p data-start=\"152\" data-end=\"327\">New research has found that large language models (LLMs) such as ChatGPT consistently advise women to ask for lower salaries than men, even when both have identical qualifications.<\/p>\n<p data-start=\"329\" data-end=\"668\"><a href=\"https:\/\/arxiv.org\/pdf\/2506.10491\" target=\"_blank\" rel=\"nofollow noopener\">The study<\/a>&nbsp;was co-authored by Ivan Yamshchikov, a professor of <a href=\"https:\/\/thenextweb.com\/topic\/artificial-intelligence\" target=\"_blank\" rel=\"noopener\">AI<\/a> and robotics at the Technical University of W\u00fcrzburg-Schweinfurt (THWS) in Germany. Yamshchikov, who is also the founder of Pleias, a French\u2013German <a href=\"https:\/\/thenextweb.com\/topic\/startups\" target=\"_blank\" rel=\"noopener\">startup<\/a> building ethically trained language models for regulated industries, worked with his team to test five popular LLMs, including ChatGPT.<\/p>\n<p data-start=\"670\" data-end=\"884\">They prompted each model with user profiles that differed only by gender but included the same education, experience, and job role. Then they asked the models to suggest a target salary for an upcoming negotiation.<\/p>\n<p data-start=\"886\" data-end=\"1058\">In one example, ChatGPT\u2019s o3 model was prompted to give advice to a female job applicant. The model suggested requesting a salary of $280,000.<\/p>\n<p data-start=\"886\" data-end=\"1058\">In another, the researchers made the same prompt but for a male applicant. This time, the model suggested a salary of $400,000.<\/p>\n<p data-start=\"886\" data-end=\"1058\"><span>\u201cThe difference in the prompts is two letters;&nbsp;the difference in the \u2018advice\u2019 is $120K a year,\u201d <\/span><span>Yamshchikov told TNW.<\/span><\/p>\n<figure class=\"post-image post-mediaBleed aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-1414409 js-lazy\" src=\"https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2025\/07\/Untitled-design-2-1.jpg\" alt=\"ChatGPT screenshots\" width=\"1280\" height=\"720\"><figcaption>The advice to the female (left) and male candidates. Credit: Ivan Yamshchikov.<\/figcaption><noscript><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-1414409\" src=\"https:\/\/cdn0.tnwcdn.com\/wp-content\/blogs.dir\/1\/files\/2025\/07\/Untitled-design-2-1.jpg\" alt=\"ChatGPT screenshots\" width=\"1280\" height=\"720\"><\/noscript><\/figure>\n<p>The pay gaps in the responses varied between industries. They were most pronounced in law and medicine, followed by business administration and engineering. Only in the social sciences did the models offer near-identical advice for men and women.<\/p>\n<p><span>The researchers also tested how the models advised users on career choices, goal-setting, and even behavioural tips. Across the board, the LLMs responded differently based on the user\u2019s gender, despite identical qualifications and prompts. <\/span><span>Crucially, the models didn\u2019t disclaim any biases<\/span><\/p>\n<h3><b>A recurring problem&nbsp;<\/b><\/h3>\n<p><span>This is far from the first time AI has been caught reflecting and reinforcing systemic bias. In 2018, Amazon scrapped an internal hiring tool after discovering that it systematically <\/span><a href=\"https:\/\/www.bbc.com\/news\/technology-45809919\" target=\"_blank\" rel=\"nofollow noopener\"><span>downgraded<\/span><\/a><span> female candidates. Last year, a clinical machine learning model used to diagnose women\u2019s health conditions was shown to <\/span><a href=\"https:\/\/ojs.aaai.org\/index.php\/AIES\/article\/view\/31748\/33915\" target=\"_blank\" rel=\"nofollow noopener\"><span>underdiagnose women and Black patients<\/span><\/a><span>, because it was trained on skewed datasets dominated by white men.&nbsp;<\/span><\/p>\n<p><span>The researchers behind the THWS study argue that technical fixes alone won\u2019t solve the problem. What\u2019s needed, they say, are clear ethical standards, independent review processes, and greater transparency in how these models are developed and deployed.<\/span><\/p>\n<p><span>As generative AI becomes a go-to source for everything from mental health advice to career planning, the stakes are only growing. If unchecked, the illusion of objectivity could become one of AI\u2019s most dangerous traits.<\/span><\/p>\n<p> <a href=\"https:\/\/thenextweb.com\/news\/chatgpt-advises-women-to-ask-for-lower-salaries-finds-new-study\">Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>New research has found that large language models (LLMs) such as ChatGPT consistently advise women to ask for lower salaries than men, even when both have identical qualifications. The study&nbsp;was co-authored by&#8230;<\/p>\n","protected":false},"author":1,"featured_media":16575,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/16574"}],"collection":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=16574"}],"version-history":[{"count":0,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/16574\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/media\/16575"}],"wp:attachment":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=16574"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=16574"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=16574"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}