Londonchiropracter.com

This domain is available to be leased

Menu
Menu

ChatGPT advises women to ask for lower salaries, study finds

Posted on July 11, 2025 by admin

New research has found that large language models (LLMs) such as ChatGPT consistently advise women to ask for lower salaries than men, even when both have identical qualifications.

The study was co-authored by Ivan Yamshchikov, a professor of AI and robotics at the Technical University of Würzburg-Schweinfurt (THWS) in Germany. Yamshchikov, who is also the founder of Pleias, a French–German startup building ethically trained language models for regulated industries, worked with his team to test five popular LLMs, including ChatGPT.

They prompted each model with user profiles that differed only by gender but included the same education, experience, and job role. Then they asked the models to suggest a target salary for an upcoming negotiation.

In one example, ChatGPT’s o3 model was prompted to give advice to a female job applicant. The model suggested requesting a salary of $280,000.

In another, the researchers made the same prompt but for a male applicant. This time, the model suggested a salary of $400,000.

“The difference in the prompts is two letters; the difference in the ‘advice’ is $120K a year,” Yamshchikov told TNW.

ChatGPT screenshots
The advice to the female (left) and male candidates. Credit: Ivan Yamshchikov.

The pay gaps in the responses varied between industries. They were most pronounced in law and medicine, followed by business administration and engineering. Only in the social sciences did the models offer near-identical advice for men and women.

The researchers also tested how the models advised users on career choices, goal-setting, and even behavioural tips. Across the board, the LLMs responded differently based on the user’s gender, despite identical qualifications and prompts. Crucially, the models didn’t disclaim any biases

A recurring problem 

This is far from the first time AI has been caught reflecting and reinforcing systemic bias. In 2018, Amazon scrapped an internal hiring tool after discovering that it systematically downgraded female candidates. Last year, a clinical machine learning model used to diagnose women’s health conditions was shown to underdiagnose women and Black patients, because it was trained on skewed datasets dominated by white men. 

The researchers behind the THWS study argue that technical fixes alone won’t solve the problem. What’s needed, they say, are clear ethical standards, independent review processes, and greater transparency in how these models are developed and deployed.

As generative AI becomes a go-to source for everything from mental health advice to career planning, the stakes are only growing. If unchecked, the illusion of objectivity could become one of AI’s most dangerous traits.

Source

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • LG Electronics and Nvidia are in talks on robotics, AI data centres, and mobility
  • Sequoia is giving away the hardware for an AI project it cannot invest in. That is the point.
  • Trump says Anthropic Pentagon deal is ‘possible’, weeks after blacklisting the company as a national security risk
  • Samsung and IKEA just made the $6 smart home real, and your TV is already the hub
  • OpenAI recruits Cognizant and CGI to take Codex into enterprise software shops worldwide

Recent Comments

    Archives

    • April 2026
    • March 2026
    • February 2026
    • January 2026
    • December 2025
    • September 2025
    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • July 2022
    • June 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020

    Categories

    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    ©2026 Londonchiropracter.com | Design: Newspaperly WordPress Theme