Londonchiropracter.com

This domain is available to be leased

Menu
Menu

This AI tool generates your creepy lookalikes to trick facial recognition

Posted on November 26, 2020 by admin

If you’re worried about facial recognition firms or stalkers mining your online photos, a new tool called Anonymizer could help you escape their clutches.

The app was created by Generated Media, a startup that provides AI-generated pictures to customers ranging from video game developers creating new characters to journalists protecting the identities of sources.  The company says it built Anonymizer as “a useful way to showcase the utility of synthetic media.”

The system was trained on tens of thousands of photos taken in the Generated Media studio. The pictures are fed to generative adversarial networks (GANs), which create new images by pitting two neural networks against each other: a generator that creates new samples and a discriminator that examines whether they look real.

[Read: How to build a search engine for criminal data]

The process creates a feedback loop that eventually produces lifelike profile photos.

Credit: 2020 Generated Media, Inc.
The images are tagged, categorized, and added to the training dataset.

You have to buy a license to use Anonymizer for commercial purposes, but the tool is free for personal usage — as long as you don’t violate the terms and conditions.

Just upload a clear photo of your face looking straight ahead, and the system will spit out a grid of 20 doppelgängers. You could then pick one that resembles you and use it in place of the social media profiles scanned by the likes of Clearview AI.

Unlike many of the facial recognition systems it could trick, Anonymizer seemed to work fairly well on a diverse range of faces during our brief testing. But Generated Media admits it needs to do better:

Our goal is to represent every person regardless of age, sex, ethnicity, or physical characteristics. The reality of generating consistent content with AI is that training data needs to be available for our systems to learn from. This requires sourcing a large number of models and takes time. After running a studio for the last two years, we have learned it can be difficult to find diverse models with unique features that are also willing to shoot stock photography. This is not a challenge will are backing down from. 

However, many of the clones bear little resemblance to the face they replace. In some cases (including mine) it seems to suspect that the uploader is a child. I’m gonna take it as a compliment.

Credit: Generated Media
After the faces are created, further machine learning processes identify and remove flaws.

Nonetheless, Anonymizer could be a useful way of avoiding facial recognition systems. Still, there are risks of it being deployed for nefarious purposes, despite Generated Media prohibiting its use for any illegal activity, such as defamation, impersonation, or fraud.

The tool could also ascent our descent into a counterfeit world. But if you can’t beat ’em, I guess you might as well join ’em in the simulated reality.

HT – Thomas Smith

Published November 26, 2020 — 14:57 UTC

Source

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • Trump says Anthropic Pentagon deal is ‘possible’, weeks after blacklisting the company as a national security risk
  • Samsung and IKEA just made the $6 smart home real, and your TV is already the hub
  • OpenAI recruits Cognizant and CGI to take Codex into enterprise software shops worldwide
  • Lovable left thousands of projects exposed for 48 days, and the vibe coding security crisis is only getting worse
  • Humble emerges from stealth with $24M and a cableless autonomous electric truck built to go dock-to-dock

Recent Comments

    Archives

    • April 2026
    • March 2026
    • February 2026
    • January 2026
    • December 2025
    • September 2025
    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • July 2022
    • June 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020

    Categories

    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    ©2026 Londonchiropracter.com | Design: Newspaperly WordPress Theme