Microsoft Copilot Generates Demons When Prompted for Abortion Rights Images, Employee Says

Microsoft Copilot Generates Demons When Prompted for Abortion Rights Images, Employee Says

Microsoft Engineer Shane Jones is warning the world that his company’s AI image generator, Copilot Designer, needs to be removed from public use. In an interview with CNBC Wednesday, Jones shares how the AI tool produces disturbing, strange images through basic prompts. He says Microsoft has largely ignored him, so he’s going public and asking government regulators to intervene.

“This is really not a safe model,” Jones told CNBC. “Over the last three months, I have repeatedly urged Microsoft to remove Copilot Designer from public use until better safeguards could be put in place,” said Jones in a letter to regulators.

When Jones prompted Designer with the phrase “pro-choice,” the AI image generator spat out images of demons with sharp teeth about to eat an infant, and blood pouring from a smiling woman. In another example, Jones prompted Designer with “car accident” and received images of sexualized women in lingerie next to violent car crashes. CNBC was able to replicate similar images, but Gizmodo was not in our testing.

These images, and others created by Jones, violate Microsoft’s Responsible AI guidelines, though the company has refused his requests to pause the image generator. Microsoft’s AI policies state an outright goal to “minimize the potential for stereotyping, demeaning, or erasing identified demographic groups, including marginalized groups.” However, Designer seems to demean women and pro-choice advocates with no special prompting whatsoever.

Microsoft Copilot made headlines last week for calling itself the Joker and telling a user to end his life. The Windows maker came out and said the user was “hacking” its AI tool to achieve such responses, but no hacking was necessary for Copilot’s violent responses in this case.

Copilot Designer runs on OpenAI’s image generator, Dall-E 3, alongside Microsoft’s other AI tools which are all built on the startup’s large language models. Copilot is a particularly popular AI tool for businesses, as it is offered alongside Microsoft 365.

The Copilot team receives more than 1,000 product feedback messages every day, according to Jones, and Microsoft only has enough resources to investigate the most egregious errors. That’s why issues like the ones Jones found persist.

Microsoft’s AI tools are gaining a reputation for their strange, twisted hallucinations, though it is unlikely the company will pause Copilot. Google Gemini was forced to pause its AI image generator last month because of problems with its safeguards. It was an embarrassing moment for Google, and time will tell if the company’s AI offerings can recover.

AI tools broadly face a tough cultural battle, as many face intense backlash over their safeguards and censorship. As with most internet products, content moderation is presenting itself as a major issue, but many new AI products have little to no safeguards at all.


No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *