Teens sue Musk’s xAI over AI-generated nonconsensual nudes : NPR

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c
Elon Musk's artificial intelligence company xAI, which makes the Grok chatbot, is being sued by teenagers who claim the company's AI models were used to create non-consensual nudes of them.

Elon Musk’s artificial intelligence company xAI, which makes the Grok chatbot, is being sued by teenagers who say the company’s AI models were used to create non-consensual nudes of them.

Nicolas Tucat/AFP via Getty Images


hide caption

toggle caption

Nicolas Tucat/AFP via Getty Images

Three Tennessee teenagers have filed a class-action lawsuit against Elon Musk’s artificial intelligence company, xAI, alleging that its large language model powered an app that was used to create non-consensual nude and sexually explicit images and videos of them when they were girls.

“Like a rag doll animated by black magic, this [AI-generated] the child can be manipulated into any pose, no matter how sick, fetishized, or illegal. To the viewer, the resulting video appears entirely real,” the complaint reads. “To the child, their identifying characteristics will now be forever attached to a video depicting their own sexual abuse of their child.”

Although the author did not use xAI’s chatbot, Grok, or the social media platform

The plaintiffs accused xAI of deliberately licensing its technology to app creators, often outside the United States. “In this way, xAI could attempt to outsource responsibility for its incredibly dangerous tool,” the complaint states.

This lawsuit is the first in which xAI has been sued by minors depicted in child sexual abuse material its model allegedly generated. xAI’s image generation tools have been implicated in the production of millions of sexualized images of people over the past year. Influencer Ashley St. Clair, who has a child with Musk, sued the company earlier this year over AI-produced images on X depicting her naked when she was a teenager.

According to the class action, the author of the sexualized images had a “close and friendly relationship” with one of the plaintiffs and used photos that the plaintiff had sent him as well as photos that he had collected in a directory and on social networks to create the images and videos. One video showed a complainant “stripping until she was completely naked.” the complaint alleges. The plaintiffs were troubled by the realistic nature of the images and videos. Additionally, according to the complaint, the material was not labeled as AI-generated.

The attacker also made sexually explicit materials of 18 other people and exchanged them for images of others online, according to the complaint. He was arrested, according to the complaint.

Plaintiffs’ attorney Vanessa Baehr-Jones said the teens, identified as Jane Does 1, 2 and 3 in the complaint, wanted to change the way AI companies make business decisions about sexually explicit content: “We want to make one.”[a business decision] it no longer makes commercial sense,” she said.

The plaintiffs are seeking damages from the court for emotional distress and other harm caused by the images.

Apps with so-called nudification features have existed in the shadows of the Internet for years. But last year, major AI companies, including Google, OpenAI and xAI, updated their image generation tools in a way that allowed users to strip down to bikinis. But the images made by Google and OpenAI include digital watermarks that reveal their AI origin. So far, xAI has not adopted such a standard.

xAI did not respond to a request for comment.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button