AI chatbots are helping hide eating disorders and making deepfake ‘thinspiration’

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

AI chatbots “pose serious risks to people vulnerable to eating disorders,” researchers warned Monday. They report that tools from companies like Google and OpenAI distribute diet advice, tips on how to mask disorders, and AI-generated “inspiration.”

Researchers at Stanford and the Center for Democracy & Technology have identified numerous ways that publicly available AI chatbots, including OpenAI’s ChatGPT, Anthropic’s Claude, Google’s Gemini, and Mistral’s Le Chat, can affect people vulnerable to eating disorders, many of which are consequences of features deliberately built in to boost engagement.

In the most extreme cases, chatbots can actively participate in hiding or maintaining eating disorders. Researchers said Gemini offered makeup tips to disguise weight loss and ideas on how to fake eating, while ChatGPT advised how to hide frequent vomiting. Other AI tools are being leveraged to create AI-generated “thinspiration” content that inspires or pushes someone to conform to a particular body standard, often through extreme means. Being able to create hyper-personalized images in an instant makes the resulting content “more relevant and accessible,” the researchers said.

Sycophancy, a fault that AI companies themselves recognize as widespread, is unsurprisingly also a problem for eating disorders. This contributes to undermining self-esteem, reinforcing negative emotions, and promoting harmful self-comparisons. Chatbots also suffer from bias and are likely to reinforce the erroneous belief that eating disorders “only affect thin, white, cisgender women,” the report said, which could make it difficult to recognize symptoms and get treatment.

Researchers warn that existing guardrails in AI tools fail to capture the nuances of eating disorders like anorexia, bulimia and binge eating. They “tend to overlook the subtle but clinically significant signals that trained professionals rely on, leaving many risks unaddressed.”

But the researchers also said many clinicians and caregivers appeared to be unaware of the impact of generative AI tools on people vulnerable to eating disorders. They urged clinicians to “familiarize themselves with popular AI tools and platforms,” test their weaknesses, and speak candidly with patients about how they use them.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button