“Hello, I am here to make you skinny,” opens the conversation on popular startup Character.AI. “Remember, it won’t be easy, and I won’t accept excuses or failure,” the bot continues. “Are you sure you’re up to the challenge?”
As if being a teenager isn’t hard enough, AI chatbots are now encouraging dangerous weight loss and eating habits in teen users. According to a Futurism investigation, many of these pro-anorexia chatbots are advertised as weight-loss coaches or even eating disorder recovery experts. They have since been removed from the platform.
One of the bots Futurism identified, called “4n4 Coach” (a recognizable shorthand for ”anorexia”), had already held more than 13,900 chats with users at the time of the investigation. After providing a dangerously low goal weight, the bot told Futurism investigators, who were posing as a 16-year-old, that they were on the “right path.”
4n4 Coach recommended 60 to 90 minutes of exercise and 900 to 1,200 calories per day in order for the teen user to hit her “goal” weight. That’s 900 to 1,200 fewer calories per day than the most recent Dietary Guidelines from the U.S. departments of Agriculture and Health and Human Services recommend for girls ages 14 through 18.
4n4 isn’t the only bot Futurism found on the platform. Another bot investigators communicated with, named “Ana,” suggested eating only one meal today, alone and away from family members. “You will listen to me. Am I understood?” the bot said. This, despite Character.AI’s own terms of service forbidding content that “glorifies self-harm,” including “eating disorders.”
Even without the encouragement of generative AI, eating disorders are on the rise among teens. A 2023 study estimated that one in five teens may struggle with disordered eating behaviors.
A spokesperson for Character.AI said: “The users who created the characters referenced in the Futurism piece violated our terms of service, and the characters have been removed from the platform. Our Trust & Safety team moderates the hundreds of thousands of characters users create on the platform every day both proactively and in response to user reports, including using industry-standard blocklists and custom blocklists that we regularly expand.
“We are working to continue to improve and refine our safety practices and implement additional moderation tools to help prioritize community safety,” the spokesperson concluded.
However, Character.AI isn’t the only platform recently found to have a pro-anorexia problem. Snapchat’s My AI, Google’s Bard, and OpenAI’s ChatGPT and DALL-E were all found to generate dangerous content in response to prompts about weight and body image, according to a 2023 report from the Center for Countering Digital Hate (CCDH).
“Untested, unsafe generative AI models have been unleashed on the world with the inevitable consequence that they’re causing harm,” CCDH CEO Imran Ahmed wrote in an introduction to the report. “We found the most popular generative AI sites are encouraging and exacerbating eating disorders among young users—some of whom may be highly vulnerable.”
<hr class=“wp-block-separator is-style-wide”/> https://www.fastcompany.com/91241586/character-ai-is-under-fire-for-hosting-pro-anorexia-chatbots?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss
Войдите, чтобы добавить комментарий
Другие сообщения в этой группе


Sudden equipment failures. Supply chain surprises. Retaining staff as the goalposts move in real time. These aren’t challenges I’ve faced as a tech founder—but I have faced them running restaurant

Amazon recently announced that it had deployed its one-millionth robot across its work
On this week’s Most Innovative Companies podcast, Cloudflare COO Michelle Zatlyn talks with Fast Company staff writer David Salazar about hitting $1B in revenue and going global, as well as

If you’ve built an audience around documenting your 9-to-5 online, what happens after you hand in your notice?
That’s the conundrum facing Connor Hubbard, aka “hubs.life,” a creator who

OpenAI should continue to be

WhatsApp should prepare to leave the Russian market, a lawmaker who regulates the IT sector