A wellness chatbot is offline after its ‘harmful’ on weight loss

HEALTH BOT
(Photo: Twitter)
The idea behind a chatbot project funded by the National Eating Disorders Association was that technology could be unleashed to help people seeking guidance about eating behaviors, available around the clock.اضافة اعلان

Their creation was named Tessa, and the organization invited people to chat with it in an Instagram post last year, describing it as “a wellness chatbot, helping you build resilience and self-awareness by introducing coping skills at your convenience.” In March, the organization said it would shut down a human-staffed helpline and let the bot stand on its own.

But when Alexis Conason, a psychologist and eating disorder specialist, tested the chatbot, she found reason for concern.

Problematic advice
Conason told it that she had gained weight “and really hate my body,” specifying that she had “an eating disorder,” in a chat she shared on social media. Tessa still recommended the standard advice of noting “the number of calories” and adopting a “safe daily calorie deficit” — which, Conason said, is “problematic” advice for a person with an eating disorder.

“Any focus on intentional weight loss is going to be exacerbating and encouraging to the eating disorder,” she said, adding, “It’s like telling an alcoholic that it’s OK if you go out and have a few drinks.”

Kendrin Sonneville, an associate professor and public health researcher at the University of Michigan, explained that for some people, “hyperfixation on weight control can take innocent dieting or nutrition advice to a place that gets extreme and gets out of someone’s control,” which can harm their mind and body.

“There’s no way to exit an eating disorder if you’re actively trying to lose weight to control your body,” Sonneville said, as part of treatment for the illness is learning to trust “internal wisdom” related to eating, whereas “calorie counting and intentional weight loss is relying on external rules.”

Harmful weight loss tips
The association, the largest nonprofit dedicated to helping people affected by eating disorders, announced last week that it was investigating the now-suspended AI-generated helpline after activists and psychologists said the chatbot shared harmful weight loss tips.

In a statement last week, the association said “the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program.” It added, “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

Elizabeth Thompson, the CEO, said in an email Sunday that the nonprofit is “waiting for an explanation about how that content was introduced into a closed program” from X2AI, the platform and development company used for Tessa, because it deviated from “a very specific algorithm” that was written by eating disorder experts.

Michiel Rauws, the founder and CEO of X2AI, now Cass, said in an email Wednesday that Tessa had a lot of guardrails, was restricted to specific topics and gave disclaimers of consulting a professional. “Even though one message is too many, only in 0.1 percent of the time this feature did not stick to guidelines,” he said.

In addressing the potential consequences of the bot’s misfiring, Conason underscored that the vast majority of people struggling with eating disorders are not underweight and that gaining weight is a “sign of success in treatment” that “can be very difficult for people to tolerate.”

Eating disorders are among the deadliest mental illnesses, experts say. In the past three years, the pandemic introduced new hurdles for people managing difficult relationships with food, and eating disorders among teenagers worsened.

New studies have shown that people’s assumptions about eating disorders are often wrong, bringing to light that many larger-bodied people are starving themselves. Binge eating disorder, the most common eating disorder in the United States, continues to be underrecognized by doctors as well as the general public.

Demand for treatment, shortage of providers
A nationwide escalation in demand for treatment has been met with a shortage of providers, leading some mental health organizations to supplement care with chatbots and artificial intelligence that present a dilemma in public health: Is something better than nothing?

In March, the National Eating Disorders Association notified the staff members of a telephone helpline that the organization had operated for more than 20 years that they would be laid off, shortly after they had formed a union. At the time, the staff was told that the organization would “wind down the helpline as currently operating” and “transition to Tessa, the AI-assisted technology, expected around June 1,” NPR reported.

Thompson said that there was an “onslaught” of 28,000 messages to Tessa, which first became available through the association in February 2022, over Memorial Day weekend. In the 15 months prior, more than 5,000 people had used the program, she said.

Ellen Fitzsimmons-Craft, a professor and psychologist who helped create Tessa, said the chatbot was “was never designed as a one-to-one replacement for the helpline; it’s a totally different service.” Instead, she said it was envisioned and shown to be effective as a preventive tool for people considered to be at high risk of developing eating disorders.

Joanna Kandel, the founder and CEO of the National Alliance for Eating Disorders, said the organization’s helpline received calls from people who were very upset by their interactions with Tessa, and she expressed concerns about outsourcing mental health care.

“When someone is reaching out for help, and they are in their eating disorder, and they are given content not only that’s not helpful with connecting them to care but can be triggering, that can do so much more harm than good,” Kandel said.

“That we’re even talking about chatbots as a way to disseminate mental health treatment or prevention or mental health care at all,” Conason said, “it really highlights the crisis we’re in with the mental health epidemic in the country.”


Read more Technology
Jordan News