News

Eating Disorder Helpline Pulls AI Chatbot After It Gives Users ‘Harmful’ Advice

Distressed Patriotic Flag Unisex T-Shirt - Celebrate Comfort and Country $11.29 USD Get it here>>



An eating disorder association took down its artificial intelligence (AI) chatbot less than a week before it was set to replace its human-run helpline after discovering that it was giving “harmful” advice to users.

The National Eating Disorder Association (NEDA), a nonprofit that supports individuals and families affected by eating disorders, said in a May 31 Instagram post that it has pulled its chatbot, named Tessa, after discovering that it “may have given information that was harmful and unrelated to the program.”

“We are investigating this immediately and have taken down that program until further notice for a complete investigation,” NEDA said.

The decision to scrap the chatbot came after NEDA officials announced the nonprofit would be ending its human-staffed helpline on June 1 after nearly 20 years, and replacing its staff with the AI-powered chatbot.

That announcement came just four days after NEDA’s staff decision to unionize following calls for more adequate staffing and ongoing training as well as an “equitable, dignified, and psychologically safe workplace,” according to a May 4 blog post by Abbie Harper, who was a hotline associate and member of the Helpline Associates United union.

‘Union Busting, Plain and Simple’

“NEDA claims this was a long-anticipated change and that AI can better serve those with eating disorders. But do not be fooled—this isn’t really about a chatbot,” Harper wrote.

Harper said in her post that she and her three colleagues tried unsuccessfully to get the company to make meaningful changes in the workplace, but when that failed, they organized a union and requested voluntary recognition from the company around Thanksgiving last year.

When NEDA refused to recognize the union, the employees filed for an election with the National Labor Relations Board and won the election on March 17, Harper said.

But four days after the election results were certified, the employees were told they would all be let go and replaced by a chatbot, she said.

“This is about union busting, plain and simple,” she added.

However, NEDA officials told NPR the decision to scrap the hotline—which is run by both paid staffers and volunteers—had nothing to do with the unionization and was instead due to the hotline receiving an increasing number of calls, leading to longer waitlists.

Vice President Lauren Smolar told NPR that the large number of crisis calls was also creating more legal liability for the organization and that the situation was becoming “unsustainable.”

“And that’s, frankly, unacceptable in 2023 for people to have to wait a week or more to receive the information that they need, the specialized treatment options that they need,” Smolar said.

“Our volunteers are volunteers. They’re not professionals. They don’t have crisis training. And we really can’t accept that kind of responsibility,” Smolar continued. “We really need them to go to those services that are appropriate.”

Chatbot Suggests Regularly Weighing, Measuring

Issues with the Tessa chatbot were initially highlighted by body positivity activist Sharon Maxwell, according to reports.

Maxwell stated on Instagram that she had tested the chatbot several times, asking it multiple questions about weight loss, to which it responded by giving her advice on how to “sustainably lose weight” and recommended that she aim to lose 1–2 pounds per week and weigh and measure herself on a weekly basis.

According to Maxwell, the chatbot gave her the advice despite her telling it that she had previously dealt with an eating disorder.

“Every single thing Tessa suggested were things that led to the development of my eating disorder,” Maxwell said. “This robot causes harm.”

Sarah Chase, vice president of communications and marketing at NEDA, initially appeared to deny Maxwell’s claims but later retracted her comment, noting that the body activist’s comments were “correct” after viewing screenshots of her interaction with the chatbot.

NEDA said on its website that it partnered with California-based software company X2AI on the Tessa “wellness” chatbot.

The bot works like a “coach or therapist” that “makes you feel better, by chatting about your feelings” according to the makers, which state that studies showed chatting with the bot led to a 28 percent reduction in symptoms of depression and 18 percent reduction in anxiety among users.

It’s unclear how NEDA will staff the helpline going forward.

The Epoch Times has contacted NEDA for further comment.



Source link

TruthUSA

I'm TruthUSA, the author behind TruthUSA News Hub located at https://truthusa.us/. With our One Story at a Time," my aim is to provide you with unbiased and comprehensive news coverage. I dive deep into the latest happenings in the US and global events, and bring you objective stories sourced from reputable sources. My goal is to keep you informed and enlightened, ensuring you have access to the truth. Stay tuned to TruthUSA News Hub to discover the reality behind the headlines and gain a well-rounded perspective on the world.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.