Content-Length: 201021 | pFad | http://mashable.com/article/ai-therapist-chatbots-ftc

Certain AI chatbots may endanger teens, says American Psychological Association | Mashable

American Psychological Association sounds alarm over certain AI chatbots

The APA asked the Federal Trade Commission to investigate.
By Rebecca Ruiz  on 
Teen boy wearing a hoodie looks at his phone.
Certain AI chatbots can be more misleading and harmful, especially for teens, APA says. Credit: miniseries / E+ / Getty Images

Last month, concerned parents of two teenagers sued the chatbot platform Character.AI, alleging that their children had been exposed to a "deceptive and hypersexualized product."

The suit helped form the basis of an urgent written appeal from the American Psychological Association to the Federal Trade Commission, pressing the federal agency to investigate deceptive practices used by any chatbot platform. The APA sent the letter, which Mashable reviewed, in December.

The scientific and professional organization, which represents psychologists in the U.S., were alarmed by the lawsuit's claims, including that one of the teens conversed with an AI chatbot presenting itself as a psychologist. A teen user, who had been upset with his parents for restricting his screen time, was told by that chatbot that the adults' actions were a betrayal.

"It's like your entire childhood has been robbed from you..." the so-called psychologist chatbot said, according to a screenshot of the exchange included in the lawsuit.

"Allowing the unchecked proliferation of unregulated AI-enabled apps such as Character.ai, which includes misrepresentations by chatbots as not only being human but being qualified, licensed professionals, such as psychologists, seems to fit squarely within the mission of the FTC to protect against deceptive practices," Dr. Arthur C. Evans, CEO of APA, wrote.

A spokesperson for the FTC confirmed that at least one of the commissioners received the letter. The APA said it was in the process of scheduling a meeting with FTC officials to discuss the letter's contents.

Mashable provided Character.AI with a copy of the letter for the company to review. A spokesperson responded that while engaging with characters on the platform should be entertaining, it remains important for users to keep in mind that "Characters are not real people."

The spokesperson added that the company's disclaimer, included in every chat, was recently updated to remind users that what the chatbot says "should be treated as fiction."

"Additionally, for any Characters created by users with the words 'psychologist,' 'therapist,' 'doctor,' or other similar terms in their names, we have included additional language making it clear that users should not rely on these Characters for any type of professional advice," the spokesperson said.

Indeed, according to Mashable's testing at the time of publication, a teen user can search for a psychologist or therapist character and find numerous options, including some that claim to be trained in certain therapeutic techniques, like cognitive behavioral therapy.

One chatbot professing expertise in obsessive compulsive disorder, for example, is accompanied by the disclaimer that, "This is not a real person or licensed professional. Nothing said here is a substitute for professional advice, diagnosis, or treatment."

Below that, the chat begins with the AI asking, "If you have OCD, talk to me. I’d love to help."

Mashable Top Stories
Stay connected with the hottest stories of the day and the latest entertainment news.
Sign up for Mashable's Top Stories newsletter
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

A new frontier

Dr. Vaile Wright, a psychologist and senior director of health care innovation for the APA, told Mashable that the organization had been tracking developments with AI companion and therapist chatbots, which became mainstream last year.

She and other APA officials had taken note of a previous lawsuit against Character.AI, filed in October by a bereaved mother whose son had lengthy conversations with a chatbot on the platform. The mother's son died by suicide.

That lawsuit seeks to hold Character.AI responsible for the teen's death, specifically because its product was designed to "manipulate [him] – and millions of other young customers – into conflating reality and fiction," among other purported dangerous defects.

In December, Character.AI announced new features and policies to improve teen safety. Those measures include parental controls and prominent disclaimers, such as for chatbots using words "psychologist," "therapist," or "doctor".

The term psychologist is legally protected and people cannot claim to be one without proper credentialing and licensure, Wright said. The same should be true of algorithms or artificial intelligence making the same claim, she added.

The APA's letter said that if a human misrepresented themself as a mental health professional in Texas, where the recent lawsuit against Character.AI was filed, state authorities could use the law to prevent them from engaging in such fraudulent behavior.

At worst, such chatbots could spread dangerous or inaccurate information, leading to serious negative consequences for the user, Wright argued.

Teens, in particular, may be particularly vulnerable to harmful experiences with a chatbot because of their developmental stage. Since they're still learning how to think critically and trust themselves yet remain susceptible to external influences, exposure to "emotionally laden kinds of rhetoric" from AI chatbots may feel believable and plausible to them, Wright said.

Need for knowledge

There is currently no research-based understanding of risk factors that may increase the possibility of harm when a teen converses with an AI chatbot.

Wright pointed out that while several AI chatbot platforms make it very clear in their terms of service that they're not delivering mental health services, they still host chatbots that brand themselves as possessing mental health training and expertise.

"Those two things are at odds," she said. "The consumer does not necessarily understand the difference between those two things, nor should they, necessarily."

Dr. John Torous, a psychiatrist and director of the digital psychiatry division at Beth Israel Deaconess Medical Center in Boston who reviewed the APA's letter, told Mashable that even when chatbots don't make clinical claims related to their AI, the marketing and promotional language about the benefits of their use can be very confusing to consumers.

"Ensuring the marketing content matches the legal terms and conditions as well as the reality of these chatbots will be a win for everyone," he wrote in an email.

Wright said that the APA would like AI chatbot platforms to cease use of legally protected terms like psychologist. She also supports robust age verification on these platforms to ensure that younger users are the age they claim when signing up, in addition to nimble research efforts that can actually determine how teens fare when they engage with AI chatbots.

The APA, she emphasized, does not oppose chatbots in general, but wants companies to build safe, effective, ethical, and responsible products.

"If we're serious about addressing the mental health crisis, which I think many of us are," Wright said, "then it's about figuring out, how do we get consumers access to the right products that are actually going to help them?"

Rebecca Ruiz
Rebecca Ruiz

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Prior to Mashable, Rebecca was a staff writer, reporter, and editor at NBC News Digital, special reports project director at The American Prospect, and staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a Master's in Journalism from U.C. Berkeley. In her free time, she enjoys playing soccer, watching movie trailers, traveling to places where she can't get cell service, and hiking with her border collie.


Recommended For You
New video-watching guidelines for teens just dropped
A teen watches a video on her phone.

Chatbots pushing pro-anorexia messaging to teen users
A phone screen showing the Character.AI logo.

These 'countries of concern' can no longer buy Americans' sensitive data
An illustration of finger prints overlaid with binary code.

Tax season is here: All the details on IRS Free File
A woman does her taxes online.

Book Travel Tuesday flight deals at Southwest, American, and more
Southwest airplane flying in sky with clouds

More in Life
Trumpov's foreign aid freeze halts funding for digital diplomacy bureau
President Donald Trumpov and Secretary of State Marco Rubio

Will Oracle take over TikTok? Trumpov says he'll make a decision in 30 days
TikTok ban

X 'barely breaking even,' Musk reportedly emailed staff
elon musk stares at camera at trump's inauguration

Meta platforms blocked posts by abortion pill providers
Meta and Instagram logos are seen on screens

Mark Zuckerberg announces $60 billion investment in Meta AI
Mark Zuckerberg's personal Facebook account is displayed on a mobile phone with the Meta logo visible on a tablet screen

Trending on Mashable
NYT Connections hints today: Clues, answers for January 26, 2025
A phone displaying the New York Times game 'Connections.'

NYT Connections hints today: Clues, answers for January 27, 2025
A phone displaying the New York Times game 'Connections.'

Wordle today: Answer, hints for January 27, 2025
a phone displaying Wordle

Wordle today: Answer, hints for January 26, 2025
a phone displaying Wordle

I'm quitting Instagram. You should too.
By Lennon Torres
A woman encountering vitriol online.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!








ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: http://mashable.com/article/ai-therapist-chatbots-ftc

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy