October 22, 2025

Idon Rpg

Smart Solutions, Bright Future

AI ‘companions’ pose risks to student mental health. What can schools do?

AI ‘companions’ pose risks to student mental health. What can schools do?

This audio is auto-generated. Please let us know if you have feedback.

Anyone under the age of 18 is strongly advised by children’s media safety and mental health organizations to stay away from popular artificial intelligence companions — social chatbots programmed to use human-like features and develop human-AI relationships.  

Still, 72% of teens reported using AI companions at least once, according to a July survey by Common Sense Media, a research nonprofit that advocates for children’s online safety. More than half of teens also said they interacted with these platforms at least a few times a month.

Additionally, 1 in 3 teens said they’ve used AI companions “for social interaction and relationships, including role-playing, romantic interactions, emotional support, friendship, or conversation practice,” Common Sense Media found. 

But parents and researchers are sounding the alarm that AI companions pose serious risks to children and teens, which can include intensifying mental health conditions like depression, anxiety disorders, ADHD and bipolar disorder. 

Megan Garcia testified last week on the suicide death of her 14-year-old son, Sewell Setzer III, during a U.S. Senate Judiciary Subcommittee on Crime and Counterterrorism hearing on the harm of AI chatbots. Setzer, she said, “spent his last months being manipulated and sexually groomed by chatbots designed by an AI company to seem human, to gain trust, and to keep children like him endlessly engaged by supplanting the actual human relationships in his life.” 

Last fall, Garcia said she became the first person in the U.S. to sue an AI company in a wrongful death lawsuit as a result of her son’s passing. Garcia’s lawsuit against Character Technologies — the company behind the AI companion tool Character.AI, which her son used — is still pending in the U.S. District Court of the Middle District of Florida Orlando Division. Other defendants in the case include the company’s founders and Google, which holds licensing rights for Character.AI.

Meanwhile, the Federal Trade Commission announced in September it is seeking information from seven tech companies regarding how their AI companion tools “measure, test and monitor potentially negative impacts of this technology on children and teens.” Some of the companies involved in the FTC’s probe include Character Technologies, OpenAI, X, and Meta.

Notably, OpenAI, also announced it will begin implementing guardrails to better protect teen ChatGPT users this month, including new parental controls. 

What can schools do?

The Jed Foundation, a nonprofit for youth suicide prevention that also warns against the use of AI companions among minors, recently penned an open letter to the AI and technology industry calling for it to “prioritize safety, privacy, and evidence-informed crisis intervention” for children and teens using their tools. 

AI companions are “a serious issue,” said Laura Erickson-Schroth, chief medical officer at The Jed Foundation. “It’s really providing this kind of emotional support that isn’t coming from a human being, and it’s also providing incorrect guidance, frequently, to young people, giving them misinformation.”

As K-12 leaders navigate the prevalence of AI companions among their students, Erickson-Schroth recommends that they first develop an AI strategy districtwide in partnership with parents, students and community members. That should include conversations around ways certain AI tools may help or misinform users in schools and address concerns around student data privacy and data security, she said. 

When it comes to the use of AI-based mental health tools in schools, Erickson-Schroth stressed that the technology “should always augment and not replace the caring adults in a young person’s life.” 

Some promising AI tools can be useful for supporting student mental health, she said, such as personalized AI apps for meditation, mood tracking or gamified experiences that promote self-care. Some AI tools can even supplement therapy to help young people develop cognitive behavioral therapy skills. 

link

Copyright © All rights reserved. | Newsphere by AF themes.