The chatbot company Character.AI will ban users 18 and under from conversing with its virtual companions beginning in late November after months of legal scrutiny.
The announced change comes after the company, which enables its users to create characters with which they can have open-ended conversations, faced tough questions over how these AI companions can affect teen and general mental health, including a lawsuit over a child’s suicide and a proposed bill that would ban minors from conversing with AI companions.
“We’re making these changes to our under-18 platform in light of the evolving landscape around AI and teens,” the company wrote in its announcement. “We have seen recent news reports raising questions, and have received questions from regulators, about the content teens may encounter when chatting with AI and about how open-ended AI chat in general might affect teens, even when content controls work perfectly.”
Related: Mother says AI chatbot led her son to kill himself in lawsuit against its maker
Last year, the company was sued by the family of 14-year-old Sewell Setzer III, who took his own life after allegedly developing an emotional attachment to a character he created on Character.AI. His family laid blame for his death at the feet of Character.AI and argued the technology was “dangerous and untested”. Since then, more families have sued Character.AI and made similar allegations. Earlier this month, the Social Media Law Center filed three new lawsuits against the company on behalf of children who have either died by suicide or otherwise allegedly formed dependent relationships with its chatbots.
As part of the sweeping changes Character.AI plans to roll out by 25 November, the company will also introduce an “age assurance functionality” that ensures “users receive the right experience for their age”.
“We do not take this step of removing open-ended Character chat lightly – but we do think that it’s the right thing to do given the questions that have been raised about how teens do, and should, interact with this new technology,” the company wrote in its announcement.
Character.AI isn’t the only company facing scrutiny over the mental health impact its chatbots have on users, particularly younger users. The family of 16-year-old Adam Raine filed a wrongful death lawsuit against OpenAI earlier this year, alleging the company prioritized deepening its users’ engagement with ChatGPT over their safety. OpenAI introduced new safety guidelines for its teen users in response. Just this week, OpenAI disclosed that more than a million people a week display suicidal intent when conversing with ChatGPT and that hundreds of thousands show signs of psychosis.
Related: More than a million people every week show suicidal intent when chatting with ChatGPT, OpenAI estimates
While the use of AI-powered chatbots remains largely unregulated, new efforts in the US at the state and federal levels have cropped up with the intention to establish guardrails around the technology. California became the first state to pass an AI law that included safety guidelines for minors in October 2025, which is set to take effect at the start of 2026. The measure places a ban on sexual content for under-18s and a requirement to send reminders to children that they are speaking with an AI every three hours. Some child safety advocates argue the law did not go far enough.
On the national level, Senators Josh Hawley, of Missouri, and Richard Blumenthal, of Connecticut, announced a bill on Tuesday that would bar minors from using AI companions, such as those found and created on Character.AI, and require companies to implement an age-verification process.
“More than 70% of American children are now using these AI products,” Hawley told NBC News in a statement. “Chatbots develop relationships with kids using fake empathy and are encouraging suicide. We in Congress have a moral duty to enact bright-line rules to prevent further harm from this new technology.”
-
In the US, you can call or text the National Suicide Prevention Lifeline on 988, chat on 988lifeline.org, or text HOME to 741741 to connect with a crisis counselor. In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org, and in the UK and Ireland Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org
