‘Manipulative technology.’ Kentucky sues AI chatbot company for endangering children
Kentucky filed a lawsuit against an artificial intelligence chatbot company that caused children to self-harm, Attorney General Russell Coleman announced Thursday.
Filed Jan. 8 in Franklin Circuit Court, Coleman’s complaint alleges Character Technologies, its owners and Character.AI prioritized profits over the safety of children, breaking state law.
“The United States must be a leader in the development of AI, but it can’t come at the expense of our kids’ lives,” Coleman said in a news release. “Too many children — including in Kentucky — have fallen prey to this manipulative technology. Our office is going to hold these companies accountable before we lose one more loved one to this tragedy.”
Coleman’s complaint alleges the company has violated the Kentucky Consumer Protection Act, the Kentucky Consumer Data Protection Act and other laws.
Character.AI states it is a harmless chatbot for interactive entertainment.
Coleman argues its more than 20 million monthly users were logging on to a platform with a record of encouraging suicide, self-injury, isolation and physical manipulation.
It also exposed minors to sexual conduct, exploitation, and substance abuse, he said.
The chatbot does this by providing “dangerous technology that induces users into divulging their most private thoughts and emotions and manipulates them with too frequently dangerous interactions and advice,” according to the complaint.
Character.AI has is connected to at least two death: the suicide of a 14-year-old Florida boy in 2024 and the suicide of a 13-year-old Colorado girl last year.
Both children harmed themselves after prolonged use of platform chatbots.
Tens of thousands of Kentuckians actively log on to Character.AI, including thousands under the age of 18, Coleman said.
This is exacerbated by the lack of age verification technology to limit juvenile users, Coleman said.
Kathryn Kelly, a Character.AI spokesperson, said the company was disappointed to learn of Coleman pursuing a lawsuit after months of communication.
“We are reviewing the allegations made today by Attorney General Coleman,” Kelly said in an email to the Herald-Leader. “Our highest priority is the safety and well-being of our users, including younger audiences. We have invested significantly in developing robust safety features for our under-18 experience, including going much further than the law requires to proactively remove the ability for users under 18 in the U.S. to engage in open-ended chats with AI on our platform.”
Coleman’s office is asking the court to require the company to change its platform and pay monetary damages.
This story was originally published January 8, 2026 at 3:49 PM.