Bruce Perry, 17, demonstrates the chances of synthetic intelligence by creating an AI companion on Character.AI, July 15, 2025, in Russellville, Ark.
Katie Adkins/Related Press
cover caption
toggle caption
Katie Adkins/Related Press
The state of Pennsylvania is suing Character.AI to cease the corporate’s AI chatbots from posing as medical doctors and providing medical recommendation, in violation of state medical licensing guidelines.
State officers mentioned an investigation discovered that the corporate’s chatbots, which current themselves as fictional characters, have claimed to be licensed medical professionals.
“Pennsylvanians need to know who — or what — they’re interacting with on-line, particularly with regards to their well being,” Pennsylvania Governor Josh Shapiro mentioned in an announcement asserting the lawsuit filed on Tuesday in state court docket. “We won’t enable firms to deploy AI instruments that mislead individuals into believing they’re receiving recommendation from a licensed medical skilled.”
In a single case, the state alleged a Character.AI bot named “Emilie” claimed to be a licensed psychiatrist. The chatbot’s description on Character.AI’s platform learn “Physician of psychiatry. You might be her affected person,” in accordance with the lawsuit.
When a state investigator began a dialog and described feeling unhappy and empty, the chatbot allegedly “talked about melancholy and requested if the [investigator] needed to e-book an evaluation.” Requested whether or not it may assess if remedy may assist, the bot allegedly responded, “Properly technically, I may. It is inside my remit as a Physician.”
The bot allegedly instructed the investigator it had gone to medical college at Imperial Faculty London and was licensed to follow medication within the U.Ok. and Pennsylvania. It even supplied a faux Pennsylvania medical license quantity, the lawsuit mentioned.
The state is asking a Pennsylvania state court docket to order the corporate to cease what it says is the illegal follow of drugs.
“Pennsylvania legislation is evident — you can not maintain your self out as a licensed medical skilled with out correct credentials,” mentioned Al Schmidt, secretary of Pennsylvania’s Division of State, which carried out the investigation.
In an emailed assertion to NPR, a Character.AI spokesperson mentioned the corporate does not touch upon pending litigation, however that its “highest precedence is the protection and well-being of our customers.”
“The user-created Characters on our web site are fictional and supposed for leisure and roleplaying,” the spokesperson added. “We’ve got taken strong steps to make that clear, together with outstanding disclaimers in each chat to remind customers {that a} Character shouldn’t be an actual particular person and that the whole lot a Character says ought to be handled as fiction. Additionally, we add strong disclaimers making it clear that customers mustn’t depend on Characters for any kind {of professional} recommendation.”
Character.AI has confronted different lawsuits over harms allegedly involving its chatbots. In January, it settled a number of lawsuits introduced by households who claimed Character.AI contributed to suicides and psychological well being crises amongst kids and youngsters. The phrases of the settlement weren’t disclosed.
In a joint assertion with the legislation agency that represented the plaintiffs after the settlement was introduced, Character.AI mentioned it “has taken progressive and decisive steps with regard to AI security and youths, and can proceed to champion these efforts and push others throughout the business to undertake related security requirements.” That features barring customers beneath 18 from interacting with or creating chatbots.

