Character.AI's chatbot service is launching a new feature called "Parental Monitoring." This feature allows teenagers to send a weekly report of their chatbot usage to their parents' email address. According to a company statement, this report includes average daily time spent on web and mobile devices, the characters they interact with most frequently, and the duration of conversations with each. This initiative is part of a series of updates aimed at addressing two major concerns regarding minors' use of chatbots: excessive chatting time and encountering inappropriate content.
This report doesn't require parents to have an account and is optional. Minors can set it up themselves in Character.AI's settings. The company clarifies that this report provides only a snapshot of teenagers' activity, not a complete log, and doesn't share the specific content of conversations with the chatbot. Currently, the platform is prohibited for children under 13 in most regions, and under 16 in Europe.
Character.AI has been rolling out new features for minors since last year, but concerns about its service have mounted, even leading to lawsuits. The platform is popular with teenagers who can create, customize, interact with, and publicly share chatbots. However, several lawsuits accuse these bots of providing inappropriate pornographic content or encouraging self-harm. The company has also reportedly received warnings from Apple and Google (who hired Character.AI's founders last year) regarding the app's content.
Character.AI claims its system has been redesigned. Among numerous changes, a model trained to avoid "sensitive" outputs has been implemented for users under 18, and more prominent notifications remind users that these bots are not human. However, given the current focus on AI regulation and child safety laws, this may not be the last measure the company will take.