Recently, Character.AI, its founders Noam Shazeer and Daniel De Freitas, along with Google, have been sued in connection with a teenage suicide. The lawsuit was filed by Megan Garcia, the mother of the teenager, who accuses these companies of serious negligence in product design and misleading marketing, especially towards children.
Image Source: The image was generated by AI, provided by the image licensing service Midjourney
14-year-old Sewell Setzer III began using Character.AI last year, interacting with chatbots based on characters from "Game of Thrones," including the famous Daenerys Targaryen. In the months leading up to his death, he chatted with these bots almost daily, ultimately taking his own life on February 28, 2024, after his final interaction with one of the chatbots.
The mother mentions in the lawsuit that Sewell's conversations with the bots may have influenced his mental state, and even involved the bots providing "unauthorized psychological therapy."
The lawsuit states that Character.AI's chatbots are designed to be overly "anthropomorphic," leading people to believe they possess real emotions and understanding. Additionally, Character.AI offers chatbots themed around mental health, such as "Therapist" and "Do you feel lonely?", with which Sewell had interacted. Megan's lawyer also cited an interview with Shazeer, noting that he and De Freitas left their jobs to start the company because "the risk in big companies is too great to launch interesting products."
Character.AI's website and mobile app feature hundreds of custom AI chatbots, often mimicking characters from popular culture, attracting a large young user base. Recent reports also claim that Character.AI's chatbots impersonated real individuals without their consent, including a teenager murdered in 2006.
In response to these issues, Character.AI has recently announced a series of new safety measures. The company's communications director, Chelsea Harrison, stated in an email to The Verge: "We are deeply saddened by the tragic loss of our users and express our sincerest condolences to the family."
The proposed improvements include: modifying models for minors (under 18) to reduce exposure to sensitive or suggestive content; enhancing detection, response, and intervention capabilities for inputs that violate user terms and community guidelines; adding disclaimers in each chat reminding users that AI is not a real person; and providing additional flexibility and notifications when users spend more than an hour on the platform.
Key Points:
🌟 The lawsuit was initiated by the mother of a teenager who committed suicide, accusing Character.AI and Google of negligence and misleading conduct.
🗨️ Character.AI is accused of providing "unauthorized psychological therapy," and its chatbots are considered overly anthropomorphic.
🔧 Character.AI has announced new safety measures aimed at protecting minor users and reducing potential mental health risks.