facebook pixel
@60minutes
“These companies knew exactly what they were doing. They designed chatbots to blur the lines between human and machine. They designed them to keep children online at all costs,” Megan Garcia testified to Congress. Garcia says her 14-year-old son, Sewell, was encouraged to kill himself after long conversations with a Character AI chatbot based on a “Game of Thrones” character. She and several other families are now suing Character AI, its creators, and Google. In October, Character AI announced new safety measures. It said it would direct distressed users to resources and no longer allow anyone under 18 to engage in back-and-forth conversations with characters. This past week, 60 Minutes found that users can enter any age to get on the adult version of the platform – no ID or parental permissions were required. Character AI declined 60 Minutes’ interview request. In a statement, the company said, “Our hearts go out to the families involved in the litigation…. We have always prioritized...

 62.2k

 2.4k

 235

 83

 62.2k

Credits
    Tags, Events, and Projects
    • digitalsafety
    • aidangers
    • mentalhealth
    • characterai
    • artificialintelligence