A mother in Texas is suing Character AI after the chatbot allegedly told her 17-year-old son, who has autism, to kill his family when they wanted to limit his cellphone use. The teen had no prior violent tendencies before he began using the chatbot, which led to self harm and dramatic weight loss. "This is not an accident. It's not a coincidence," said Matthew Bergman, the mother's attorney, on how the chatbot platform was designed.
#artificialintelligence #chatbot #autism
"NewsNation Prime" is America’s source for unbiased news offering a full range of perspectives from across the U.S. Weekends starting at 7p/6C. #Prime
NewsNation is your source for fact-based, unbiased news for all Americans.
More from NewsNation:
newsnationnow.com
Get our app:
trib.al/TBXgYpp
Find us on cable:
trib.al/YDOpGyG
How to watch on TV or streaming:
trib.al/Vu0Ikij