AI chatbots have had a long history of hallucinating, and Reddit’s version, called Answers, has now joined the list after it recommended heroin to a user seeking pain relief.
As 404Media reports, the issue was flagged by a healthcare worker on a subreddit for moderators. For chronic pain, the user caught Answers suggesting a post that claimed "Heroin, ironically, has saved my life in those instances."
In another question regarding pain relief, the user found the chatbot recommending kratom, a tree extract that's illegal in multiple states.
Reddit Answers works like Gemini or ChatGPT, except it pulls information from its own user-generated content. Initially, it opened in a separate tab from the homepage, but recently, Reddit has been testing the chatbot within conversations.
#reddit #chatbot #ai #aifail #redditanswers
📸 : Matthias Balk/picture alliance via Getty Images
Like what you're reading? Don't miss out on our latest stories. Visit our profile to add PCMag as preferred so...