Mom horrified by Character.AI chatbots posing as son who died by suicide - Ars Technica
Mom horrified by Character.AI chatbots posing as son who died by suicide - Ars Technica

Mom horrified by Character.AI chatbots posing as son who died by suicide

Mom horrified by Character.AI chatbots posing as son who died by suicide - Ars Technica
Mom horrified by Character.AI chatbots posing as son who died by suicide
This is referring to a bot designed to help with people struggling with mental health, and is actually a big one. That number is way too low.
“hey, I know you feel like killing yourself, but if it happens then we’ll just replace you with a shitty bot” probably isn’t as helpful as they thought it would be. It’s violating and ghoulish.
I hate this attitude of "well if you can't get a professional therapist, figure out how to get one anyways". There needs to be an option for people who either can't afford or can't access a therapist. I would have loved for AI to fill that gap. I understand it won't be as good, but in many regions the wait-list for therapy is far too long, and something is better than nothing