Men are opening up about mental health to AI instead of humans
Men are opening up about mental health to AI instead of humans

Men are opening up about mental health to AI instead of humans

A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.
“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.
Almost like questioning an AI is free while a therapist costs a LOT of money.
There are other causes here.
They've been talking for a while about how the low participation in dating by Gen Z women is because they're tired of being the entire support system for men experiencing a loneliness epidemic.
It's a lot of pressure for the women to be under, and so they're withdrawing.
I'm guessing this is one of the driving forces as well. Lack of real, emotionally intimate human connections around them. Many men are quite fucked in that regard right now.
The flip side of that is vast numbers of Gen Z Men saying many Gen Z women are basically misandrists, who asked them to stop interacting with them unprompted, no more unwanted attention... so they did that, they stopped... and now all they see is IG and TikToks of Gen Z Women complaining that no one asks them out on dates anymore, no one is 6' tall with a 6 figure income becore the age of 30, and willing to worship them as a queen.
I am not saying this is any kind of objectively accurate to whatever degree, but I am saying that this is the very common, general vibe.
So, in that situation: Why bother?
Many men can actually be fulfilled just staying actually single, as in not even dating single, snd getting their own lives, finances, health, to a better place.
Yes this does though also mean that ... because we've just got less general, face to face socialization going on that... basically a larger than otherwise number of them will basically develop harmful, reinforcing neuroses, in harmful echo chambers... but at the same time, that applies to women as well.
This is what happens when you jam a broad economic collapse up alongside a highly digital and publicized modern media landscape that is tweaked all to fuck to highlight and push the most extreme version of everything... along with extremely mixed messaging that an only digitally socialized person recieves, but all as a firehose, that is very hard to make true sense of.
So... fuck this shit I'm out... social withdrawal... basically becomes a reasonable mental health improving move, even if it does leave you kinda socially stunted as compared to pre-internet generations.
I've got no horse in this race but it appears that 'men should not be afraid to open up' articles and tweets were followed by 'men, we are not your therapist'.
🤷♂️
I think there's a lot more to it than cost. Men, even with considerable health care resources, are often very averse to mental health care.
Thinking of my father in law, for example, I don't know how much you would have to pay him to get him into a therapist's office, but I'm certain he wouldn't go for free.
Also talking to ChatGPT, if done anonymously, won’t ruin your career.
(Thinking of AD military, where they tell you help is available but in reality it will and maybe should cost you your security clearance.)
Granted, but it still will suck a fuck ton of coal produced electricity.
Yeah, but also one of them is helpful and the other is the exact opposite. If the choices are AI therapist or no therapist, you are still better off with no therapist.
Got it. No therapist it is.
that's easy to say, but when someone is in a crisis, I would be wrong to judge then for talking to an AI (shitty terrible solution) instead of a therapist that can be unaffordable and also comes with a risk of then being terrible.
I'd be interested on a study there.
I lot of therapy is taking emotions and verbalising them so that the rational part of the brain can help in dealing with things. Even a journal can help with that, so talking to an inanimate machine doesn't seem stupid to me.
However therapists guide the conversation to challenge the patient, break reinforcing cycles, but in a way that doesn't cause trauma. A chatbot isn't going to be the same.
I'm gonna need a source on that.