Isn’t it a good thing for pedophiles to have an outlet for their desires that doesn’t involve harming children? Am I not seeing an obvious downside?
Pedophilia is not a sexuality and CSAM, AI generated or not, is not a healthy outlet. Pedophilia should be treated as a disease, and pedophiles should receive treatment for that instead.
AFAIK you can't "cure" pedophilia the same way you can't cure homosexuality. The best you can do is teach people not to act on their desires.
pedophiles should receive treatment for that instead
In a world where many people cannot afford basic healthcare or – if they can afford it – where healthcare isn’t available in the required quantity, does your argument still hold?
If I'm not mistaking I remember reading that consuming CSAM increases the likelihood of offense since it normalizes the act/makes the fantasies more vivid. It makes them more want to act out what they see instead of removing desires.
And what happens when they start making requests of real underage people?
That’s the whole point of my argument. They don’t need to make request for real people if they can get fake ones of equal quality. Your argument reads like “We can’t let people have meat. What if they start eating live cows?”
It’s still fake. But if it looks like a person in real life, what difference does the distinction make?
I'm pretty sure there is a quite a difference between an actual human being abused and a victimless depiction of such act. Not unlike watching a violent movie. Such people obviously still need help and treatment, but to me it seems vastly better than the alternative.
It depends on whether you hold a world view where every person is valuable and needs help and understanding to become their best self or one where there are good and bad people and the baddies need to be punished and locked away so everyone else can live their life in peace.
Don’t AI models need to be trained on the material they are trying to emulate?
No, not at all.
That’s why people like them, you can say make me a photo of a “monkey riding a pickle in space” or “a dog made of cheese” and it’ll make it despite obviously having no reference.
It only needs to be trained to know what things are, it can mix them freely.
On one hand, yes, but on the other, Stable Horde developed a model to detect CSAM thanks to Stable Diffusion, and that's being used to combat pedos globally
What’s interesting is that mammals from mice to dogs don’t draw a distinction between arbitrary ages before trying to copulate. On the other hand, they don’t try to fuck the equivalent of pre-pubescent members of their species either, nothing natural about that.
Isn’t it a good thing for pedophiles to have an outlet for their desires that doesn’t involve harming children? Am I not seeing an obvious downside?
Pedophilia is not a sexuality and CSAM, AI generated or not, is not a healthy outlet. Pedophilia should be treated as a disease, and pedophiles should receive treatment for that instead.
AFAIK you can't "cure" pedophilia the same way you can't cure homosexuality. The best you can do is teach people not to act on their desires.
In a world where many people cannot afford basic healthcare or – if they can afford it – where healthcare isn’t available in the required quantity, does your argument still hold?
If I'm not mistaking I remember reading that consuming CSAM increases the likelihood of offense since it normalizes the act/makes the fantasies more vivid. It makes them more want to act out what they see instead of removing desires.
Based on this article, it seems that teens were using an app: https://www.msn.com/en-us/money/other/ai-generated-child-sexual-abuse-images-could-flood-the-internet-a-watchdog-is-calling-for-action/ar-AA1iMZj5
Is that your reference?
That’s the whole point of my argument. They don’t need to make request for real people if they can get fake ones of equal quality. Your argument reads like “We can’t let people have meat. What if they start eating live cows?”
I'm pretty sure there is a quite a difference between an actual human being abused and a victimless depiction of such act. Not unlike watching a violent movie. Such people obviously still need help and treatment, but to me it seems vastly better than the alternative.
It depends on whether you hold a world view where every person is valuable and needs help and understanding to become their best self or one where there are good and bad people and the baddies need to be punished and locked away so everyone else can live their life in peace.
Don’t AI models need to be trained on the material they are trying to emulate?
No, not at all.
That’s why people like them, you can say make me a photo of a “monkey riding a pickle in space” or “a dog made of cheese” and it’ll make it despite obviously having no reference.
It only needs to be trained to know what things are, it can mix them freely.