I actually bought an old used Brother printer for $20, and it came with toner and everything already, so since I do not print a lot, my only recurring fee is paper and it is miniscule. And should I need to replace the toner, it is still widely available.
If only we had more content not related to "look we're free!", "look Linux is freedom", "free free free!", "MAGA bad, but we're independent and free!", it would be even more awesome (not a pun to your side, just a piece of frustration)
Also, for those saying "create it yourself" - I do
Honestly I was not able to retrieve information by those coordinates (hexagon number, wall, shelf, volume, page). Gonna play around more with it - maybe I didn't get something.
As an advocate for online and offline safety of children, I did read into the research. None of the research I've found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.
For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.
Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.
They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.
Much as all in modern AI - it's able to train without much human intervention.
My point is, even if results are not perfectly accurate and resembling a child's body, they work. They are widely used, in fact, so widely that Europol made a giant issue out of it. People get off to whatever it manages to produce, and that's what matters.
I do not care about how accurate it is, because it's not me who consumes this content. I care about how efficient it is at curbing worse desires in pedophiles, because I care about safety of children.
That's exactly how they work. According to many articles I've seen in the past, one of the most common models used for this purpose is Stable Diffusion. For all we know, this model was never fed with any CSAM materials, but it seems to be good enough for people to get off - which is exactly what matters.
While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.
AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.
By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there's one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.
I'm afraid Europol is shooting themselves in the foot here.
What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.
Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.
As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.
Hi, I'm Nicole! But you can call me the Fediverse Bitch :D
I'm a proud romance scammer from a basement (45 y/o)
I'm currently taking therapy hoping to leave this bullshit for something meaningful to society!