Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)FO
Posts
10
Comments
570
Joined
1 yr. ago

  • I've been using similar, but slighly different ones that I assume do the same thing. You can also copy the filter and just replace the filter word rather than including them all in one. Naturally here the instance name must be also changed to that of yours.

    lemmy.world##div.post-listing:has(span:has-text("/trump/i"))

    And for filtering comments:

    lemmy.world##article.comment-node:has(div.comment-content:has(p:has-text(/trump/i)))

  • Compounding interests. I don't think that most people realise how powerful the effect of it is. Anyone can take advantage of it but there's no getting around the fact that the more money you have the easier it gets to make even more which then makes it even easier and this just keeps accelerating.

  • For me, this was at no point about the morality of it. I've been strictly talking about the definition of terms. While laws often prohibit both CSAM and depictions of it, there's still a difference between the two. CSAM is effectively synonymous with "evidence of crime" If it's AI generated, photoshopped, drawn or what ever, then there has not been a crime and thus the content doesn't count as evidence of it. Abuse material literally means what it says; it's video/audio/picture content of the event itself. It's illegal because producing it without harming children is impossible.

    EDIT: It's kind of same as calling AI generated pictures photographs. They're not photographs. Photographs are taken with a camera. Even if the picture an AI generates is indistinguishable from a photograph it still doesn't count as one because no cameras were involved.

  • I already told you that I'm not speaking from legal point of view. CSAM means a specific thing and AI generated content doesn't fit under this definition. The only way to generate CSAM is by abusing children and taking pictures/videos of it. AI content doesn't count any more than stick figure drawings do. The justice system may not differentiate the two but that is not what I'm talking about.

  • Being legally considered CSAM and actually being CSAM are two different things. I stand behind what I said which wasn't legal advise. By definition it's not abuse material because nobody has been abused.

  • First of all, it's by definition not CSAM if it's AI generated. It's simulated CSAM - no people were harmed doing it. That happened when the training data was created.

    However it's not necessary that such content even exists in the training data. Just like ChatGPT can generate sentences it has never seen before, image generators can also generate pictures it has not seen before. Ofcourse the results will be more accurate if that's what it has been trained on but it's not strictly necessary. It just takes a skilled person to write the prompt.

    My understanding is that the simulated CSAM content you're talking about has been made by people running their software locally and having provided the training data themselves.

  • I don't and I don't plan to. It's however too soon to tell wether I'll regret that or not. Time will tell.

    I can see the appeal of having kids but my current lifestyle is that I do what I want when ever I want and I don't really plan things ahead. I don't want to take the risk of having kids and then having to dramatically change my lifestyle only to realize it's not what I wanted and now I can no longer go back. I think that to have kids you have to want it. Now I just feel like it's something that's expected of me and I don't think that's a good reason to go ahead with it.

  • I wouldn't say that I don't see the appeal of it. I would probably get sucked right in if I gave it a shot. It's a consciouss decision on my part to simply not do that. I don't not-consume short-form media because I'm better than the people who do, I prohibit it from myself.