Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)BE
Posts
23
Comments
197
Joined
2 yr. ago

  • That's an arbitrary decision to make and doesn't really need to be debated

    The study is pretty transparent about what "CSAM" is under their definition and they even provide pictures, from a science communication point of view they're in the clear

  • “treating them the same” => The threshold for being refused entry into mainstream instances is just already crossed at the lolicon level.

    From the perspective of the fediverse, pictures of child rape and lolicon should just both get you thrown out. That doesn’t mean you’re “treating them the same”. You’re just a social network. There's nothing you can do above defederating.

  • "treating them the same" => The threshold for being banned is just already crossed at the lolicon level.

    From the perspective of the park, pissing in a pond and fighting a dude both get you thrown out. That doesn't mean you're "treating them the same". You're just the park.

    Do you get it now?

  • Who places the bar for "exclusion from a social network" at felonies? Any kind child porn has no place on the fediverse, simulated or otherwise. That doesn't mean they're equal offenses, you're just not responsible for carrying out anything other than cleaning out your porch.

  • He invented the stupid take he's fighting against. Nobody equated "ink on paper" with "actual rape against children".

    The bar to cross to be filtered out of the federation isn't rape. Lolicon is already above the threshold, it's embarrassing that he doesn't realize that.

  • CSAM definitions absolutely include illustrated and simulated forms. Just check the sources on the wikipedia link and climb your way up, you'll see "cartoons, paintings, sculptures, ..." in the wording of the protect act

    They don't actually need a victim to be defined as such

  • It's illegal in a lot of places including where I live.

    In the US you have the protect act of 2003

    (a) In General.—Any person who, in a circumstance described in subsection (d), knowingly produces, distributes, receives, or possesses with intent to distribute, a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting, that— (1) (A) depicts a minor engaging in sexually explicit conduct; and (B) is obscene; or (2) (A) depicts an image that is, or appears to be, of a minor engaging in graphic bestiality, sadistic or masochistic abuse, or sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; and (B) lacks serious literary, artistic, political, or scientific value; or attempts or conspires to do so, shall be subject to the penalties provided in section 2252A(b)(1), including the penalties provided for cases involving a prior conviction.

    Linked to the obscenity doctrine

    https://www.law.cornell.edu/uscode/text/18/1466A

  • Okay, the former then.

    Let's just think about it, how do you think it would turn out if you went outside and asked anyone about pornographic drawings of children? How long until you find someone who thinks like you outside your internet bubble?

    "Nobody wants this stuff that whole servers..."

    There are also servers dedicated to real child porn with real children too. Do you think that argument has any value with that tidbit of information tacked onto it?