Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)PA
Posts
7
Comments
177
Joined
2 yr. ago

  • there are a number of blog posts that have different details about the how/why, etc. i just followed the links in the article to other parts of the series.

    I expect that the use case is more prevalent than you think, where you are spending a decent chunk on cloud infra. I have been convinced for some time now that the costs are high compared to our on-prem. I really like the idea of a the "deft" type hardware management service, so that look after the DCs, hardware and connectivity, and we look after the software.

  • you mean the ML model?

    I dont think it is too bad, as it is more like look for a description that has children and a sexual context. This can be trained without CSAM as the model generalises situations it has seen before - a pornographic picture (sexual context) and kids playing at a platground (children in the scene).

  • There Is a tool that someone built directly to scan images uploaded to lemmy for CSAM.

    It is really quite clever. The image is put through a ML/AI model, which describes it (Imange to text), then the text is reviewed against a set of rules to see if it has the hallmarks of CSAM. If it does, it is deleted.

    This is fully self hosted.

    What I like is that it avoids the trauma of a person having to see those sort of things