Skip Navigation

Posts
0
Comments
294
Joined
1 yr. ago

  • There is a very simple explanation for this specific case: nobody on hackertalks.com is subscribed to !goodoffmychest@lemmy.world!

    Most community related activities on Lemmy will only be sent to instances that have at least one subscriber for the community.

  • other instances will need to have at least one subscriber to the community to be sent votes and new content

  • This is addressed in the upcoming Lemmy release 0.19.4 where contents will no longer be included in API responses. Until then it's up to clients to actually hide it. Content is kept for a few days to allow you to undo deletion, but you can also edit your content before deleting it to remove that. There is also a scheduled task running once a week I believe that will replace contents of deleted comments with something like PERMANENTLY DELETED.

    Regardless, as Lemmy is a public platform, you should be aware that people may be storing this information on linked platforms regardless and may not respect the edits/deletions at all.

  • ah, annoying that that seems to be happening accidentally so easily :/

  • Hi,

    is this consistently happening with the same posts?
    Do you see comments when you try the same post again later?
    You mentioned this happens with both the default front end and also with Alexandrite, does it happen with the same posts on both of them?
    Can you reproduce this in a private browser window?
    If you can, would you mind sharing a post that this is happening with so we can take a look?
    It would also be useful to see the associated error message.

  • the reporting endpoint is the same, doesn't matter where you report from.

  • you can't report users yet.

    there's an email address for our ticket system in the sidebar of this community.
    it's either that or trying your luck by sending a private message to a random admin. tickets are often a better option.

  • if the missing NSFW mark is the only reason for removal we're generally open to restore content after that has been fixed.

    we will not actively search out content that has been removed and then was updated to include the nsfw mark, but this would be doable on request.

  • Hello,

    the post was removed by a Lemmy.World admin.

    Since the post is not in a Lemmy.World community and your user is not on Lemmy.World either, this removal is only affecting Lemmy.World users.

    Our AutoMod is only notifying you about this happening, but the wording should be improved to make it more clear what happened.

  • we've switched from using multiple federation sending containers (which are supposed to split receiving instances across workers) to just using a single one.

  • so far this has been a single case with kbin.earth and lots and lots of cases with kbin.social.

    no other instances have been observed behaving like this yet.

  • see https://lemmy.world/comment/8961882 for now.

    we've been spending a bunch of time already during the last days to get a solution in place on our end that will allow us to selectively reject federated activities from kbin, such as allowing comments and posts while rejecting votes, which seem to be the main issue currently, but we're seeing some stability issues with this currently.

    we're planning to unban the affected users from the communities once we have this stabilized, as we currently have to pick between

    1. defederate from kbin.social (and other kbin instances when they are affected)
    2. reject all inbound activities from affected instances
    3. temporarily ban affected users in the communities associated with the issue
    4. drop all activities with certain characteristics, such as votes, when coming from a specific instance
    5. drop all activities with certain characteristics, such as votes, when coming from a specific instance and exceeding a rate limit

    1-3 are all options we can do with existing tools, 4 and 5 require a custom implementation on our side. as 3. has the least overall impact of those we decided to go with 3 for now, which seems to work out rather well so far, except for the individual user experience of affected users.

    4 has been our primary focus to implement currently, but it takes time to ensure this works as expected, as we're essentially building this from scratch. 5. may be implemented afterwards if we want to spend additional time on it.

  • maybe I misunderstood your comment, I read your Texas AG example as asking for information about users. did you mean Texas AG asking for the removal of comments where people are stating they're trans?

  • as I'm very tired right now, I only want to comment on one of the arguments/questions you brought up.

    you're asking for the difference between taking down content and providing information about users.

    its very simple actually. sharing non-public data is a very different story than removing access to otherwise public information, whether it's originally coming from Lemmy.World or elsewhere.

    when we take down content, even if it's more than legally strictly necessary, the harm of such a takedown is at most someone no longer being able to consume other content or interact with a community. there is no irreversible harm done to anyone. if we decided to reinstate the community, then everyone would still be able to do the same thing they were able to do in the beginning. the only thing people may be missing out on would be some time and convenience.

    if we were asked to provide information, such as your example of a Texas AG, this would neither be reversible nor have low impact on people's lives. in my opinion, these two cases., despite both having a legal context, couldn't be much further from each other.

  • We do question the validity of claims, but when it comes to takedowns of copyright related content, we simply do not have the resources to throw money at lawyers to evaluate this in detail. We can apply common sense to determine if something appears to be a reasonable request, but we can't pay a lawyer to evaluate every single request. We also can't afford going to court over every case, even if we were to win, because those processes take large amounts of personal time and have a risk of significant penalties.

    Legal advocates on Lemmy or any other platform for that matter are not a substitution for legal council.

  • What would be the alternative?

    Moving the instance behind Tor and hoping to never get identified?

    As long as you're operating a service on the internet you'll be bound by laws in one place or another. The only thing you can do against this is trying to avoid being identified and therefore trying to evade prosecution. This is not a legal defense.

  • Lemmy.World is legally primarily bound by the countries listed here.

    If we get a request, of course we will evaluate that request.

    When it comes to taking down content, such as copyright infringing content, we may err on the side of caution to reduce the legal risk we're exposing ourselves to.

    When it comes to handing over data that is not already publicly accessible, such as (not-really-)private messages or IP addresses of users, we will not "err on the side of caution" and hand out data to everyone, but we must follow the laws that we're operating under. See also https://legal.lemmy.world/privacy-policy/#4-when-and-with-whom-do-we-share-your-personal-information.