Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

  • ptrck@lemmy.world
    link
    fedilink
    arrow-up
    62
    arrow-down
    1
    ·
    edit-2
    10 months ago

    I’m afraid the fediverse will need a crowdsec-like decentralized banning platform. Get banned one platform for this shit, get banned everywhere.

    I’m willing to participate in fleshing that out.

    Edit: it’s just an idea, I do not have all the answers, otherwise I’d be building it.

    • Katana314@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      4
      ·
      10 months ago

      What you’re basically talking about is centralization. And, as much as it has tremendous benefits of convenience, I think a lot of people here can cite their own feelings as to why that’s generally bad. It’s a hard call to make.

      • rbar@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        10 months ago

        They didn’t say anything about implementation. Why couldn’t you build tooling to keep it decentralized? Servers or even communities could choose to ban from their own communities based on a heuristic based on the moderation actions published by other communities. At the end of the day it is still individual communities making their own decisions.

        I just wouldn’t be so quick to shoot this down.

        • Rambi@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          There is something similar to that for Minecraft servers, it’s a website/ plugin where people’s bans get added to and other admins can check usernames on there to see if they’re a troll or whatever and ban them straight away before they cause issues. So it’s definitely possible to do in a decentralised way.

    • BradleyUffner@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      10 months ago

      There is no way that could get abused… Like say, by hosting your own instance and banning anyone you want.

      • Whitehat Hacker@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Do people think that someone has to use the same email address, or the same username? If someone uses a different email, username, and IP address (don’t try and argue semantics it can, always can, and always has been done) then whatever you put into the list can’t be applied to them.

        Even if you ask for IDs people can fake those, it’s illegal sure but so is what these assholes did and it didn’t really stop them now did it.

      • ptrck@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        2
        ·
        edit-2
        10 months ago

        You can have a local banlist supplemented by a shared banlist containing these CSAM individuals for example.

        • thisisawayoflife@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          10 months ago

          That ban list could be a set of rich objects. The user that was banned, date of action, community it happened in, reason, server it happened at. Sysops could choose to not accept any bans from a particular site. Make things fairly granular so there’s flexibility to account for bad actor sysops.

    • Draconic NEO@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      We already have that, it’s called prison. Can’t go on the internet from Prison (at least I’d assume so, wouldn’t make much sense if people could). That’s not 100% since people need to be caught for it to work but once they are it certainly is.

      Though other Global ban solutions don’t really work well because they require a certain level of compliance that criminals aren’t going to follow though with (i.e. Not commiting identity theft). They can also be abused by malicious actors to falsely ban people (especially with the whole identity theft thing).

    • Hello Hotel@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Mabe FIDO for identity purposes is a good idea. Mabe some process that takes a week to calculate an identity token and an approval and rejection system for known tokens