shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • Jordan Lund@lemmy.one
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    You say that NOW, but if people start using your images to generate revenge porn or, you know, really anything you didn’t consent to, that’s a huge problem.

    Both for the people whose images were used to train the model and for the people whose images are generated using the models.

    Non-consent is non-consent.

    This is how you get the feds involved.

    • ram@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Let’s not forget that these AI aren’t limited by age. Like fuck am I gonna be out here defending tech that would turn my kid into CSAM. Fucking disgusting.

      • PelicanPersuader@beehaw.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Worse, people making AI CSAM will wind up causing police to waste resources investigating abuse that didn’t happen, meaning those resource won’t be used to save real children in actual danger.

      • MaggiWuerze@feddit.de
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        On the other hand, this could be used to create material that did not need new suffering. So it might reduce the need for actual children to be abused for the production of it.

        • ram@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Ya, no, those people need psychological help. Not to feed the beast. This is nonsense.

          • ichbinjasokreativ@beehaw.org
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            It’s (rightfully) currently illegal, but that doesn’t stop people. Keep it illegal, increase punishment drastically, make AI-created material a grey area.

            • Rekorse@kbin.social
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              Its already the worst crime around and people still do it. Maybe its not the punishment we need to focus on.

            • ram@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 year ago

              I’m not sure increasing punishment is actually an effective manner of combating this. The social implications of being a child predator are likely to have a more deterrent effect than the penal system imo (I don’t have data to back that).

              I, personally, am an advocate for making treatment for pedophiles freely, easily, and safely accessible. I’d much rather help people be productive, non-violent members of society than lock them up, if given a choice.

          • MaggiWuerze@feddit.de
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Sure they do, but if they have to consume would you rather a real child had to suffer for that or just an Ai generated one?

            • ram@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Neither. I would have mental health supports that are accessible to them.

              • tweeks@feddit.nl
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                Of course we don’t want both, but it comes across as if you’re dismissing a possible direction to a solution to the one that is definitely worse (real life suffering) by a purely emotional knee jerk.

                Mental health support is available and real CSAM is still being generated. I’d suggest we look into both options; advancing ways therapists can help and perhaps at least have an open discussion about these sensitive solutions that might feel counter-intuitive at first.

        • tweeks@feddit.nl
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          That’s a fair point. And I believe AI should be able to combine legal material to create illegal material. Although this still feels wrong, if it excludes suffering in base material and reduces future (child) suffering, I’d say we should do research on it at least. Even if it’s controversial, we need to look at the rationale behind it.

    • Evergreen5970@beehaw.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      As someone who personally wouldn’t care at all if someone made AI porn of me and masturbated to it, I am incredibly uncomfortable with the idea that someone who doesn’t like me may have the option to generate AI porn of me having sex with a child. Now there’s fake “proof” I’m a pedophile, and I get my life ruined for sex I never had, for violation of consent I never actually committed. Even if I’m vindicated in court, I might still be convicted in the court of public opinion. And people could post faked porn of me and send it to companies to try to say “Evergreen5970 is promiscuous, don’t hire them.” Not all of us have the luxury of being able to pick and choose between companies depending on whether they match our values, some of us have to take what they can get and sometimes that would include companies that would judge you for taking nude photos of yourself. It would feel especially bad given I’m a virgin by choice who has never taken nudes let alone sent them. Punished for something I didn’t do.

      Not everyone is going to restrict their use to their private wank sessions, to making a real image of the stuff they probably already envision in their imagination. Some will do their best to make its results public with the full intention of using it to do harm.

      And once faking abuse with AI porn becomes well-known, it might discredit actual photographic/video proof of CSAM happening. Humans get fooled by whether an AI-generated image was taken by a human or generated by AI, and AI doesn’t detect AI-generated images with a perfect accuracy rate. So the question becomes “how can we trust any image anymore?” Not to mention the ability to generate new CSAM with AI. Some more mainstream AI models might try to tweak algorithms to prevent people from generating any porn involving minors, but there’ll probably always be some floating around with those guardrails turned off.

      I’m also very wary of dismissing other peoples’ discomfort just because I don’t share it. I’m still worried for people who would care about someone making AI porn of them even if it was just to masturbate with and kept private.