shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • PelicanPersuader@beehaw.org
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Worse, people making AI CSAM will wind up causing police to waste resources investigating abuse that didn’t happen, meaning those resource won’t be used to save real children in actual danger.