GenAI tools ‘could not exist’ if firms are made to pay copyright::undefined

  • Eccitaze@yiffit.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    So because corps abuse copyright, that means I should be fine with AI companies taking whatever I write–all the journal entries, short stories, blog posts, tweets, comments, etc.–and putting it through their model without being asked, and with no ability to opt out? My artist friends should be fine with their art galleries being used to train the AI models that are actively being used to deprive them of their livelihood without any ability to say “I don’t want the fruits of my labor to be used in this way?”

    • BURN@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 months ago

      This is the problem people have

      They don’t see artists and creators as worth protecting. They’d rather screw over every small creator and take away control of their works, just because “it’d be hard to train without copyrighted data”

      Plenty of creators would opt in if given the option, but I’m going to guess a large portion will not.

      I don’t want my works training what will replace me, and right now copyright is the only way we can defend what was made.

      • Eccitaze@yiffit.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        It’s like nobody here actually knows someone who is actually creative or has bothered making anything creative themselves

        I don’t even have a financial interest in it because there’s no way my job could be automated, and I don’t have any chance of making any kind of money off my trash. I still wouldn’t let LLMs train with my work, and I have a feeling that the vast majority of people would do the same

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      I don’t know if your fears about your friends’ livelihood are justified, but cutting down on fair use will not help at all. In fact, it would make their situation worse. Think through what would actually happen.

      When you publish something you have to accept that people will make it their own to some degree. Think parody or R34. It may be hurtful, but the alternative is so much worse.

      • Eccitaze@yiffit.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Huh? How does that follow at all? Judging that the specific use of training LLMs–which absolutely flunks the “amount and substantiality of the portion taken” (since it’s taking the whole damn work) and “the effect on the market” (fucking DUH) tests–isn’t fair use in no way impacts parody or R34. It’s the same kind of logic the GOP uses when they say “if the IRS cracks down on billionaires evading taxes then Blue Collar Joe is going to get audited!”

        Fuck outta here with that insane clown logic.

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          10 months ago

          I think you would find it easier to help your friends if you approached the matter with reason rather than emotion. Your take on fair use isn’t is missing a lot, but that’s beside the point.

          Assume you get what you wanted are asking for. What then?

          • Eccitaze@yiffit.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Yeah, no, stop with the goddamn tone policing. I have zero interest in vagueposting and high-horse riding.

            As for what I want, I want generative AI banned entirely, or at minimum restricted to training on works that are either in the public domain, or that the person creating the training model received explicit, opt-in consent to use. This is the supposed gold standard everyone demands when it comes to the widescale collection and processing of personal data that they generate just through their normal, everyday activities, why should it be different for the widescale collection and processing of the stuff we actually put our effort into creating?

            • General_Effort@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              10 months ago

              As for what I want, I want generative AI banned entirely,

              Well, you can see the moral (and political!) problem here. Maybe the people who crunched numbers before electric computers wanted them banned. Maybe people who make diesel engines want EVs banned. That’s asking the public to take a hit for the benefit of a small group. Morality aside, it’s politically unlikely.

              or at minimum restricted to training on works that are either in the public domain, or that the person creating the training model received explicit, opt-in consent to use.

              This is somewhat more likely. But what then?

              I’ll start. Opt-in means that you have to obtain a license to AI train with something. You have to pay the owner of the intellectual property. What does this mean in our economy? What happens?

              • Eccitaze@yiffit.net
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                ideally? It means that AI companies have to throw away their entire training model, pay for a license that they may not be able to afford, and go out of business as a result, at which point everyone snaps out of the cult of AI and realizes it’s as overhyped as block chain and pretends it never happened. Pardon me while I find a flea to play the world’s tiniest violin. More realistically, open models will be restricted to FOSS works and the public domain, while commercial models pay for licenses from copyright holders.

                Like, what, you think I haven’t thought through this exact issue before and reached the exact conclusion your leading questions are so transparently pushing that open models will be restricted to public works only while commercial models can obtain a license? Yeah, duh. And you know what? I. Don’t. Care. Commercial models can be (somewhat) more easily regulated, and even in the absolute worst case, at least creators will have a mechanism to opt out of the artist crushing machine.

                • General_Effort@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  10 months ago

                  Ok, so you’re all in on some weird ideology and give a fuck about the livelihood of your “artist friends”. You had me. Great job. Good propaganda.

                  Corporate control is not the only thing you get (and actually not what I was leading to). You also get free money for the wealthy.

                  Getty claims to have the biggest private photo archive with 130 million images. How many does an artist own? The NYT owns all its archive a hundred years back; each daily newspaper having about as many words as a novel. How many novels does an author own? Of course, that’s still small fry. Meta has trained its image generator on 1.1 billion images that were generously opted in by users of insta and facebook.

                  So that’s how the licensing fees are going to be split. More money for the owning class, without any work required.

                  The money comes from subscriptions. Who pays subscriptions for image generators? The same people who pay Adobe for a subscription. People who have to make lots of high quality images. Professionals artists.

                  But you don’t care cause you don’t actually have artist friends. Fine. I have no idea what kind of crazy ideology you follow that you think a cyberpunk dystopia is the lesser evil.

                  • Eccitaze@yiffit.net
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    10 months ago

                    Ah, yes, you don’t have an actual rebuttal so everything is just “propaganda” and “cyberpunk dystopia” as if snake oil salesmen hawking freaking AI-powered vibrators and vagueposting about the benefits of AI while downplaying or ignoring its very real, very measurable harms, while an entire cottage industry of individuals making a living on their creative endeavors being forced into wage slave office jobs isn’t even more of a dystopia.

                    Try actually talking to an artist sometime bud, I don’t know of a single one that is actually okay with AI, and if you weren’t either blind or an “ideas guy” salivating at the thought of having a personal slave to make (shitty, barely functional, vapid) shit without paying someone with the actual necessary skills, you’d agree too.

    • PsychedSy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      The concept of copyright is insane to begin with. Corps don’t make it bad - it starts out bad.

      It’s an invented right.