• gapbetweenus@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    The tool’s creators are seeking to make it so that AI model developers must pay artists to train on data from them that is uncorrupted.

    That’s not something a technical solution will work for. We need copyright laws to be updated.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        10 months ago

        Truly a “Which Way White Man” moment.

        I’m old enough to remember people swearing left, right, and center that copyright and IP law being aggressively enforced against social media content has helped corner the market and destroy careers. I’m also well aware of how often images from DeviantArt and other public art venues have been scalped and misappropriated even outside the scope of modern generative AI. And how production houses have outsourced talent to digital sweatshops in the Pacific Rim, Sub-Saharan Africa, and Latin America, where you can pay pennies for professional reprints and adaptations.

        It seems like the problem is bigger than just “Does AI art exist?” and “Can copyright laws be changed?” because the real root of the problem is the exploitation of artists generally speaking. When exploitation generates an enormous profit motive, what are artists to do?

    • Marcbmann@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      The issue is simply reproduction of original works.

      Plenty of people mimic the style of other artists. They do this by studying the style of the artist they intend to mimic. Why is it different when a machine does the same thing?

      • teichflamme@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        It’s not. People are just afraid of being replaced, especially when they weren’t that original or creative in the first place.

        • Even_Adder@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          They’re playing both sides. Who do you think wins when model training becomes prohibitively expensive to for regular people? Mega corporations already own datasets, and have the money to buy more. And that’s before they make users sign predatory ToS allowing them exclusive access to user data, effectively selling our own data back to us.

          Regular people, who could have had access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility, would instead be left worse off and with less than where we started.

          • UnderpantsWeevil@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            10 months ago

            Who do you think wins when model training becomes prohibitively expensive to for regular people?

            We passed that point at inception. Its always been more efficient for Microsoft to do its training at a 10,000 Petaflop giga-plant in Iowa than for me to run Stable Diffusion on my home computer.

            Regular people, who could have had access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility

            Already have that. It’s called a $5 art kit from Michael’s.

            This isn’t about creation, its about trade and propagation of the finished product within the art market. And its here that things get fucked, because my beautiful watercolor that took me 20 hours to complete isn’t going to find a buyer that covers half a week’s worth of living expenses, so long as said market place is owned and operated by folks who want my labor for free.

            AI generation serves to mine the market at near-zero cost and redistribute the finished works for a profit.

            Copyright/IP serves to separate the creator of a work from its future generative profits.

            But all this ultimately happens within the context of the market itself. The legal and financial mechanics of the system are designed to profit publishers and distributors at the expense of creatives. That’s always been true and the latest permutation in how creatives get fucked is merely a variation on a theme.

            instead be left worse off and with less than where we started.

            AI Art does this whether or not its illegal, because it exists to undercut human creators of content by threatening them with an inferior-but-vastly-cheaper alternative.

            The dynamic you’re describing has nothing to do with AI’s legality and everything to do with Disney’s ability to operate as monopsony buyer of bulk artistic product. The only way around this is to break Disney up as a singular mass-buyer of artwork, and turn the component parts of the business over to the artists (and other employees of the firm) as an enterprise that answers to and profits the people generating the valuable media rather than some cartel of third-party shareholders.