• Potatos_are_not_friends@lemmy.world
    link
    fedilink
    English
    arrow-up
    52
    arrow-down
    16
    ·
    10 months ago

    A small team of 7 was able to create something of this magnitude , all thanks to the various tools of today like Generative AI.

    We talk about the bad stuff of AI. But here’s the good… small mom and pop shops being able to release top tier products like the big companies.

    • circuitfarmer@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      40
      arrow-down
      17
      ·
      10 months ago

      It’s arguably not good that we’re normalizing people being able to use this while its training relied on other creators who were not compensated.

        • acutfjg@feddit.nl
          link
          fedilink
          English
          arrow-up
          5
          ·
          10 months ago

          Were they in public forums and sites like stack overflow and GitHub where they wanted people to use and share their code?

          • Armok: God of Blood@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            10 months ago

            Stable Diffusion uses a dataset from Common Crawl, which pulled art from public websites that allowed them to do so. DeviantArt and ArtStation allowed this, without exception, until recently.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            10 months ago

            Where did the AI companies get their code from? Is scraped from the likes of stack overflow and GitHub.

            They don’t have the proprietary code that is used to run companies because it’s proprietary and it’s never been on a public forum available for download.

        • Ech@lemm.ee
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          20
          ·
          10 months ago

          Humans using past work to improve, iterate, and further contribute themselves is not the same as a program throwing any and all art into the machine learning blender to regurgitate “art” whenever its button is pushed. Not only does it not add anything to the progress of art, it erases the identity of the past it consumed, all for the blind pursuit of profit.

          • Sethayy@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            16
            arrow-down
            7
            ·
            10 months ago

            Oh yeah tell me who invented the word ‘regurgitate’ without googling it. Cause the its historical identity is important right?

            Or how bout who first created the internet?

            Its ok if you dont know, this is how humans work, on the backs of giants

            • Ech@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              7
              ·
              10 months ago

              Me not knowing everything doesn’t mean it isn’t known or knowable. Also, there’s a difference between things naturally falling into obscurity over time and context being removed forcefully.

              • Sethayy@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                2
                ·
                10 months ago

                And then there’s when its too difficult to upkeep them, exactly like how you can’t know everything.

                We probably ain’t gonna stop innovation, so we mine as well roll with it (especially when its doing a great job redistributing previously expensive assets)

                • Ech@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  5
                  ·
                  10 months ago

                  If it’s “too difficult” to manage, that may be a sign it shouldn’t just be let loose without critique. Also, innovation is not inherently good and “rolling with it” is just negligent.

      • moon_matter@kbin.social
        link
        fedilink
        arrow-up
        25
        arrow-down
        1
        ·
        edit-2
        10 months ago

        Devil’s advocate. It means that only large companies will have AI, as they would be the only ones capable of paying such a large number of people. AI is going to come anyway except now the playing field is even more unfair since you’ve removed the ability for an individual to use the technology.

        Instituting these laws would just be the equivalent of companies pulling the ladder up behind them after taking the average artist’s work to use as training data.

        • Corkyskog@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          How would you even go about determining what percentage belongs to the AI vs the training data? You could argue all of the royalties should go to the creators of the training data, meaning no one could afford to do it.

          • moon_matter@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            How would you identify text or images generated by AI after they have been edited by a human? Even after that, how would you know what was used as the source for training data? People would simply avoid revealing any information and even if you did pass a law and solved all of those issues, it would still only affect the country in question.

      • Lmaydev@programming.dev
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        12
        ·
        edit-2
        10 months ago

        Then we shouldn’t have artists because they looked at other art without paying.

      • kmkz_ninja@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        11
        ·
        10 months ago

        Oonga boonga wants his royalty checks for having first drawn a circle 25,000 years ago.

      • mindbleach@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        3
        ·
        10 months ago

        As distinct from human artists who pay dividends for every image they’ve seen, every idea they’ve heard, and every trend they’ve followed.

        The more this technology shovels into the big fat network of What Is Art, the less any single influence will show through.

      • Dkarma@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        5
        ·
        10 months ago

        Literally the definition of greed. They dont deserve royalties for being an inspiration and moving a weight a fraction of a percentage in one direction…

      • Grumpy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        8
        ·
        10 months ago

        If AI art is stolen data, then every artists on earth are thieves too.

        Do you think artists just spontaneously conjure up art? No. Through their entire life of looking at other people’s works, they learned how to do stuff, they emulate and they improve. That’s how human artists come to be. Do you think artists go around asking permission from millions of past artists if they can learn from their art? Do artists track down whoever made the fediverse logo if I want to make a similar shaped art with it? Hell no. Consent in general is impossible too because whole lot of them are likely too dead to give consent be honest. Its the exact same way AI is made.

        Your argument holds no consistent logic.

        Furthermore, you likely have a misunderstanding of how AI is trained and works. AI models do not store nor copy art that it’s trained on. It studies shapes, concepts, styles, etc. It puts these concepts into matrix of vectors. Billions of images and words are turned into mere 2 gigabytes in something like SD fp16. 2GB is virtually nothing. There’s no compression capable of anywhere near that. So unless you actually took very few images and made a 2GB model, it has no capability to store or copy another person’s art. It has no knowledge of any existing copyrighted work anymore. It only knows the concepts and these concepts like a circle, square, etc. are not copyrightable.

        If you think I’m just being pro-AI for the sake of it. Well, it doesn’t matter. Because copyright offices all over the world have started releasing their views on AI art. And it’s unanimously in agreement that it’s not stolen. Furthermore, resulting AI artworks can be copyrighted (lot more complexity there, but that’s for another day).

        • WeLoveCastingSpellz@lemmy.fmhy.net
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          11
          ·
          10 months ago

          L take, AI is not a person and doesn’t have the right to learn like a person. It is a tool and it can be used to replicate others art.

          • Grumpy@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            2
            ·
            10 months ago

            What gives a human right to learn off of another person without credit? There is no such inherent right.

            Even if such a right existed, I as a person who can make AI training, would then have the right to create a tool to assist me in learning, because I’m a person with same rights as anyone else. If it’s just a tool, which it is, then it is not the AI which has the right to learn, I have the right to learn, which I used to make the tool.

            I can use photoshop to replicate art a lot more easily than with AI. None of us are going around saying Photoshop is wrong. (Though we did say that before) The AI won’t know any specific art unless it’s an extremely repeated pattern like “mona lisa”. It literally do not have the capacity to contain other people’s art, and therefore it cannot replicate others art. I have already proven that mathematically.

            • Dkarma@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              2
              ·
              10 months ago

              Yep, these ppl act like they get to choose who or what ingest their product when they make it available willingly on the internet…oftentimes for free.

              This whole argument falls on its face once u realize they don’t want AI to stop…they just want a cut.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            10 months ago

            That doesn’t make it bad.

            It’s a tool that can be used to replicate other art except it doesn’t replicate art does it.

            It creates works based on other works which is exactly what humans do whether or not it’s sapient is irrelevant. My work isn’t valuable because it’s copyrightable. On a sociopath things like that