I just want to make funny Pictures.

  • EldritchFeminity@lemmy.blahaj.zone
    cake
    link
    fedilink
    arrow-up
    7
    arrow-down
    10
    ·
    1 month ago

    Except it isn’t copying a style. It’s taking the actual images and turning them into statistical arrays and then combining them into an algorithmic output based on your prompt. It’s basically a pixel by pixel collage of thousands of pictures. Copying a style implies an understanding of the artistic intent behind that style. The why and how the artist does what they do. Image generators can do that exactly as well as the Gaussian Blur tool can.

    The difference between the two is that you can understand why an artist made a line and copy that intent, but you’ll never make exactly the same line. You’re not copying and pasting that one line into your own work, while that’s exactly what the generator is doing. It just doesn’t look like it because it’s buried under hundreds of other lines taken from hundreds of other images (sometimes - sometimes it just gives you straight-up Darth Vader in the image).

    • Honytawk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      1 month ago

      It’s taking the actual images and turning them into statistical arrays and then combining them into an algorithmic output based on your prompt.

      So looking at images to make a generalised understanding of them, and then reproduce based upon additional information isn’t exactly what our brain does to copy someones style?

      You are arguing against your own point here. You don’t need to “understand the artistic intent” to copy. Most artists don’t.

    • desktop_user@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      5
      arrow-down
      3
      ·
      1 month ago

      and just about any artist can draw Darth Vader as well, almost all non “ethics” or intent based argument can be applied to artists or sufficiently convoluted machine models.

      • EldritchFeminity@lemmy.blahaj.zone
        cake
        link
        fedilink
        arrow-up
        5
        arrow-down
        3
        ·
        1 month ago

        But just about any artist isn’t reproducing a still from The Mandalorian in the middle of a picture like right-clicking and hitting “save as” on a picture you got from a Google search. Which these generators have done multiple times. A “sufficiently convoluted machine model” would be a senient machine. At the level required for what you’re talking about, we’re getting into the philosophical area of what it means to be a sentient being, which is so far removed from these generators as to be irrelevant to the point. And at that point, you’re not creating anything anyway. You’ve hired a machine to create for you.

        These models are tools that use an algorithm to collage pre-existing works into a derivative work. They can not create. If you tell a generator to draw a cat, but it hasn’t any pictures of cats in its data set, you won’t get anything. If you feed AI images back into these generators, they quickly degrade into garbage. Because they don’t have a concept of anything. They don’t understand color theory or two point perspective or anything. They simply are programmed to output their collection of vectorized arrays in an algorithmic format based upon certain keywords.

        • Honytawk@lemmy.zip
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 month ago

          Why are you attaching all these convoluted characteristics to art? Is it because you are otherwise unable to claim computer art isn’t art?

          Art does not need to have intent. It doesn’t need philosophy. It doesn’t need to be made by a sentient being. It doesn’t need to be 100% original, because no art ever is. So what if a computer created it?

          If you encounter an artist who never saw a cat, they would also not be able to paint it. Just look at these medieval depictions of lions where it is clear the artist never saw one.