• db0@lemmy.dbzer0.comOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Making only big companies able to “rip off your work” (not an accurate representation, but whatever) Is not the solution you think it is.

    The only solution is to force all models trained on public data to not be covered by copyrights by default. Any output from those models should also by default be in the commons. The solution is to avoid copyright cartels, not strengthen them.

    • Mahlzeit@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      IMO, we need to ask: What benefits the people? or What is in the public interest?

      That should be the only thing of importance. That’s probably controversial. Some will call it socialism. It is pretty much how the US Constitution sees it, though.

      Maybe you agree with this. But when you talk about “models trained on public data” you are basically thinking in terms of property rights, and not in terms of the public benefit.

    • Mahlzeit@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      The models (ie the weights specifically) may not be copyrightable, anyways. There’s no copyright on the result of number crunching. Once the model is further fine-tuned, there might be copyright, but it’s still unlike anything covered by copyright in the past.

      One analogy I have is a 3D engine. The engineers design the look of the typical output by setting parameters, but that does not create a specific copyright on the parameters. There’s copyright on the design documents, the code, the UI, if any and maybe other stuff. It’s not quite the same, though.

      Some jurisdictions have IP on databases. I think that would cover AI models. If I am right, then that means that any license agreements that come with models are ineffective in the US.

      However, to copy these models, you first need to get your hands on them. They are still trade secrets, so don’t on leaks.