They frame it as though it’s for user content, more likely it’s to train AI, but in fact it gives them the right to do almost anything they want - up to (but not including) stealing the content outright.

  • moitoi@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    This will be an unpopular opinion here.

    I’m not against AI but the rules have to be in laws and regulations. First, AI can’t use copyrighted material without paying for it. It can’t either use material without asking individually.

    The second point is that AI can’t created copyrighted material. Whatever an AI created, it’s free of copyright and everyone can use it.

    Third, an AI can’t be a blackbox. It has to be comprehensive how it works and what the AI is doing. A solution would be to have source available code.

    Fourth, AI can’t violate laws, create and push misinformation, and material used for misinforming.

    And, of course, anything created using AI has to be indentified as such.

    The money is in what the AI can do, the quality of the result, and the quality of the code. All the other things isn’t valuable.

  • MaggiWuerze@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    So, they want to create AI written and narrated audiobooks that use the voices of well known voice actors without paying them for the privilege? How is that supposed to stand in court?

    • rhebucks-zh@incremental.social
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      So why didn’t they say “derivative works with content of equal sentence-level, character-level, name, and story-level meaning”. I think it’s gonna be used for something more than that. They want to update content to fit the woke agenda, and people will frame it as good.