Skyrim VAs are speaking out about the spread of pornographic AI mods.

  • Rossel@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    The legal grounds are that the AI is trained using voice lines that can indeed be copyrighted material. Not the voice itself, but the delivered lines.

    • Skull giver@popplesburger.hilciferous.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That’s a decent theoretical legal basis, but the voice lines are property of the game company rather than the voice actors.

      If this interpretation of copyright law on AI models will be the outcome of the two (three?) big AI lawsuits related to stable diffusion, most AI companies will be completely fucked. Everything from Stable Diffusion to ChatGPT 4 will instantly be in trouble.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The problem with that approach is that the resulting AI doesn’t contain any identifiable “copies” of the material that was used to train it. No copying, no copyright. The AI model is not a legally recognizable derivative work.

      If the future output of the model that happens to sound very similar to the original voice actor counts as a copyright violation, then human sound-alikes and impersonators would also be in violation and things become a huge mess.

      • ChemicalRascal@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        The problem with that approach is that the resulting AI doesn’t contain any identifiable “copies” of the material that was used to train it. No copying, no copyright. The AI model is not a legally recognizable derivative work.

        That’s a HUGE assumption you’ve made, and certainly not something that has been tested in court, let alone found to be true.

        In the context of existing legal precedent, there’s an argument to be made that the resulting model is itself a derivative work of the copyright-protected works, even if it does not literally contain an identifiable copy, as it is a derivative of the work in the common meaning of the term.

        If the future output of the model that happens to sound very similar to the original voice actor counts as a copyright violation, then human sound-alikes and impersonators would also be in violation and things become a huge mess.

        A key distinction here is that a human brain is not a work, and in that sense, a human brain learning things is not a derivative work.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          That’s a HUGE assumption you’ve made

          No, I know how these neural nets are trained and how they’re structured. They really don’t contain any identifiable copies of the material used to train it.

          and certainly not something that has been tested in court

          Sure, this is brand new tech. It takes time for the court cases to churn their way through the system. If that’s going to be the ultimate arbiter, though, then what’s to discuss in the meantime?