The American Matthew Butterick has started a legal crusade against generative artificial intelligence (AI). In 2022, he filed the first lawsuit in the history of this field against Microsoft, one of the companies that develop these types of tools (GitHub Copilot). Today, he’s coordinating four class action lawsuits that bring together complaints filed by programmers, artists and writers.

If successful, he could force the companies responsible for applications such as ChatGPT or Midjourney to compensate thousands of creators. They may even have to retire their algorithms and retrain them with databases that don’t infringe on intellectual property rights.

  • WebTheWitted@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    Great power competition / military industrial complex . AI is a pretty vague term, but practically it could be used to describe drone swarming technology, cyber warfare, etc.

    • anachronist@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      LLM-based chatbot and image generators are the types of “AI” that rely on stealing people’s intellectual property. I’m struggling to see how that applies to “drone swarming technology.” The only obvious use case is in the generation of propaganda.

      • S13Ni@lemmy.studio
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        You could use LLM like AI to go through vast amounts of combat data to make sense of it on the field and analyze data from mass surveillance. I doubt they need much more excuses.

        Case could be made tech bros have overhyped the importance of AI to military industrial complex but it nevertheless has plenty of nasty uses.

      • maynarkh@feddit.nl
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        The only obvious use case is in the generation of propaganda.

        It is indeed. I would guess that’s the game, and is already happening.