News

Disney and NBCUniversal's lawsuit against AI company Midjourney is a way to establish legal precedent around AI.
Disney and Universal sue AI image company Midjourney for unlicensed use of 'Star Wars,' 'The Simpsons' and more Disney and Universal joined forces in a lawsuit against artificial intelligence image ...
In a move that could redefine the boundaries between generative AI (genAI) and intellectual property, Disney and Universal ...
Brandon Bauman, a top Hollywood dealmaker and chief strategy officer at Loti AI, examines the landmark lawsuit filed by ...
Disney and NBCU filed a federal lawsuit Tuesday against Midjourney, a generative AI start-up, alleging copyright infringement ...
Disney and Universal sued Midjourney on Wednesday for generating Shrek, Darth Vader, Buzz Lightyear, and a host of other copyrighted characters in the first major legal showdown between Hollywood and ...
"Piracy is piracy," says Disney's chief legal officer, as the studios aim to show that tools allowing personalized AI slop creations of characters like Darth Vader or Shrek run afoul of their IP.
The lawsuit, filed in U.S. District Court in Los Angeles, claims the AI company Midjourney generates images that “blatantly incorporate and copy” the movie studios’ famous characters.
Disney and Universal have together filed a lawsuit against Midjourney, the company behind one of the most popular AI image generators, over rip offs of characters and art styles from the likes of ...
Why Disney’s AI Lawsuit Will Determine Whether Studios Survive. ... with other companies joining up to go after Midjourney, and perhaps Disney and Universal going after other companies.
There are now 18 AI copyright-infringement lawsuits in the U.S., down from an original 36 as cases have been consolidated Disney v. Midjourney is the first lawsuit from Hollywood, to reference ...
The lawsuit includes some convincing side-by-sides of the real movie stills and the images Midjourney has created. It's not as if those examples are a close facsimile that can be mistaken for ...