Gandalf_The_Grey
Level 82
Thread author
Verified
Honorary Member
Top Poster
Content Creator
Well-known
- Apr 24, 2016
- 7,189
What would you do if you found out that the photos that you’d taken were being used to train AI without your consent? You might think that you have the right to ask the organization that is using them to stop and delete them. Well, it’s not that simple, as the case of Robert Kneschke, a German stock photographer, shows. Kneschke said he asked LAION — a non-profit that provides training materials for machine learning research — to remove his photos from its dataset. Not only did LAION refuse, but it also said he must pay for making “unjustified” copyright claims.
His case raises questions: Is copyright protection enforceable at all in the age of AI? And is AI training, as it exists now, intrinsically unethical? As we try to answer these questions, let’s take a closer look at how Kneschke has challenged one of the pillars of the generative AI industry: its ability to train AI models on billions of images, many of them copyrighted, for free.
AI training site stole his photos, then sued when he complained: Robert Kneschke's story
Photographer Robert Kneschke is waging an uphill legal battle with LAION, an organization that hosts AI training materials, to have his photos removed. We take a look at the case, its implications for the AI industry, and ask Robert what he hopes to accomplish with his lawsuit.
adguard-vpn.com