[ad_1]
Meanwhile, bot-protection companies like DataDome have been offering services to deter scraping for years and have recently seen a huge shift in response to the rise of generative AI. CEO Benjamin Fabre told WIRED that he has seen a surge in customers looking for protection against AI-related scrapers. “Seventy percent of our customers reach out to us asking to make sure DataDome is blocking ChatGPT” and other large language models, he says.
Although companies like DataDome are well-established, they cater to large corporations and charge accordingly; they’re usually not accessible to individuals. Kudurru’s arrival, then, is promising precisely because it is offering a free tool aimed at regular people.
Still, Kudurru is far from a broad or permanent solution for artists who want to stop AI scraping; even its creators envision it as a stopgap measure as people wait for meaningful regulatory or legislative action to manage how AI is trained. Most artist advocates believe that these companies will not stop scraping for training data voluntarily.
Copyright activist Neil Turkewitz sees it as a “speed bump” for AI generators, not an industrywide fix. “I think they’re great. They should be developed, and people should use them,” Turkewitz says. “And it’s absolutely essential we don’t view these technical measures as the solution.”
“I applaud attempts to develop tools to help artists,” Crabapple says. “But they ultimately put the burden on us, and that’s not where it should be. We shouldn’t have to play whack-a-mole to keep our work from being stolen and regurgitated by multibillion-dollar companies. The only solution to this is a legislative one.”
A larger-scale, permanent change in how generators train will likely need to come from governments; it is highly unlikely that the larger generative AI companies will stop web scraping voluntarily. Some are attempting to ameliorate critics by creating opt-out features, where people who don’t want their work to be used can ask to be removed from future training sets. These measures have been viewed as half-baked at best by many artists, who want to see a world in which training takes place only if they’ve opted into participation.
To make matters worse, companies have started developing their own opt-in protocols one by one rather than settling on a common system, making it time-consuming for artists to withdraw their work from each individual generator. (Spawning previously worked on an early opt-out tool for Have I Been Trained? but sees the fragmentation as “disappointing,” according to Meyer.)
The European Union has come the furthest in developing legal frameworks for artistic consent to AI training. “It’s going incredibly well,” Toorenent says. She is optimistic that the AI Act could be the beginning of the end of the training free-for-all. Of course, the rest of the planet would have to catch up—and the AI Act would help artists enforce choices to opt out, not shift the model to opt-in. In other words, the world is a long, long way off from the dream of an opt-in training structure becoming a reality. In the meantime—well, there’s Kudurru.
[ad_2]
Source link