Artificial intelligence | Artists accuse Meta of vampirizing their work

For years, painters, photographers and other artists have flocked to Instagram to publish their works and make themselves known. Now, many are parting ways to prevent Meta, which owns Instagram, from using their art to train its artificial intelligence (AI) models.



They denounce Meta on their accounts, where many of them announce that they are migrating to Cara, an online portfolio for artists that prohibits AI creations and training. In May, a Meta executive said the company considered any Instagram post to be used to train its AI. Shortly after, Meta informed European users that their hardware will be used for this purpose from June 26. There is no way to refuse, although European law allows for appeal over the use of personal data.

According to AI companies, almost the entire public internet can be used to train AI… which could replace the authors, musicians and visual artists who created this “training data”.

Tension is rising and artists are at a loss: they need Meta apps to give themselves visibility, but they cannot prevent AI from vampirizing their works. Some say they are already close to losing their livelihood.

Instagram defectors

According to Cara founder Jingna Zhang, her platform grew from 40,000 to 650,000 users last week. At one point, it was the fifth most downloaded app, according to Apple. It is unclear if this exodus will have any effect on Meta.

Visit the Cara website (in English)

“I’m losing sleep over it,” said M.me Zhang, photographer and defender of artists’ rights. “We didn’t expect that. »

Many artists, including Mme Zhang, is suing Google, Stability AI and other AI companies, accusing them of training their systems with online content, some of which is copyrighted. Authors and editors including George RR Martin (Iron Throne) and the New York Times, do the same. According to the defendants, this use is permitted by the “fair use” doctrine, which authorizes the remixing and interpretation of existing content.

PHOTO TAKEN FROM THE CARA SITE

Cara, a free platform, crashed several times this week, overwhelmed by mass signups from hundreds of thousands of Instagram defectors.

In the meantime, artists are scrambling to protect their future works, relying on unproven alternatives.

Cara, launched for free in January 2023, is still under development and has crashed several times this week, overwhelmed by registrations, says Mme Zhang. Available on iOS, Android and on the web, its home page resembles that of Instagram, with “like”, “comment” and “repost” buttons.

Artist Eva Redamonti has looked at “four or five” alternatives to Instagram, but finds it difficult to assess which one best protects her interests. According to Ben Zhao, a computer science professor at the University of Chicago, several apps lured artists with false promises, quickly revealing themselves to be “AI farms” where their works are harvested (that’s the technical term). Mr. Zhao and his colleague Heather Zheng created the Glaze tool – integrated into Cara –, which is supposed to protect artists’ work against AI imitations.

Cara prohibits users from posting AI creations there until “ethical and data privacy issues” have been resolved.

Cara uses the company’s AI detection tool Hive to nab violators and labels each uploaded image NoAI to discourage harvesting. But in fact, there is no way to stop AI companies from serving themselves.

According to some artists, AI has already taken away their income.

10 years ago, Kelly McKernan – an illustrator from Nashville – joined Facebook and Instagram, which quickly became the best source of clients. But from 2022 to 2023, its revenues from this showcase fell by 30%, while AI-generated images proliferated on the internet. Last year, she typed her name into Google and the first result was an AI-generated image mimicking her style.

Mme McKernan and two other artists are suing AI companies, including Midjourney and Stability AI.

Terms of use

Independent illustrator Allie Sullberg moved to Cara this week, following the lead of many artist friends who denounced Meta and deserted Instagram. She says she is outraged that Meta presents its AI products as tools for creators, while they do not benefit in any way from the use of their works to train the AI.

PHOTO LIONEL BONAVENTURE, AGENCE FRANCE-PRESSE

Artists criticize Meta for turning against content creators.

The Meta app terms of use specify that all users accept the company’s AI policy. Now, Mme Sullberg says he joined Instagram in 2011, 10 years before the 2021 launch of OpenAI’s first consumer generative image model DALL-E.

According to Meta spokesperson Thomas Richards, the company does not offer a withdrawal option. “Depending on where they live […] and local privacy laws […]people can object to their personal information being used to build and train AI,” he says.

Jon Lam, video game artist and creators’ rights activist, searched for hours on Instagram for a way to protect his works from being vampirized for the benefit of AI. He found a form, but it only applies to European users, who are protected by a privacy law. Mr Lam says he feels “anger and fury” towards Meta and other AI companies.

These companies turn against their customers. We were falsely promised that social media was about staying in touch with family and friends by sharing what’s happening in our lives.

Jon Lam, video game artist

“Ten years later, it has become just a platform used to collect data” to feed their AI.

Mme McKernan says he hopes the big lawsuits filed by creators will prompt AI companies to change their policies.

“It’s complacency that allows companies like Meta to continue to treat content creators – the very people who make them money – the way they do,” he says.

This article was originally published in the Washington Post.

Read this article in its original version (in English; subscription required).


source site-53