The Krita Foundation, an independent non-profit group that makes free graphics software for artists, recently noted that the company has included content analysis in their Adobe Creative Cloud account settings, where they (and anyone else) can analyze your content. to use techniques such as machine learning (for example, for pattern recognition) to develop and improve our products and services.
Many apps have the ability to share data with the developer, which allows you to send telemetric data such as the frequency of use of the app or certain features, crash causes. As an example, they point to Microsoft’s case, which angered many people who declared that telemetry is enabled by default and cannot be disabled in Windows 10. data unless you choose not to use this service.
But the data sets are getting bigger, and they’re shaping increasingly powerful AI systems, including these generative systems that, more recently, anyone can use to create remarkable images in an instant. It’s disgusting, but even worse is switching to a new sharing method and forcing existing users to join. Adobe told PetaPixel that this content analysis is “not new and has been around for a decade.” If they were using machine learning for this purpose, and they were saying so ten years ago, it’s pretty impressive that no one noticed in all that time. This seems impossible. I suppose politics was in some form, but it developed quietly.
Okay, we know… We made fun of the Adobe cloud service when it went down. We mocked Corel Painter and Clip Studio. We joined the protest against AI-generated Images. We have clarified our position on NFT. But this is beyond mocking. This EW! EW! EW! pic.twitter.com/40wBWYci7V
— @Krita@mastodon.art (@Krita_Painting) January 4, 2023
A closer look at the Enterprise Content Scanning FAQ reveals that the rule was updated in August last year and applies to images, audio, video, text or documents stored on cloud servers. The company makes it clear that it is not interested in content processed or stored locally on users’ devices.
Adobe’s latest content analysis policy allows the company to collect its own AI training data without going online. Some artists are frustrated that their work is being taken down and used to create generative AI art models without their consent, allowing anyone using text-to-image tools like DALL-E, Midjourney or Stable Diffusion to create content that mimics their style.
These tools can create any kind of image in just a few words, including those that clearly evoke the works of many artists. Users can refer to these artists by specific name as well as words such as style or by. And the actual uses of these tools can range from personal entertainment to more commercial endeavors.
An even more fundamental concern is when artists discover that their work is being used to train artificial intelligence: their own art is actually being used to develop computer software that could one day disrupt their livelihoods. Anyone who creates images with systems like Stable Diffusion or DALL-E can then sell them, subject to specific terms regarding copyright and ownership of these images. “I don’t want to be involved in the machine in any way that would devalue what I’m doing,” said illustrator and printmaker Daniel Danger, who learned that a number of his works were being used to teach Stable Diffusion.
Some services, including OpenAI’s DALL-E system, do not disclose the datasets on which their AI systems are based. But with Stability Diffusion, Stability AI is clear about its origins. Its underlying database is trained on pairs of images and texts selected for their appearance from a larger image and text cache from the Internet. The full dataset, known as LAION-5B, was created by the German artificial intelligence association LAION: “large-scale artificial intelligence open network”.
While Adobe has created a number of artificial intelligence programs that help users convert a 2D image into a 3D image or quickly search through videos to find specific clips, the company doesn’t yet have its own product. Instead, it sells AI-generated work on its image platform.
A company spokesperson said: We give customers complete control over their choices and privacy settings. The policy discussed is not new and has been in place for ten years to help us improve our products for customers. For anyone who prefers not to scan content, we offer that option here.
Tara McPherson, a Pittsburgh-based artist whose work has appeared on toys, clothing and films such as the Oscar-winning movie Juno, also worries about the possibility of some work being lost to artificial intelligence. He feels indebted and “exploited” because his work was included in the original Stable Diffusion data set without his knowledge. how easy will it be? how elegant will this art be? At the moment, sometimes he hesitates a little, but he is surprised.
With respect to generative AI, Adobe does not use any data stored in customers’ Creative Cloud accounts to train experimental generative AI features. We are in the process of revising our policy to better define use cases for generative artificial intelligence.
The practice of scraping images or other content from the Internet to create datasets is not new and has traditionally been a legal principle of US copyright law called “fair use.” Author situations that authorize the use of copyrighted works. Indeed, many of these copyrighted images are used in very different ways, such as to teach computers to identify cats.
But the data sets are getting bigger, and they’re shaping increasingly powerful AI systems, including these generative systems that, more recently, anyone can use to create remarkable images in an instant.
Source: Krita Foundation
How do you feel about the topic?
How do you feel about this situation? Did such an initiative surprise you?
are you for or against such initiatives?
These Artists Previously Found Their Artwork To Study AI And Now They’re Furious
Artists are starting to sell their AI-generated artwork on photo websites using on-demand artwork-generating software, with some artists trying to monetize it.