dbwalton opened this issue on Oct 23, 2020 ยท 238 posts
DarkElegance posted Thu, 08 August 2024 at 6:55 PM
I agree completely!Torquinox posted at 7:40 PM Tue, 6 August 2024 - #4488114
So, there is a layer of reality that has been excluded from the conversation. Daz will say they "trained the AI with their own data" but that's only the top layer of data, the frosting. The underlying layer is made of stolen data, the same as every other AI generator out there. There are no ethical data training sets. They're all based on mass theft. That's because it takes billions of images and enormous volumes of text and enormous computing power to make the ai "smart enough" to do anything with the layer of frosting on top, the actual Daz data.That's the biggest frustration. There isn't a commitment to "delete it all and start over to placate the artists we screwed over", and anyone who thinks artists are "overpaid" are very much trying to jump on this and make it work - hell, anyone who thinks people are overpaid for anything are trying very very hard to jump on this and make it work.It all goes back to the Laion data sets. It's well documented that those are made of stolen data. Of course, you're not allowed to say "stolen data" on Daz site because it implies that some company did something wrong - Which of course all the AI companies did when they stole everything from everyone on the internet to train their AIs under the false auspices of research and fair use. Now the AI companies are making money off it and the AI industry has billions and billions of dollars being pumped into it. This apparently makes it all ok because we're all just little people dumb enough to share our work on the internet. So, it's ok to steal from us. And it is theft. When a company takes our work and incorporates it into their AI training data sets without our consent or compensation, that's theft. It was also ok to steal from famous artists and from Stephen King, too.
There are a few artists attempting lawsuits, but who knows how those will go? I do not expect much in the way of legal remedy. It is known that AIs have a habit of regurgitating parts of or entire works in its output. And it is known that earlier models are used as the starting point for later models, and it is pretty much impossible to remove the stolen works from the data set. So, once it's in there, it's in there.
It also shows how broken the entire copyright system is, since it pretty much is expecting the artists to go after the companies in court for enforcement, and otherwise is basically slanted in the direction of the biggest coffers. As long as the companies in question get to basically rake the internet, it takes away the biggest and best forms of advertisement from the artists because it's a serious risk to put anything on the internet because it will get fed into one of these engines without a lot of work to attempt to poison the input, and that only really works if enough artists agree to do the poisoning without so many doing it that all the engineers have to do is add a special case to inoculate against the poisoning. Granted, that was already a problem (just look at the art that gets downloaded and slapped on t-shirts!), but Generative Imagery Engines make the problem so much bigger because now an artist's style can be "emulated", and is much harder to point at in a court as a direct copyright violation.
https://www.darkelegance.co.uk/