generative ai poison pill tools for protecting artists are coming along nicely.
ben jordan is a musician who is working with a university project to poison songs for ai sampling as well as scrambling the audio cues that voice recognition systems like alexa use to understand/spy on you. harmony cloak is the software
glaze and nightshade for digital art. glaze will disrupt without affecting what a human sees. nightshade will screw up the ai model.
there will be a fun little war as companies scrape data without permission and find their models getting corrupted and producing worse results the more they scrape. the harmony cloak example was done on a nv5080 over 2 weeks but it is very early experimentation stage. as future cpus incorporate more tensor and graphic compute units it should reach ubiquitous availability at reasonable prices for any starting artist, and new companies can offer the poison injection processing as a service for those who dont want to deal with the hassle.
i imagine the model trainers will have to come up with a filtering process that checks known good outputs before and after new scraped data is added to the model but even then the scope and variety of named styles that they are trying to steal from will make the process time consuming and expensive.
the startup tech bros have already started complaining about the poisoned data, when it hits the big corporations the fallout should be delicious.