@eccentrix said:
@thepanzini said:
@devoureroftime:
If the reference material used is copyright content then procedural generation wouldn't essentially be any different, likewise in reverse for AI.
This is something I don't understand about complaints of AI using copyrighted material to generate assets - humans do the same thing. No artist has gotten their education by only looking at public domain art. Musicians don't list Audio Network or Wikimedia Commons as sources of inspiration during interviews. I don't know why computers getting ideas from existing content is any worse than humans doing it.
It's an extremely complicated topic but a lot of it comes down to licensing, attribution, and the potential for harm to living, working artists. A great example is Getty Images. It was found that one of the largest image generators had scraped Getty's various thousands upon thousands of photos from their website to use as training data without permission from Getty or any acknowledgement that they had done it (this was found out because the image generator was putting in broken, janky looking versions of the Getty watermark on generated images). Getty paid photographers for all of those photos and licensed them such that they should not have been used as training data. So we end up with a case of both Getty having the thing they sell being stolen and the work of thousands of photographers being used in a tool that could put some of these photographers out of work (why pay Getty for generic photos when you can just generate photos that accomplish the same thing for articles). It should be obvious that this is harmful to working photographers. It's no different for any other form of art.
I don't know why computers getting ideas from existing content is any worse than humans doing it.
Humans don't get a free pass on this so why should computers? If an artist blatantly copies the work of another without attribution and gets found out, they have an extremely short window of time to fix the problem before it often becomes a career-ending thing for them. A good recent example of this is Olivia Rodrigo. She had to retroactively give several other artists songwriting credits on her debut album because several of her songs were found to be suspiciously similar to those already existing (the most obvious example being "good 4 u", which took the entire song structure, various chord sequences, and other bits from Paramore's "Misery Business". It was extremely blatant to the point that you could play the two songs side by side and they matched up almost perfectly at times). She said that these artists were huge inspirations for her and was doing homages to them, etc. etc., but the reality was that had she not given them credit she would have been in heaps of trouble and probably had her career ended.
Taking inspiration from and copying are two entirely different things. Current generative AI is doing mostly the latter, not the former. Taking someone else's work and shuffling around things a tiny bit doesn't make what you are doing not plagiarism. For example, if I take the idea of Nile Rodgers-style jangly disco guitar and put it in my own distinct disco song, that's generally fine. If I completely rip off the song "We Are Family" but change the lyrics and shift a couple chords, chances are Nile's not gonna be happy with me unless I get his permission and give him the accreditation that will allow him to be paid for his work that I'm copying. If you asked a generative AI to write a song in the style of Nile Rodgers, it would probably do something similar to the latter example because that's how these models work. They can't create original work because everything they make is inherently derivative.
Remember, when thinking about this stuff you have to give no benefit of the doubt to the people pushing it because they are catering to the shittiest, most cynical business people who would happily fire a bunch of artists to make a number go up slightly. They don't care about art or quality, all they care about is number going up. We've already seen what's happened to creative jobs in the game industry the last year (and this was without generative AI). We don't need to see that happen in every other creative industry too because some suits think they can save a bit of money by using a bunch of shitty generic generative AI assets.
Log in to comment