Submitted by Sieventer t3_10n7gj7 in singularity
LittleTimmyTheFifth5 t1_j67cxnb wrote
Here's a thought, how would the music industry react to that? I sense that would be a lot of legal fights and claims. Besides, they could probably privately license it to companies to get some money off it or something.
Spire_Citron t1_j684p96 wrote
If there's one industry you don't want to get into a fight with over copyright, it's the music industry. I think you're right. If the music industry challenges them and wins, it could impact all their other AI ventures.
pressurepoint13 t1_j688n2n wrote
Everyone will eventually lose to AI. It's inevitable.
[deleted] t1_j68m2b0 wrote
[deleted]
SurroundSwimming3494 t1_j69cz5c wrote
Yeah, let's have one industry choose our future on behalf of the entirety of humanity.
Not that this sub would have a problem with that, of course.
pressurepoint13 t1_j69gs6m wrote
Didn't say that's what I wanted.
SurroundSwimming3494 t1_j6a9ctp wrote
Sorry.
pressurepoint13 t1_j6amfyo wrote
My algorithm does not acknowledge that word.
SurroundSwimming3494 t1_j6as79b wrote
wut
IONaut t1_j6dmw2b wrote
Copyright infringement by AI training will probably be decided before the music industry even has a chance in the class action lawsuit brought by visual artists against Stability AI.
Glittering-Neck-2505 t1_j6btcx9 wrote
I think the bigger factor here is that research prototypes aren't typically released to the public. Companies that have released public betas have done so after numerous iterations behind closed doors. These are things that are not public by default. I'm going to get pushback saying this but I don't know why people feel entitled to freely test out R&D prototypes.
ChatGPT has been the exception, not the rule.
Viewing a single comment thread. View all comments