Viewing a single comment thread. View all comments

poo2thegeek t1_j6i59po wrote

So, while this is certainly true, for something to come under copy right it had to be pretty similar to whatever its copying.

For example, if I want to write a book about wizards in the UK fighting some big bad guy, that doesn't mean I'm infringing on the copy right of Harry Potter.

Similarly, I can write a pop song that discusses, idk, how much I like girls with big asses, and that doesn't infringe on the copyright of the (hundreds) of songs on the same topic.

Now, I do think that if an AI model output something that was too similar to some of its training material, and the company that owned that said AI went ahead and published it, then yeah the company should be sued for copyright infringement.

But, it is certainly possible for AI to output completely new things. Just look at the AI art that has been generated in recent month - it's certainly making new images based off what its learnt a good image should look like.

​

Also, on top of all this, its perfectly possible to ensure (or at lest, massively decrease probability of) outputting something similar to its inputs, by 'punishing' the model if it ever outputs something too similar to training inputs.

​

All this means that I don't think this issue is anywhere near as clear cut as a lot of the internet makes it out to be.

3