gkaykck

gkaykck t1_j1v72ay wrote

I think if this is going to be implemented, it has to be at model level, not as an extra layer on top. Just thinking outloud with my not so great ML knowledge, if we mark every image in training data with some special and static "noise" which is unnoticable to human eyes, all the images generated will be marked with the same "noise". So this is for running open source alternatives on your own cluster. I think if this kind of "watermarking" will be implemented, it needs to be done in the model itself.

When it comes to "why would OpenAI do it", it would be nice for them to be able to track where does their generated pictures/content end up to for investors etc. This can also help them "license" the images generated with their models instead of charging per run.

1