pilooch t1_iwybbh6 wrote
Hey there, this is a truly difficult problem. With colleagues we do train very precise GANs on a daily basis. We've given up on inversion and latent control a couple years ago, and we actually don't need it anymore.
My raw take on this is that the GAN latent space is too compressed/folded for low level control. When finetuning image to image GANs for instance, we do get a certain fine control of the generator, though we 'see' it snap to one 'mode' or the other. Meaning, we do witness a lack of smoothness that implicitly may prevent granular control.
Haven't looked at the theoretical side of this in a while though, so you may well know better...
bloc97 t1_iwyeh1x wrote
>the GAN latent space is too compressed/folded
I remember reading a paper that showed that GANs often folds many dimensions of the "internal" latent space into singularities, with large swathes of flat space between them (it's related to the mode collapse problem of GANs).
Back to the question, I guess that when OP is trying to invert the GAN using gradient descent, he is probably getting stuck in a local minima. Try a global search metaheuristic on top of the gradient descent like simulated annealing or genetic algorithms?
Viewing a single comment thread. View all comments