Viewing a single comment thread. View all comments

TemperatureStatus435 t1_j9gk2mn wrote

Regularization in some vague sense applies, but there are different kinds of that, so you must be more specific. For example, an Autoencoder uses a bottleneck layer to learn information-dense representations of the domain space, and it may employ some mathematical regularization so that the raw numbers don’t explode to infinity.

However, a Variational Autoencoder employs the above methods, but also an additional type of regularization. The effect of this is to normalize the shape of the bottleneck layer so that it is close to Gaussian. This is extremely useful to do, but for entirely different reasons.

Long story short, don’t just say “regularization” and think you understand what’s going on.

6