Viewing a single comment thread. View all comments

ultra_prescriptivist t1_je28wqt wrote

In most cases, normalization doesn't affect dynamic range at all; all it does is adjust the volume to a pre-defined level (measured in LUFS).

Spotify is different from other streaming services because Premium users can select different normalisation levels - quiet, normal, and loud.

The funny thing is that "Loud" here doesn't always mean that normalization turns the perceived volume up. If a track was mastered fairly loud already, enabling normalization and setting it to Loud may even drop the volume. You can see this on Daft Punk's Give Life Back to Music. Notice also how the shape of the waveform stays the same, since no dynamic range compression is being applied.

However, the one situation where the normalization setting does affect dynamic range is when we have a track with high dynamic range that was mastered relatively quietly and we set the normalization to Loud. The problem now is that the loud parts of the track might be pushed too high and cause clipping, so a limiter has to be applied to ensure that we don't get distortion. This applies to most classical music, such as this recording of Mahler's 5th Symphony.

Notice how the Normal setting looks the same as having normalization switched off but the Loud setting has compressed the track significantly, going from a DR (dynamic range) value of 12 to around 6.

For Spotify users who want to avoid any dynamic range compression, leaving normalization enabled and set to the Quiet or Normal settings is fine - they just need to be careful when having it set to Loud when listening to certain types of music.

6