Viewing a single comment thread. View all comments

AudiophileHeaven OP t1_jd8h1wy wrote

AAC codec and AAC bluetooth are two separate things from what I can gather. In fact, AAC bluetooth sound is extremely different between iPhones and Android phones, as I pointed out there.

The sonic is based on double blind testing, but it is tightly connected to the available bandwidth, and this is because Bluetooth compression is something that has to happen real time and has to be a low power process, the nature of how it is designed means that it is less efficient than MP3 and AAC file encoding. We also have to take into account that the files are most likely already lossy in nature, so the quality of anything transmitted over Bluetooth is always lower than what was stored in the computer.

1

ultra_prescriptivist t1_jd8kdg7 wrote

Fair enough, I suppose that makes sense. I can see how the real-time transcoding issue could account for a more noticeable degradation than locally stored lossy files.

I don't really do Bluetooth audio, so I often fail to appreciate that it really is quite a different ballgame.

For Android users, what would you say is the best codec to use for a good balance between performance and reliability? LDAC or one of the aptX variants?

3

AudiophileHeaven OP t1_jda4hrj wrote

aptX HD is theoretically the best sounding to date, while being the most stable. The idea that LDAC sounds better applies only when there is no network congestion.

LHDC can be a bit superior but it is not widely supported at the moment.

1

ultra_prescriptivist t1_jdalv4a wrote

Thanks for that, although I notice you're now being a bit more careful when making claims about what "theoretically" sound better.

You should have really used such caution when writing this article, to be honest. In a technical piece about Bluetooth codecs, it seems to me that subjective impressions should be kept well out of it.

5

AudiophileHeaven OP t1_jdavh9y wrote

My take on audio is to have subjective impressions with everything, because I fear that if something should be good, but in practice isn't, then I should call it out. For example, I had the poor experience of not liking some of the high-end audiophile software, and I prefer Foobar2000 over most other programs for listening music, not because it sounds better, but i tend to find the others cumbersome. As for Bluetooth, I think that there's a HUGE difference between what can sound best, and what will sound best in most average situations, like in an airport or a gym, where network congestion plays a huge role, which is why I tried to also include how it sounds to the ear, in real life scenarios, not just in theory.

1

giant3 t1_jdb74sg wrote

> AAC codec and AAC bluetooth are two separate things

You couldn't be more wrong. The bit stream of AAC is very clearly documented and 2 different decoders should produce the same bit stream after decoding.

Any difference that exists is in encoding. There were differences 15 years ago when iOS encoder was superior, but now both Android and iOS are on par.

1

blorg t1_jdci6ny wrote

This Soundguys comparison is from 2018, and it shows major differences in AAC encoding between iOS and Android, and for that matter between Android phones.

I don't think simply looking at the high frequency cut-off really tells much about audio quality, it's not the largest factor, and particularly when many of them are high enough that I know I wouldn't hear the difference personally. But it does indicate objectively that there were at least differences between the encoders as recently as 5 years ago, you don't need to go back 15.

AAC on my phone running Android 12 is also still truncated, I think around 17kHz. Subjectively, I think it sounds good, I don't think that's a problem. Arguably, cutting at 17kHz they don't have to waste as many bits on frequencies that most people can't hear. But it's lower than apparently iOS does it (Soundguys says it goes to 18.9kHz).

2

AudiophileHeaven OP t1_jdcr6ef wrote

u/blorg - This is exactly how I documented it too, and rings true with my findings. Two Android phones can show significant differences between how they handle AAC and the same for iPhone vs Android. iPhone always has superior AAC encoding regardless of the test.

I also agree that the high cutoff won't always matter as it is likely that most people can't hear above 18kHz or even 17kHz, but it shows the difference, and if you place them side by side you can hear some differences, it is not just the high end cutoff that's different.

2

giant3 t1_jdd0v1t wrote

I am very well aware of this article from sound guys.

BTW, I was talking about encoders while you are talking about difference in configuration. The Android encoder was developed by Fraunhofer Institute, the very people who invented MP3 & AAC, but the manufacturer can set the bandwidth of the encoder. If the phone is rooted, we could change the bandwidth.

Anyways, the cutoff doesn't matter much. There is very little energy in music beyond 13kHz except for cymbal crashes.

1

blorg t1_jdd3q6h wrote

Right, but it doesn't matter whether it's the encoder or the parameters, the point is the result is different.

If you increase the parameters on SBC (SBC XQ) it can sound great too, but that doesn't help you particularly if the bitpool is artificially limited to a low bitrate as is done on Samsung buds (bitpool 37 rather than 53) and I believe, Windows.

That you could theoretically get a different result if the parameters were different, doesn't get you the different result, you get the result with the parameters chosen by the developers of the stuff you are using.

The Apple encoder is I believe different from the Fraunhofer one, and is supposedly, better. So there it's not just the parameters. But even if it was just the parameters, the average user isn't rooted and can't change the parameters. So it doesn't matter whether it's the "encoder" or the "parameters".

Personally- I think AAC sounds fine on Android, so I'm with your there.

2