Viewing a single comment thread. View all comments

ultra_prescriptivist t1_jd7x956 wrote

Good write-up, especially for people like me who know relatively little about Bluetooth codecs as a whole.

I do have a query about the section on AAC, though:

>The compression algorithm of psychoacoustics of AAC is similar to MP3, and it cuts out a lot of data, but which the algorithm assumes you won’t hear. For the average listener, the compression is easily audible, with a decent pair of headphones.

Is the Bluetooth AAC codec somehow different and less efficient than the lossy compression AAC codec? Because, if not, this is a serious case of "citation needed".

Apple's lossy AAC encoder is extremely good, and blind tests have shown that the vast majority of people cannot tell between high bitrate AAC and lossless (Source 1 | Source 2) and may even be challenging to discern at bitrates as low as 128kbps.

How were your assessments of which codec sounds better made, exactly? Did you conduct any controlled testing, or were you just basing them on the assumption that high bitrate = sounds better?

7

[deleted] t1_jd8x61n wrote

[deleted]

5

ultra_prescriptivist t1_jdakr2c wrote

Ah, thanks for that! That more or less confirms my suspicions.

It's very interesting to see how poorly aptX performs in blind tests, especially against SBC.

1

ve_ t1_jddqhau wrote

Cable reviews can be useful... Until they start talking about the sound. Quality of construction and haptics of materals.. flexibility.. weight.. all small things that matter. To some people the colour matters.. i like those mint green china cables a lot. XD

1

AudiophileHeaven OP t1_jda47ei wrote

Yes, I also review cables. And as for Mp3, if you feel it is the same and if your ears tlel you that they sound the same, better for you, you can save some money.

0

[deleted] t1_jdah431 wrote

[deleted]

4

WikiSummarizerBot t1_jdah5uy wrote

Transparency (data compression)

>In data compression and psychoacoustics, transparency is the result of lossy data compression accurate enough that the compressed result is perceptually indistinguishable from the uncompressed input, i. e. perceptually lossless. A transparency threshold is a given value at which transparency is reached.

Codec listening test

>A codec listening test is a scientific study designed to compare two or more lossy audio codecs, usually with respect to perceived fidelity or compression efficiency. Most tests take the form of a double-blind comparison. Commonly used methods are known as "ABX" or "ABC/HR" or "MUSHRA". There are various software packages available for individuals to perform this type of testing themselves with minimal assistance.

^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)

1

AudiophileHeaven OP t1_jdawox8 wrote

Well, if you feel this way, you can enjoy MP3s the way you want, same for AAC or SBC codecs. At this point this argument is useless, most of the world is slowly adopting streaming, so a flac vs Mp3 is a useless discussion for any practical outcome.

As for Bluetooth, buy an android phone with multiple codecs, a headphone that supports a many of them and test, then you will know if there are differences.

−2

[deleted] t1_jdc3dsg wrote

[deleted]

2

AudiophileHeaven OP t1_jdctmnk wrote

If you are refuting something your own experience didn't hold true, you're literally diregarding yourself and what you heard yourself. If you, as an average user will hear the difference, then why would you assume other average users wouldn't.

I test things with a sample of around 35 people before making a "most people" argument, and the rate of success of a test must be over 70% , and usually 80%. For Mp3 vs flac, the rate with which they could tell the differences and which was superior was 75% with metal, 50% with classical (no bias, they couldn't really do it), and around 65% with pop, the Mp3 compresion algorithm clearly has a bias towards making changes that are not audible with classical but can show in highly dynamically compressed music such as rock or pop. The test we did was with Mp3 320CBR vs flac, and it was done using both speakers and headphones, both of which setups were midrange to get what a true average user would be like. The people were mostly young listeners, under 35.

Ogg at lvl q-10 is truly audibly transparent, no user including me can't tell OGG Q10 vs lossless apart, regardless of the music style.

I am not saying there is no lossy audibly transparent codec, but I know from experiment that Mp3 especially at 256 vbr is not audibly transparent if straining music is used, like dynamically compressed metal music. I think that if I mainly listened to jazz or classical, I probably could not tell them apart well, mp3 256 vs flac, even in my tests most users can't. But with rock and metal, they can, there's a strong bias towards those musisc styles in my articles too, since that's what composes most of my test playlist.

For the sake of experiments, try to experiment before leaning too much on other's statements.

−1

AudiophileHeaven OP t1_jd8h1wy wrote

AAC codec and AAC bluetooth are two separate things from what I can gather. In fact, AAC bluetooth sound is extremely different between iPhones and Android phones, as I pointed out there.

The sonic is based on double blind testing, but it is tightly connected to the available bandwidth, and this is because Bluetooth compression is something that has to happen real time and has to be a low power process, the nature of how it is designed means that it is less efficient than MP3 and AAC file encoding. We also have to take into account that the files are most likely already lossy in nature, so the quality of anything transmitted over Bluetooth is always lower than what was stored in the computer.

1

ultra_prescriptivist t1_jd8kdg7 wrote

Fair enough, I suppose that makes sense. I can see how the real-time transcoding issue could account for a more noticeable degradation than locally stored lossy files.

I don't really do Bluetooth audio, so I often fail to appreciate that it really is quite a different ballgame.

For Android users, what would you say is the best codec to use for a good balance between performance and reliability? LDAC or one of the aptX variants?

3

AudiophileHeaven OP t1_jda4hrj wrote

aptX HD is theoretically the best sounding to date, while being the most stable. The idea that LDAC sounds better applies only when there is no network congestion.

LHDC can be a bit superior but it is not widely supported at the moment.

1

ultra_prescriptivist t1_jdalv4a wrote

Thanks for that, although I notice you're now being a bit more careful when making claims about what "theoretically" sound better.

You should have really used such caution when writing this article, to be honest. In a technical piece about Bluetooth codecs, it seems to me that subjective impressions should be kept well out of it.

5

AudiophileHeaven OP t1_jdavh9y wrote

My take on audio is to have subjective impressions with everything, because I fear that if something should be good, but in practice isn't, then I should call it out. For example, I had the poor experience of not liking some of the high-end audiophile software, and I prefer Foobar2000 over most other programs for listening music, not because it sounds better, but i tend to find the others cumbersome. As for Bluetooth, I think that there's a HUGE difference between what can sound best, and what will sound best in most average situations, like in an airport or a gym, where network congestion plays a huge role, which is why I tried to also include how it sounds to the ear, in real life scenarios, not just in theory.

1

giant3 t1_jdb74sg wrote

> AAC codec and AAC bluetooth are two separate things

You couldn't be more wrong. The bit stream of AAC is very clearly documented and 2 different decoders should produce the same bit stream after decoding.

Any difference that exists is in encoding. There were differences 15 years ago when iOS encoder was superior, but now both Android and iOS are on par.

1

blorg t1_jdci6ny wrote

This Soundguys comparison is from 2018, and it shows major differences in AAC encoding between iOS and Android, and for that matter between Android phones.

I don't think simply looking at the high frequency cut-off really tells much about audio quality, it's not the largest factor, and particularly when many of them are high enough that I know I wouldn't hear the difference personally. But it does indicate objectively that there were at least differences between the encoders as recently as 5 years ago, you don't need to go back 15.

AAC on my phone running Android 12 is also still truncated, I think around 17kHz. Subjectively, I think it sounds good, I don't think that's a problem. Arguably, cutting at 17kHz they don't have to waste as many bits on frequencies that most people can't hear. But it's lower than apparently iOS does it (Soundguys says it goes to 18.9kHz).

2

AudiophileHeaven OP t1_jdcr6ef wrote

u/blorg - This is exactly how I documented it too, and rings true with my findings. Two Android phones can show significant differences between how they handle AAC and the same for iPhone vs Android. iPhone always has superior AAC encoding regardless of the test.

I also agree that the high cutoff won't always matter as it is likely that most people can't hear above 18kHz or even 17kHz, but it shows the difference, and if you place them side by side you can hear some differences, it is not just the high end cutoff that's different.

2

giant3 t1_jdd0v1t wrote

I am very well aware of this article from sound guys.

BTW, I was talking about encoders while you are talking about difference in configuration. The Android encoder was developed by Fraunhofer Institute, the very people who invented MP3 & AAC, but the manufacturer can set the bandwidth of the encoder. If the phone is rooted, we could change the bandwidth.

Anyways, the cutoff doesn't matter much. There is very little energy in music beyond 13kHz except for cymbal crashes.

1

blorg t1_jdd3q6h wrote

Right, but it doesn't matter whether it's the encoder or the parameters, the point is the result is different.

If you increase the parameters on SBC (SBC XQ) it can sound great too, but that doesn't help you particularly if the bitpool is artificially limited to a low bitrate as is done on Samsung buds (bitpool 37 rather than 53) and I believe, Windows.

That you could theoretically get a different result if the parameters were different, doesn't get you the different result, you get the result with the parameters chosen by the developers of the stuff you are using.

The Apple encoder is I believe different from the Fraunhofer one, and is supposedly, better. So there it's not just the parameters. But even if it was just the parameters, the average user isn't rooted and can't change the parameters. So it doesn't matter whether it's the "encoder" or the "parameters".

Personally- I think AAC sounds fine on Android, so I'm with your there.

2