Saturday 26 August 2023

Part II: Comparison of Bluetooth Fidelity - AAC encoder quality (Android 10 & 13, Windows 11, Apple iPhones & Mac)

Greeting everyone, time to jump into Part II of our assessment of lossy Bluetooth music transmission.

In Part I, we examined the use of an Android 10 device (Huawei P30 Pro) as audio transmitter showing the differences between the codecs as played back with the AIYIMA A08 PRO amplifier and its Qualcomm QCC5125 Bluetooth SoC. Please refer to that article for details about the methodology and comparison with the output from a high resolution Topping desktop DAC.

For this Part II, let's focus on the Advanced Audio Coding (AAC) codec which has become a very popular option. Other than the universal default SBC, AAC is probably the most common one for music transmission on account of the fact that the "elephant in the room" - Apple - uses this across its product lines as their standard codec running at 256kbps. Given the amount of use, this is basically a practical standard for quality music transmission over Bluetooth.

Given the broad range of computers/tablets/phones used among family members here, when I'm looking for wireless headphones, I would want to make sure the device supports AAC; probably more so than aptX or LDAC.

Note that there is actually a "family" of AAC profiles from the early Low Complexity AAC (LC-AAC) originating in 1997 up to later versions like Extended High Efficiency AAC (xHE-AAC) released in 2012. As end users, we're generally not privy to such details so I'll just use the generic term "AAC" in this article.

Android 10 & 13:

As you may recall from Part I, the Huawei P30 Pro (2019) phone performed poorly using AAC, here are the graphs again:

Clearly this level of performance is not impressive compared to even SBC in the tests! And as discussed with Mikhail in the comments, the encoding appears to be software-based rather than offloaded to specialized SoC hardware (the Huawei uses HiSilicon's KIRIN 980 SoC).

Using the same test parameters as last time, the question then is, can newer versions of Android perform better? Well, I have here a Samsung Galaxy Tab S6 Lite from 2020. It's running Android 13, based on the Exynos 9611 SoC:

Still not good. Unfortunately the tablet is locked so I couldn't just use SDB to see what the config files looked like without a bit of extra work rooting. Regardless, given this level of performance which is a little better than the Huawei, it's probably reasonable to say that this Samsung is using the same/similar software AAC encoder as with the Android 10 device.

I have an Android 11 Samsung Tab A10.1 (2019) here as well. Interestingly, that tablet refused to connect to the AIYIMA using AAC but would always default to aptX! As such, I was not able to get a measurement; maybe Android 11 knew that if aptX is available, to just do the user a favor and skip AAC. ;-)

Apple's iPhone AAC:

Switching over to the Apple devices, let's have a look at a couple of iPhones, the iPhone 11 from 2019 and the latest iPhone 14 from late 2022:

Nice result for both phones! That's what a good implementation of AAC should look like.

As you can see, both the iPhone 11 (2019) and iPhone 14 Pro (2022) performed really well on the tests. THD+N scored better than -70dB and TD+N better than -65dB. These numbers are similar to the best that LDAC 909kbps achieved in Android. But the great thing is that Bluetooth transmission using AAC is only at 256kbps with the iPhones which means improved reliability (no dropouts) and range.

Beyond just the numbers, if we look at the FFT patterns, Apple's AAC implementation has clearly better-resolving high-frequency content than LDAC 909.

Computers - Windows 11 MiniPC and macOS laptop:

While Apple has done well with the iPhones/iOS, how about on the computers side?

Since Windows 11, AAC has become a native codec for Bluetooth use alongside SBC. This is a change from Windows 10 where aptX was also available with many Bluetooth transmitters. You can double-check the codec used with Bluetooth Tweaker.

Here are the measurements using the Beelink SER4 Ryzen 7 4700U miniPC I reviewed last year with its built-in Bluetooth 5, AAC codec as per the latest Windows 11 22H2:

Unfortunately, AAC even with a relatively fast Windows 11 machine leaves much to be desired.

Sure, it's a little better than the Android 10 and 13 devices tested above with marginally better THD+N and TD+N around -55dB. However, the same pattern of distortion can be seen with a fair amount of noise at the base of the test tones and this becomes additive with complex signals like the Multitone 32. Disappointing, but I guess I'm not too surprised since the native Windows audio architecture isn't known for excellent performance.

How about Apple's macOS then with a recent MacBook Air M1 (late 2020)? My wife's machine is running the current Ventura 13.5.1.

Notice that in macOS, the AAC encoding is not as good as with the iPhones. Numerically, it's a bit better than Windows 11 due to lower noise floor with THD+N and TD+N at -60dB or better. Notice in the Multitone 32 graph that frequency rolls off a bit earlier than with Windows 11, missing some of those upper tones above ~15kHz. Let's look into this in more detail...

Frequency Response:

Time for frequency response curves comparing the devices - starting with the Android and iPhones. Notice that I'll include the desktop Topping D10s DAC for comparison.

I separated the curves because there was too much overlap to see the nuances. No smoothing has been applied. With a narrow 13dB range on the Y-axis, we can see that the Android AAC encoding looks irregular compared to the iPhone, a sign of the reduced encoder precision. At least Android 13 (Samsung) seems more refined than the Android 10 (Huawei) device.

Let's compare Windows 11 22H2 and macOS Ventura 13.5:

Psychoacoustic encoders "know" humans don't hear anything beyond 20kHz, as such, they all typically start rolling off before then.

While much less than the Android AAC encoders, we see that there's still a tiny bit of imprecision in the Window 11 frequency response curve resulting in a loss of smoothness. However, while the Windows 11 encoder has full frequency extension, notice the macOS Ventura AAC encoder cuts off at 15.4kHz. Guys over 30 years old might not notice anything, but younger folks could notice a loss of some sonic "sparkle". This is even more restrictive than LDAC 303kbps with cut-off at 16.5kHz on the Android 10.


Bravo Apple for getting AAC encoding quality done right on the iPhones. My assumption is that the iPhone is using hardware-assisted encoding to achieve this.

The quality we see using the iPhones with AAC 256kbps is better than aptX-HD previously tested and at least equivalent to the higher bitrate LDAC >900kbps on Android. There's certainly something to be said about Apple focusing on a single standard and doing it well!

Unfortunately, AAC encoding is not as good on macOS and the frequency response isn't as extended - out to just shy of 15.5kHz, a disappointment for audiophiles, I'm sure. :-|

Looking around, I see that Fraunhofer's software is embedded in Android, Windows, and macOS (see here and here) so perhaps that explains the similar qualitative limitations among these devices.

While none of my Android devices performed well on the AAC codec tests, obviously there are many Android models based on all kinds of hardware out there. Maybe the Qualcomm Snapdragon SoC can handle AAC better than what I'm showing here. But then again, Qualcomm would likely be focusing their energies on optimizing aptX performance. Regardless, I hope these published results can raise the bar for Android audio subsystem developers.

The Apple iPhone has shown us that AAC's sound quality can be objectively excellent with great compression ratio at 256kbps. I'm sure that this is done with higher computational load. Also, AAC will tend to impose longer latencies. I've seen sites like this stating that AAC latency is around 60ms (this is just the encoder latency, with Bluetooth transmission and wireless headphone variables, >150ms is common). With music-only playback, latency should not be a problem. Likewise, video players can apply latency compensation to reduce issues like poor lip sync. Realtime interactive gamers however might want to look at codecs like aptX-LL specifically built to minimize temporal lag. Nice to see latency of the Apple Airpods already improving over the generations.

Looking beyond what we have today, maybe Android can do to AAC what they're already offering with LDAC - different bitrate settings in the "Developer options". For example, I wonder whether there are any compatibility issues if higher quality "AAC 320kbps" is offered to headphones and soundbars for those who just want to listen to music or watch videos (with latency compensation), knowing that this higher bitrate setting (above 256kbps) will use more processing and lengthen latency.

As I said last week, I think stuff like the Japan Audio Society's certification for "Hi-Res Audio Wireless" is silly and just to create hype. IMO, Apple's iPhone AAC implementation sounds and performs just as good as much higher bitrate LDAC but it'll never get that "Hi-Res Audio" certification since it's 16-bits and doesn't support higher sampling rates. Let's ignore these rather meaningless numerical specs for lossy encoding. There's no question that AAC can achieve "High Quality Wireless" sound when implemented well.

Wow... Almost September! Life's going to get busy here over the next few weeks.

Hope you're enjoying the music, audiophiles.


  1. You cannot jump over the Apple Eco Fence without repercussions.

    1. Indeed Stephen,
      That's what my son is realizing now when he got an iPhone 14 from a family member to upgrade to from his Android. ;-)

      I just don't like that I can't plug in the USB cable and transfer music and images into a normal file system.

  2. Hi Arch,

    Thanks for that review, good to know the constraints of Bluetooth codecs. On my Motorola phone under Android 11 (moto g power), the choice seems determined by the connected device: All options are there but grayed out in Developer mode when nothing is connected. When I connect my Sennheiser Momentum TW 2, the only choice I get is Qualcomm aptX 16/44.1 so 325 kbps, all others remain greyed out. Good enough for lossy I guess. I guess if i had an Apple earbud I would be stuck with a bad AAC implementation…

    1. Yeah Gilles,
      Probably true if you had Apple AirPods and forced to use AAC.

      Definitely room to improve with the BT codecs. How much consumers care and whether subgroups like audiophiles would push for better quality I suspect will need to be seen. I do hope Android at least can improve and achieve parity with the Apple iPhone though!

  3. Archimago, thanks again for doing all this work!

    Regarding BT latency for gaming scenarios, I know that many people use apps that force their headset / mobile device to use communications-oriented BT profiles like HSP over the SCO link that usually have lower latency than A2DP by design. Everyone hopes that newer low latency protocols should solve this problem without compromising audio quality.

    1. Cool thanks for the info Mikhail,
      Long time since I looked into this so had a quick peek at the HSP profile and SCO link.

      Yikes. 64kbps CVSD (8kHz sampling rate) or mSBC (16kHz sampling) codecs definitely isn't going to cut it for decent music playback. ;-) But yeah, should have good latency...

      Folks who want to have a listen to the quality of these codecs Mikhail is talking about, check this page out:

  4. BTW, there is "yet another" independent evaluation of different implementation of AAC codecs, also naming Apple's implementation as the most "transparent":

    1. Nice stuff, and that was only 128kbps AAC!

  5. Archimago, and regarding the "lossy hi-res audio" thing from Part I. It is true that this combination of words does not make sense if we interpret "hi-res audio" as "more resolution than a 1644 CD format, suitable for further processing"—this is the understanding of "hi-res" by mastering engineers and such. I think, if we instead think of "lossy hi-res" codecs as ones that are able to work with higher dynamic range and then produce a better lossy result, this makes more sense. For example, remember the "mastered for iTunes" program which required studios to submit 24-bit material for lossy AAC distribution that iTunes used back in time.

    For customers who just look at labels, this might make sense, too—if you play a lossless hi-res audio from your NAS or a streaming service, it seems logical when listening to it to use BT headphones that have "lossy hi-res" sticker on them.

    1. Yup, definitely important to not have lossy decode → lossy encode for that extra processing!

      While we can talk about and think about these meanings, I suspect that the whole point of the JAS logo is still a marketing scheme aimed at the general public who probably don't grasp the more nuanced background...

      Clearly the JAS isn't adding nuance to the discussion by using the same yellow/golden "Hi-Res Audio" logo with the only difference being "Wireless" underneath; seemingly implying that there's some direct correlation between these codecs and the truly "Hi-Res Audio" hardware like DACs that this sticker is also slapped on.

      Oh well. It is what it is, as usual, "buyer beware" of the psychology behind the marketing.

  6. I'd love to hear your thoughts on the monetization strategies of apps like Getupside and how they compare to the Bluetooth audio codec technologies discussed in your blog post. It's intriguing to see how technology and business intersect in our rapidly changing digital landscape.
    How Does Getupside Make Money

  7. I found this comparison of Bluetooth headphones and earbuds really interesting, and it got me thinking about how technology has evolved in recent years.
    can you buy gas with walmart gift card

  8. Great to see some proper tests done on Bluetooth codecs. I've always been curious about them, but could never get such clarity anywhere else. Thanks!

    While you have cleared up the encoder side of the equation pretty much, with iOS+AAC coming out on top, what role does the decoder at the receiving end play in all this? Does AAC support on the receiving end guarantee the highest quality Bluetooth playback if you have an iOS device?

    I ask because on 2 of my Bluetooth speakers (Creative Sound Blaster Roar Pro, and Roar 2, both of which support AAC and aptX), Android+aptX sounds a lot better than iOS+AAC. On these 2 speakers, aptX just has more clarity, detail and well defined high-end. Since I can connect 2 Bluetooth devices at once to them, I could test quite easily - I played the same song on both Android and iOS, just paused one device, and played the other immediately after - and yes the difference was quite significant and noticeable. AAC sounds rather muffled compared to aptX. I guess this would imply that my Roar speakers haven't implemented AAC properly? And your Aiyima A08 Pro has? So would you say the decoder side of things can also mess things up?

    Aside from the above 2 speakers, I have several other Bluetooth speakers, some with AAC support, some just SBC, and for all those on the whole I'd say Bluetooth audio through iOS sounds better than Android, even when it's just doing SBC.