Saturday, 1 March 2025

HDMI Musings: high speed cables, data rates, YCbCr color subsampling, Dolby Vision MEL/FEL, optical cables and +5V injection.

Hey folks, for this post, I thought it would be good to dive a little more into the world of AV technologies after discussing the nVidia Shield TV Pro last week. While "classic" audiophile technology (ie. standard hi-fi analog and digital 2-channel stereo without special DSP advancements) has matured nicely already, this isn't quite the case with modern digital video tech. While many (probably most) features have settled, we can see ongoing evolution of the High-Definition Multimedia Interface (HDMI) standard to be mindful of - for example the recent announcement of HDMI 2.2 at CES2025 expanding capabilities well beyond the needs of today.

As usual, it'll take time (years) for this standard to be incorporated into TVs and source devices like GPUs (latest nVidia RTX 5080/5090 are HDMI 2.1b) or something like VR devices being at the forefront of potential generational gains. For more than 20 years, with each significant revision of the HDMI standard, we're seeing doubling of data speed with HDMI 2.2 now aiming at just shy of 100 gigabits-per-second (96Gbps), twice of the 48Gbps bit rate in current HDMI 2.1 products.

This recent update makes HDMI the fastest of all currently-announced consumer Audio-Video connection standards, the one wire that basically does it all - hi-res video (with high dynamic and frame rate), hi-res audio (up to 32 PCM channels at 24/192, with DSD to 8-channel of DSD256), HDCP copy protection, audio return channel, even ethernet (100Mbps).

This level of sophistication (and licensing costs) could make it difficult for small companies with limited R&D capabilities to get in the game with custom designs. This is probably in part why the cottage industry audiophile companies rarely use HDMI other than selling overprices cables (like AudioQuest). Plus frankly, basic 2-channel audio doesn't need the higher technical capabilities anyway (which is basically what Paul McGowan says).

Let's run through a few thoughts about HDMI that might be good to know as general knowledge as tech enthusiasts. I'll provide a few links of interest, touch on tech stuff like color spaces and subsampling, and things you might want to try if you're running longer high-speed HDMI connections and potentially noticing issues.

I. High speed HDMI, the variants

HDMI cables have basically looked the same since their release in 2002. They all have 19 pins on the connectors:

Source.

However, they need to conform to specifications in order to accurately pass along the data at high gigabits/second speeds based on TMDS signaling (note the 4 differential TMDS signal pin pairs). It is in these HDMI cables where we see companies apply techniques to minimize interference (like crosstalk, EMI), optimize conductivity (high purity copper wiring), improve durability, and use stringent manufacturing standards to achieve verifiably reliable performance.

We can look at the official HDMI Cables Resource page to see the various cable classifications. As of this writing in early 2025, we have 3 cable speed grades: HDMI Standard (10.2 Gbps), HDMI High Speed (18 Gbps), and HDMI Ultra High Speed (48Gbps). Higher speed cables are backwards compatible with the slower ones. With the introduction of HDMI 2.2, we'll be seeing "Ultra96" cables certified for 96Gbps coming.

Certification is important! The bulk of these gigabits-per-second communications for audio and video are unidirectional and non-error-corrected which is why one can see data errors show up in the image, or worse, disconnection if the data loss is severe (data errors will show up in a form similar to discussion of poor USB transmission years ago).

One technical detail worth noting is that the Gbps numbers above signify "bit rate" which is higher than the actual "data rate" (often ~80% of bit rate). Data rate is the actual information passed along from transmitting to receiving device to be processed.

So the actual data rates look like this for the various HDMI standards:

HDMI 1.0-1.2 (2002) 3.96Gbps
HDMI 1.3-1.4 (2006) 8.16Gbps
HDMI 2.0 (2013)    14.40Gbps
HDMI 2.1 (2017)    41.89Gbps
HDMI 2.2 (2025)    83.78Gbps (presumably HDMI 2.1 FRL-like)

For those who want to see the nitty-gritty of HDMI data rate and these calculations, make sure to head over to this phenomenal LTT online calculator.

Based on the data rate, this is how we know what screen resolutions and framerates the source device (like your UHD BluRay player, computer, AppleTV, nVidia Shield TV) and display can handle. In general, a rule of thumb is that if you want modern 4K/60fps HDR10/Dolby Vision, HDMI 2.0 will be enough. Above that, use HDMI 2.1 for 4K/120fps and 8K/60Hz. And if one day we need 12K/60fps, there's support in HDMI 2.2.

[For fun, let's think about 12K resolution. About 11,520 x 6,480 pixels for typical 16:9 aspect ratio scaled up from our usual 4K of 3840x2160. That's 74.65 megapixels per frame. If this is used in a home theater, and we sit a generous 12' away from the screen, "retina resolution" will be achieved with a screen 550" diagonal optimally! The TV would be 40' wide x 22.5' high - billboard-sized!

Alternatively, say we use a very large 150" diagonal screen (about 12.5' wide), we'll need to sit 3.2' away to get within the retinal resolution threshold of a 12K image. For reference, the largest flat panel TV I've seen these days would be the 115" TCL QM89 which currently goes for US$20k.

12K resolution? Gotta have it! ๐Ÿคฃ Have fun with the retinal screen calculator here, play around with the numbers.]


II. High Speed HDMI cables...

Unlike "high end audiophile cables" where even with US$1000 asking price, there are no clear standards or certifications for quality (hence mostly a form of snake oil), HDMI cables do have a certification process. While companies will sell cables claiming to support standards like 48Gbps HDMI 2.1, look for evidence that the cable has been certified.

It's usually not difficult to achieve rated speed for short cables like 6' and below. But if you are going higher like 10' or 15' wires, save yourself some headaches and buy the certified stuff.

Recently I needed 15' of HDMI Ultra High Speed wire (HDMI 2.1) for my nVidia RTX 4090 GPU to 65" TV set-up and I can vouch that the Monoprice 8K Certified Ultra High Speed HDMI 15' (US$27, CAD$33) worked out well:

How do we confirm the product is certified? Look for the label on the package with the hologram:

Scan the QR code on your phone and you should be able to confirm that the product has indeed been through the certification process off the HDMI.org site:

While I'm sure anything can be faked, it'd be pretty silly to fake a label like this for a $30 product! As usual, make sure to buy from a reputable company that's not peddling in snake oil or "luxury" stuff unless that's what you're going for. 

This cable worked well for me with stable 4K/120fps over 15' (3840x2160, 120fps, 10-bit color, lossless RGB = 32.3Gbps data rate). Being NEC CL3 rated for fire-resistance, this can be run behind the walls. One caveat about this cable is that the metal connector is very robust and that strain relief is stiff. Make sure you have about 3-4" clearance behind the connector on your TV or source device. I see Monoprice also sells the 8K Certified Braided Ultra HDMI cable which has a shorter connector and strain relief that will fit better in confined spaces / smaller devices.

III. Video formats (RGB, YCbCr) and Compression

A bit like audio where we have PCM and DSD encoding formats, in the video world the data can be transferred from your source (eg. UHD BluRay player, computer, streamer) to the receiver/screen encoded in different ways. Ultimately, all video is converted to RGB (red, green, blue - the three additive primary colors) by the time the light is created on the screen. However, due to historical reasons back to the days of black & white TV, video encoding has typically been in the "YCbCr" (also known as "YCC" or "YUV") consumer video format:

Y = Luma (the B&W brightness level)
CbCr = Chroma - blue and red "difference from neutral"

Since Cb and Cr refer to blue and red deviations from perceptual neutral (grayscale), the higher these values, the more color shifts these values represent. See here for the conversion formula between YCbCr to RGB.

Since our vision is most sensitive to the brightness (luminance) signal, we can keep that resolution high but drop some of those color (chroma) signals for "lossy" compression which can still look very good in natural scenes. Compression is most noticeable on a computer when reading the fine details in colored static text. This color compression is represented by notations like 4:2:2, and 4:2:0 to indicate "subsampling" or "downsampling".

Here's what those 3 numbers are referring to in the chroma subsampling nomenclature:

4:4:4 is uncompressed - for every 4 horizontal B&W/luma pixels, we have corresponding 4 chroma values for the odd and even scanlines (you might recall the days we routinely used interlaced image formats where even and odd scanlines were displayed with each sweep of the CRT). We can "subsample" the color information down to 4:2:2 which means there's a 50% reduction (2 chroma for 4 luma samples) in the horizontal color values for both odd and even scanlines (1920x1080 screen resolution → 960x1080 color resolution).

It might be surprising to know that almost all video including all your (UHD) BluRays and online streams have been compressed to 4:2:0; both a reduction in horizontal and vertical color resolution (1920x1080 screen resolution now becomes → 960x540 color resolution). Notice that the even scanlines contain no color information, rather it's interpolated from the odd scanlines. 4:2:0 contains only 25% of the color (chroma) information and requires ~50% of the overall data rate resulting in substantial data savings.

[For an excellent detailed writeup to explore more of what those numbers represent with graphics, see Spears & Munsil's Choosing A Color Space, 2nd Edition.]

Color values can be of different bit depth resolution - typically 8-bits in standard dynamic range, and 10-bits for HDR10 (high dynamic range). Dolby Vision is graded up to 12-bits with high dynamic range and dynamic metadata to allow greater tonal flexibility with changing scenes. Then there's also color gamut which is the description for the range of colors these numbers represent (we talked a little about this back in 2016 on 4K TVs):

Note other terms like gamma correction/tone mapping, white point which we
won't discuss here you might want to be familiar with. (image source)

Standard HDTV uses the Rec.709/sRGB (gamma 2.2) gamut whereas HDR can encompass richer and deeper colors as per the larger Rec.2020 triangle above.

Taken together, these color variables describe the "color space". There's obviously quite a bit of technical complexity here. Visual data encompasses more variables than the comparatively simple 2-channel audio data that audiophiles are used to (even DSD is probably easier to appreciate).

IV. A little more about Dolby Vision (MEL/FEL)

While there has been competition with HDR10+ (HDR 10 Plus), announced by Samsung and Amazon in 2017, the premier high dynamic range color format remains Dolby Vision which simply has significantly more content. Both of these systems allow dynamic tone mapping to be applied scene-by-scene (even frame-by-frame) which provides more subtle shading/coloring as compared to the static map applied throughout a movie with the base HDR10 profile available for all HDR displays.

[An interesting aside, many displays such as projectors are not Dolby Vision capable. But you might be able to "fake it" using Low-Latency Dolby Vision (LLDV) using products like the HDFury Vertex2 - have a look at this article.]

As we have seen over the years with audio technology, there are clearly diminishing returns as technology evolves. For example, the color bit depth (which refers to the number of shades of red/green/blue) of our flat screen display panels remain at 10-bits/color currently (total 30-bits adding RGB). Like with audio where we can only appreciate very high dynamic range if we push up the volume to deafening levels (limit of the best DACs somewhere around 20-21-bits even with the best 24-bit or 32-bit data), so too with video panels. While our typical TV sets might average peak 1000 nits (an easier term to say than "candela/square meter") brightness, professional monitors like the Flanders Scientific XM312U capable of an advertised 5000 nits are still 10-bit panels and cost over $20k.

So how bright do we want/need our displays to be? How many bits?

For me, even a 500 nits TV in a dark room is enough viewing from about 10' away. As with hi-fi audio and the need for low ambient noise for best listening experience, likewise I can't imagine seriously enjoying high quality video unless the lights are turned down.

[Great comparison article here between HDR10/HDR10+/Dolby Vision.]

If you're wondering, Dolby Vision is able to deliver 12-bit color to your display over HDMI as 12-bit 4:2:2 YCbCr encapsulated into 8-bit/color (24-bit total) RGB labeled as Rec.709. This data is then decoded in your DV-capable TV to its full 12-bit Rec.2020 glory. The data rate of 4K/60fps Dolby Vision therefore works out to 12.54Gbps, within the HDMI 2.0 maximum data rate of 14.4Gbps. 

True uncompressed 4K/60fps 12-bit RGB or 4:4:4 YCbCr will require over 18.5Gbps and thus HDMI 2.1 bandwidth would be required.

One more thing about Dolby Vision for UHD BluRay connoisseurs - keep an eye out for Full Enhancement Layer (FEL, 12-bit) encoded discs as compared to standard Minimal Enhancement Layer (MEL, 10-bit) discs. Check out this list. Whether we can subjectively see a difference or not, 12-bits can objectively encode smoother gradients.

FEL is available on favorites and interesting titles like Gladiator (2000), Forrest Gump (1994), Lord of War (2005), Saving Private Ryan (1998), Top Gun (1986), Back To The Future Trilogy (1985), Braveheart (1995), Watchmen: Ultimate Cut (2009), and Shutter Island (2010) if you're looking for examples.

[Current video players like the nVidia Shield TV Pro (2019) and AppleTV 4K (latest 2022 model) will not decode the secondary Dolby Vision 12-bit Full Enhancement Layer. The only TV box I've seen that decodes UHD BluRay rips of Dolby Vision profile 7 with FEL is the Ugoos AM6B Plus TV Box (based on Amlogic S922X-J SoC, about US$190, CAD$270) using CoreELEC Kodi software. You might be able to find a better deal on AliExpress typically around US$160.
A fun toy for you perfectionistic videophiles with ripped UHD BluRays libraries who don't mind a little DIY "hacking". ๐Ÿ˜‰]

Even though our display panels might be limited to 10-bits, that extra bit-depth data might still have some benefit in the display device to reduce banding according to Dolby (some debate here). Plus this is future-proof I suppose since the PQ transfer function is capable of luma up to 10,000 nits if ever we have displays capable of that kind of brightness - not sure we need such a thing!

V. Optical HDMI for longer lengths and +5V Injector for marginal connections

Even though the Monoprice 15' HDMI 2.1 cable discussed above works well for me, sometimes you might want even longer cables like say 20+ feet. For those circumstances, you should look into fiber optic and active cables/extenders. For my basement movie room, I've been using the inexpensive Ablink 30' Optical HDMI 2.1 cable (about CAD$50) which has worked well. I see many brands with similar designs like this, or this. Note that the "optical" connection refers to the high speed audio/video data transmission. Within these cables there are still copper wires for power and features like the ethernet line.

Beyond length extension, another nice thing about optical HDMI is that the cables are thinner so easier to route down the conduits I installed behind the wall to hide wires. One does have to be careful not to damage the cable with excessive bending. Note that optical cables are genuinely directional (unlike expensive analog audio cables ๐Ÿค”) so make sure you have the ends for the source and display connected appropriately.

When using an active optical cable where the HDMI port has to supply power to the transceiver module, if you ever run into unexpected issues like the monitor losing sync going intermittently blank or notice data errors (pixel corruption, color banding), it could be because the HDMI port isn't supplying an adequately stable 5V to the transceiver. In these cases, you could try using a 5V USB injector (about $10), assuming you have a USB port nearby:

If you don't have a USB port easily available, this one had its own AC adaptor.

Plug that into the source HDMI side and use power from the USB output (eg. 5V/1A from my Integra DRX-8.4 USB port) instead of just drawing from HDMI pin 18. Unless you have very long cables, for the most part, you should be OK but it's a good tool to consider for trouble shooting.

+5V injector from USB port connected to optical 30' HDMI cable.

Beyond the optical HDMI cables, there are active non-optical cables like this Cable Matters Active 48Gbps HDMI (again, mind the direction), or coupling active signal booster (Cable Matters, StarTech) as alternate means of extending range.

One more thing about power. Large screen TVs can suck up quite a bit of energy (150-200W isn't unusual with 4K UHD displays) with HDR turned on and the screen showing bright content. I've had situations where I plugged the TV into a power bar connected to other devices causing voltage droops that can result in loss of HDMI connection. The simple solution is to plug your TV into its own outlet. Don't share it with your AV receiver or subwoofer(s) which might also suck a few 100 watts.


VI. Concluding thoughts...

Yeah, I know, HDMI technology isn't the usual stuff audiophiles think about; but I think we should. ๐Ÿ™‚

As modern audiophiles of the 21st Century, I think we need to appreciate audio as just one member of the family of multimedia technologies with converging qualitative goals toward higher fidelity reproduction. Whether it's the evolution of a high bandwidth interface, better recording capabilities, improved digital-to-analog conversion, evolution in storage technology, these things will have impacts across audio and visual digital media. Broader knowledge of current objective performance helps protect us from purely subjective foolish claims.

Furthermore, if we seriously want to push audio reproduction forward, as I discussed a year ago in my article on Fidelity, Immersion, and Realism (FIR), we need to consider multichannel audio to better re-create realistic sound fields and the most common interface that this rides on is of course HDMI. HDMI is commonly available, there's already a lot of content, and does so at resolutions without lossy compromises thanks to these phenomenal bitrates. HDMI for home entertainment is positioned as the one cable/interface to use until maybe a day when wireless technologies might more conveniently take over.

As I suggested above, I suspect the latest and greatest HDMI 2.2 will take a bit of time to be adopted since many of the latest features are simply not needed for most applications today (even ultra-ultra-high-res 8K/60fps displays are fine with HDMI 2.1). It's good that the hardware standard is available to encourage creativity and opportunities for content creators though!

Some technologies will never need very high speed interfaces. For example hi-res 2-channel stereo audio hasn't needed to go beyond USB 2.0 (480Mbps) DACs even though we're up to USB 4 now (20-120Gbps!).

Ultimately, the human sensory systems are finite and can be saturated quite easily by modern digital technology. Diminishing returns with higher data rates whether with audio or video is simply to be expected. I think it's great to be living at a time where the evolution is still happening quickly, literally before our eyes!

I think that's all I have to say about HDMI and related topics for now. There is something else which I'll save for its own post next time. ๐Ÿ˜‰

Hope you're all doing well as we enter March and the Spring weather ahead in the Northern Hemisphere!

New music for the week: Panda Bear's "Praise" off Sinister Grift (2025, DR6 2-channel stereo, DR11 multichannel/Atmos). Appropriate album title given recent political developments!?

Alt. rock with a bit of the Beach Boys vibe to my ears:


And for some retro-remix pop, here's Roxette with Galantis' single "Fading Like A Flower" (again, very cool in multichannel); enjoy:

3 comments:

  1. Thanks for this excellent update on HDMI.
    The modern HDMI signal could support up to 32 ch but only the 8ch of LPCM.
    Considering the venue of ATMOS, we need at least 12ch, 16 ch will be even better.
    At my job, we used an HDMI to AoIP converter, to bring the gaming console 8ch ( 7.1) over audio over IP to our monitoring system.

    ReplyDelete
    Replies
    1. Interesting Blogue,
      I thought since HDMI 2.0, there were 32 potential PCM channels that could be supported by devices as part of the standard but as far as I am aware, there are no current consumer-level devices that expose more than 7.1 LPCM input and I don't think I've seen Windows support something like 7.1.4 to discretely address height channels.

      I might be mistaken and would love to know if there is such a thing as a receiver that opens up >7.1 on the HDMI and if computers can access those additional channels!

      Delete
  2. Thanks amigo. This was interesting. It’s convinced me even more that 4K UHD Blu-ray is a waste of money. I do think HDR is a nice upgrade, but it’s not worth the investment needed for HDR. I’m happy with 1080p Blu-ray.

    In Britain, second-hand Blu-ray discs can be bought for as little as £1, up to around £4. One day, I’d like to upgrade my system to a 5.1.4 setup. I would call myself a film fan, but I only have a handful of favourites—one of which is Watchmen, which doesn’t have a Dolby Atmos mix.

    The film industry has done absolutely everything possible with 4k UHD Blu-ray to make it as appealing as dog shit.

    ReplyDelete