Friday 23 December 2016

MUSINGS: End of 2016 - Video Encoding (HEVC 10-bits, the HDR "Trinity"), Multichannel Streaming, and Other Thoughts...


For those of you into video, I suspect you're already very excited about the "next generation" H.265/HEVC encoding format. About a year back, I already made mention of the impressive results I was seeing with playback of HEVC on the Skylake HTPC I was putting together. A year down the road, we see the ongoing development of software harnessing the power of the new encoding technique - even lower bitrate for very high quality output.

As we say goodbye to 2016, I thought I'd just "shoot the breeze" a bit and meander down some related topics. Let's talk about video encoding, what I've been doing, what I've found useful/interesting, and some speculation of what I think would be in the not too distant future as it applies to high dynamic range (HDR) video...

Over the last number of months, I have been gradually transitioning my movie server (primarily my Blu-Ray movie rips) from 1080P AVC/H.264 to HEVC/H.265 with excellent results and substantial space savings. For more details have a look at this analysis from ExtremeTech in 2013 (the year that HEVC was formally ratified). Basically, we can expect around a 50% reduction in bitrate using HEVC compared to AVC of the same quality. At times, with very clean digital sourced movies like animations, this compression ratio could be much better even. This type of advancement is clearly meaningful as the world moves forward with delivery of media over network systems. Just think of the bandwidth saved for companies like Netflix, Hulu, Amazon Prime, Vudu to have video transitioned from MPEG2, VC1, and AVC to HEVC. Furthermore, the drive for higher resolutions like 4K will demand this kind of progress for even better lossy algorithms - much cheaper to upgrade the software than a network hardware infrastructure... One practical benefit for many home users including myself is that smaller bitrate also reduces the need for buffering and improves performance in fluctuating network conditions especially when streaming wirelessly (I have a wireless Roku 4 upstairs that handles HEVC better because of this).

Notice that in the video world nobody realistically expects lossless video! There's just way too much data if it were lossless! Video degradation is really not noticeable once you hit a reasonable bitrate, even more so when the encoder is allowed the intelligence and leeway to utilize variable bitrates as the situation arises for demanding scenes. I think it's important to keep this in mind when faced with irrational beliefs around lossy audio and that somehow quality is badly compromised even with reasonably high bitrates. I trust that by now, it's clear to readers here that lossy 320kbps MP3 or 256kbps AAC do sound fantastic and it would be odd for folks (like the extreme audiophiles) to claim they can "easily" hear a difference (remember the blind test we did here years ago). This is the great thing about digital encoding - the accuracy and flexibility of digital transmissions allow us to apply powerful and intelligent algorithms that push the limits of what is achievable.

For the "earlier adopters" (we're not that early any more!), I thought I'd discuss what I've been doing with my movie server in the last year. First, as noted above, I've been encoding all my movies to HEVC (using the Windows ecosystem). The two programs I have been using are Handbrake and DivX. My default is without question Handbrake which is free and for the most part trouble free using the open-source x265 project as the software encoding engine. I noticed at times that the current official release of Handbrake (0.10.5, getting old since Feb 2016) had issues with some judder in transcoded videos. This is when I would reach for DivX. So far, the "nightly" beta builds of Handbrake with newer versions of the x265 library have been very good (discussed below).

Already, standard 8-bit HEVC encodes look great, but in many corners of the Internet, we're seeing more people using 10-bit encoding. This started years ago (around 2011) with the introduction of 10-bit AVC Hi10P encodes among anime lovers. Since hand drawn and cartoon-type animation often consist of large portions of the frame filled by solitary colors with little noise, really fine gradients and slight color irregularities can result in "banding" from 8-bit quantization that might not otherwise be noticeable in natural image capture where noise, film grain, and fine details are expected. A 10-bit encode would expand the quantization by 4x (ie. maximum 256 shades of gray now increased to 1024 shades), plus with the enhanced color accuracy, hence no need to "hard encode" extra dithering, bitrate can actually decrease and still achieve improved quality. The problem with these 10-bit AVC's though is that the decoding "profile" ("High 10 Profile" AVC) used is typically beyond the specifications of most consumer hardware. This meant that at least back in 2011 or so, you needed a pretty decent computer running the software decoder; typically at least a Core 2 Duo or equivalent... But that was then and computing power is not really an issue now unless we're looking at low-power Intel Atoms and ARM SoC's like the Raspberry Pi.

These days, with the advancement of video hardware and computing speed, I think it's time to look ahead and I believe that 10-bit HEVC will be the next ubiquitous "standard" for high quality encoding for the foreseeable future (next 10 years).

Here is what I've been doing over the last couple months...

On the software encoding side, already, 10-bit HEVC for Handbrake is out there ready to try. Grab a copy of the latest "Nightly" build of 64-bit Handbrake for Windows. Then go here to grab and drop the x265/264 .dll files (you have to register for forum access) in the "Handbrake Nightly" directory like so:

Now when you run Handbrake Nightly, you'll see options for H.265 (and H.264) that can handle 10-bit and 12-bit color depths. Here's the typical setting I use for a 4K encode:


Pretty straight forward selection of the "H.265 10-bit (x265)" encoder. "Constant quality" level 24 looks great for 4K encodes (I'll typically use 22 for 1080P - lower number is higher quality, given the lower resolution and less leeway for noticeable quality degradation especially on my 75" Vizio P75-C1 TV). In the "Encoder Preset", I use "Medium" which slows down the encoding process by allowing the lossy analysis to have more time to optimise the search for interframe changes to improve quality. Also, notice that I've selected "High Profile" on the right which will not automatically downsize the image resolution when doing 4K/2160P.

Note that there is value in using 10-bit encoding even with 8-bit videos. The file size could be smaller for 10-bit video than 8-bit due to the extra precision available to represent each pixel in a "constant quality" encode. In my own tests, I typically find file sizes about the same (+/-10%) which is really insignificant difference. However, realizing that 10-bit encoding could reduce banding especially in situations with limited bitrate makes it worthwhile I think.

As a quick example, recently I ripped Ratatouille (2007) for my video server (for the kids of course! :-). The original Blu-Ray rip is a 19.5GB H.264 with 640kbps AC3, and english subtitles. I decided to keep the AC3 and english subs, only re-encoding to HEVC 10-bit, constant quality 22, Encoder Preset at Medium. The resulting file size was a mere 1.40GB! This is almost 1/14 the size of the original - remember, clean animated images compress beautifully:
1:1 500x500 image crop comparing original H.264 vs. H.265 transcode. Handbrake Nightly, Constant Quality = 22, Encoder Preset = Medium. x265 10-bit encoder engine. Remember, the JPG file shown here is 8-bit color only and lossy compressed.
Blogger likely re-compressed the image above so you're probably not seeing the exact image when "pixel-peeping", but I think you get the "picture". Image quality is beautiful and when running at 24fps, it's subjectively unnoticeable.

Like many things in life, there is a price to pay for quality... Two main points to keep in mind:

1. HEVC software encoding is slow. Currently with software like Handbrake, expect to leave your computer maybe overnight or daytime at work for 1080P movies (and maybe a day for 4K). Typically, with my Core i7-3770K @ stock 3.5GHz, it takes me about 5 hours for a 2-hour 1080P movie. Expect at least 4x that for 4K video. Note that for newer generation processors with AVX2 instructions, the x265 encoder has optimizations to take advantage of these instructions. My "Skylake" i5-6500 machine in the home theater room actually is faster anywhere from 30-100% compared to my aforementioned stock "Ivy Bridge" i7-3770 workstation despite the lack of Hyper-Threading! Not unexpectedly, 10-bit video encoding does take longer compared to 8-bit - about 10-20% slower for me...

These are very rough estimates since it really all depends on the video itself like how dynamic it is, with action shots taking longer and more space to compress. Beware of very noisy video as well. For example the grainy, dark look of Batman v Superman really bogged down the rate of conversion compared to the "cleanliness" of Pixar movies which can be converted quicker with lower bitrates to achieve similar image quality.

While fast hardware encoding is possible (like with an nVidia Pascal GPU) and is becoming more common, I suspect HEVC 10-bit hardware encoding will go "mainstream" when Intel releases Kaby Lake next year and available to forthcoming CPUs. Since I recently got an ASUS nVidia GTX 1080 graphics card, I did a quick test using myFFmpeg encoding with nVidia's NVENC. The results were OK - almost real-time transcoding (20-30 fps) of high bitrate 4K 10-bit HEVC to 8-bit Y'CbCr4:2:0. At equivalent bitrates (say 4K video at ~10Mbps), the x265 software encoding does look better. This has been the general criticism of fixed hardware encoding - it's fast but typically will need a bit more bitrate to achieve the same quality as a good software encoder.

2. Make sure you have good HEVC (especially 10-bit) hardware decoding. No point saving all this storage space while maintaining excellent quality with the latest high-tech encoding technique if you can't play it back smoothly! Unlike in the world of AVC playback where 10-bit support is limited, 10-bit HEVC support is essential because "High Dynamic Range" (HDR) basically requires the extra bit depth to make it worthwhile. I mentioned a couple weeks back that I use an Amlogic S905X TV box for movie streaming. At this time, this is the most economical way I know of to decode HEVC "Main10" files in hardware. For ~$50, I bought an inexpensive Sunvell T95X box:
Notice the Logitech Unifying receiver on the right; the device is controlled with my K410 keyboard/trackpad.

If you look around, there are many other equivalent machines like this, this, and this... I have the 2GB RAM/16GB storage model which works well with this excellent LibreElec firmware from kszaq (the box comes with Android installed but I prefer the stability and functionality of the custom firmware). The S905X chipset is capable of HDR10 video output. Also, the ODROID-C2 I previously discussed runs the Amlogic S905 SoC which works well for hardware HEVC 10-bit decoding using OpenPHT (with PLEX server) or LibreElec/Kodi but without HDR support. As you might guess, the S905X is the successor of the S905 with HDR10 ability and VP9 hardware decoding. I have had no difficulty bit-streaming AC3, DTS, TrueHD, DTS-HD MA audio to the receiver using these devices.

With HDR being incorporated into essentially all new TVs, 10-bit hardware decoding will become standard whether the panel is actually 8-bits or not. Furthermore, Intel's Kaby Lake will include 10-bit HEVC support so 10-bit HEVC is about as much guaranteed to become mainstream and ubiquitous as any technology standard.

Lastly, not to be ignored, I think it's still worth keeping an eye on Google's royalty-free VP9 which they've outfitted with 10-bit/HDR capability through "Profile 2" and rolled out officially with the recent Chromecast Ultra and YouTube HDR:

I took the picture above about three weeks ago since I was curious to see HDR YouTube myself... My Vizio P75-C1 already can do standard 4K casting but not 10-bit VP9, so likely no YouTube HDR support. I popped down to the BestBuy down the street to pick up the device. Strangely, I could not get this thing working in HDR10 mode plus I couldn't even get Dolby Vision working on Netflix despite not having any issues with the TV's built-in casting function. This is obviously very disappointing, especially since the Chromecast Ultra's advertising claims DV compatibility. Oh well, I didn't feel like fooling around with it further and got my refund. Who knows, maybe there's a firmware upgrade as I heard that others have been troubled with the lack of Dolby Vision success as well. A little surprised I couldn't even get HDR10 working easily through my Yamaha RX-V781 receiver even though the cheap S905X TV box above worked!

Let's talk audio for a little bit...

For those of you with a library of multichannel files like 5.1 FLACs ripped from DVD-A/Blu-Ray/SACD's, I have found no better way to stream multichannel than with the ODROID-C2 or Amlogic S905X TV box using the free Kodi software through the HDMI interface to my Yamaha receiver as multichannel PCM. I just point Kodi to the multichannel music directory on my NAS under the Music tab and away it goes to identify and grab album artwork. Phenomenal! Here's what my TV screen looks like when streaming the 5.1 DVD-A rip of Queen's A Night At The Opera in multichannel 24/96:

Kodi is easily controlled with my Logitech keyboard/touchpad, but when playing music with the TV off (but HDMI still connected to the receiver decoding audio of course), it all works nicely with the Kore Android control app for selecting the album and song. Here it is running on my Vizio P75-C1's Android-based pad "remote controller":

Within the audiophile world, I think it's unfortunate that not more has been written and few releases are available in multichannel audio. Understandable given the rise of mobile audio. I've enjoyed and continue to listen and collect 5.1 presentations when available. Nonetheless, I am glad that inexpensive receivers, the HDMI interface, and Kodi streaming devices like the above are not difficult to put together for a stable, reliable, inexpensive and altogether compelling surround music experience!

--------------------

As suggested a couple months back, we are looking at great progress this year with 4K video and a push towards better image quality beyond just spatial resolution. I suspect for many (most) consumers, all the discussions about HDR (High Dynamic Range), the various competing HDR standards (HDR10, Dolby Vision), and the video encoding variants like AVC/HEVC/VP9 sound like technogeek gibberish. That is of course true since I don't believe the average consumer should really care once a technology matures! Over the next few years as more folks experience the technology and we settle down on the important features that resonate with the consumer, it'll "just work".

From the vantage point of late 2016, let me offer a look at the crystal ball and suggest what I see ahead (remember... crystal balls are murky and potentially inaccurate :-).

First, I doubt there'll be much of a "war" as some suggest (or even on USA Today!) when it comes to HDR standards. The move to 4K has not been as rapid as when we went from CRTs to flat-panel 720P/1080P so although the techies will have their excitement, I just don't think the consumer at large will run to stores to pick up the latest HDR tech much less 4K screens. Furthermore, unlike the potential financial licensing benefits from having one's physical format adopted (like say HD-DVD vs. Blu-Ray, Betamax vs. VHS, SACD vs. DVD-A), there is only one UHD Blu-Ray standard. An HDR standard like HDR10 is open and royalty-free; no real money behind it to implement other than satisfying consumer expectations. Money likely will be made in the production side (eg. Dolby selling its production tools for Dolby Vision and implementing it in as many cinemas as possible...) rather than consumer end. As such, I suspect that the HDR technologies will co-exist like how the audio encoding techniques have coexisted with essentially universal compatibility these days among AV receivers for DD/DTS/TrueHD/DTS-HD MA. I suspect a "trinity" of HDR varieties will coexist by 2018 with TVs typically supporting all three at the high end, and maybe 2/3 at the basic end (probably those that do not want to license Dolby Vision). I suspect a TV that is compatible with these 3 HDR mechanisms will be rather future-proof... At least for the next 5 years I imagine...

1. HDR10 - There's a minimum standard, this is it for UHD Blu-Ray HDR with HEVC 10-bit Y'CbCr4:2:0 encoding. HDR YouTube also uses HDR10 coupled with the VP9 10-bit (Profile 2) CODEC.

2. HLG - Still early and not even available on many consumer sets yet, but I have a suspicion that the "Hybrid Log-Gamma" transfer function technique recently ratified as Rec.2100 mixed with 10-bit HEVC will be what we're going to end up seeing more and more of in the years ahead. HLG will likely form the basis of HDR broadcasts when it becomes available, does not require any embedded metadata that can be lost during processing, is royalty-free to implement, allows single signal compatibility with both SDR and HDR (in Rec.2020 color space) - probably as "simple" as it gets in many ways. Here's a BBC article on this (and apparently BBC's botched HDR streaming experiment so far on Panasonic TVs). With Android 7.0 Nougat supporting HLG, it could be the HDR standard of mobile devices - imagine capturing HDR video on your phone's camera for example. Bottom line is that HLG could be the de facto format for home videos, computer video files, and home video servers with mixed SDR and HDR content. Furthermore, I suspect HLG would not be difficult to implement through firmware updates with existing HDR TV's. Sony has already announced it for their top-of-the-line but older BVM-X300 monitor that was introduced in early 2015. HLG is also officially supported in the HDMI 2.0b specs.

For an excellent technical look at HLG, ST2084 and other HDR goodies, have a look at this LightIllusion page. Also, if you want to check out an HLG test stream on YouTube right now, here ya go (LG's "Cymatic Jazz" HDR demo - see here for even more details). Without an HLG-compatible screen, the video will be missing the dynamic "pop" of course:


BTW: No surprise that Dolby is not impressed with HLG - see these PDF's here and here. Remember that both HDR10 and Dolby Vision use the Perceptual Quantizer (PQ) ETOF (SMPTE ST-2084) designed by Dolby Labs, so the HLG transfer function is a competitor to PQ.

3. Dolby Vision - Sticks around because of Dolby's prowess in the professional movie/production world, and the high-end Dolby Cinema theaters when applied to blockbusters and A-level movies. DV is dynamic scene-based HDR with "end-to-end" matching of capabilities from the production studio to your TV - the "high end" quality standard. The other specs are impressive as well - 10,000-nit range and 12-bit precision; hard to imagine needing anything more! Netflix, Amazon, and Vudu are already streaming DV. We're starting to see this in the consumer disk playback world with the Oppo UDP-203 player promising to support DV UHD Blu-Rays but still no content yet (and as of late 2016, the Oppo needs a firmware update). Because of Dolby's licensing costs, I think DV will remain the "premium" specification which the cheapest players and TV's will miss out on while still supporting HDR10 and HLG.

Pssssst... Vizio, how about giving us HLG to complete the DV/HDR10/HLG "trinity" for the 2016 M and P-series once you fix a few of those firmware bugs, improve input HDR lag, and Y'CbCr4:4:4? That would impress your competitors and consumers alike... :-) This would really lead the field in 2017 IMO.

[For completeness, there's dynamic metadata HDR10... Still a ways off at this stage until the HDMI 2.1 standard probably. However from a qualitative perspective, it'll have to compete with Dolby Vision and whatever level of content DV has at that point. If DV is going strong, I'm not sure who's going to care about this unless it's just an easily-implemented firmware update.]

Merry Christmas and a very happy New Year everyone! Stay safe, hope you're enjoying quality time with you and yours.

Chat in 2017...

Update: Handbrake 1.0.1 was released around the Christmas 2016 holidays. It's good!

9 comments:

  1. Hi,

    Long time lurker, first time commentor. :-)
    I have spent a good chunk of time digging into transcoding this year too, during my research I stumbled across Don Melton's handbrake automation scripts which do a couple of key things;

    - hyper focused on quality versus file size using handbrakeCLI
    - better automation and batching (again, because the CLI version of handbrake)

    Unfortunately the scripts don't yet support h265 because of a bug in x265, but that should be fixed soon I hope. Either way, thought it would be a good share in case you didn't stumble across it on your own.

    Scripts and a very excellent readme:
    https://github.com/donmelton/video_transcoding

    A little extra help getting it up and running on Windows:
    https://ryanchristensen.net/how-to-rip-and-transcode-blurays/

    Specific details about why no h265 support yet:
    https://github.com/donmelton/video_transcoding/issues/59

    Lastly, if you get a Ruby SSL error, here are the instructions to fix it (better than the method in the "windows install" link above):
    http://guides.rubygems.org/ssl-certificate-update/

    Have a great holiday, keep up the great writing!
    K

    ReplyDelete
    Replies
    1. That sounds extra commercially, probably should not have just right into the meat of it, ah well, still very cool work even if I sound like a spambot.

      Delete
    2. Very cool klogg!
      I didn't know about the script so very happy that you dropped the tip... Didn't seem commercially at all. Amazing all the work folks have put in and fantastic that it's all freely available.

      Merry Christmas...

      Delete
    3. Another update as I came back to this to reference something;

      The bug in x265 that prevented it from working with Don Melton's scripts has been long fixed and works now (assuming you are running a platform that works with the DLL you reference above), although the bitrate the script target are still geared for x264. Rather than tinker looking for a lower bitrate which delivers the same quality, I stuck with the same file sizes hoping to squeeze out improved IQ for "free."

      I also bought a Ryzen 1700X because I got the itch, and it encodes x265 10-bit at almost exactly the same speed that my old i7-3770 did x264 8-bit. It is a cross of disappointing because it doesn't feel like I improved my situation fort he money spent, and elation because for the same encode time I am getting next gen files.

      Hope things are well, would always love an update if you refine your process or tools.

      Kyle

      Delete
  2. AFAIK, h264 High10 showed gain in quality just because in the standard they didn't pay attention to rounding between encoding steps, which caused additional losses. Using higher bit depth "fixed" that for h264.

    OTOH, in h265 they addressed this and using higher bit depth for 8 bit source won't gain you anything - actually you could loose quality (and speed) because of unnecessary bit depth the encoder has to process (and false assumptions it has to make). Similar in audio: encoding a 16-bit source converted to 24-bit wouldn't gain you anything compared to just encoding a 16-bit source.

    Another thing is also that x265 is very good at retaining quality at lower bit-rates than x264, but it preserves less detail at higher bit-rates. So if you want to loose as little as possible compared to the source then x264 might actually be better (and it is for some people encoding for backup purposes).

    ReplyDelete
    Replies
    1. Hello!
      You could be right. Interesting comment about higher bitrate and retention of details in AVC vs. HEVC.

      In some simple testing I have done at low bitrates to bring out imperfections, I have noticed however that 10-bit x265 HEVC compared to 8-bit encoding does seem to reduce quantization effects even with an 8-bit AVC source while maintaining a constant bitrate. I'll see if I can dig up a sample of this for a post in the next couple days...

      Of course, if we let the encoder use higher bitrate, whatever potential benefit would likely be moot.

      Delete
    2. Just posted a little comparison / discussion on this...
      http://archimago.blogspot.com/2016/12/quick-compare-avc-vs-hevc-8-bit-vs-10.html

      Delete
  3. Hi!

    It's nice to see how video enconding is evolving. But i'll wait at least 5 more years until i jump on the 4k/HDR bandwagon (the system is too immature for me - the guys of Digital Foundry in YouTube made a video about the confusing HDR settings, even in good TV's).

    About the compression, it is really silly to say that MP3 320k sounds bad. It all depends on what is the source material and how well trained your ears are to identify the artifacts. In video, 99% of people doesn't see any difference between a RAW image and a compressed one, but some video professionals and some photographers do see. Does that mean that h.265 is bad? Of course not! But... some people are irrational when we talk about audio, so...

    Best Regards!

    ReplyDelete

  4. I’m not sure where you are getting your info, but good topic. I needs to spend some time learning much more or understanding more. Thanks for wonderful information I was looking for this information for my mission.

    Open Frame Touch Monitor

    ReplyDelete