For those of you into video, I suspect you're already very excited about the "next generation" H.265/HEVC encoding format. About a year back, I already made mention of the impressive results I was seeing with playback of HEVC on the Skylake HTPC I was putting together. A year down the road, we see the ongoing development of software harnessing the power of the new encoding technique - even lower bitrate for very high quality output.
As we say goodbye to 2016, I thought I'd just "shoot the breeze" a bit and meander down some related topics. Let's talk about video encoding, what I've been doing, what I've found useful/interesting, and some speculation of what I think would be in the not too distant future as it applies to high dynamic range (HDR) video...
Over the last number of months, I have been gradually transitioning my movie server (primarily my Blu-Ray movie rips) from 1080P AVC/H.264 to HEVC/H.265 with excellent results and substantial space savings. For more details have a look at this analysis from ExtremeTech in 2013 (the year that HEVC was formally ratified). Basically, we can expect around a 50% reduction in bitrate using HEVC compared to AVC of the same quality. At times, with very clean digital sourced movies like animations, this compression ratio could be much better even. This type of advancement is clearly meaningful as the world moves forward with delivery of media over network systems. Just think of the bandwidth saved for companies like Netflix, Hulu, Amazon Prime, Vudu to have video transitioned from MPEG2, VC1, and AVC to HEVC. Furthermore, the drive for higher resolutions like 4K will demand this kind of progress for even better lossy algorithms - much cheaper to upgrade the software than a network hardware infrastructure... One practical benefit for many home users including myself is that smaller bitrate also reduces the need for buffering and improves performance in fluctuating network conditions especially when streaming wirelessly (I have a wireless Roku 4 upstairs that handles HEVC better because of this).
Notice that in the video world nobody realistically expects lossless video! There's just way too much data if it were lossless! Video degradation is really not noticeable once you hit a reasonable bitrate, even more so when the encoder is allowed the intelligence and leeway to utilize variable bitrates as the situation arises for demanding scenes. I think it's important to keep this in mind when faced with irrational beliefs around lossy audio and that somehow quality is badly compromised even with reasonably high bitrates. I trust that by now, it's clear to readers here that lossy 320kbps MP3 or 256kbps AAC do sound fantastic and it would be odd for folks (like the extreme audiophiles) to claim they can "easily" hear a difference (remember the blind test we did here years ago). This is the great thing about digital encoding - the accuracy and flexibility of digital transmissions allow us to apply powerful and intelligent algorithms that push the limits of what is achievable.
For the "earlier adopters" (we're not that early any more!), I thought I'd discuss what I've been doing with my movie server in the last year. First, as noted above, I've been encoding all my movies to HEVC (using the Windows ecosystem). The two programs I have been using are Handbrake and DivX. My default is without question Handbrake which is free and for the most part trouble free using the open-source x265 project as the software encoding engine. I noticed at times that the current official release of Handbrake (0.10.5, getting old since Feb 2016) had issues with some judder in transcoded videos. This is when I would reach for DivX. So far, the "nightly" beta builds of Handbrake with newer versions of the x265 library have been very good (discussed below).
Already, standard 8-bit HEVC encodes look great, but in many corners of the Internet, we're seeing more people using 10-bit encoding. This started years ago (around 2011) with the introduction of 10-bit AVC Hi10P encodes among anime lovers. Since hand drawn and cartoon-type animation often consist of large portions of the frame filled by solitary colors with little noise, really fine gradients and slight color irregularities can result in "banding" from 8-bit quantization that might not otherwise be noticeable in natural image capture where noise, film grain, and fine details are expected. A 10-bit encode would expand the quantization by 4x (ie. maximum 256 shades of gray now increased to 1024 shades), plus with the enhanced color accuracy, hence no need to "hard encode" extra dithering, bitrate can actually decrease and still achieve improved quality. The problem with these 10-bit AVC's though is that the decoding "profile" ("High 10 Profile" AVC) used is typically beyond the specifications of most consumer hardware. This meant that at least back in 2011 or so, you needed a pretty decent computer running the software decoder; typically at least a Core 2 Duo or equivalent... But that was then and computing power is not really an issue now unless we're looking at low-power Intel Atoms and ARM SoC's like the Raspberry Pi.
These days, with the advancement of video hardware and computing speed, I think it's time to look ahead and I believe that 10-bit HEVC will be the next ubiquitous "standard" for high quality encoding for the foreseeable future (next 10 years).
Here is what I've been doing over the last couple months...
On the software encoding side, already, 10-bit HEVC for Handbrake is out there ready to try. Grab a copy of the latest "Nightly" build of 64-bit Handbrake for Windows. Then go here to grab and drop the x265/264 .dll files (you have to register for forum access) in the "Handbrake Nightly" directory like so:
Now when you run Handbrake Nightly, you'll see options for H.265 (and H.264) that can handle 10-bit and 12-bit color depths. Here's the typical setting I use for a 4K encode:
Pretty straight forward selection of the "H.265 10-bit (x265)" encoder. "Constant quality" level 24 looks great for 4K encodes (I'll typically use 22 for 1080P - lower number is higher quality, given the lower resolution and less leeway for noticeable quality degradation especially on my 75" Vizio P75-C1 TV). In the "Encoder Preset", I use "Medium" which slows down the encoding process by allowing the lossy analysis to have more time to optimise the search for interframe changes to improve quality. Also, notice that I've selected "High Profile" on the right which will not automatically downsize the image resolution when doing 4K/2160P.
Note that there is value in using 10-bit encoding even with 8-bit videos. The file size could be smaller for 10-bit video than 8-bit due to the extra precision available to represent each pixel in a "constant quality" encode. In my own tests, I typically find file sizes about the same (+/-10%) which is really insignificant difference. However, realizing that 10-bit encoding could reduce banding especially in situations with limited bitrate makes it worthwhile I think.
As a quick example, recently I ripped Ratatouille (2007) for my video server (for the kids of course! :-). The original Blu-Ray rip is a 19.5GB H.264 with 640kbps AC3, and english subtitles. I decided to keep the AC3 and english subs, only re-encoding to HEVC 10-bit, constant quality 22, Encoder Preset at Medium. The resulting file size was a mere 1.40GB! This is almost 1/14 the size of the original - remember, clean animated images compress beautifully:
|1:1 500x500 image crop comparing original H.264 vs. H.265 transcode. Handbrake Nightly, Constant Quality = 22, Encoder Preset = Medium. x265 10-bit encoder engine. Remember, the JPG file shown here is 8-bit color only and lossy compressed.|
Like many things in life, there is a price to pay for quality... Two main points to keep in mind:
1. HEVC software encoding is slow. Currently with software like Handbrake, expect to leave your computer maybe overnight or daytime at work for 1080P movies (and maybe a day for 4K). Typically, with my Core i7-3770K @ stock 3.5GHz, it takes me about 5 hours for a 2-hour 1080P movie. Expect at least 4x that for 4K video. Note that for newer generation processors with AVX2 instructions, the x265 encoder has optimizations to take advantage of these instructions. My "Skylake" i5-6500 machine in the home theater room actually is faster anywhere from 30-100% compared to my aforementioned stock "Ivy Bridge" i7-3770 workstation despite the lack of Hyper-Threading! Not unexpectedly, 10-bit video encoding does take longer compared to 8-bit - about 10-20% slower for me...
These are very rough estimates since it really all depends on the video itself like how dynamic it is, with action shots taking longer and more space to compress. Beware of very noisy video as well. For example the grainy, dark look of Batman v Superman really bogged down the rate of conversion compared to the "cleanliness" of Pixar movies which can be converted quicker with lower bitrates to achieve similar image quality.
While fast hardware encoding is possible (like with an nVidia Pascal GPU) and is becoming more common, I suspect HEVC 10-bit hardware encoding will go "mainstream" when Intel releases Kaby Lake next year and available to forthcoming CPUs. Since I recently got an ASUS nVidia GTX 1080 graphics card, I did a quick test using myFFmpeg encoding with nVidia's NVENC. The results were OK - almost real-time transcoding (20-30 fps) of high bitrate 4K 10-bit HEVC to 8-bit Y'CbCr4:2:0. At equivalent bitrates (say 4K video at ~10Mbps), the x265 software encoding does look better. This has been the general criticism of fixed hardware encoding - it's fast but typically will need a bit more bitrate to achieve the same quality as a good software encoder.
2. Make sure you have good HEVC (especially 10-bit) hardware decoding. No point saving all this storage space while maintaining excellent quality with the latest high-tech encoding technique if you can't play it back smoothly! Unlike in the world of AVC playback where 10-bit support is limited, 10-bit HEVC support is essential because "High Dynamic Range" (HDR) basically requires the extra bit depth to make it worthwhile. I mentioned a couple weeks back that I use an Amlogic S905X TV box for movie streaming. At this time, this is the most economical way I know of to decode HEVC "Main10" files in hardware. For ~$50, I bought an inexpensive Sunvell T95X box:
|Notice the Logitech Unifying receiver on the right; the device is controlled with my K410 keyboard/trackpad.|
With HDR being incorporated into essentially all new TVs, 10-bit hardware decoding will become standard whether the panel is actually 8-bits or not. Furthermore, Intel's Kaby Lake will include 10-bit HEVC support so 10-bit HEVC is about as much guaranteed to become mainstream and ubiquitous as any technology standard.
Lastly, not to be ignored, I think it's still worth keeping an eye on Google's royalty-free VP9 which they've outfitted with 10-bit/HDR capability through "Profile 2" and rolled out officially with the recent Chromecast Ultra and YouTube HDR:
I took the picture above about three weeks ago since I was curious to see HDR YouTube myself... My Vizio P75-C1 already can do standard 4K casting but not 10-bit VP9, so likely no YouTube HDR support. I popped down to the BestBuy down the street to pick up the device. Strangely, I could not get this thing working in HDR10 mode plus I couldn't even get Dolby Vision working on Netflix despite not having any issues with the TV's built-in casting function. This is obviously very disappointing, especially since the Chromecast Ultra's advertising claims DV compatibility. Oh well, I didn't feel like fooling around with it further and got my refund. Who knows, maybe there's a firmware upgrade as I heard that others have been troubled with the lack of Dolby Vision success as well. A little surprised I couldn't even get HDR10 working easily through my Yamaha RX-V781 receiver even though the cheap S905X TV box above worked!
Let's talk audio for a little bit...
For those of you with a library of multichannel files like 5.1 FLACs ripped from DVD-A/Blu-Ray/SACD's, I have found no better way to stream multichannel than with the ODROID-C2 or Amlogic S905X TV box using the free Kodi software through the HDMI interface to my Yamaha receiver as multichannel PCM. I just point Kodi to the multichannel music directory on my NAS under the Music tab and away it goes to identify and grab album artwork. Phenomenal! Here's what my TV screen looks like when streaming the 5.1 DVD-A rip of Queen's A Night At The Opera in multichannel 24/96:
Kodi is easily controlled with my Logitech keyboard/touchpad, but when playing music with the TV off (but HDMI still connected to the receiver decoding audio of course), it all works nicely with the Kore Android control app for selecting the album and song. Here it is running on my Vizio P75-C1's Android-based pad "remote controller":
Within the audiophile world, I think it's unfortunate that not more has been written and few releases are available in multichannel audio. Understandable given the rise of mobile audio. I've enjoyed and continue to listen and collect 5.1 presentations when available. Nonetheless, I am glad that inexpensive receivers, the HDMI interface, and Kodi streaming devices like the above are not difficult to put together for a stable, reliable, inexpensive and altogether compelling surround music experience!
As suggested a couple months back, we are looking at great progress this year with 4K video and a push towards better image quality beyond just spatial resolution. I suspect for many (most) consumers, all the discussions about HDR (High Dynamic Range), the various competing HDR standards (HDR10, Dolby Vision), and the video encoding variants like AVC/HEVC/VP9 sound like technogeek gibberish. That is of course true since I don't believe the average consumer should really care once a technology matures! Over the next few years as more folks experience the technology and we settle down on the important features that resonate with the consumer, it'll "just work".
From the vantage point of late 2016, let me offer a look at the crystal ball and suggest what I see ahead (remember... crystal balls are murky and potentially inaccurate :-).
First, I doubt there'll be much of a "war" as some suggest (or even on USA Today!) when it comes to HDR standards. The move to 4K has not been as rapid as when we went from CRTs to flat-panel 720P/1080P so although the techies will have their excitement, I just don't think the consumer at large will run to stores to pick up the latest HDR tech much less 4K screens. Furthermore, unlike the potential financial licensing benefits from having one's physical format adopted (like say HD-DVD vs. Blu-Ray, Betamax vs. VHS, SACD vs. DVD-A), there is only one UHD Blu-Ray standard. An HDR standard like HDR10 is open and royalty-free; no real money behind it to implement other than satisfying consumer expectations. Money likely will be made in the production side (eg. Dolby selling its production tools for Dolby Vision and implementing it in as many cinemas as possible...) rather than consumer end. As such, I suspect that the HDR technologies will co-exist like how the audio encoding techniques have coexisted with essentially universal compatibility these days among AV receivers for DD/DTS/TrueHD/DTS-HD MA. I suspect a "trinity" of HDR varieties will coexist by 2018 with TVs typically supporting all three at the high end, and maybe 2/3 at the basic end (probably those that do not want to license Dolby Vision). I suspect a TV that is compatible with these 3 HDR mechanisms will be rather future-proof... At least for the next 5 years I imagine...
1. HDR10 - There's a minimum standard, this is it for UHD Blu-Ray HDR with HEVC 10-bit Y'CbCr4:2:0 encoding. HDR YouTube also uses HDR10 coupled with the VP9 10-bit (Profile 2) CODEC.
2. HLG - Still early and not even available on many consumer sets yet, but I have a suspicion that the "Hybrid Log-Gamma" transfer function technique recently ratified as Rec.2100 mixed with 10-bit HEVC will be what we're going to end up seeing more and more of in the years ahead. HLG will likely form the basis of HDR broadcasts when it becomes available, does not require any embedded metadata that can be lost during processing, is royalty-free to implement, allows single signal compatibility with both SDR and HDR (in Rec.2020 color space) - probably as "simple" as it gets in many ways. Here's a BBC article on this (and apparently BBC's botched HDR streaming experiment so far on Panasonic TVs). With Android 7.0 Nougat supporting HLG, it could be the HDR standard of mobile devices - imagine capturing HDR video on your phone's camera for example. Bottom line is that HLG could be the de facto format for home videos, computer video files, and home video servers with mixed SDR and HDR content. Furthermore, I suspect HLG would not be difficult to implement through firmware updates with existing HDR TV's. Sony has already announced it for their top-of-the-line but older BVM-X300 monitor that was introduced in early 2015. HLG is also officially supported in the HDMI 2.0b specs.
For an excellent technical look at HLG, ST2084 and other HDR goodies, have a look at this LightIllusion page. Also, if you want to check out an HLG test stream on YouTube right now, here ya go (LG's "Cymatic Jazz" HDR demo - see here for even more details). Without an HLG-compatible screen, the video will be missing the dynamic "pop" of course:
BTW: No surprise that Dolby is not impressed with HLG - see these PDF's here and here. Remember that both HDR10 and Dolby Vision use the Perceptual Quantizer (PQ) ETOF (SMPTE ST-2084) designed by Dolby Labs, so the HLG transfer function is a competitor to PQ.
3. Dolby Vision - Sticks around because of Dolby's prowess in the professional movie/production world, and the high-end Dolby Cinema theaters when applied to blockbusters and A-level movies. DV is dynamic scene-based HDR with "end-to-end" matching of capabilities from the production studio to your TV - the "high end" quality standard. The other specs are impressive as well - 10,000-nit range and 12-bit precision; hard to imagine needing anything more! Netflix, Amazon, and Vudu are already streaming DV. We're starting to see this in the consumer disk playback world with the Oppo UDP-203 player promising to support DV UHD Blu-Rays but still no content yet (and as of late 2016, the Oppo needs a firmware update). Because of Dolby's licensing costs, I think DV will remain the "premium" specification which the cheapest players and TV's will miss out on while still supporting HDR10 and HLG.
Pssssst... Vizio, how about giving us HLG to complete the DV/HDR10/HLG "trinity" for the 2016 M and P-series once you fix a few of those firmware bugs, improve input HDR lag, and Y'CbCr4:4:4? That would impress your competitors and consumers alike... :-) This would really lead the field in 2017 IMO.
[For completeness, there's dynamic metadata HDR10... Still a ways off at this stage until the HDMI 2.1 standard probably. However from a qualitative perspective, it'll have to compete with Dolby Vision and whatever level of content DV has at that point. If DV is going strong, I'm not sure who's going to care about this unless it's just an easily-implemented firmware update.]
Merry Christmas and a very happy New Year everyone! Stay safe, hope you're enjoying quality time with you and yours.
Chat in 2017...
Update: Handbrake 1.0.1 was released around the Christmas 2016 holidays. It's good!