Since returning from my Asia trip. I see that new toys for Christmas have now been released including the new Xbox One X (which I mentioned a few months back).
Although it seems like there may be issues that need to be fixed, one nice feature of the Xbox One S and X machines is the ability to play the new Blu-Ray UHD disks which have been available since March 2016. Though there are a number of disks out there now, I suspect the growth has not been spectacular... Not unexpected really since the jump from 2K (1080P) to 4K (2160P) resolution does require the right conditions as laid out previously to be truly appreciated, plus, like music, movies have gone "virtual" with streaming being the primary mechanism of consumption.
For this post, I thought I'd put together a few discussion items about video IMO worth thinking about as we are very much in the era of the 4K videophile!
I. Let's talk Spatial ResolutionAs a start, if you're reading this on a 4K monitor or TV, for an excellent demonstration of video resolution, have a look at Steve Yedlin's resolution demo to "see" the difference (or lack of).
The reason I bring this up is because most movies these days are still produced in 2K in the final "mastering" - the Digital Intermediate typically still done in 2K (1080P). For fun, have a look at the site Real or Fake 4K to search for your favorite movie and see whether it likely has retained "real" 4K resolution throughout the production chain. You might also notice that most CGI and visual elements are still rendered in 2K on most current flicks including high-budget movies from Pixar's Cars 3 to Marvel/Disney's Spider-man: Homecoming, DC's Wonder Woman and even the very recent Justice League. Despite each of these releases available as 4K UHD Blu-Ray for purchase (or will be available in the near future), it's rather clear that improvements in image quality would not be because of native spatial resolution since they're upscaled 2K to 4K.
Notice that on Real or Fake 4K, they include scanned 35mm film as "real" 4K (eg. The Fifth Element, Blade Runner). Like claims of analogue tapes digitized as 24/96 being "high resolution", subjectively, IMO 35mm-sourced video looks inferior. I personally would not put an old 35mm film encoded into 4K high on my list of "must have" UHD Blu-Rays especially if I already have the 1080P version. The likelihood is that one will start seeing the blemishes and limitations of the film source more than meaningful detail.
As suggested above, while analogies are not direct, there are similarities worth keeping in mind between hi-fidelity audio and video formats. When "standard resolution" 16/44 CD-quality PCM already provides >90dB dynamic range and >20kHz frequency response, how much noticeable difference do we expect from 24-bits and 88kHz and higher sampling? Likewise, depending on the size of one's TV screen and sitting distance, many will likely be quite happy with the 1080P standard Blu-Ray compared to 4K/2160P UHD Blu-Ray. We've been through this before of course in the audio world with tests like our "16-bit vs. 24-bit blind test" a few years back. The point is that human perceptual limitations are rather well understood after decades of research; whether it's audio or even video. While there is more to be done in trying to understand the underlying biological processing of our brains and the psychological mechanisms of perception, this doesn't mean we don't already understand how technology can saturate our perceptual limits for the most part given the limitations of a certain technology (ie. flat-screen 2D video or 2-channel stereo sound). Sure, just like speakers can still be further improved upon for better fidelity, monitors and TVs can likewise still be improved upon greatly, but the parameters of the digital media format itself such as resolution, bit-depth, color gamut can already contain a huge amount of information beyond the next generations of display devices.
II. Remember to consider Dynamic Range (color & contrast)Just like with music albums, it is typically not the audio bit-depth resolution that people will notice when the media is "remastered" (although with some old albums, re-transfers with modern higher quality ADCs will go a long way to restore resolution). Unless there is actual re-mixing of the music, it is typically the change in dynamic range and EQ'ing applied that makes the sound different...
In the audio world, sadly we have seen attenuation of this dynamic range as demonstrated by a decrease in the DR value of albums over the years. I know, the DR tool is not the end all and be all of measuring dynamic range, but it's a reasonable proxy. Thankfully, with video, they are going in the right direction with expanding contrasts and color range to achieve "HDR". In the world of video, they adjust the "color grading" for the image to improve contrasts and color fidelity. My hope has always been that in time, the audio world will appreciate the importance of higher dynamic range and embrace more realistic fidelity again instead of just making things loud all the time.
III. Keep an eye out for "post-truth" subjectivism as fidelity satisfies the desires of the vast majority of consumersAs humans, we keep seeking for the "next big thing", and as (mostly) guys, we seek the newest and best technologies to expand our opportunities to reproduce reality and allow for maximal artistic expression. That's good but what if technological gains plateau?
Consider for a moment the question: what happens when engineering hits those limits of human perception; when "high fidelity" enters the realm of being easily "good enough" for the vast majority of consumers near the limits of the current technology? For years we have seen one type of outcome in the audiophile world. While one can appreciate subjective preferences, something happened over the decades when large groups of "audiophiles" started embracing subjective claims as the only measure of truth. Some people start embracing a bizarre form of "faith-based" thought while knowing full well that the products are the result of science and engineering!
In 2016, the term "post-truth" as defined in the Oxford Dictionary was given the distinction as the "word of the year":
- Relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.‘in this era of post-truth politics, it's easy to cherry-pick data and come to whatever conclusion you desire’‘some commentators have observed that we are living in a post-truth age’
Analogue (typically vinyl, reel-to-reel) vs. digital represents the prime example of senseless "format folly" which has raged on in audiophilia for decades. Objectively we know that modern digital "wins" when it comes to ultimate potential fidelity, convenience, robustness, ease of distribution, and flexibility as a whole compared to the types of analogue media out there. Of course nobody is forced to choose digital or analogue. And there is plenty of subjective leeway to claim that one prefers this over that. No need to fight vigorously I think because the proof is not hard to find (though there are other variables to consider as discussed previously); the simple truth being that potential fidelity of analogue media is limited.
What I find interesting is that this "analogue vs. digital" debate is starting to be perpetuated in the world of hi-res video as well. For example, recently the public is being given the idea that shooting actual film makes some kind of huge difference in terms of quality. For the most part, this idea is perpetuated by a couple of auteurs... Quentin Tarantino and Christopher Nolan.
IV. IMAX 70mm film as an example of subjectivist hype?Much news had been made of Interstellar being projected on 50 IMAX 70mm screens worldwide, Tarantino's Hateful Eight shown on about 100 70mm screens and recently Dunkirk with its 125 projectors; paltry numbers considering the thousands of theatre screens out there.
Christopher Nolan claimed in 2015 that there is some risk of "a very real danger in watering down the theatrical experience" with the demise of film - watch the first few minutes of this:
Yet, when you view the interview in its entirety, it's hard to understand the need for such a dire warning. Sure, grabbing an image on analogue film is more difficult and involves all kinds of skills for the director and production crew, and film can change the video quality due to the type of grain, distortion and noise added to the image (like the limits of vinyl). But beyond that, I fail to see how doing the job digitally can actually "water down" the consumer's experience. Is there any reason that a movie shot digitally cannot be just as good or the experience just as enjoyable for the viewing public or videophile? (Let us of course not forget that even in Nolan's movies, digital effects are obviously needed!)
He seems to be conflating 2 things here (similar to the vinyl apologists who lump all kinds of variables together). First, there is the experience of the consumer when watching a movie in a theatre (sights and sounds, social elements), and second, there is the question of the putative value of working with film as the artist (skillful processes involved).
The first point is similar to having a preference for going to a concert hall to listen to music. There is the "physicality" of being in a movie theatre. Comfort of the seats, the smell of buttered popcorn, the chatter of neighbours, the social elements of going out to a movie with friends...
But I would argue that going to a concert hall for music is a superior experience because concerts are live events where artists are actually performing for you! Not so with movies - we're simply watching a reproduction with different audio-visual equipment in a larger space with other people! Other than maybe having a social space ("the communal experience" 23:00 in Nolan's 2015 interview) to invite a bunch of friends to watch together along with seeing the reactions of strangers, I'm not sure what other elements of the "theatrical experience" would be of such great importance (the seats? the buttered pop-corn?).
The contentious argument for me comes with the second point about the joys of working with film when linked to some subjective notion of film somehow being "better". Notice in the 2015 interview at about 1:10, the interviewer asks Nolan in a blind test whether he would be able to pick out the film version. He claims he would be "extremely confident" - yeah, sure... And notice immediately afterwards comes the old explanation like in audio as to how "sophisticated" the human perception system is at 1:30 (in this case, the eye rather than the ear of the audiophile - are there such things as "golden-eye" videophiles?). Perhaps like the vinyl lover who doesn't mind the clicks and pops, Nolan describes film as having "that organic, larger than life quality that great film print has" - presumably including the blemishes he can easily ignore and take some comfort in. He wants his kids to have that experience; how generous of him :-).
(The rest of the interview IMO is excellent as Nolan speaks about his movies like Insomnia, Dark Knight Trilogy, etc...)
Then there's the insistence on using the larger format 70mm film for Dunkirk. There's no reason to really be impressed by some kind of resolution superiority (the Yedlin demo above includes IMAX 70mm/15-perf) compared to say a 4K dual-laser IMAX projection. Remember that even if 70mm film captures higher resolution initially, once the film goes through the various processing steps and then is copied back to film for final projection, resolution is easily lost just like in the old days of generational loss with audio tapes going through the mixing process. In fact, laser IMAX provides not just excellent 4K resolution but also the greater color gamut and improved contrast. If you're wondering, yes, I did check out the "IMAX Experience in 70mm" at Colossus Langley Cinemas locally about a week after Dunkirk opened. I was not exactly overwhelmed with the image quality and the limitations of the film were evident (remember, each time the movie is projected, little scuffs and other film damage can occur just like with each vinyl playback). Sure, the 1.43:1 aspect ratio is great for the large screen, but that would have been the same with laser IMAX at a location that supports the aspect ratio. I suspect the question of film vs. 4K laser IMAX could be easily determined with a side-by-side comparison. While I have not seen a full report on such a test (I've read rumors of such demonstrations in the past with audiences preferring the 4K laser projection), here's an interesting comment from Gizmodo in 2015:
As immersive as seeing Christopher Nolan's Interstellar in IMAX was last year, seeing the film remastered for these new laser projectors is an entirely different experience. You see details that were simply washed out by the old projector technology, and combined with the crispness and brightness of the image it's probably as close as you'll ever get to seeing what an astronaut sees while they're in orbit. IMAX has even gone so far as to upgrade the lighting in its theaters, redirecting bulbs away from the screen to help maximize and coax every last bit of contrast from its new projectors.That actually sounds much more reasonable to me than harping on anachronistic "2001: A Space Odyssey" era 70mm film with "standard dynamic range" technology at the usual flickery 24fps. Arguing for 70mm film is more about harboring subjective romantic and nostalgic feelings toward the old technology than arguing for true resolution benefits.
Ultimately...I'm glad that high definition and "ultra" resolution video is making it's way into the living rooms and theaters. I simply hope that in these technological matters, we discuss issues of "fidelity" and perceptual limitations in a reasonable fashion when comparing the different formats. That we maintain objectivity and rationality in the hopes that the video hobby doesn't degenerate into the bizarre faith-based delusion of some even prominent members of the audiophile world (or sadly the "post-truth" backbone philosophy of some mainstream audiophile media!).
Yes, analogue formats whether vinyl or 70mm film can look and sound great. But there's no reason that these qualities cannot be digitized over to 24/96 or 4K digital while retaining everything necessary simply because in truth, analogue media does not contain "infinite" resolution. No need for amplifying beliefs bordering on dogma like those from Neil Young years ago and certainly be careful of subjective beliefs (eg. "70mm film is better!") posing as "truths" ("post-truth creep") in the world of the videophile as technology matures in the years ahead.
Ultimately, my suspicion is that the video world, being more "seeing is believing" will actually remain much more rational and will not become all worked up about stuff like high-priced video cables and nonsensical tweaks in any event.
Well, even if you prefer 70mm film in the theaters, it's good to see that for us home videophiles, 4K Nolan movies are coming in December - almost all of them! Check out the "Christopher Nolan 4K Collection". Nice. With HDR remastering for all these movies, I'm certainly looking forward to viewing some classics like The Dark Knight Trilogy and Inception again with better resolution and dynamic range. Unlike audio remastering, it certainly looks like indeed video remastering is going the right direction :-).
So, what do you guys think? Anyone feel strongly about the limited return of 70mm film projection technology to your local movie-plex? (Personally, I'm hoping for an IMAX 4K laser installation in Vancouver at some point.)
Have a great week ahead and enjoy the video and audio folks!