Saturday 2 December 2017

MUSINGS: Thoughts on fake 4K, post-truth and film "format folly" (70mm film & Christopher Nolan).


Since returning from my Asia trip. I see that new toys for Christmas have now been released including the new Xbox One X (which I mentioned a few months back).

Although it seems like there may be issues that need to be fixed, one nice feature of the Xbox One S and X machines is the ability to play the new Blu-Ray UHD disks which have been available since March 2016. Though there are a number of disks out there now, I suspect the growth has not been spectacular... Not unexpected really since the jump from 2K (1080P) to 4K (2160P) resolution does require the right conditions as laid out previously to be truly appreciated, plus, like music, movies have gone "virtual" with streaming being the primary mechanism of consumption.

For this post, I thought I'd put together a few discussion items about video IMO worth thinking about as we are very much in the era of the 4K videophile!

I. Let's talk Spatial Resolution

As a start, if you're reading this on a 4K monitor or TV, for an excellent demonstration of video resolution, have a look at Steve Yedlin's resolution demo to "see" the difference (or lack of).

The reason I bring this up is because most movies these days are still produced in 2K in the final "mastering" - the Digital Intermediate typically still done in 2K (1080P). For fun, have a look at the site Real or Fake 4K to search for your favorite movie and see whether it likely has retained "real" 4K resolution throughout the production chain. You might also notice that most CGI and visual elements are still rendered in 2K on most current flicks including high-budget movies from Pixar's Cars 3 to Marvel/Disney's Spider-man: Homecoming, DC's Wonder Woman and even the very recent Justice League. Despite each of these releases available as 4K UHD Blu-Ray for purchase (or will be available in the near future), it's rather clear that improvements in image quality would not be because of native spatial resolution since they're upscaled 2K to 4K.

Notice that on Real or Fake 4K, they include scanned 35mm film as "real" 4K (eg. The Fifth Element, Blade Runner). Like claims of analogue tapes digitized as 24/96 being "high resolution", subjectively, IMO 35mm-sourced video looks inferior. I personally would not put an old 35mm film encoded into 4K high on my list of "must have" UHD Blu-Rays especially if I already have the 1080P version. The likelihood is that one will start seeing the blemishes and limitations of the film source more than meaningful detail.

As suggested above, while analogies are not direct, there are similarities worth keeping in mind between hi-fidelity audio and video formats. When "standard resolution" 16/44 CD-quality PCM already provides >90dB dynamic range and >20kHz frequency response, how much noticeable difference do we expect from 24-bits and 88kHz and higher sampling? Likewise, depending on the size of one's TV screen and sitting distance, many will likely be quite happy with the 1080P standard Blu-Ray compared to 4K/2160P UHD Blu-Ray. We've been through this before of course in the audio world with tests like our "16-bit vs. 24-bit blind test" a few years back. The point is that human perceptual limitations are rather well understood after decades of research; whether it's audio or even video. While there is more to be done in trying to understand the underlying biological processing of our brains and the psychological mechanisms of perception, this doesn't mean we don't already understand how technology can saturate our perceptual limits for the most part given the limitations of a certain technology (ie. flat-screen 2D video or 2-channel stereo sound). Sure, just like speakers can still be further improved upon for better fidelity, monitors and TVs can likewise still be improved upon greatly, but the parameters of the digital media format itself such as resolution, bit-depth, color gamut can already contain a huge amount of information beyond the next generations of display devices.

II. Remember to consider Dynamic Range (color & contrast)

Just like with music albums, it is typically not the audio bit-depth resolution that people will notice when the media is "remastered" (although with some old albums, re-transfers with modern higher quality ADCs will go a long way to restore resolution). Unless there is actual re-mixing of the music, it is typically the change in dynamic range and EQ'ing applied that makes the sound different...

In the audio world, sadly we have seen attenuation of this dynamic range as demonstrated by a decrease in the DR value of albums over the years. I know, the DR tool is not the end all and be all of measuring dynamic range, but it's a reasonable proxy. Thankfully, with video, they are going in the right direction with expanding contrasts and color range to achieve "HDR". In the world of video, they adjust the "color grading" for the image to improve contrasts and color fidelity. My hope has always been that in time, the audio world will appreciate the importance of higher dynamic range and embrace more realistic fidelity again instead of just making things loud all the time.

III. Keep an eye out for "post-truth" subjectivism as fidelity satisfies the desires of the vast majority of consumers

As humans, we keep seeking for the "next big thing", and as (mostly) guys, we seek the newest and best technologies to expand our opportunities to reproduce reality and allow for maximal artistic expression. That's good but what if technological gains plateau?

Consider for a moment the question: what happens when engineering hits those limits of human perception; when "high fidelity" enters the realm of being easily "good enough" for the vast majority of consumers near the limits of the current technology? For years we have seen one type of outcome in the audiophile world. While one can appreciate subjective preferences, something happened over the decades when large groups of "audiophiles" started embracing subjective claims as the only measure of truth. Some people start embracing a bizarre form of "faith-based" thought while knowing full well that the products are the result of science and engineering!

In 2016, the term "post-truth" as defined in the Oxford Dictionary was given the distinction as the "word of the year":

post-truth

ADJECTIVE

  • Relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.
    ‘in this era of post-truth politics, it's easy to cherry-pick data and come to whatever conclusion you desire’
    ‘some commentators have observed that we are living in a post-truth age’
(As referred to in the definition, this goes beyond consumer electronics into all kinds of political domains and views of self and others... But let's just restrict ourselves to tech products here.)

Scientific progress to the point where objectivity doesn't suggest further need for advancement indeed leaves little room but for humans to dig into their imaginations to explore for subjective differences. Companies with little ability to advance actual research and development will also start to promote products that encourage this subjective introspection. In fact, the science of advertising has already shown us a multitude of ways of applying psychological biases to change beliefs and behaviours. In general, tweaks and high-priced cables of "audiophilia" are fascinating examples over the years of this "post-truth", "post-science" hobby when significant numbers hang on to unvalidated and at times bizarre claims, allowing themselves to be aligned with individuals and companies promoting products based on "emotion and personal belief".

One interesting belief almost to the point of mainstream acceptance is that vinyl represents "high resolution". Clearly this is wrong in every way with no empirical basis in objective measures of fidelity.

Analogue (typically vinyl, reel-to-reel) vs. digital represents the prime example of senseless "format folly" which has raged on in audiophilia for decades. Objectively we know that modern digital "wins" when it comes to ultimate potential fidelity, convenience, robustness, ease of distribution, and flexibility as a whole compared to the types of analogue media out there. Of course nobody is forced to choose digital or analogue. And there is plenty of subjective leeway to claim that one prefers this over that. No need to fight vigorously I think because the proof is not hard to find (though there are other variables to consider as discussed previously); the simple truth being that potential fidelity of analogue media is limited.

What I find interesting is that this "analogue vs. digital" debate is starting to be perpetuated in the world of hi-res video as well. For example, recently the public is being given the idea that shooting actual film makes some kind of huge difference in terms of quality. For the most part, this idea is perpetuated by a couple of auteurs... Quentin Tarantino and Christopher Nolan.

IV. IMAX 70mm film as an example of subjectivist hype?

Much news had been made of Interstellar being projected on 50 IMAX 70mm screens worldwide, Tarantino's Hateful Eight shown on about 100 70mm screens and recently Dunkirk with its 125 projectors; paltry numbers considering the thousands of theatre screens out there.

Christopher Nolan claimed in 2015 that there is some risk of "a very real danger in watering down the theatrical experience" with the demise of film - watch the first few minutes of this:


Yet, when you view the interview in its entirety, it's hard to understand the need for such a dire warning. Sure, grabbing an image on analogue film is more difficult and involves all kinds of skills for the director and production crew, and film can change the video quality due to the type of grain, distortion and noise added to the image (like the limits of vinyl). But beyond that, I fail to see how doing the job digitally can actually "water down" the consumer's experience. Is there any reason that a movie shot digitally cannot be just as good or the experience just as enjoyable for the viewing public or videophile? (Let us of course not forget that even in Nolan's movies, digital effects are obviously needed!)

He seems to be conflating 2 things here (similar to the vinyl apologists who lump all kinds of variables together). First, there is the experience of the consumer when watching a movie in a theatre (sights and sounds, social elements), and second, there is the question of the putative value of working with film as the artist (skillful processes involved).

The first point is similar to having a preference for going to a concert hall to listen to music. There is the "physicality" of being in a movie theatre. Comfort of the seats, the smell of buttered popcorn, the chatter of neighbours, the social elements of going out to a movie with friends...

But I would argue that going to a concert hall for music is a superior experience because concerts are live events where artists are actually performing for you! Not so with movies - we're simply watching a reproduction with different audio-visual equipment in a larger space with other people! Other than maybe having a social space ("the communal experience" 23:00 in Nolan's 2015 interview) to invite a bunch of friends to watch together along with seeing the reactions of strangers, I'm not sure what other elements of the "theatrical experience" would be of such great importance (the seats? the buttered pop-corn?).

The contentious argument for me comes with the second point about the joys of working with film when linked to some subjective notion of film somehow being "better". Notice in the 2015 interview at about 1:10, the interviewer asks Nolan in a blind test whether he would be able to pick out the film version. He claims he would be "extremely confident" - yeah, sure... And notice immediately afterwards comes the old explanation like in audio as to how "sophisticated" the human perception system is at 1:30 (in this case, the eye rather than the ear of the audiophile - are there such things as "golden-eye" videophiles?). Perhaps like the vinyl lover who doesn't mind the clicks and pops, Nolan describes film as having "that organic, larger than life quality that great film print has" - presumably including the blemishes he can easily ignore and take some comfort in. He wants his kids to have that experience; how generous of him :-).

(The rest of the interview IMO is excellent as Nolan speaks about his movies like Insomnia, Dark Knight Trilogy, etc...)

Then there's the insistence on using the larger format 70mm film for Dunkirk. There's no reason to really be impressed by some kind of resolution superiority (the Yedlin demo above includes IMAX 70mm/15-perf) compared to say a 4K dual-laser IMAX projection. Remember that even if 70mm film captures higher resolution initially, once the film goes through the various processing steps and then is copied back to film for final projection, resolution is easily lost just like in the old days of generational loss with audio tapes going through the mixing process. In fact, laser IMAX provides not just excellent 4K resolution but also the greater color gamut and improved contrast. If you're wondering, yes, I did check out the "IMAX Experience in 70mm" at Colossus Langley Cinemas locally about a week after Dunkirk opened. I was not exactly overwhelmed with the image quality and the limitations of the film were evident (remember, each time the movie is projected, little scuffs and other film damage can occur just like with each vinyl playback). Sure, the 1.43:1 aspect ratio is great for the large screen, but that would have been the same with laser IMAX at a location that supports the aspect ratio. I suspect the question of film vs. 4K laser IMAX could be easily determined with a side-by-side comparison. While I have not seen a full report on such a test (I've read rumors of such demonstrations in the past with audiences preferring the 4K laser projection), here's an interesting comment from Gizmodo in 2015:
As immersive as seeing Christopher Nolan's Interstellar in IMAX was last year, seeing the film remastered for these new laser projectors is an entirely different experience. You see details that were simply washed out by the old projector technology, and combined with the crispness and brightness of the image it's probably as close as you'll ever get to seeing what an astronaut sees while they're in orbit. IMAX has even gone so far as to upgrade the lighting in its theaters, redirecting bulbs away from the screen to help maximize and coax every last bit of contrast from its new projectors.
That actually sounds much more reasonable to me than harping on anachronistic "2001: A Space Odyssey" era 70mm film with "standard dynamic range" technology at the usual flickery 24fps. Arguing for 70mm film is more about harboring subjective romantic and nostalgic feelings toward the old technology than arguing for true resolution benefits.

Ultimately...

I'm glad that high definition and "ultra" resolution video is making it's way into the living rooms and theaters. I simply hope that in these technological matters, we discuss issues of "fidelity" and perceptual limitations in a reasonable fashion when comparing the different formats. That we maintain objectivity and rationality in the hopes that the video hobby doesn't degenerate into the bizarre faith-based delusion of some even prominent members of the audiophile world (or sadly the "post-truth" backbone philosophy of some mainstream audiophile media!).

Yes, analogue formats whether vinyl or 70mm film can look and sound great. But there's no reason that these qualities cannot be digitized over to 24/96 or 4K digital while retaining everything necessary simply because in truth, analogue media does not contain "infinite" resolution. No need for amplifying beliefs bordering on dogma like those from Neil Young years ago and certainly be careful of subjective beliefs (eg. "70mm film is better!") posing as "truths" ("post-truth creep") in the world of the videophile as technology matures in the years ahead.

Ultimately, my suspicion is that the video world, being more "seeing is believing" will actually remain much more rational and will not become all worked up about stuff like high-priced video cables and nonsensical tweaks in any event.

---------------------------------------

Well, even if you prefer 70mm film in the theaters, it's good to see that for us home videophiles, 4K Nolan movies are coming in December - almost all of them! Check out the "Christopher Nolan 4K Collection". Nice. With HDR remastering for all these movies, I'm certainly looking forward to viewing some classics like The Dark Knight Trilogy and Inception again with better resolution and dynamic range. Unlike audio remastering, it certainly looks like indeed video remastering is going the right direction :-).

So, what do you guys think? Anyone feel strongly about the limited return of 70mm film projection technology to your local movie-plex? (Personally, I'm hoping for an IMAX 4K laser installation in Vancouver at some point.)

Have a great week ahead and enjoy the video and audio folks!

32 comments:

  1. Film resolution is a different business than audio...the film look and texture is very different from a digital sensor.
    The way the stock film react is very different than the digital sensor.
    There is always improvement involve on the digital, but flm handle contrast ratio since a ong time and ig production could even ask for special film stock (remember the bleach look with Minortity Report). On the other hand, you must work digitally now if you want to handle the post-production workflow flawlessy.
    I could understand why some cinematographer prefer shooting in 70mm, you must understand that shooting film is way more costly than digital.

    ReplyDelete
    Replies
    1. Absolutely Blogue,
      I totally agree that film "colors" the image in a way different from what digital will do. Of course by "colors" I mean it like how in audio viny has a "coloration" imparted on the sound. That coloration adds to the subjective experience we may or may not like depending on life experience and elements like how it may evoke an emotional reaction.

      My sense is that using film purposely is great and in many ways is like a "special effect". Artists, cinematographers are of course free to use it for what they desire to accomplish. But I don't think we need to attach it to objective parameter like how amazingly "high fidelity" it is or try to portray it as somehow technically "better" compared to a good digital capture.

      I'm also totally for upgrading the local cinema to playback the movie's video and audio in very high quality. I'm just not impressed with the idea that retrofitting theaters with refurbished 70mm projectors and training staff to manage & fix problems with this unwieldly format is the way to go :-).

      Yup, 70mm must be remarkably costly these days! Again, I'm totally for choice and the customer being able to seek out 70mm film projection if they want (just not sure how practical it is or how many people really will pay for the "genuine experience"). But again, I'm not sure this needs to be "over-hyped" with claims about some vague but special "theatrical experience".

      My suspicion is that we're not far from a digital projection system which can be "dumbed down" to project an image that achieves the same effect as 70mm :-).

      Delete
    2. I agree with shooting in film...but film projection is a pain in the a%$ss!, there is so many occasion to screw up with the projection. Now the projectionist is the same person selling the ticket, cleaning the toilets and checking security. JPEG200 individual files, File Server, 2K or 4K Projector and the appropriate audio chain...and that's it...no more jitter, splice, dust, misaligned optical reader, badly reel assignment...just cinema...

      Delete
  2. Many years ago in the days of analogue TV I heard from a colleague who worked in broadcast engineering that the BBC used a process called TARIF (Technical Apparatus for the Rectification of Indifferenet Film) along with its telecine machines. This enabled a vision operator working with a joystick to make corrections to a film's lift, colour balance and gamma, either for future broadcast (in which case the operator's corrections were recorded while the film was being previewed) or live on air. Even back in those days, film was regarded as a difficult medium for broadcasters to handle.

    ReplyDelete
    Replies
    1. Yeah...

      And it really is just a good reminder of how far we have come technologically! Advancement should bring with it ease, flexibility, and power to simplify basic tasks so those who use the tech can focus on more creative tasks... With the power of digital these days, we've seen great creativity in terms of VFX and CGI for example.

      Having said this, I suspect many would like to see more creativity out of hollywood. Maybe it's time we moved on from all these sequels, superhero movies, remakes, and translations of graphic novels :-). I trust there are still stories "new under the sun."

      Delete
    2. This comment has been removed by the author.

      Delete
    3. I used to have a job, back in the early days of home video tapes, doing exactly that. It mostly involved adjusting brightness and contrast to emphasize what was important in a shot done live (I never had time, nor was I provided with, a cue sheet) during the transfer to a 1" analog video master. Levels were judged by looking at a scope of the video signal as well as a calibrated broadcast monitor (Conrac was the brand used by the company I worked for). Making a film transfer look good for analog video is quite difficult. Thankfully, digital has fewer limitations than analog video.

      That said, there is one thing about film projection vs. digital projection where film rules. Blacks. It is not a problem with digital recording technology, it is a problem with projection technology. Now, for your local cinema, digital technology just kills film. I stopped going to local movie theaters years ago because of incompetent projectionists who seemed to relish putting razor blades in the film gate, couldn't focus the projector, never changed the bulbs and a multitude of other sins which made going to the movies suck compared to watching them on a well calibrated home screen. But blacks. I grew up in LA and loved seeing films in 70mm at the first run theaters in Westwood and Hollywood. 70mm allows you to throw a ridiculously large image up on the screen and it looks glorious when the original was filmed in 70mm (the Hateful Eight) or in IMAX (Dunkirk). I went to my local digital cinema this summer to watch Dunkirk (BTW, a great film, it is not often that you see a movie that could have been a silent these days). And it looked great, except for one thing. The blacks. Black was dark, dark gray. Blacks were squashed, never reaching black and losing detail in shadows. I doubt it would have looked that way using traditional projection methods. I assume that projection technology will get better as time goes on, but it is still not up to par with film, when film is done right. Unfortunately for most people film presentations are done poorly.

      Delete
  3. As a side note on pushing video formats further than the physical limits, I’d like to comment on the other « loudness war » not the dynamic range one, but the sheer brute volume of sound in some movie theaters that push 100 dB+ soundtracks.

    Just went to see Blade Runner 2049, and although I’m a fan of Denis Villeneuve’s cinematography, I could only stand the soundtrack by stuffing foam ear plugs in my ears. I remembered my past experience with Nolan’s Interstellar where I just left the theater after a couple of minutes of his noise, so I brought some.

    Villeneuve’s previous film, Arrival, was nowhere as loud, so I guess, contrary to Nolan’s declared « realism » wish, that this soundtrack volume was imposed to him by the production company wanting to cash in on the action film crowd.

    With ear plugs filtering down some 20-30 dB, (not linearly of course, low bass gets through but screechy middle and highs are tempered) I could tolerate it. I wonder what people find realistic in a normal door that resonates when closed with a low 20 Hz thud or hand gun shots that sound like a cannon.

    The strange thing is that back home I listened to the soundtrack that was posted on YouTube and then only could I find similitudes to the Vangelis soundtrack of the first film ! In the theater, the louder than natural aggressive sound distorted the perception for me, triggering a danger reaction rather than a musical one.

    Maybe others are more tolerant of loud soundtracks in theaters, or simply got accustomed to it, but I think I just went to my last picture show !

    ReplyDelete
    Replies
    1. It's a new trend to extend the low frequencies on the mix.
      Started with Nolan...the low was ao high that the screen was moving at least two inch...putting the picture out of focus with the vibration.

      Delete
    2. Oi, sad to hear that Gilles! Sounds tinnitus inducing :-(.

      I quite enjoyed Arrival and the sound design I thought was quite good. When I saw the trailer for Blade Runner 2049 it did occur to me that the soundscape reminded me of something Nolan/Zimmer would do (accentuated bass, foghorns, Shepherd Effect perhaps).

      While I'm picking on Nolan's "push" for 70mm here, we could just as well pick on his soundtrack decisions. I agree, way too loud to the point of incomprehensible dialogue in Interstellar. Dunkirk was the same.

      If moviemakers keep doing this, I wouldn't be suprised if it drives people - those sensitive to noise and older customers - from the theaters. Audiophile also should not be subjected to potential hearing loss :-).

      Delete
  4. Interesting to draw some parallels between video and audio format wars (and arms race). There are some similarities, but also some differences:

    - Based on my experience with a 5K computer display (Dell UP2715K), Ultra High resolution (4K or above) can make a visible difference, even ignoring gamut and bit depth differences - looking at good quality pictures shows that one doesn't get the full benefit of a high end camera watching its pictures (especially with RAW images) on a full HD display (after calibration of course), even at 'normal viewing distances'.

    - Video formats are compressed in a lossy way, especially if distributed through streaming. While 4K or UHD may sound attractive, I'm wondering if bandwidth (bits per second) would not a better indicator of quality

    - With video, there are some upscaling technologies which do make a visible difference (I'm thinking of, for instance, the algorithms used by software such as MadVR, which reduce significantly the gap between DVDs and Full HD images). So far, commercial upscaling to 4K of 2K sources doesn't seem to be taking advantage of these solutions.

    On a personal note, I'm tempted by 4K video with a good projector to remplace my Full HD JVC D-ILA), but it seems that there is not enough native content to justify this move (upscaling Full HD with MadVR is great, but I'd like to see more native 4K media, especially at high bit rate... and beyond Hollywood blockbusters ;-) ). I've been quite involved with the move from analog TV to Full HD (from the late 1990s to the last decade), and hope the transition to 4K will not take as long!

    ReplyDelete
    Replies
    1. Hi Eiffel,
      Great points.

      Yes, I agree, I'm happy to have 4K monitors for both work and home use these days. Given that we sit quite close to the computer monitors, that extra resolution and the smoothness of lines, text, graphics is certainly noticeable. Somewhat different context compared to sitting in a home theater typically >6 feet away. Of course depends on the size of the projection as well...

      Yup, there is a lossy element to video. Notice nobody in their right mind demands lossless video frames being delivered to the consumer :-). Good point about "measuring" quality of video using "bits per second" (more typically Mbps). Certainly when we start streaming 4K, we better start seeing a good bitrate!

      For context, folks, remember that UHD Blu-ray typically has average bitrates of 100-125Mbps for video... Netflix maybe 50-60Mbps or so max.

      Upscaling algorithm is certainly important. I have no idea what quality is being used by the producers going from 2K --> 4K. One would certainly hope high quality methods are being used if the home user has access to techniques like MadVR.

      Let us know if you jump to 4K at home :-). Yeah, more native 4K material will come.

      Delete
  5. I touched on the film thing once:
    https://therationalaudiophile.wordpress.com/2015/10/28/bring-back-the-projectionist/

    My main point was that it has a negative effect on people's perception of everything else they experience. Once you implant the idea that there is something special about analogue and that digital is therefore inferior, both creators and 'consumers' spoil their own experiences of digital from then on.

    Maybe it is inevitable. Provide people with a 'perfect' technology and they will almost masochistically crave for something to struggle against.

    ReplyDelete
    Replies
    1. Great article at Rational Audiophile in 2015!

      I think you've touched on a very important element in the human psychological experience and that I also noted above. The fact that no matter how great something is, we can get bored easily... And this leads us to feed subjectively dissatisfied. As a social group, we then end up with these "revivals" and the "pendulum swings". As far as I am aware, despite these swings, technological progress still goes ahead; it's not like most people will go back to vinyl, demand the return of laserdisc, or drag out their CRT TV again :-)...

      Speaking of human psychology, I heard that some cinemas were giving away commemorative T-shirts when one bought tickets for the 70mm IMAX showing. Certainly one way to create interest and sell more expensive seats.

      I still have some very old 4-minute 8mm reels of family films from back in the very old days. I'm certainly glad to digitize those and not have to bring them out again.

      As for still images, yup, glad to be done with wasteful and environmentally unfriendly plastic film. A few years back, I finally scanned my medium format wedding negatives. Pain in the butt and given the imperfections, specks of dust, etc. still went into Photoshop to fix color, sharpness and anomalies.

      Delete
  6. There might be a fairly obvious amount of difference the 4K Blade Runner shows?

    http://forum.blu-ray.com/showpost.php?p=14059684&postcount=1740

    as an example (close-up comparison of Tyrell's pyramid)

    ReplyDelete
    Replies
    1. and the grain... so organic, natural and beautiful...

      Film grain has nothing to do with natural and organic...

      Delete
    2. Hi Stalepie,
      Thanks for that link!

      Great little snip of Blade Runner from 1080P and UHD. I watched a little bit of the "Final Cut" UHD the other night and it does look good. Apart from pixel-peeping though, the most noticeable difference is actually from the HDR10 IMO. They've expanded the dynamic range, improved contrast, and made those building lights "pop" out. Unfortunately, there's no fixing focus issues here and there and the film grain can be quite prominent in some scenes. It is a "classic" in any event, and whatever improvements can be made is appreciated...

      I've always enjoyed the soundtrack and the Dolby Atmos remix/master sounds great BTW.

      Delete
  7. Like audio, isn't it the case that most people have their own threshold or sensitivities to 'quality'. I have a 65" Samsung 4K display, and the Oppo 203 player, and have bought a few 4k discs, but truthfully, for me the quality of HD BluRay is more than adequate to allow me to enjoy whatever I watch.
    It is very clear to me that 4k is 'better', but I simply don't feel inclined to 'pixel peep' my way through a movie, when I'd rather enjoy it's artistic merits at a very acceptable level of resolution. (Btw, I bought the Oppo as a music disc player first and foremost).
    Others may feel dissatisfied with HD believing there is something better to be had, but of course the truth is, that as in music reproduction, those better qualities are severely limited by the upstream production.

    I'm absolutely fine with BluRay. The biggest dissatisfaction I have is that it's hard to find good movies which are not overladen with computer generated effects and massive booms and explosions - loudness wars of another kind!

    ReplyDelete
  8. OMG, I can't even.... Dude, just stay in your lane. At some point later in life you're going to look back at these posts and realize just how much YOU sound like a proselytizing, dogmatic, and worst of all, subjective, person. You want to play arund with the audio arena where basically .0001% want to argue and .00001% actual studies/research exist to back any of your stuff up, fine. But now you're an expert on film?

    ReplyDelete
    Replies
    1. A really fascinating comment! It seems to me that you are mistaking Archimago's piece as a claim to "expertise" when it is, in fact, something different: playing around with ideas.

      "Expertise" is the result of vocational training, qualifications, on-the-job experience. Taxi drivers are experts on driving. Heating engineers are experts on fitting boilers. Experts are people who have become experienced in doing a job. They don't necessarily have particularly interesting insights or ideas.

      But you don't need to be an "expert" to understand how something works; in fact it can get in the way of true understanding. Audio is about signals: wiggly lines, how to store them, how to reproduce them. It involves some mathematics, some physics, some electronics. It is also about culture, human senses, perception, psychology. As such, the only way to understand it is to be a bit of a generalist.

      Film is also about signals, storing them, reproducing them, etc. There are many parallels with audio. As such, a person with sufficiently general knowledge, and the in-built curiosity that goes with an interest in ideas (not just doing a job) can make perfectly valid, meaningful, insightful observations on both.

      Delete
    2. I concur, Thanks Archimago for some much needed subjective commentary. Doctorrazz

      Delete
    3. Hey there Wushuliu,

      No, I certainly would not claim to be an "expert" in film. I am first and foremost a "consumer" whether of audio or video products; seeing things through that lens of value as a potential customer. Of course this is "subjective" although I try to make a few points which I hope are "reality based" and translate to the world around us and the people who partake of these things. It's a blog post and I am certainly happy to look back in a few years and recognize that maybe my ideas have changed... Or I might not!

      Like I said in the article, I just wanted to throw out some ideas which may or may not be different from what's published out there. I suspect some of these points will resonate with some and not others as we enter the "Ultra High Definition" world of movies and consumer video. No different from any of my other "MUSINGS" posts I think.

      Delete
  9. Thanks for the great commentary/clarity/analogy, video-Audio. Analogue, to digital. As a old school, photographer, with 35mm and 2 1/4"x 2 3/4" cameras back in the day, I had to find a east European slide projector to enlarge the images of the 2 1/4 to show the difference in the resolution, and then the defects in blemishes etc, were painful. This mid 80's evolution into digital formats, audio and visual, I can remember, the talking heads saying that digital format had a long way to go to catch up to film?? Ya 2-3 years and it was over. I have a 3 year old 70" LCD 240hz 1080 TV and not making a upgrade anytime soon, until there is more content, and as with sound, until I, a 62 year old can hear or see the difference, night and day. I loved the visual links. Thanks again for the Clarity.

    ReplyDelete
  10. Thanks for the comment!

    Great to hear from "old school" practitioners like yourself. I as well remember those comments about how it would take a "long time" for things like resolution, contrast, color quality, etc. to overtake the supremacy of film cameras. Certainly found it tough to believe these claims even back then given how fast the products were developing!

    Though I never got into the intricacies like developing my own negatives (I remember growing up and hanging out with my dad in his darkroom), I did have my Nikon 35mm body and a collection of lenses. As a consumer it was quite clear that the end was near by early 2000's with the release of even some of those early DSLR's (got my Nikon D70 6MPix in 2004 just in time for the arrival of my first child).

    I still have a few rolls of unopened film in the cool storage space :-). I suspect they must be 15 years past expiration by now! Anyone know what happens to "expired" film after all these years?

    Enjoy the 70" :-).

    ReplyDelete
  11. Those unexposed films can still be developed, as long as in cool, dry storage, just developed some old ones, a lot older than that, Kodachrome 25 and Ektachrome 64. Came out fine. Cheers.

    ReplyDelete
  12. Not strictly on topic, but I've been thinking about a different form of audio format "folly" of late.

    I totally get how with the latest DACs, Jitter is a problem of the past. You've measured and proved this time and time again - Bravo!

    My DAC has 4 digital inputs: AES, Optical, Coax, and USB.
    I run my Raspberry Pi3 Volumio player into the USB input.
    My LG UHD Dolby Vision TV's optical out goes straight into my DAC.
    My AppleTV4 goes into my LG TV, with audio going via the LG TV's optical out to the DAC.

    When I play any content from the Apple TV to the LG TV, I get about a 1/4 second audio drop out (silence) every 3-5 seconds or so. However, when I stream the audio from the AppleTV4 via AirPlay to the Volumio server connected with USB to my DAC, there are NO drop outs and it seems fine. There my be a small level of audio lag doing it this way, but to be honest I'd prefer that over the regular audio drop-outs.

    Anyway, I'm now considering getting an HDMI Audio splitter to feed the audio directly from the AppleTV to my DAC. I ordered one before checking that it had a coax RCA SPDIF out, it doesn't. It's optical.

    So, should I get an SPDIF Optical to Coax converter as I'm out of Optical inputs on my DAC?
    Or do I get a different HDMI splitter, one with a Coax / RCA SPDIF digital output?

    So back to the whole "Audio Format Folly" thought...

    What impact does taking audio from a less than ideal source on audio quality, jitter etc?
    If we feed audio from a Mac/PC/RPi etc into a modern DAC - wonderful repeatable results!

    But what about AppleTV's, TV Optical outs, TV HDMI ARC outs, game console (optical) outs, set-top-box digital outs?

    Is there anything we can do to minimise the damage of consumer convenience where most 'home' stuff offers Optical SPDIF out and we run out of Optical inputs on our nice audiophile DACs?

    ReplyDelete
    Replies
    1. Hey Unknown... Wow. That's some digital audio hell :-).

      Yeah, latency in audio is much better than those dropouts you're describing between the the AppleTV to LG TV (HDMI) then to DAC (optical SPDIF). Ouch.

      Personally I'd see if I can find an HDMI --> Coaxial inexpensive audio splitter. I know there's one here in North America Amazon that's advertised for Apple TV (1080P model) and it's really cheap. As usual, I'd make sure there's a good return policy just in case. Getting a optical --> coaxial on top of the splitter likely will just add to your jitter and possible frustration if it doesn't work. Remember, HDMI isn't known for great jitter suppression to begin with.

      Alas, not sure if there's much to be done once we run out of inputs. My solution is to use a receiver for all the home theater / video stuff and the stereo audiophile DAC only plays music...

      Delete
  13. Back when Interstellar came out I came across a really nice study done by Dr. Hans Kiening from R&D of ARRI about resolution on Film Camera and the optimal digital conversion ( Link ) the article of course show what a resolution it's reached in the best case scenario.
    Not so simple the analogy with HD Audio and larger video format, looking at the number it's clear that 70mm IMAX as the advantage over 4K but like someone else sad other thing come into play for the final quality.
    Moving to a home environment even if more resolution it always better the size of the tv and the watching distant will make it less beneficial and, like you said, noticeable more Dynamic Range of the 4K UHD disc than the little details.

    ReplyDelete
    Replies
    1. Thanks Davide for the comment and link!

      Great stuff :-). Yeah, so many factors involved and no matter what, it's never as simple as just assuming that whether something is 35mm or 70mm to begin with automatically implies some quality factor.

      Figure 22 in that article is a fascinating look at generational losses with the pure analogue process.

      Also really like Figure 29 with a 25m wide screen showing seating for benefit of 4K and 8K material.

      Delete
    2. Sorry to enter late, I usually read this blog regularly but somehow missed this paper from ARRI. It does look very film-biased (esp., when considers using such judgmental words like "coarse details" instead of low-frequency details, as accepted in the literature). It reminds me of the film-versus-digital discussions eons ego in photography.

      Interestingly, all discussions pretty much ceased with the appearance of 8MP cameras (24x16mm sensors, Bayer arrangement), as it didn't make sense anymore to claim that one could get a better picture with 35mm film (36x24mm, double that of movies' 35mm area, btw.). And the 11MP sensor of Canon 1Ds would be compared already to medium format (i.e., well above 36x24mm) as it was deemed to be well above the regular 35mm.

      Now, it was acknowledged that 35mm film had details beyond 8MP, but because they were so weak, the overall visual impact [the MTF integral if you will] of the digital 8MP cameras was greater than 35mm film.

      That would put the movies' 35mm (24x16mm frame, half-size) at most at 4MP Bayer sensor level [or perhaps at 2.5MP RGB (3-color sensor) as in Foveon sensors of Sigma]. And don't forget almost all movies today are finished in 2K DI (2MP most), and we are very fine with it indeed.

      Those discussions were raging 12-15 years ago in photography, but we are still seeing them nowadays regarding movies :)

      Delete
  14. While the paper does seem very 'film based, its still a great window into how the industry works in spite of users concerns and / or tastes. The issue of where to use 4K comes to mind but i guess at the end of the day its all down to personal taste and budget. Heres a good page to test out your 4k monitor with: http://www.playstation4magazine.com/destiny-2-wallpaper/

    ReplyDelete
  15. Hello, really enjoying your blog and the detailed appraisals in it. With that spirit in mind, I must say regarding this post: this does not read as a take of a cinephile. To reduce the use of film by Nolan to preserving spots and imperfections for his children, is a gross misdirection of what film lovers see. Foremost is grain, and yes you can replicate it digitally, but that is always ersatz. Secondly as an analogue medium, the resolution can be incredibly high, and resolved at a fine level not as uniform pixels, but an almost organic pattern. The combination of these two things mean that when you see an actual *film* in a cinema, and you're up close, there is a tactile quality to the image. It makes for the feeling of 'cinema' in that you want to touch and hold the thing on screen. Yes digital has it's own different qualities and you don't get the super high res sort of look with film. But they are definitely *different*. And it's not about one being superior, but that is a straw man. It's about whether film is *worth preserving* as a medium. And by God yes it is!

    ReplyDelete