Tuesday, 20 September 2016

MUSINGS: 4K UHD TV, HFR, 3D, HDR, etc... Thoughts on video technology and the consumer.


As you saw a couple weeks back, when in China, I visited a number of electronics stores, especially the "big box" places in Beijing and Guangzhou. Many of these stores had on display a whole floor of 4K TVs made by all kinds of brands (many local China brands like TCL, Hisense, Skyworth, Changhong as well as the usual Samsung, LG, Sharp, Sony, etc...) and one could easily walk around and compare image quality. Of course, this is ultimately a limited comparison because the TVs are all displaying the manufacturer default settings and this usually means extremely bright, color saturated, extreme sharpness, and max'ed out contrast settings playing typically pristine demo videos so as to catch the consumer's gaze.

Even though this is primarily an audio blog, I see no need to confine ourselves to just one sensory modality... I've been wanting to upgrade to a 4K TV for awhile, certainly since my 4K computer monitor upgrade early last year. For today, I thought it would be good to think about the world of TVs and where the home visual technologies are leading us. Let's talk about some of the technology around visual quality and the parameters we should be keeping in mind, and perhaps even consider the future and what's down the pipeline. Perhaps most importantly, I like to think as a consumer in the big picture and even speculate a little bit about what might happen down the road.

By nature of our innate sensors (ie. eyes) and how we perceive (ie. vision), a TV is obviously a very different experience from audio which I've been posting on primarily for years. But just like in audio, there are standard objective parameters we can use to explore quality and engineering "accuracy" of the device. For a flat-screen display, basically, it goes something like this:

1. Spatial resolution:

Although complex and dependent on visual patterns and colors, let's simplify this for our discussions as primarily determined by the angular resolution of the 2D flat screen accounting for the viewing distance. How much pixel density is needed depends then on the total size of the screen and the distance we're viewing it from. The human eye does have a limit which I had already discussed last year. The website Is This Retina? is a wonderful resource to make some calculations and estimates on what's "needed" in one's home.

We are currently in the era of 4K (3840 x 2160P, UHD/"Ultra HD") TVs, a time when a typical person (without "eagle eyes", perhaps the analogy of "golden ears"!) can no longer make out pixels on a 65" TV unless one is sitting <4.5' away! I think this says something about the state of our technology... We have achieved a level of quality in this dimension of the visual experience such that it would be very hard to argue that more is needed in a typical home environment. I have already used the analogy of 4K being like "CD resolution" in the vast majority of situations, beyond which there are diminishing returns or no perceivable improvements at all.


2. Temporal resolution:

This refers primarily to the framerate of the video. Basically, does it look smooth across time or is the framerate dipping below the visual threshold of sensory memory such that the mind is no longer fooled and detects stuttering (various forms of temporal discontinuity). Modulated light is experienced typically to appear stable above 50Hz ("flicker fusion threshold") and some suggest this could be as high as 90Hz. Therefore, if we want to ensure a framerate that's smooth, maybe we should be aiming for 90-100 frames-per-second ultimately... Interestingly, even Thomas Edison back in the day recommended 46fps! Gamers these days love very high framerates like 120Hz (consider technology like nVidia's GSYNC) - great to get the edge especially in competitive shooters I'm sure when combined with low control latency.

The fact is though that movies have been shot at around 24Hz since the beginning of the era (since at least the 1920's). NTSC video has been at 30Hz since the early 1940's, and later the PAL standard of 25Hz since the 1960's at the advent of color video. (There's of course SECAM, along with other parameters like resolution, interlacing, and color transmission differences with these video standards which we'll ignore for this discussion.)

This is interesting isn't it? Clearly, 24fps is not adequate as a "high fidelity" standard if we are to aim for smooth video approaching the physiological threshold. But attempts at speeding up this framerate have resulted in complaints of the "soap opera effect" where psychologically, if a movie were to be this smooth, it gets associated with looking like a "home video" (camcorder) or a "cheap" soap opera on TV rather than a "cinematic" experience. Most recently in 2012, Peter Jackson attempted the jump to 48fps "High Frame Rate (HFR)" with The Hobbit: An Unexpected Journey... In 2013 with The Desolation of Smaug, and of course you have to finish this trilogy with The Battle of Five Armies in 2014. Apparently it didn't work out so well. Supposedly James Cameron's upcoming Avatar sequels will use high framerates as well.

The truth is, since the days of high-definition (720P, 1080P) and now into 4K resolution, 60Hz is no big deal and already here for the taking. All current model 4K TV's are capable of HDMI 2.0 and 4K/60Hz so long as the source device can do it. Already we can stream off YouTube for example and experience smooth 4K/60 videos (like this one). It's the content that's dictating low framerates like 24fps rather than the hardware.

Remember that most TVs these day provide the ability to interpolate frames, hence smooth out the motion using digital processing. I don't mind some of this processing in reducing the "judder" when a TV is faced with a challenging framerate like 24Hz displayed at 60Hz. Done too smoothly and the dreaded "soap opera effect" can be a problem. Furthermore, there are many instances where we can make out distortions and especially motion artifacts. Probably fair to say that in most situations, it's best to just turn this feature off. 

3. 3D:

We have 2 eyes. The mind processes the world as 3D spatial representations just like experiencing the illusion of a soundstage with 2-channel audio. I've been watching 3D Blu-rays since 2011 and it's obvious to anyone that the technique does add to the experience (whether one likes it or ends up getting nauseated is idiosyncratic of course!). The analogy here I think is multi-channel music. I have multi-channel albums as well, I do love them since they open up the spatial immersion, but I still listen to mostly 2-channel stereo.

When 3D Blu-rays first came out in 2010, as expected, the manufacturers wanted to hype up the technology. Remember 3D video cameras and 3D cell phones with 2 lenses back around 2012? But I think it's not surprising that this never really took off... What did they think, people were going to replace all their fine 2D TV sets with 3D capable ones!? The inconvenience of 3D glasses is an issue, the 3D effect isn't necessary for most scenes in movies, plus the effect in many instances were achieved through reprocessing rather than native 3D capture.

Note that I am not calling 3D a "gimmick". IMO, it is a real feature and one I do like to experience once awhile. In terms of "fidelity", it does get us closer to the limits of what the eye/mind can process, so when used well, it adds to the potential level of enjoyment just like multi-channel audio can. It's just another tool with which video artists can convey their vision.

Clearly the need for 3D glasses is an inconvenience which I'm sure is a major issue for adoption since it does involve putting something intimately close to one's eyes (I suspect there is a strong psychological component, perhaps on a subconscious aversion level). Likewise, multichannel audio with the need for proper speaker placement is an inconvenience of space and limits acceptance factor for many households.

For those keeping track, notice that the 2016 models of Samsung TVs have dropped 3D capabilities altogether. There's also waning support from LG and Sony. Some companies like Vizio never even really tried. The new UHD Blu-ray format doesn't support 3D. The future doesn't look bright for this feature in the home theater world (not sure how well the 3D feature is filling seats in the theaters).

4. "Pixel Quality":

          a. Color gamut
          c. Color quantization
          b. Contrast

Although 3 separate parameters, we might as well discuss them here together as most discussions of this nature tend to do. In the last 2 years or so, 4K TV manufacturers are aiming to improve "not just more pixels, but better pixels" (to quote this title). They aim to do this by expanding the range of colors displayable by the TV sets and also improve the contrast between the darkest and brightest parts of an image with greater gradation.

Even though color gamut has had variation over the years especially in the world of computers (for example, graphic artists and photographers are familiar with the basic sRGB vs. AdobeRGB vs. Kodak's ProPhotoRGB).
In the world of consumer video in the last decade (let's not worry about the old NTSC and PAL color spaces here), we've been watching our movies and TV based on the sRGB color space (identically referred to as ITU Rec. 709). As a basic priority for content today, a TV set should be able to reproduce the Rec. 709 colorspace well as the foundation from which to build. As you can see, sRGB/Rec. 709 comprises only a portion of the visible "horseshoe-shaped" colors the eyes can see (typically drawn using the CIE1931 color space diagram as above, or CIE1976 chromaticity diagram). The larger the color gamut available, the better the TV set is in showing the range of saturated colors that the eye/mind can appreciate.

With the 4K HDMI2.0a standard released in early 2015, HDMI devices allow for HDR metadata as specified in CTA-861.3 to be embedded in the digital stream. Realize though that since HDMI1.4, the interface can already encode and expand beyond sRGB into AdobeRGB, and "x.v.Color" (IEC 61966-2-4 xvYCC). Recently with HDMI2.0, the Digital Cinema Initiatives' DCI-P3 and even larger Rec. 2020 color spaces are now available. DCI-P3 and Rec. 2020 are the most interesting for 4K today and into the foreseeable future:
See source of this diagram here. D65/6500K white point. Some text blurred to focus on the relevant here.
Related to the color quality is the color quantization capability of the video system. Since HDMI1.3, the "bit-depth" of each pixel's color representation can be expanded from the base level of 8-bits up to 16-bits using more bandwidth of course. The marketing term for this is "Deep Color" and can be traced back to the 2007 time frame. Remember though that like in audio processing, just because the data can be 32-bits, it doesn't mean the final output (eg. the DAC or speakers) are capable of that amount of range, much less this amount of resolution being necessary for human perception. So too with TV's. An 8-bit LCD (that's 8 bits for Red, another 8 bits for Green, and 8 bits for Blue = 24-bits = 16.8M colors) panel can use dithering algorithms to improve color gradation although of course a 10-bit panel would be awesome and do a better job with ~1 billion potential colors (this is easily demonstrable with test content).

As for contrast, this is essentially the "dynamic range" of the technology. How much difference between full brightness and the darkest black. Realize that this again can be complex to characterize depending on the lighting technology used and you'll see descriptions like multi-zone dimming, edge-lit vs. full array... etc. The ideal would be something like OLED where each pixel is an independent light-source instead of group illumination. Remember that there are always compromises to this whether it be to thickness of the screen, overall achievable quality, or of course cost to it all.

Clearly, the technology is complicated and no surprise then that the whole "pixel quality" discussion is getting rolled together at the industry level. Recently, the Ultra HD Premium certification from the UHD Alliance has been released which states the following in order to be compliant with this "High Dynamic Range" (HDR) initiative:
1. Color gamut: At least 90% DCI-P3 color coverage.
2. Color quantization: 10-bit color depth.
3. Contrast:
     >1000 nits peak brightness, <0.05 nits black level [aimed at LED/LCD panels]
     OR
     >540 nits peak brightness, <0.0005 nits black level [aimed at OLED panels]
Remember that OLEDs achieve great dark levels but they're not as bright, hence the 2 standards. A nit can be converted to other measures of luminance (brightness). So if we look at the peak levels, 1000 nits (SMPTE tends to use this unit) is also 1000 candela/m2 or 291.9 fL (foot-Lambert). We'll discuss brightness a little more later.

Standardization for what HDR means I believe is a good thing and it's really a shame that we don't see this kind of clear objective standard in the audio world when they embarked on "high resolution audio" hardware, not to mention the paucity of actual high quality music "software" demanding such resolution! The question of course is whether even with these definitions, companies will use them and ultimately if they catch on with the general public. There is certainly no guarantee that the general public will care. Furthermore, it doesn't help if companies don't sign on. For example, Vizio is in disagreement with the standard and Sony is declining use of the labeling.

To make matters worse for the consumer is yet another "format war" of sorts between how HDR is to be implemented through the metadata - the "open" HDR10 vs. licensing Dolby Vision. Other techniques reportedly from BBC/NHK and Philips/Technicolor are also in the works. Uncertainty, especially for something like this is not going to be good for the consumer and no doubt will confuse and likely delay unified support. At least at this time HDR10 and DV can coexist. What is worth knowing is that even though DV may be technically better and can provide an excellent "end-to-end" solution, you also can't discount a "free to implement" solution and mandated support in UHD Blu-rays with HDR10! (Hmmmm, we've heard the phrase "end-to-end solution" already in audio haven't we? DV includes the final display device which makes sense unlike MQA which cannot account for the essential amplifier and speaker end point.)

Thoughts as a consumer...

Now that we have covered much of the relevant technical stuff, let's talk about what the consumer might see or want. Like my perspective about audio, I feel it's good to be pragmatic thinking about these technologies and be able to differentiate between the hype and what's important for oneself. As consumers, the beauty about technology and what makes it exciting includes the power of deflationary cost pressure inherent in a very competitive sector; every year that goes by, we can bet that goods will get cheaper (inflation adjusted) and functionally better. This of course doesn't mean we wait forever to get the toys we like, but this fact is one more parameter to consider in our own cost-benefit analysis beyond technical specs. Like audio, at some point, because of the limits of human vision, the curve will "flatten" to the extent where all that improves are lower costs rather than significant qualitative/quantitative improvements between generations of products.

We have seen already in the audio world that "better" or "newer" doesn't always lead to marketplace acceptance or support. Successful technologies tend to be clearly different (this is probably why hi-res audio fails - it's more expensive and just not all that different in sound in the vast majority of situations), and convenient (3D TV's and multichannel audio are examples of inconvenience that muddles along despite obviously appearing/sounding different). For example, the jump in resolution between VHS and DVD in the late 90's was clearly worthwhile (the difference between interlaced analogue to progressive digital was huge) plus random access discs were so much more convenient. 480P DVD to 1080P Blu-Ray was clearly an improvement as well for most people - the resolution jump very obvious, and though still disc-based, the Blu-Ray material is more robust. Likewise on the hardware front, there's no comparison between the old CRT's and a nice flat-panel screen which at comparable sizes would be better in quality, more compact, and much lighter.

But what of the differences between Blu-Ray and the new UHD Blu-Ray (UHD BD)? Or the jump from 1080P to 4K/UHD/2160P? If we look at resolution benefits, a commonly quoted recommendation is to multiply the screen diagonal measurement by 1.6263 to achieve a 30-degree viewing angle as suggested by SMPTE. This means that for a 65" TV, the recommendation is to sit around 105" from the TV or almost 9'. This is probably a comfortable distance for most living rooms or home theater rooms. At this distance, we're right at the edge of "retina" angular resolution (again, we can explore using this web calculator). Sitting any closer or upgrading to a larger screen at that same sitting distance, we'll start noticing the pixels. Perhaps analogous to CD audio where 44.1kHz is good but a Nyquist frequency of 22.05kHz is close to the theoretical threshold of hearing around 20kHz and we may notice effects from the digital filtering. One can say that 1080P resolution is very close to the threshold of pixel detection at the 30-degree view angle recommendation.

Realize though that 30-degrees is just one recommendation. In fact, THX suggests something closer to 40-degree viewing angle as being optimal (36-degrees for the furthest seat in the house) where you sit just 1.2 times the diagonal TV size. At this distance for a 65" TV, one would be sitting just 78", or 6.5' away, close enough that with good visual acuity, pixels will be evident with a 1080P screen. Clearly, if we want the screen to fill more of our visual field, then a higher resolution like 4K would be of noticeable benefit.

The calculations above are what argues for a higher resolution TV capable of 4K. If you have good eyes and want a TV that's filling your visual field closer to the 40-degree view angle spec or want to be able to just have the freedom to sit closer and still maintain pixellation-free imaging, then 4K is for you. The obvious question though is how many people will notice enough to entice them to upgrade for resolution itself beyond home theater enthusiasts? I think this is where the TV industry ran into a slowing of growth in recent years. Other reasons of course include the lack of actual software to go along with 4K hardware, slow migration to HDMI2.0, and concerns about compatibility with HDCP2.2 copy protection. The HDMI2.0 spec for 60Hz and HDCP2.2 necessitated an upgrade to video switches and home theater receivers for consumers. It doesn't help that early adopters without HDCP2.2 (basically those who bought 4K TVs prior to late 2013) will not be able to directly connect to protected devices like UHD BD players or perhaps other upcoming devices.

Despite many of the concerns above having been addressed over the last couple years and despite screen sizes on average getting larger, I suspect only a minority of consumers are purchasing 65"+ screens. What capability then does the Industry need to promote their units beyond resolution, size, and other physical attributes like thickness, weight, and "features" probably nobody asked for like curved screens?

Enter HDR.

Unlike the 4K resolution bump which affects mostly larger screen adopters, the thought here is that since "pixel quality" is better, people would want this because it affects everyone from the small-screen cellphone user, to 40" owner to the 80"+ cohort. The contrast, color gamut, and better quantization would argue for an upgrade and literally sell itself, right!? Well, I'm not so sure about this. In fact, I suspect John Dvorak is right that in the big picture (pun intended), the notion of HDR, while a nice technological advancement isn't going to lead to any major bump in hardware sales other than initial early adopters and then gradual replacement rate as people upgrade their pre-existing sets. Realize that there is clearly some hype when we see comparison images like this suggesting such a poor "standard" image:
Comparison from this site.
Remember that just because a TV advertises 1000 nits of potential peak brightness, that peak level only applies to extreme brightness like specular highlights that can clip earlier without the extra headroom (sun off chrome, glistening water), the extra bit depth can be useful in unusually dark/underexposed scenes, or full-on mega-stop exposure like the camera pointed at the sun above! Sure, that's great that a TV can reproduce an image like that, but I think most of us will have very different opinions on how important this would be. After spending quite a bit of time recently with a calibrated OLED screen checking out all the HDR demos I can get my hands on, it's quite clear that HDR can certainly be eye-catching! But just like oversaturated, oversharpened settings can be eye-catching, after awhile fatigue is going to set in. The HDR effect IMO should be used sparingly in actual use. Let's also not forget that in brighter ambient light settings, HDR might not be beneficial.

By the way, if you have ever downloaded one of the HDR demo videos like this Life of Pi footage, you'll realize the color looks off, very much like the left image above when played on a regular non-HDR set. This is because HDR decoding requires the use of an appropriate "electro-optical transfer function", typically SMPTE ST 2084, essentially an extension of the more familiar non-linear gamma function into 10 and 12-bit video (more reading here). Without the appropriate non-linear mapping of color and luminance values, it will look washed out and pasty.

As we close, I think it's worth considering for a moment what it is that a TV screen is trying to accomplish. For audiophiles, there is this notion that with good equipment and room, given the appropriate media, we can accomplish a reproduction of "live" sound from our systems (that idea of the "absolute sound" I suppose). We have a reference for this - just go to a live jazz club or the local concert hall... But what is the reference for a big screen TV? Is it the ubiquitous nature scenes, night sky, timelapse photography, model shots, "maraschino cherries dropping into martini glass in slo-mo" demo videos we see at the local big box electronics stores? (Funny to think that these demo reels are "standards" like Diana Krall albums at hi-fi shows. :-)

No, of course not... As much as those videos can be impressive (you can download many of those here by the way), I suspect, if we want big sound and a big screen, we look to our local cinemas and compare our home theater setups with what's in the real theaters playing movies that cost millions to produce. And in this respect, that "real movie theater" benchmark these days is far from impressive on the whole! The other day, I took my kids to see Kubo And The Two Strings at the local digital projector multiplex. Although I enjoyed the movie, clearly the technical "fidelity" of the video I saw could easily be surpassed by essentially any decent big-screen TV at home (without potentially sitting near coughing neighbours, stuck in bad seats, stepping on popcorn, and enduring the whispers of strangers). For example, most theaters today project images with peak illumination level ~50nits in a darkish room (exit signs and walkway illumination are needed so there's no way it can be too dark) - this is the general SMPTE recommendation. A modern TV set can already achieve >100nits rather easily. Likewise, a typical projector screen can't really achieve very deep blacks and it ends up that most theaters have maybe around 500:1 contrast ratio if that... Again, easily surpassed by a decent LED/LCD TV. While probably most of the newer digital projection theaters are capable of 4K resolution, standard 2K is still common.

It's good that Dolby is trying to improve this state of affairs with their Dolby Cinemas. Peak illumination is now up to 106nits and contrast ratio said to be 1,000,000:1 and good for HDR. This is accomplished with a very darker theater and twin-laser projectors. So hopefully as more movie lovers experience this, the movie image quality will trickle down, at least from the content perspective. (Unfortunately, I missed the opportunity to check out the Jackie Chan Cinema theater in Beijing recently to watch in one of these theaters. I don't believe any of the Canadian theaters are up to this standard.)

Irrespective of HDR, currently in late 2016, the amount of "standard" 4K content remains weak. Of course, we're only in the first year of UHD BD's life. The least expensive UHD Blu-ray player you can buy today is ~US$300 (XBOX One S) and UHD BD discs start at US$25. Paid streaming sites like Netflix, Amazon Instant Video, and Ultraflix have few series/documentaries/movies at 4K (as of August 2016, something like <50 titles for each of these streaming sites). Nice that there are free 4K videos on YouTube and hopefully in the near future, HDR content will arrive. It will take some time before streaming gets up in speed to deliver quality like a UHD BD rather than bit-starved highly compressed images though. The road is long and we're only in the starting stretch despite reasonable availability of 4K TV/projectors since 2012. I think even at this early phase, it's clear that the growth rate is not going to be like with previous generations of video technology.

Something worth keeping in mind I think is that video could easily follow the path of audio in terms of the effects of mobile technology. While I think a big screen TV will remain desirable for quite awhile, more so than an impressive large speaker system for most people, video consumption on cell phones has clearly grown significantly and become the "TV" of choice for many. Concomitantly, in terms of ticket sales at the local theater, there has been a decline in numbers over the years since around 2003 though not by a huge amount and revenues still seem good before inflation adjustment at least; this clearly says something about the cost per ticket rising over the years. Could we eventually see a situation where large screen visual-field-filling video will end up being more like mobile VR technology rather than large home/movie theaters? This is quite possibly inevitable...

Ultimately, other than when watching news, sports, or maybe other live events, the likelihood is that most of the time what we're experiencing is a story... Video technology is a tool to suspend disbelief and immerse us in this art of storytelling whether in a gripping documentary, interpersonal dynamics of drama, or fantasy and sci-fi. Sure, a summer blockbuster movie might be even more impressive in 4K/3D/HDR with the best CGI, 3D, technical wizardry, and all that for spectacle. There is a point though where it's "good enough" to enjoy all that one needs for any specific program. Technical quality will never overcome a poor plot line, unlikable characters, poor acting, or the lack of character development. Spending >$250M on Batman v Superman didn't create a cinema classic for me, yet to this day the 1954 B&W celluoid Seven Samurai remains a treat.

-------------------------------

I hope everyone is enjoying a good September! Busy times ahead for me but with the cooler weather and rains also ahead here in Vancouver, possibly more time to do indoor stuff like listen to some music and watch a few movies :-).

Enjoy the sights and sounds!

10 comments:

  1. I would have liked if the UHD format had also come with a wider movie format image, just like we went from 4:3 to 16:9 when we moved to HD.

    ReplyDelete
    Replies
    1. Interesting point Willem. Considering our wider horizontal field of view, a wider aspect ratio might make sense like 1.85:1. It's all a compromised given the varied programming out there.

      At least we got away from 4:3 :-).

      Delete
    2. A few years ago Philips tried just this with a few ultra wide tv's, but it was a commercial failure. I guess indeed because broadcast TV did not have any program material. On the other hand, the 'ultrawide' movie material is just out there.

      Delete
  2. no content to expect for home cinema; then go for 4K gaming!
    That, although most synthetic but sometimes photoreal worlds, is truly amazing. I use a 49 inch Phillips 4K television as computermonitor; really amazing to "fly" in a 4K Digital Combat Simulator world of of Las Vegas and surroundings in Nevada. It keeps me away from the TV , and more or less, the audio-set-)

    ReplyDelete
    Replies
    1. Good idea "ik ja". I'll probably investigate that now that single-card graphics cards are starting to achieve good quality and framerates in 4K. The new nVidia 1080 and easier-on-the-wallet 1070 look really good!

      I'm sure a 49" monitor would look fantastic on a gaming rig :-)!

      Delete
  3. Having gone from IPS to an Eizo MVA monitor to the 15" LG OLED, I've sort of walked away. OLED is nice but my unit has some inherent banding and I'm now comfortable watching with this hardware calibrated 4K NEC monitor, even though it renders at 820:1 CR 60 Hz.

    Western culture is near its nadir and there isn't a whole lot you can expect from it. Sophisticated standards aren't worth much unless you happen to be insatiable for Hollywood and SM entertainment. In fact, I probably spend most of my watching time on 90s Japanese DVD rips.

    Not to mention streaming services are now favored by labels, and 2160p youtube renders textures so poorly that the humble DVD compares favorably.

    ReplyDelete
  4. Thank you, great to know all this about TVs, I was thinking og buying a new 4K tv and came across these https://10edges.com/best-4k-tv/, it would be nice of you to tell me which one is the best cause I don't have much knowledge on TVs

    ReplyDelete
    Replies
    1. Hi Max,
      That's a pretty good list for TVs in 2017. The basic parameters still remain highly important this year. Brightness, contrast, compatibility with latest standards (HDR10, DV) if important to you.

      I'm very happy with what I have - the Vizio P75. FALD tech provides excellent blacks and contrasts. A friend who is a big OLED guy but never experienced a FALD commented that he was quite surprised by the quality... Like everything, there are pros and cons; important for you to figure out your priorities, TV size, and price willing to pay of course!

      Good luck, lots of great options out there and I bet uniformly the products will improve with each generation!

      Delete
  5. Great article as always. Brilliant stuff.
    Maybe silly question. Are TVs as good as Monitors FOR WEDDING VIDEO EDITING Can You Use A 40″ 4K 60HZ TV As A Computer Monitor? I have a choice of 43 inch Vs 40 inch 4k TV as a monitor. Or do you have any other suggestion to find the best 4K monitors ?

    ReplyDelete
    Replies
    1. Hi Ricky,
      Yes you can use a 40"+ TV as a monitor. Make sure it does 4K / 60Hz / chroma 4:4:4 or uncompressed RGB so you don't lose any color resolution. Also, check the contrasts and black levels to make sure it's up to your standards.

      Delete