|A view from behind the commander's chair :-). BenQ BL3201PH on the table.
Please allow me to reminisce a little on "ancient" technology history... Back in 1989, in my university undergrad, I worked for a summer doing computer science research and saw for the first time a SUN SPARCstation 1 "pizza box" with 20MHz processor, 16MB RAM, and a 256-color "megapixel" (1152 x 900) display. I was blown away! This was a "dream machine" compared to my 7MHz Motorola 68000, 512KB Commodore Amiga 1000 with 32 colors (4096-color HAM mode was cool but limited in application, before the 64-color EHB mode) and a maximum resolution of 640x400 interlaced (can be pushed a bit into overscan). Back in those days, even a relatively expensive Macintosh was only capable of 640x480 8-bit (256) color.
The closest to "true-color" I saw in the 80's was an old Motorola VME 68020 machine I worked on to develop a rudimentary GUI for image recognition software running an ancient 16-bit color Matrox frame buffer video card. Although limited to 640x480 interlaced, it was impressive to see an actual digital picture on a computer screen that looked like something out of a video!
[Even back then, although the sound quality was nothing to write home about, in 1989, the first PC Sound Blaster card was introduced. By then, we had been living with CD audio for a number of years already, and even this first generation card was capable of 8/22 mono already. It was just a matter of time before 16/44 stereo sampling was on option given enough memory and storage space. The Sound Blaster 16 with 16/44 stereo came just a few years later in 1992. Clearly, technology for imaging / video has always been behind audio in capability and relative fidelity due to complexity and storage requirements (this of course also speaks to the neurological sophistication of the visual architecture compared to audio in our brains).]
At some point in the early 1990's I saw a TI "true color" 24-bit graphics card machine at the university (remember the TARGA graphics file format anyone?). By 1994, I bought myself a Picasso II graphics cards for the Amiga capable of 800x600, 24-bit color (sweet!). By 1997, my first PC graphics card capable of >1024x768, 24-bit color was the Matrox Mystique. From then on, each generation of graphics card became more about 3D performance rather than 2D speed or resolution... My computer display also got upgraded through the years, from NEC MultiSync CRTs to 1280x1024 LCD, to Dell's UltraSharp 24" series (1920x1200), and last year I got the excellent 27" BenQ BL2710PT (2560x1440).
But one goal remained elusive on my desktop machine. A large screen monitor (in the 30" range) with at least spatial "high fidelity"; looking smooth, detailed, with clearly enough fidelity that my eyes/mind no longer would be able to distinguish those digital pixels anymore - in essence, something close to the limit of our visual spatial apparatus in 2D (perhaps like how CD is close to our auditory limits within the stereo domain). Although in the visual sphere there's still room for improvement in terms of color accuracy, contrast (dynamics), and black levels, finally it looks like we're "there" with pixel resolution (and at minimum flickerless 60Hz refresh rates with decent response time).
This goal of achieving pixel resolution meeting biological limits is obvious and technology companies have been building up towards it for years. Apple's "marketing speak" captured it nicely; they called it "Retina Display" - a screen resolution packed tightly enough that individual pixels would not be visible to the user. The first product they released to the public with this resolution designation was the iPhone 4 with a screen resolution of 960x640 (3.5", 326ppi) in June 2010 (of course other phone companies use high resolution screens and have surpassed Apple's screens; though I must credit Apple with their superb marketing prowess!). Steve Jobs back then made a presentation about the resolution of the human eye being around 300 dpi for cellphone use:
Realize that this number is only relevant in relation to distance from which the screen is viewed. When we test eye-sight, the "target" of 20/20 vision is when we are able to discriminate two visual contours separated by 1 "arc minute" of angular resolution (1/60th of 1 degree). Like I mentioned in the post a couple weeks ago, like hearing acuity, there will be phenotypic variation to this in the population and some folks will achieve better than 20/20 vision just like some people will have better hearing than others ("golden ears"). For those interested in the physics and calculated limits of vision, check out this page.
Coming back to technology then... As per Steve Jobs, when we use a cell phone, we generally view it at a distance closer than say a laptop or desktop monitor. Normally we'll view a smallish screen phone (say <6" diagonal) at about 10-12 inches. In that context, the 300 pixel per inch specification is about right... Just like in audio where we can argue about "Is CD Resolution Enough?", the visual resolution guys also argue if more is needed - witness the passion of the Cult Of Mac and their plea for "True Retina" (something like 900 ppi for the iPhone 4, and 9K for a 27" computer screen)!
Until that day when we can see for ourselves if 9K is needed though (the UHD definition offers 8K for those truly on the bleeding edge of technology), check out this helpful web site for calculating what viewing distance a screen becomes "retina" grade:
Enter the horizontal and vertical resolution, then screen size, and press "CALCULATE". It'll tell you the PPI resolution, aspect ratio, and most importantly in this discussion at what distance the angular resolution of the pixels reach the 20/20 threshold. Using this calculator, my BenQ BL3201PH, 32" 4K/UHD (3840 x 2160, 137 dpi) monitor reaches "retina" resolution at a viewing distance of 25".
Considering that I generally sit >25" away from the monitor, it looks like I've achieved that "magic" resolution I've been hoping for all these years :-). With a 32" monitor, you actually wouldn't want to sit too close, otherwise you'd be moving the head too much to scan the screen all the time. Subjectively, the monitor image looks gorgeous and it really is wonderful not noticing any pixels or easily making out any aliasing imperfections in text. I think I can live with this for a few years!
There's something special about achieving high fidelity (whether audio or visual). For a machine to match (and these days surpass) biological sensory limitations is a milestone. And to do it at price points within reach of most consumers is further evidence of technological maturation. In just a few years, we've witnessed the transformation of high resolution screen technology with "retina" resolution starting in handheld devices, to laptops, and now to the desktop monitor...
In the Archimago household, there remains one large screen screaming for these high resolutions. My TV in the sound & home theater room. If I plug in the numbers into the website, it looks like I'll need an 80" 4K TV :-). Well, I'll be keeping an eye on those prices then! Although I'm willing to jump into the 4K computer monitor waters at this time, I think I'll wait when it comes to the TV. HDMI 2.0, DisplayPort 1.3, HDCP 2.2 all need to be hashed out and widely supported before I jump in with a big purchase. Also, OLED 4K could be spectacular... Maybe next year?
I want to say a few words about the usability of 4K monitors. I was actually a little apprehensive at first about buying one due to some reviewers complaining that text size was too small and it was too difficult to use with Windows 8.1. I suspect this would be the case with smaller 4K screens like 27" models (Huh!? What's with that 5K iMac at 27"?). At 32", I can actually use it even at 100% (1:1) although a 125% scaling made things easier on the eyes. Note that many/most Windows programs are still not "scaling aware", which is why having the screen usable at 100% from a standard viewing distance is beneficial at this time.
|Use the "scaling"!
For digital photography, 32" 4K was made for Lightroom / Photoshop! The ability to see your photos on a large screen with 8 full megapixels is stunning. The bad news is that my quad-core Intel i7 CPU is feeling slower processing all those megapixels from a RAW file; not quite enough to make me feel I need a CPU upgrade just yet.
There are some 90+Mbps AVC 4K demo videos floating around providing a tantalizing taste of what 4K Blu-Ray could look like in the home theater. Panasonic showed off their 4K "ULTRA HD Blu-Ray" at CES2015 recently and I suspect that will be the best image quality we're going to get for awhile simply because of the large capacity Blu-Ray disks have to offer. It looks like the new encoding standard H.265/HEVC will be used for these future videos and this will provide even better compression efficiency and image quality for the same bitrate (supposedly similar image quality at 50% data rate compared to H.264/AVC). This could end up being the last copy of Shawshank Redemption I ever buy... Hopefully :-). [Even here, we can get into a debate about analogue vs. digital... Arguably, unless the movie was filmed in 70mm, 4K should be more than adequate to capture the full image quality of any 35mm production.]
For the time being, 4K YouTube streaming does look better than 1080P but it's clear that Internet bitrates impose significant compression penalties (noticeable macroblock distortions with busy scenes). Netflix has some material but will not currently stream 4K to the computer (only 4K TVs so far - probably due to copyright protection). I have watched 4K shows like House Of Cards and Breaking Bad off Netflix, but like 4K YouTube, the quality isn't really that impressive at this point.
Finally, remember the hardware needed to run a 4K/UHD monitor. I decided at this point to get the screen because we now have 2nd generation reasonable-priced screens (~$1000) at 60Hz, with IPS-type technology. The BenQ uses the DisplayPort to achieve 60Hz refresh rate and is SST (Single Stream Transport) instead of MST (Multi-Stream Transport) which split the screen into 2 x 2K "tiles". SST should be hassle free as I have heard of folks experiencing driver issues with the tiled screens not handled properly (imagine only half the screen displaying if the software fails the tiling process). Note that for a bit more money, the Samsung U32D970Q has received some excellent reviews for image quality and color accuracy.
I'm currently using an AMD/ATI Radeon R9 270X graphics card I got last year. Not expensive and has been trouble free for 60Hz SST operation. Just remember to buy some quality DisplayPort 1.2 cables (the BenQ has both full-sized and micro DisplayPort input). Here's an example of a very high speed digital interface that requires about 12Gbps of data transferred to achieve 3840x2160, 24-bits/pixel at 60Hz. The 6' DP-to-miniDP cable that comes with the monitor does the job fine but so far I have had no luck with 10' generic cables just to give some extra flexibility to my setup (anyone know of a reliable 10' 4K/60Hz cable, maybe 26AWG conductors?). Even at data rates 25x that of high-speed USB 2.0 (and 2x USB 3.0 speed), there's no need to spend >$20 for a good 6' cable.
Modern high-performance gaming at 4K would really demand a more powerful graphics processor so I haven't tried on this machine. I suspect less demanding ones would run just fine.
As noted earlier, remember that pixel resolution is only one factor in overall image quality. The ability to display good contrast (like dynamic range in music) and also color accuracy are very important. Clearly it's in these other areas that computer and TV displays can further improve. Note also that UHD defines an enlarged color space as well (ITU-R BT 2020 vs. the previous Rec. 709 for standard HDTV - see here) so the improvement in this regard is another tangible benefit.
I hope you enjoyed this foray outside the usual audio technical discussions... Enjoy the music and whatever visual set-up you're running!
PS: Happy Dynamic Range Day (March 27, 2015)! Great to see a recent purchase; Mark Knopfler's Tracker, was mastered at decent DR11... Keep 'em coming - "rescuing the art form" is about preserving qualities like the full dynamic range and releasing music meant for listening with systems superior to boomboxes and earbuds!