Thursday, 26 March 2015

MUSINGS: Gone 4K / UHD - A "Look" At Ultra-High-Definition...

This week, I thought I'd take a break from just the audio stuff and discuss a new "toy" I got 5 weeks ago. That's right, as the title suggests, a 4K / UHD screen; it's a computer monitor to be exact:

A view from behind the commander's chair :-). BenQ BL3201PH on the table.
Remember that there's a separate "body" defining 4K movies at the local movieplex - the Digital Cinema Initiative (DCI). They have "true" 4000 pixel horizontal resolution like the 4096x1716 (2.39:1 aspect), or the very close 3996x2160 (1.85:1) resolutions. Whereas for the smaller screens like computer monitors and TV's, we have the UHD "Ultra High Definition" standard defined as 3840x2160 (16:9 / 1.78:1). So although it's not exactly "4K" horizontally, it's close and I guess "4K" is a better advertising catch-phrase than "2160P". Needless to say 3840x2160 is 2x the linear resolution of 1080P or 4x the actual number of pixels.

Please allow me to reminisce a little on "ancient" technology history... Back in 1989, in my university undergrad, I worked for a summer doing computer science research and saw for the first time a SUN SPARCstation 1 "pizza box" with 20MHz processor, 16MB RAM, and a 256-color "megapixel" (1152 x 900) display. I was blown away! This was a "dream machine" compared to my 7MHz Motorola 68000, 512KB Commodore Amiga 1000 with 32 colors (4096-color HAM mode was cool but limited in application, before the 64-color EHB mode) and a maximum resolution of 640x400 interlaced (can be pushed a bit into overscan). Back in those days, even a relatively expensive Macintosh was only capable of 640x480 8-bit (256) color.

The closest to "true-color" I saw in the 80's was an old Motorola VME 68020 machine I worked on to develop a rudimentary GUI for image recognition software running an ancient 16-bit color Matrox frame buffer video card. Although limited to 640x480 interlaced, it was impressive to see an actual digital picture on a computer screen that looked like something out of a video!

[Even back then, although the sound quality was nothing to write home about, in 1989, the first PC Sound Blaster card was introduced. By then, we had been living with CD audio for a number of years already, and even this first generation card was capable of 8/22 mono already. It was just a matter of time before 16/44 stereo sampling was on option given enough memory and storage space. The Sound Blaster 16 with 16/44 stereo came just a few years later in 1992. Clearly, technology for imaging / video has always been behind audio in capability and relative fidelity due to complexity and storage requirements (this of course also speaks to the neurological sophistication of the visual architecture compared to audio in our brains).]

At some point in the early 1990's I saw a TI "true color" 24-bit graphics card machine at the university (remember the TARGA graphics file format anyone?). By 1994, I bought myself a Picasso II graphics cards for the Amiga capable of 800x600, 24-bit color (sweet!). By 1997, my first PC graphics card capable of >1024x768, 24-bit color was the Matrox Mystique. From then on, each generation of graphics card became more about 3D performance rather than 2D speed or resolution... My computer display also got upgraded through the years, from NEC MultiSync CRTs to 1280x1024 LCD, to Dell's UltraSharp 24" series (1920x1200), and last year I got the excellent 27" BenQ BL2710PT (2560x1440).

But one goal remained elusive on my desktop machine. A large screen monitor (in the 30" range) with at least spatial "high fidelity"; looking smooth, detailed, with clearly enough fidelity that my eyes/mind no longer would be able to distinguish those digital pixels anymore - in essence, something close to the limit of our visual spatial apparatus in 2D (perhaps like how CD is close to our auditory limits within the stereo domain). Although in the visual sphere there's still room for improvement in terms of color accuracy, contrast (dynamics), and black levels, finally it looks like we're "there" with pixel resolution (and at minimum flickerless 60Hz refresh rates with decent response time).

This goal of achieving pixel resolution meeting biological limits is obvious and technology companies have been building up towards it for years. Apple's "marketing speak" captured it nicely; they called it "Retina Display" - a screen resolution packed tightly enough that individual pixels would not be visible to the user. The first product they released to the public with this resolution designation was the iPhone 4 with a screen resolution of 960x640 (3.5", 326ppi) in June 2010 (of course other phone companies use high resolution screens and have surpassed Apple's screens; though I must credit Apple with their superb marketing prowess!). Steve Jobs back then made a presentation about the resolution of the human eye being around 300 dpi for cellphone use:

Realize that this number is only relevant in relation to distance from which the screen is viewed. When we test eye-sight, the "target" of 20/20 vision is when we are able to discriminate two visual contours separated by 1 "arc minute" of angular resolution (1/60th of 1 degree). Like I mentioned in the post a couple weeks ago, like hearing acuity, there will be phenotypic variation to this in the population and some folks will achieve better than 20/20 vision just like some people will have better hearing than others ("golden ears"). For those interested in the physics and calculated limits of vision, check out this page.

Coming back to technology then... As per Steve Jobs, when we use a cell phone, we generally view it at a distance closer than say a laptop or desktop monitor. Normally we'll view a smallish screen phone (say <6" diagonal) at about 10-12 inches. In that context, the 300 pixel per inch specification is about right... Just like in audio where we can argue about "Is CD Resolution Enough?", the visual resolution guys also argue if more is needed - witness the passion of the Cult Of Mac and their plea for "True Retina" (something like 900 ppi for the iPhone 4, and 9K for a 27" computer screen)!

Until that day when we can see for ourselves if 9K is needed though (the UHD definition offers 8K for those truly on the bleeding edge of technology), check out this helpful web site for calculating what viewing distance a screen becomes "retina" grade:

http://isthisretina.com/


Enter the horizontal and vertical resolution, then screen size, and press "CALCULATE". It'll tell you the PPI resolution, aspect ratio, and most importantly in this discussion at what distance the angular resolution of the pixels reach the 20/20 threshold. Using this calculator, my BenQ BL3201PH, 32" 4K/UHD (3840 x 2160, 137 dpi) monitor reaches "retina" resolution at a viewing distance of 25".


Considering that I generally sit >25" away from the monitor, it looks like I've achieved that "magic" resolution I've been hoping for all these years :-). With a 32" monitor, you actually wouldn't want to sit too close, otherwise you'd be moving the head too much to scan the screen all the time. Subjectively, the monitor image looks gorgeous and it really is wonderful not noticing any pixels or easily making out any aliasing imperfections in text. I think I can live with this for a few years!

There's something special about achieving high fidelity (whether audio or visual). For a machine to match (and these days surpass) biological sensory limitations is a milestone. And to do it at price points within reach of most consumers is further evidence of technological maturation. In just a few years, we've witnessed the transformation of high resolution screen technology with "retina" resolution starting in handheld devices, to laptops, and now to the desktop monitor...

In the Archimago household, there remains one large screen screaming for these high resolutions. My TV in the sound & home theater room. If I plug in the numbers into the website, it looks like I'll need an 80" 4K TV :-). Well, I'll be keeping an eye on those prices then! Although I'm willing to jump into the 4K computer monitor waters at this time, I think I'll wait when it comes to the TV. HDMI 2.0, DisplayPort 1.3, HDCP 2.2 all need to be hashed out and widely supported before I jump in with a big purchase. Also, OLED 4K could be spectacular... Maybe next year?

------

I want to say a few words about the usability of 4K monitors. I was actually a little apprehensive at first about buying one due to some reviewers complaining that text size was too small and it was too difficult to use with Windows 8.1. I suspect this would be the case with smaller 4K screens like 27" models (Huh!? What's with that 5K iMac at 27"?). At 32", I can actually use it even at 100% (1:1) although a 125% scaling made things easier on the eyes. Note that many/most Windows programs are still not "scaling aware", which is why having the screen usable at 100% from a standard viewing distance is beneficial at this time.

Use the "scaling"!
Firefox runs great with 125% scaling and you can go into Chrome's options to set the default scaling to 125% as well. Internet Explorer looks excellent out-of-the-box.

For digital photography, 32" 4K was made for Lightroom / Photoshop! The ability to see your photos on a large screen with 8 full megapixels is stunning. The bad news is that my quad-core Intel i7 CPU is feeling slower processing all those megapixels from a RAW file; not quite enough to make me feel I need a CPU upgrade just yet.

There are some 90+Mbps AVC 4K demo videos floating around providing a tantalizing taste of what 4K Blu-Ray could look like in the home theater. Panasonic showed off their 4K "ULTRA HD Blu-Ray" at CES2015 recently and I suspect that will be the best image quality we're going to get for awhile simply because of the large capacity Blu-Ray disks have to offer. It looks like the new encoding standard H.265/HEVC will be used for these future videos and this will provide even better compression efficiency and image quality for the same bitrate (supposedly similar image quality at 50% data rate compared to H.264/AVC). This could end up being the last copy of  Shawshank Redemption I ever buy... Hopefully :-). [Even here, we can get into a debate about analogue vs. digital... Arguably, unless the movie was filmed in 70mm, 4K should be more than adequate to capture the full image quality of any 35mm production.]

For the time being, 4K YouTube streaming does look better than 1080P but it's clear that Internet bitrates impose significant compression penalties (noticeable macroblock distortions with busy scenes). Netflix has some material but will not currently stream 4K to the computer (only 4K TVs so far - probably due to copyright protection). I have watched 4K shows like House Of Cards and Breaking Bad off Netflix, but like 4K YouTube, the quality isn't really that impressive at this point.

Finally, remember the hardware needed to run a 4K/UHD monitor. I decided at this point to get the screen because we now have 2nd generation reasonable-priced screens (~$1000) at 60Hz, with IPS-type technology. The BenQ uses the DisplayPort to achieve 60Hz refresh rate and is SST (Single Stream Transport) instead of MST (Multi-Stream Transport) which split the screen into 2 x 2K "tiles". SST should be hassle free as I have heard of folks experiencing driver issues with the tiled screens not handled properly (imagine only half the screen displaying if the software fails the tiling process). Note that for a bit more money, the Samsung U32D970Q has received some excellent reviews for image quality and color accuracy.

I'm currently using an AMD/ATI Radeon R9 270X graphics card I got last year. Not expensive and has been trouble free for 60Hz SST operation. Just remember to buy some quality DisplayPort 1.2 cables (the BenQ has both full-sized and micro DisplayPort input). Here's an example of a very high speed digital interface that requires about 12Gbps of data transferred to achieve 3840x2160, 24-bits/pixel at 60Hz. The 6' DP-to-miniDP cable that comes with the monitor does the job fine but so far I have had no luck with 10' generic cables just to give some extra flexibility to my setup (anyone know of a reliable 10' 4K/60Hz cable, maybe 26AWG conductors?). Even at data rates 25x that of high-speed USB 2.0 (and 2x USB 3.0 speed), there's no need to spend >$20 for a good 6' cable.

Modern high-performance gaming at 4K would really demand a more powerful graphics processor so I haven't tried on this machine. I suspect less demanding ones would run just fine.

------

As noted earlier, remember that pixel resolution is only one factor in overall image quality. The ability to display good contrast (like dynamic range in music) and also color accuracy are very important. Clearly it's in these other areas that computer and TV displays can further improve. Note also that UHD defines an enlarged color space as well (ITU-R BT 2020 vs. the previous Rec. 709 for standard HDTV - see here) so the improvement in this regard is another tangible benefit.

I hope you enjoyed this foray outside the usual audio technical discussions... Enjoy the music and whatever visual set-up you're running!

PS: Happy Dynamic Range Day (March 27, 2015)! Great to see a recent purchase; Mark Knopfler's Trackerwas mastered at decent DR11... Keep 'em coming - "rescuing the art form" is about preserving qualities like the full dynamic range and releasing music meant for listening with systems superior to boomboxes and earbuds!

Tuesday, 17 March 2015

MUSINGS: Audiophiles "Us vs. Them" (Objectivists vs. Subjectivists) Attitudes and Envy!?


Since I'm stuck at LAX on my way home from a wonderful Spring Break with the wife and kids down in Texas as well as a Caribbean cruise, I thought I'd polish my response to Hal Espen's comment in the last post... BTW, I enjoyed visiting Bjorn's in San Antonio just to see the audio and home theater gear they had on display. Some really nice stuff and it looks like they're upgrading their main demo room to Atmos soon. I appreciated the knowledgeable staff and friendly attitude; taking time to demo the gear even though they knew I didn't even live in the USA.

So Hal, nice comment:
Pure confirmation bias from beginning to end. None of this really exists. : )
You can't have it both ways. Either your blog is about providing the little bursts of brain chemicals that us vs. them skeptics receive when scientism is seen to be crushing audiophilia, or you're going to go wafting into the subjective realm of the subtle and imaginative "classy" pleasures of reproducing music electronically, as you do with evident misgivings here.


Which side are you on, boys? Which side are you on>
Gets right to the heart of some of the heated debates and arguments I suppose... I guess I "swing both ways" in some regards. :-)

Remember though, I am "more objective" in my leanings in terms of intolerance for some of the true BS (like certain cables). However, I have no issues with enjoying the finer things in life. If a $50,000 pair of speakers made with premium materials look fantastic with my decor, sounds great, and I really wanted them; I would happily buy them! But as an objectivist, I would just like to make sure that they are built to scientific principles around the ability to convey accurate sound (decent frequency response, minimal enclosure resonance, good time alignment, rational crossovers, adequate power handling...). The philosophical issue I have with pure subjectivism is that I see these parameters as pre-requisites as an informed consumer to my purchase and essential to any complete review due to psychological limitations (biases) and limits of human hearing acuity based on personal experience with my own failings and knowing the limits of various "golden ears" I have come across in my travels. I don't think it's unreasonable to state that some folks lack insight into their own abilities and limitations - this is not just a comment about audiophiles but apply to many other situations as well.

There are many examples in the Stereophile reviews where IMO it's quite clear that certain "recommended" components should be viewed with suspicion in the eyes of those interested in objective criteria of accuracy and "high fidelity". An example is something like the DeVore Orangutans - they don't "measure up" as can be seen with the Stereophile measurements and there's even a comment about audible coloration with solo piano by JA. Many interesting comments in that post. For the asking price of $12000/pair, I think that's unreasonable performance for the expense given a myriad of other options at that price point and below. BTW I have heard them and although they sound OK especially with low power amps, I am just not interested in gear that "color" the sound in a significant way. No matter what some subjective folks "think" or "hear" or "feel". (Esthetically these speakers do nothing for me either.)

This principle is all the more relevant with stuff like cables (especially for digital signals) where there's literally nothing there to measure or difference to be heard once any kind of controlled protocol is put in place. Other than subjective esthetic preferences and psychological "feel good" about owning expensive copper snakes. I really don't care enough about the "bling" of cables to feel it's worthy of the expense since that is all they offer.

Ultimately, I think it's OK to embrace the various "shades of grey" in audiophile philosophical leanings and I hope I don't come off too intolerant of anyone's freedom to believe what they want. However I don't have to believe everything I hear/read and I choose to take a stand on buying gear based on what appears to be reasonably "sound" science. Some folks seem to think it's about expense or "envy" about the cost of audiophile gear; and that's the reason why some people criticize the equipment or company. While this may be the case at times, personally I do not believe this is my concern at all (nor have I met many objectivists where I thought they might be projecting envy as a major reason for their disdain of nonsense). Over the years I've easily spent >$50,000 on audio gear and much more than that to buy a house meeting my criteria for a decent enough sound room (yet another pre-requisite - something I wish all audio reviewers would talk more about and show us pictures of the soundroom rather than listing likely insignificant accessories like cables used).

I truly find it bizarre that recently folks like Michael Lavorgna at AudioStream keeps talking about "envy" (like here and here)... Folks, when objectively some things don't make sense like $1000 ethernet cables, what is there to be envious about unless one is honestly willing to accept that they are in this audio hobby not for sound quality but acknowledging that "bling" is worth coveting (like that $17,000 Apple Watch)?

Gents (and ladies). Enjoy the music!

No need to get upset in flame wars since it's just a hobby... One of I hope a number of others since there is so much in this world to enjoy. Figure out what's important to you and your stance. Most of all, for the love of the community, stay cool when it comes to debates out there :-). IMO, the objective perspective has so much to offer in terms of reality testing, tools to help adjudicate qualitative differences, and a way to tease out collective facts from individual beliefs... For something as obvious as ethernet cables, put the facts forward and wait for reasonable responses or evidence to show otherwise. Hopefully folks will think about their beliefs and engage in reasonable conversation about what is important and how we can all benefit from improved sound quality.

And it's always good to realize there's more to this than a simplistic and childish "us vs.  them" attitude of course...

BTW: I just couldn't help but run into this article on the "JCAT Reference USB" cables. Can someone tell me the definition of a "true believer in the audiophile experience" or the "true hobbyist"? So what does that make "us" or are we "them"? :-) Also, shouldn't we be reserving phrases like "true believer" for religion and faith rather than engineered products based on applied physics!?

Saturday, 7 March 2015

2015-02-27: HiFi Centre Grand Reopening...

About a year ago, I reported on the B&W Nautilus demo at the HiFi Centre here in Vancouver. That was at their old location... As of late February, they're now in the new place near Vancouver Chinatown and to celebrate, they had a nice (re)opening event (February 27, 2015). It caught some publicity from the Stereophile web site as well.

I decided to go check it out after work on the Friday and see the new space. For fun, here are a few cell phone snapshots of some of the gear on display. Note that it was well attended and I purposely tried not to take shots with people in them to protect the innocent :-).

Upon entering the main building, we see a Nautilus "shell" display; HiFi Centre has always had a good partnership with B&W:

Of course what's a party for audiophiles without some live jazzy music? And there were free drinks on the house as well... Thanks guy!


There's a nice "headphone bar" to demo. Just plug in the appropriate headphone to the Bryston BHA-1 amp and use the iPad to play a tune. Good to see balanced cabling used for the higher model Audeze and Sennheisers (not that it'd make much difference in a store but at least being demoed with best potential quality). I'm only showing the Audeze and Sennheiser selections here. They also have a full line of Audio-Technica headphones and Grados on the other side out of the picture to the left.


I already have the Sennheiser HD800's so was eyeing that Audeze LCD-3 - maybe I'll add a planar magnetic headphone to my collection at some point. Beside the LCD-3 on the top rack was the LCD-XC, a closed-back unit which sounded excellent with great noise isolation in the room. It feels heavy in hand and I can see it potentially getting heavy over time when worn, but the comfy ear pads really felt great (at least over the few minutes I was listening).


Across the way was an Auralic stack with the Aries streamer, Vega DAC and Taurus headphone amp. It was connected to the yet-to-be-released AudioQuest NightHawk headphones. It's a "semi-open" design so even in a loud area with folks chatting and music playing, it wasn't too difficult to hear the music playing; not as good as a fully closed design of course for noise isolation. Keep an eye on the Head-Fi posts to see when it's officially released. The AudioQuest rep says it's coming in April. It felt comfortable and I didn't notice any sonic issues for what it's worth in a noisy environment. As for the Auralic devices I think the Vega is a great DAC and I've always liked that amber LCD front panel design. Esthetically I still think the Aries looks too much like a Cylon Basestar on the old Battlestar Galactica! I don't see the point of the vertical visual obstruction from the "flares" on top and bottom. I'm sure it works fine as a digital transport, but it's not all that visually pleasing to me and doesn't help functionally IMO. And for this price, I'm just not impressed by the plastic facade.

Moving along, we see turntables on the wall along with various speakers below them. Most of the tables were Rega and Music Hall. These were not connected; for that you'll have to enter the 5 listening rooms. The classic Linn LP12 on display as well.



As for the music rooms (5 in total), I am impressed by the sensible and pleasing room treatments... Each room has a wall-mounted iPad running control software along with ethernet wired network streaming. Plenty of tracks available to play and presumably if all the rooms are connected to the same central server, one could play the same music and assess sonic differences originating from the same mastering. Computer-based music servers are obviously here to stay. Again, the noise level was a bit too high that night to appreciate the music playback but overall no complaints!

Here is the Bryston / B&W room. A few jazz and rock tracks were played. I was there when Lorde's "Royals" (from Pure Heroine of course) was playing... Good rendition. Although those B&W 802 Diamonds are capable of reproducing the low end quite well, I noticed that it didn't sound as "precise" as I have heard this track with a good sub; seemed just a little bloated to me even listening at what I thought would have been the sweet spot (too close to wall?). Something else I need to double check with my next visit. Listeners were suitably impressed nonetheless and I suspect many audiophiles have not heard this track with a system capable of "flat" response to 20Hz. I don't remember what was being used to render the music stream but there was a dCS Puccini CD/SACD player in there if anyone is interested in expensive disk spinners these days.



Another room featured the Vienna Acoustics Liszt speakers (~$15k/pair). Again, very nice room and sound quality... Forgot to take note of what amp they were using. They were playing an LP at the time interestingly enough. There was also a room featuring Totem speaker; in this case the Element Earth (~$9k/pair) on the right, I believe driven by Naim electronics. I was quite impressed by the tonal balance on acoustic and bass response from percussion tracks out of those little guys! They're smaller in person than my impression looking at the picture. Demoing the Totem was none other than the founder Vince Bruzzese... We had a nice chat about their speakers and design of the Element line. Personable guy, enthusiastic, unostentatious - I like that! I think most folks came out of that room impressed by these Canadian speakers.


The "highest end" room in terms of cost was the Sonus Faber Lilium (>$65k/pair - sorry for poor focus, you know, alcohol and all...) driven by dual McIntosh MC2301 300W tube monoblocks (>$20k/pair). They were playing light classical at the time; nothing that I thought really challenged the amp/speakers. For anyone wondering about the "grill" in front of the Lilium, they're just soft string-like material so IMO there's no real protection for the drivers behind... Not something you'd want to put in the family room when friends with kids come over for a visit! :-)

The Mac Rack...
Speaking of McIntosh, here's one for the "high-end" computer audiophile:
Looks like we have a MXA70 (50Wpc) integrated amp on the left I think with XR50 bookshelf speakers. I didn't get to hear this setup as it was in the main hallway with many folks hanging around and chatting.


One of the rooms featured NAD and Bluesound (the server "Vault" was to the right just outside the picture). Both NAD and Bluesound are divisions of Lenbrook Industries so it made sense they were paired. If I'm not mistaken, that would be the B&W CM6 S2 to the right (~$2k/pair). I'll have to come back again to check out the NAD streaming devices especially that Masters M12. It looks like it has the BluOS module for streaming installed. The NAD rep was a pleasure to chat with as well. I'm certainly not about to change my Logitech Media Server (Squeezebox) system soon since it's working so well over the years (currently using one of the LMS 7.9 builds, >100 days uptime on the Windows server), but the BluOS control system seemed well thought out.

So, apparently this new store is built around the "sensory" retail Bang & Olufsen concept... Not sure what the specific elements are in the design here or how it compares to the New York or Copenhagen stores but it is well laid out. Of course we have B&O lifestyle products on display. They look good...

Those BeoLab 5's (active, Class-D, ~$16k/pair?) on either side of Paul McCartney are positively futuristic looking and filled the room nicely with sound; not sure how good they are with soundstaging or accuracy however. They were just playing some B&O promo material. I've seen mixed reviews.

Finally, what is an audiophile store without gadgets like cartridges and of course cables?


There you go. The AudioQuest USB and ethernet Diamonds. Yeah...

It was funny seeing the wives and girlfriends hanging outside the showroom entrance while the men mingled amidst the audio gear. Partly makes up for all the times you see guys just sitting on mall benches when the girls go clothes/shoe shopping I guess.

Dear readers, see, even an objectivist can have fun in this hobby :-). It's also great to see that there are currently plans for the first Vancouver Audio Show this May 8-10 - I might check that out if I'm in town around then. It will be great to see Vancouver increase in audiophile hobby prominence situated where it is with a huge influx from Asia in general and China specifically. Nice to have a store such as this to visit once awhile; especially not far from home...

A classy party for a classy new store... Bravo HiFi Centre!

----------

Off for some R&R over Spring Break with the family in Texas. I just hope it warms up down there! If anyone has recommendations for a good hi-fi or music store (used vinyl!?) to check out in San Antonio - let me know.

As always... Hope you're all enjoying your music.

Sunday, 1 March 2015

MUSINGS: Audio Quality, The Various Formats, and Diminishing Returns - In Pictures!

Let me be the first to say that graphs and charts where audio formats are plotted out in terms of unidimensional sound quality ratings are ridiculously oversimplified and can be very misleading! However, they can be fun to look at and could be used as bite-sized "memes" for discussion when meeting up with audio friends or for illustration when people ask about audio quality.

Since they're out there already, let us spend some time this week to look at these visual analogies as a way to "think" about what the authors of these works want us to consider/believe. I'm going to screen capture without permission a couple of these images to explore. As usual, I do this out of a desire to discuss, critique, and hopefully educate which I consider "fair use" of copyrighted material; as a reminder to readers, other than a tiny bit of ad revenue on this blog (hey, why not?), I do not expect any other gain from writing a post like this.

First, here's PONO's "Underwater Listening" diagram released around the time of the 2014 SXSW (March 2014):
PONO: Underwater Listening
Others have already commented on this of course (here also). I don't know what ad "genius" came out with this diagram, but it is cute, I suppose. I remember being taken aback by this picture initially as it's so far out of "left field" (creative?) that I felt disoriented when I first saw this thing...

How audio formats would evoke a desire to compare underwater depths remains a mystery to me. Obviously, there's a desire to impress upon the recipient two main messages - a direct correlation between sampling rate (from CD up) with quality, and to make sure the MP3 format gets deprecated as much as possible (1000 ft?!). On both those counts, this image gets it so wrong, it's almost comedic. Clearly, one cannot directly correlate samplerate and bitrate with audio quality because the relationship isn't some kind of linear correlation. Why would CD quality be "200 ft", and 96kHz "20 ft"? Surely nobody in their right mind would say that 96kHz is 10 times perceptually "better". Sure, there is a correlation such that a low bitrate file like 64kbps MP3 will sound quite compromised with poor resolution, but without any qualification around this important bitrate parameter, how can anyone honestly say that all MP3s sound bad? I might as well say that Neil Young's a poor-sounding recording artist because the Le Noise (2010) and A Letter Home (2014) albums are low fidelity.

I suspect that the PONO camp must be a bit ashamed of this diagram since I don't see it around anymore and I don't find it on their website (might have missed it). I don't think the "underwater" diagram made many friends nor sold many machines in any case...

Here's a more recent chart from Meridian circa late 2014:
Meridian: History of audio quality & convenience?!
From this, we "learn" that "downloads" have poorer quality than CDs (always?!). Also, I "learn" that LPs sound significantly better than "DVD-A/SACD" (and by extension high-resolution audio). But the most important point is that current streaming audio sounds worse than cassette tapes in quality. Does that make sense to anyone? Is this saying that streaming Spotify, Tidal, Qobuz, etc. customers are so hung up with convenience that they're willing to listen to sound quality worse than an 80's Walkman?

Of course this is the myth that they primarily want to perpetuate because guess what... Buy this "revolutionary" Meridian MQA and that'll make streaming sound awesome!

While in some cases, sure, we can say a very poorly encoded 192kbps MP3 download (like something done in 1999 with XING MP3) could sound significantly worse than CD and a 64kbps stream can be worse than an excellent cassette copy, like the PONO "artwork" above, there are some truly horrible gross generalizations here! Many LPs sound poor due to low quality pressings, many downloads are qualitatively superb, and clearly any reasonable music streaming service sounds better than a cassette tape - who's kidding whom?! Furthermore, a high resolution digital master (like with high-res downloads or encoded on DVD-A/SACD) has the capability to be more accurate than reel-to-reel tape, but of course subjectively, analogue tape can add its own unique signature/color/distortion that can be preferred... (To be able to mix in the digital domain without generational fidelity loss compared to analogue tape is obviously a big plus.)

Of course, it's easy for me to just criticize without putting something forward... Therefore, please allow me to add for your consideration my submission to the "overgeneralized sound quality vs. audio format graph":

It's a graph of the law of diminishing returns in terms of audio technology and sound quality. I think it's important to take into account the fact that hearing ability is obviously NOT infinite. Due to biological phenotypic variation, there's probably a bell-shaped curve to hearing ability as well as moment-to-moment fluctuations in acuity which is represented by the "Zone of max. auditory acuity" gradient [See comments: probably more of an asymmetrical negatively skewed distribution]. Depending on a person's maximum hearing ability, the 100% point will shift up or down relative to another but let's keep this graph simple and say that for any individual, we can only hear up to 100% based on how we're endowed. Day to day, our hearing acuity changes - everything from current stress level affecting the ability to attend to the sound, to ear wax, to allergies, to sinus/ear infections, to noise induced hearing loss, to tinnitus, to age will result in a decline in the maximum acuity (some of this sadly irreversible). Obviously, mental training can help improve how well we attend and pick up subtle cues.

The Y-axis therefore represents the "Perceived Fidelity" up to 100%. Exactly how fidelity is measured is not important in this simple diagram but obviously will consist of frequency response, dynamic resolution, low noise floor, low distortion including timing anomalies using the same mastering of a recording of superb quality for all formats. On the X-axis, we have "Effective Uncompressed PCM Bitrate" as a measure of approximately how much data is used for encoding the audio. This is a proxy for how much "effort" is needed to achieve the level of fidelity. Note that the scale is logarithmic, not linear to correspond to the logarithmic perception of frequencies and dynamic range. More data, more storage, more "effort" is needed to achieve any improvement to perceived quality as we go towards the top of the plateau to the right of the graph.

As you can see, the curve plateaus since we obviously cannot hear beyond around "100%". At some point, it really does not matter how much data we use to encode the sound, there just will not be any significant perceivable difference and all we've done is wasted storage. The big question of course is at what point along this curve do we place the capabilities of the various audio formats.

Starting with good old CD, we know that scientific research has shown little evidence to suggest in controlled trials that higher resolution sounds much better (see discussion here). Therefore, I think it's reasonable to put it at point (1) which is quite far along the curve already - this would correspond to the 16/44 stereo PCM bitrate of ~1.5Mbps. It's very close to the 100% point - I don't think it's unreasonable to say around 95% so there is a possibility for some improvement. Where exactly this lies is not that important, it could be 90% for example; the main idea being that qualitative gains beyond the CD format are not going to be really massive. As we go higher to 24/96 (~5Mbps, point 3) and 24/192 (~10Mbps, point 5), we achieve essentially 100% perceived quality and for all the effort in terms of bitrate/file size, relatively little is gained. Although mathematically these high-resolution formats can capture more frequencies and greater dynamic range, the actual auditory benefits are limited.

Where does DSD sit in this? Realize that 1-bit DSD isn't as efficient as PCM (a description I've seen calls each bit of DSD an "additive" refinement to the sound, versus a "geometric" refinement with multibit PCM). Furthermore, noise shaping shifts the quantization noise into the higher frequencies resulting in non-uniform dynamic range across the spectrum; this is generally not a problem because hearing sensitivity also drops in the higher frequencies. From what I have heard and through examining DSD rips, I think that DSD64 is better (more accurate) than CD but not much more (I personally think 21-bit/50kHz PCM, about ~2Mbps, is good enough for DSD64 conversion and avoids encoding all that excess noise) whereas DSD128 is just short of 24/96 but very close. Note that this inefficiency in DSD encoding screams for the use of compression which I have argued should really be implemented in DSD file formats a couple years back.

So what about lossy compression in terms of perceived fidelity? Considering that there has not been good data to demonstrate that many people can differentiate high bitrate MP3 from lossless PCM, I have no issues placing it just shy of CD quality. To keep the graph clutter-free, I just used a single line to denote the MP3 320kbps quality even though I recognize that there could be a wide range to the fidelity depending on quality of the encoder and demands of the music. There are special cases, usually containing high frequency content that can demonstrate limitations with high bitrate MP3 but these are rare and generally will not be evident in actual music. You might ask "why is 320kbps MP3 equivalent to ~1.5Mbps uncompressed PCM!?" The answer is due to psychoacoustics techniques employed. Sure, there is significant data reduction, and yes, taken out of context of the rest of the audio, you can hear the difference (as in "The Ghost in the MP3" project). However the data removal was done with sophisticated algorithms informed by models of human hearing. As encoding algorithms have improved, so too have the sonic quality of the resulting MP3 over the years. This is a good example of how you cannot compare bitrates directly; the way the data is being encoded is obviously very different! And sadly PONO advertising doesn't seem to understand this when they keep using diagrams like this:

Just because a lot of data is used doesn't mean there's much benefit even if the recording were done in true high resolution. By the time we get to 24/192, we're way into the zone of diminishing returns and may in fact as some have suggested entered a point where the sound quality suffers because of potential intermodulation distortion from ultrasonic frequencies and some DAC's may no longer be functioning in an optimal fashion. The fact that technologically we can get this far into the curve is also a reflection of the state of maturity of audio science. Personally I remain partial to 24/96 as a "container" for the highest resolution humans will ever need; one which is already standard on both recording and playback equipment.

Finally, as I indicated in a previous post, vinyl has limitations. Yes, it can of course sound great but there are limitations to accuracy (including differences for outer grooves vs. inner grooves), higher overall distortion, and material imperfections. As a result, there will be a wide range to the sound of LP playback as identified in the graph. Perceived fidelity compared to the original source would be lower but also remember that just like the reel-to-reel tape discussion above, some of the distortion and coloration could be "euphonic" as well - hence preferred by some (many?).

I'm sure a graphics artist could produce a much more pleasing image than what I kludged together above :-). Like the PONO and Meridian pictures, it's simplistic but I think compared to the others, a more realistic representation.

Notice that the Meridian graph above tries to suggest that there has been deterioration of potential sound quality over time (especially when they suggest streaming quality is like cassette tape!). I've seen a number of people parrot this same idea in magazines and forums. I think this is nonsense. Consider that even free Spotify is streaming with Ogg Vorbis 160kbps on the desktop (still very good!). With a premium account, you get 320kbps. And sites like Tidal already do lossless 16/44 FLAC. We're looking at quality either reasonably close or identical to CD quality. Here's my version of the chart:


As you can see, I don't believe there has really been any inverse correlation between sound quality and convenience over time. Note the drop in convenience from CD to DVD-A/SACD which I don't think is a big deal since many DVD-As play in regular DVDs and are easy to rip now (dead format anyway), plus SACDs are often hybrid and play on standard CDs (and can also be ripped these days with some inconvenience). The shift from physical media to "virtual" digital data storage has been tremendously convenient although it brings with it a new skill set - file management, proper tagging, and of course managing backups. Now the shift towards streaming has become even more convenient and "mobile" through wireless data networks (but there's limited ability to customize and tag one's collection and the sense of "ownership" of the music - a problem if one is a "collector"). As far as I'm concerned, the only real qualitative decline was from LP to cassette tapes where convenience in terms of portability improved (can listen in cars and Walkmen, less need for cleaning, but no random access song selection which is why I gave LP a 50, and cassette only an increase to 60 overall). I believe streaming just needs a little more bandwidth and if we can reliably get 24/48 FLAC streaming, we will achieve a quality and convenience beyond what most music lovers and audiophiles would feel they "need" (we'll see if MQA really offers much more). Of course, there's always that desire to have physical artwork and booklets to thumb through while listening to the music - vinyl remains the "king" of album art in that regard.

One final comment to those who feel that just because folks like myself do not believe high bitrate MP3 sounds substantially different from lossless 16/44, that I'm somehow "advocating" for lossy audio. That's not exactly true since I don't think anyone would deny that lossless formats are superior for the best accuracy / fidelity. I still prefer FLAC as my archive format because then I can convert to whatever other format I want without multigenerational lossy degradation. However, I do believe MP3 is the way to go with cars and portable audio even if they support lossless and high resolution. High bitrate MP3's are quick to transfer, take up less space, and there's just no way I will be able to hear a difference in my car or walking down the street. I personally find high-resolution lossless files (or God forbid uncompressed DSD) on a phone or portable device extremely wasteful even if storage size were not an issue. MP3 (and similar formats like AAC, WMA, Vorbis...) has its place as a tool for high quality compression and there are many applications where it's all one ever needs to get the job done completely. Plus MP3s are universally supported.

Bottom Line: Remember the principle of diminishing returns as we're dealing with mature audio technology and limitations of the hearing apparatus. It's important to keep this in mind when assessing the promise of "new technology" and manufacturer claims such as the diagrams above.

(Did anyone see any critical comments from the audiophile press about PONO or Meridian's ad material above? How about Sony's 64GB "Premium Sound" SD card recently? There sadly seems to be a lack of critical thinking in much of the audiophile reporting these days, which only serves to isolate this hobby and solidifies the concept of the pejorative "audiophool".)

----------

Regretfully, I missed a live performance by Cécile McLorin Salvant here in Vancouver last weekend. A friend went and thought the performance was amazing! She seems to be channeling a young Ella...

Check out her albums Cecile (2010) and the Grammy-nominated Womanchild (2013) if you like jazz vocals.

Enjoy the music...

Saturday, 21 February 2015

MEASUREMENTS: The Intercontinental Internet Audio Streaming Test...

Time to go intercontinental. :-) [Scene from the old movie War Games.]

After the ethernet cable results last week and the absence of any difference, there was discussion about an extreme internet music server test. What if instead of maybe 100 feet of ethernet cable from server to player, we had thousands of miles of cables in between?

With help from Mnyb from the Squeezebox Forum, we were able to orchestrate a test to demonstrate the extremes of measured performance with the server system basically on the other side of the world. He lives out in Västerås, Sweden approximately 100 km from Stockholm. A direct distance of:

More than 7300 km away from my home in Vancouver, Canada. This test will require the data to be transferred across the Transatlantic internet "backbone" and across North America to get to the west coast of Canada. Considering that the internet infrastructure cables are not straight lines, I suspect the data probably is traveling a substantially longer distance to reach my home than 7300 km.

Undersea Internet backbone map circa 2011.
I can tell that we're traveling a substantial distance based on the results of data latency. Using the paping (port ping) program, I can 'ping' the TCP port of Mnyb's Logitech Music Server in Sweden to see how long 2-way data takes between the two locales. First within my home, here's how it looks:

As you can see, my server is located internally at 192.168.1.80, and it takes on average <1 millisecond to go from the media room to my music server and back. Port 9000 is the LMS server control port if you're wondering. But when we reach out to Sweden:
IP address removed of course...

It now shows an average of 213ms data latency. Note that this amount of latency is problematic for real-time interactive data transfer like playing a first-person video game off the server... I'd get slaughtered if it took 1/4 of a second to tell the server what I'm doing in a high speed game of Call of Duty! But remember, music streaming is about bulk data transfers.. Let's see if I can find any measurable issue.

I. Set-up

My local system is set up exactly as described last week. I'll use the green 20' standard CAT 6 UTP cable I measured last week to connect my local switch with the Transporter player.

Mnyb's Server in Sweden <--> generic ethernet patch <--> Netgear GS108 gigabit switch <--> 20' flat ethernet cable <--> Netgear GS105 gigabit switch <--> generic patch <--> Linksys WRT1900AC <--> patch cable to wall socket <--> THE INTERNET >7000 km <--> 3' Cat 6 UTP patch cable <--> NETGEAR Nighthawk R7000 router <--> 30' Cat-6A STP Gigabit cable (in wall) <--> 6' Cat 6A STP patch cable <--> TP-LINK TL-SG1008D gigabit switch <--> 20' Cat 6 UTP cable <--> Logitech Transporter streamer


Whew... As for Mnyb's server computer:
HP ProLiant MicroServer N36L - released in 2010, dual-core 1.3GHz AMD Athlon II NEO. 1GB DDR3 RAM, gigabit port, 250GB OS hard drive, Western Digital "Green" 2TB drive.
Running ClearOS 6 (Linux), Logitech Media Server 7.9.

Note that his server machine is less powerful than what I'm using in my tests last week with those different ethernet cables (AMD A10-5800K, 16GB RAM). Not a worry though since serving music is a rather straight forward task not needing much CPU speed; especially with an efficient OS like Linux.

Mnyb's ISP is rated 100Mbps down/10Mbps up. Mine is a 50Mbps down/5Mbps up.

II. RightMark Audio Analyzer (24/96)

Summary of the calculated values. The first 3 rows consist of measured results from last week using various ethernet cables, the fourth row is the result of the "Intercontinental Test":

As you can see, there's really not much difference! Remember that these tests are very sensitive so little variations can happen just because of cables being moved for example. Furthermore, the "Intercontinental Test" was done a week later after I had put everything away from last week's test. Of interest and importance is that distortion results (THD, IMD) were essentially the same and certainly no worse than having a server in the next room. Again, these results are for a 24/96 "high resolution" audio test.

Some graphs then - same set as last week's:
Frequency Response: Essentially a perfect overlay representative of the Transporter device.
Noise level: There is a small spike at 120Hz I didn't see last week. Little spurious noise like this can be seen due to the sensitivity of the system. In any case, there's the 60Hz power line noise, and nothing else measures greater than -110dB.

IMD: Nothing unusual to write home about!

Stereo crosstalk: Very minor differences with tests run 1 week apart.

III. J-Test (24-bit)

I took the J-Test composite from last week and overlaid the result from today's "Intercontinental Test":

You can make out the result today as slightly brighter than last week's plots. Again, this is a pretty close overlay of the 24/48 Dunn J-Test spectrum. No evidence of any significant jitter being added to the signal during the >7000 km journey.

IV. Conclusion / Discussion

As you can see, objectively, there is no evidence that locating the server >7000 km away did anything to the output quality of the Transporter. Neither the RightMark results nor the Dunn J-Test suggests any change beyond what I normally expect from inter-test variation.

Despite the tests completing without issue, I did have trouble streaming the 24/88 FLAC-encoded Shelby Lynne song "Pretend" from the Just A Little Lovin' SACD rip which I had included in the directory for Mnyb to put up on his server... Because the test signals were short and consisted of quite a bit of silence, they were easily compressed by FLAC and transferred easily with an average data rate of <1Mbps. No issues with keeping the Transporter's buffer adequately full. However, a full length 24/88 song has a much higher average bitrate (>2.5Mbps) and this resulted in the need for rebuffering about every 15 seconds streamed across from Europe. Other than the unfortunate buffer under-run every 15 seconds, the song sounded excellent subjectively during playback.

Is this surprising? Of course not! Like I said last week, computer networking protocols are robust in terms of data error correction. When it comes to TCP/IP, internet, and ethernet data transmission, everything happens asynchronously and in a bit perfect way. This is what digital is good at - getting data transmitted perfectly with no degradation. And for asynchronous interfaces like ethernet and asynchronous USB where there is bidirectional flow control, there is furthermore no "clock recovery" process like in older digital interfaces like S/PDIF where jitter can be introduced in the source clock (remember, jitter is a property of the digital source and DAC themselves, not the cable assuming we're looking at reasonable cable lengths as far as I can tell). This is why the objective results look fine even though the server is a continent away. Even though the latency was high as demonstrated by the "paping" data above, it doesn't matter so long as the Transporter's data buffer did not "run dry" as it did when I tried playing a full song in high-resolution resulting in annoying pauses. Considering that both his and my ISP are usually able to manage specified speeds (100 Mbps down/10 Mbps up, and 50 down/5 up), it's unfortunate that I wasn't seeing such throughput during the test (maybe because it was ~9:30PM local time and about 6:30AM in Sweden on the first day of the Lunar New Year?!).

There are at least a couple important implications. First, this test again reiterates the idea (fact) that ethernet cables will make no difference. Why bother with expensive wires for the last number of feet leading to the player device if there is no evidence that data transfer over >7000 km using standard cables makes any difference when played back? Note that along the way, there has likely been a number of conversions between electrical conduction and optical lines. Even the most expensive ethernet cable will of course not speed up transfers and avoid buffer under-runs.

Second, there's no issue with jitter - bits are bits as it applies to asynchronous interfaces and so long as bit-perfection is achieved and data transfer rate is good enough for the local buffer, there's nothing to worry about. This is good as audio consumption gradually transitions to streaming services for many music lovers / audiophiles. The most that will happen are pauses from connection speed issues rather than qualitative difference to the sound during playback assuming the software isn't somehow messing things up!

(BTW - ever wonder why some audiophiles and reviewers obsess over a few feet of digital cable - especially blaming jitter despite really no evidence to show that it makes a significant difference? Yet these days, with the advent of digital streaming and advertising dollars from places like Tidal, nobody talks about the "potential" for jitter when data is being streamed from miles and miles away? Instead there's just isolated talk about ethernet cables this and that to sell expensive stuff yet no consideration for the truly big picture! Obviously there's something wrong with this whole cable "industry".)

A final thanks to Mnyb for "opening up" his LMS server for me to tap into and helping out with this test! Nice music collection you got there, buddy :-). I managed to stream a couple 16/44 FLAC songs off there with very minimal buffer under-runs (3 rebuffers/song rather than every 15 seconds with the 24/88 tune), and no rebuffering issues at all with 256kbps MP3 which still sounded great. (Don't forget the result of the MP3 test in 2013! High bitrate MP3 sounds excellent and still has a role to play when data speed is restricted or unreliable.)

Enjoy the music as we end off February 2015... Happy lunar new year for all celebrating!

Saturday, 14 February 2015

MEASUREMENTS: Ethernet Cables and Audio...





Remember folks, this is what an "ethernet frame" looks like. It's data. There is no "music" in there until decoded and prepared for the DAC! Notice the CRC bytes for error detection. (The numbers represents how many bytes for each segment.)

0. Preamble

Hey folks, over the years, I have been critical of high-end audio cables... Previously, I have shown that RCA analog interconnects can result in measurable differences with channel crosstalk changes with long lengths. But the digital interconnects themselves do not result in measurable differences even in terms of jitter (TosLink SPDIF, coaxial SPDIF, or USB). Although my HDMI receiver's DAC isn't as accurate or jitter-free, different HDMI cables don't seem to make any measurable difference either. The only caveat to this being that a digital cable can just plain fail, in which case the distortion to the audio signal has a particular (annoying) characteristic which is clearly audible and not a subtle change (eg. a poor USB cable sound).

So far, I have not seen any further measurements to suggest my conclusions are inaccurate. I have seen audiophile reviewers and forum posters still claim digital cables make an audible difference and when questioned they provide lots of words but no actual empirical evidence. It has been awhile since I've seen any articles claiming objective evidence for cable measurements - haven't come across new ads or audiophile articles although of course I may have missed some.

However, as computer audio expands, there will be opportunities to "brand" more hardware as somehow "audiophile approved" and companies that make audio cables likewise will naturally capitalize on new lines of interconnects / cables... And as expected, cost of these things will be commensurate with "premium" products.

Which brings us to the concept of "audiophile ethernet cables" (see here also, and recent mainstream press exposure of the "madness"). Let me be clear. If I have issues with USB cables, or SPDIF cables, making any significant contribution to audible sound quality (assuming again essentially error-free transmission of data), there is no rational explanation whatsoever that ethernet cables should make any difference. The TCP/IP protocol has error correction mechanisms that allow for worldwide transmission integrity (otherwise Internet financial transactions should be banned!),  and is asynchronous so there is no temporal dependence on exact timing mechanisms (jitter not an issue with adequate buffer to reclock and feed the DAC). So long as the "protocol stack" is functioning as it should between the devices, there will not be any issue. Systematic errors causing audible distortion either means hardware failure or poorly implemented communication software. Therefore the expectation if we were to test or "listen to" different ethernet cables is that there would be no difference.

Since I like to make sure objectively, let us at least run a few tests to see if indeed evidence can be found to support the hypothesis.

I. Test Setup

First, we must decide where to place the ethernet cables to test... You see, in any server/streamer system, we expect that there would be a few network cables in the data path. For the sake of ease in measurements and assuming the same thing as audiophile beliefs in power cables, let us place the test cables as the last leg of the data path between the streamer and the last ethernet switch (this guy also thinks the last leg is important). Here then is my setup:

Server PC <--> 6' Cat 6 UTP patch cable <--> 20' Cat-6 Gigabit generic cable (in wall) <--> NETGEAR Nighthawk R7000 router <--> 30' Cat-6A STP Gigabit cable (in wall) <--> 6' Cat 6A STP patch cable <--> Test switch <--> Test cable <--> Logitech Transporter streamer

As you can see above, if we trace the route the data takes between server and streaming device, we're usually looking at quite a bit of cable! In a typical "wired" house, much of the cable exists in the wall and would not be amendable to easy rewiring. Since I just did some renovations last year, I made sure to run high quality Cat 6A STP from router to the sound/media room. I am going to not just test a few cables, but I'm also going to try a different ethernet switch! Here are some details:

Server PC: AMD A10-5800K quad core PC, stock 3.8GHz speed, running Windows Server 2012 R2, Logitech Media Server 7.9.0 build 1420794485 [Jan 12, 2015], 16GB DDR3 RAM, built-in Realtek PCIe ASUS motherboard gigabit ethernet interface.

NETGEAR Nighthawk R7000 router: running dd-wrt.com custom firmware "kongac build 24345". Very stable with >100days uptime currently, underclocked to 800MHz just because I never needed the 1GHz speed.

Streamer/DAC device is the venerable Logitech Transporter. Remember that the Transporter only runs at 100Mbps whereas the rest of the system is capable of gigabit (1000Mbps) speeds.

The "Test switches": for the most part, I will use the inexpensive gigabit TP-LINK TL-SG1008D which I bought at a local computer store slightly more than a year ago (<$30). It's got 8 ports and fast enough for 100MB/sec (that's 100 megabytes/sec) file transfer through it from server to my HTPC:

The white thing underneath is just my LP carbon fibre brush to lift it a little to photograph the front easier. Ahem... Pardon the dust... :-)

In comparison, for a couple of the tests I will use this little guy:



A LinkPro SOHOHub 4-port 10/100Mbps switch which I believe is about 10 years old (the TC3097-8 interface controller inside came out around 1998). I found it in the attic of the house I bought, powered by a Canon AC adaptor which provided adequate juice.

For both these switches, I will keep my HTPC computer connected to one of the other ports.

The "Test cables":



So, I rummaged around my pile of computer parts and found these cables to test. Note that I was shopping at NCIX.com when I was doing some renovations and getting my network system up. I "standardized" on some rather inexpensive Cat 6A cables on sale there - hence the nGear brand which they carried.

The top picture, from the left we have a 1-foot length of Cat 6A STP (<$3.00) - presumably the "best" cable given the short length and excellent shielding. Note that Shielded Twisted Pair (STP) cables are not necessarily better than UTP (Unshielded...); one must make sure the shield is properly connected at each end. Next we have presumably the "worst" cable of the bunch - a generic "freebie" 3-foot length of Cat 5E UTP patch cable that has been sitting around for the last 5 years in my pile of parts. The blue plastic jacket is loose and quality so flimsy that I can probably pull it apart easily without much strength needed. Then we have a 10-foot length of Cat 6A (<$6.00), and finally, a much longer 50-feet length of Cat 6A STP (~$15.00). Cables from the same brand will allow us to see if length makes a difference.

The green cable in the lower picture was one I found in my office. It's a 12-year old 20-feet generic Cat 6 UTP cable that has been in daily use for the last 12 years... I guess you can call it "burned in"!

Sorry folks, I don't have any Cat 7 cables here. At this point, I don't see any reason to use these since I'm only running a 1 Gbps network. Anyone out there running a 10 Gbps network at home requiring Cat 7 cables? Realize that even Cat 6 is potentially capable of 10 Gbps up to 50m (>160 feet) or so.

I will measure with RightMark (newest 6.4.1) to look at the usual dynamic range, noise floor, distortion along with the Dunn J-Test signal to see if there's any evidence of jitter anomaly in the Transporter's RCA DAC output (rather than the XLR for the sake of convenience). Some well shielded 6' RadioShack interconnects used (Cable C here). As usual, my E-MU 0404USB device was used for the measurements. All measurements done in 24/96 (high resolution) or 24/48 for the jitter test.

Let the measurements begin...

II. RightMark Audio Analyzer (24/96)

Here's the summary table of all results with 5 cables with the TP-LINK gigabit switch and 2 other measurements with the old 100Mbps LinkPro switch:

As you can see, there are no significant differences in the audio output at all. Analogue output was measured all the way to 48kHz - well beyond the audible spectrum. It didn't matter whether the cable was 1-foot all the way to 50-feet. Likewise, Cat 5E, Cat 6, Cat 6A, UTP or STP made no difference whatsoever. In 2 of the tests (50' CAT 6A & 3' CAT 5E + LinkPro), I was playing 20Mbps 1080P MKV video concurrently on the HTPC connected to the switch to increase the data rate coming from the server - no difference in background noise or anything else.

A few graphs from which the calculated data were derived:
Frequency Response: Exact overlay.
Noise level: slight 60Hz hum measured down at -115dB, everything else even further below this.

IMD: Again, essentially perfect overlay with the different cables.
Stereo Crosstalk: Would be very bizarre to see any anomaly here!

III. J-Test (24-bit)

Instead of showing 7 individual J-Test graphs, I decided to overlay each one to create a composite image:


As you can see, there is some normal variability in the noise floor around the 12kHz primary frequency but otherwise, nothing sticks out. There's some low-level jitter around 12kHz, some of which I'm sure related to the E-MU device itself rather than just the Transporter.

No evidence that any of the cables / switch changes resulted in any anomaly using the 24-bit Dunn jitter test. None of the sidebands exceeded -110dB from the primary frequency peak at 12kHz. Note that the peak itself is at -3dBFS, but I measured it a bit lower to avoid the use of the E-MU's input amplifier which would add some noise. Again, no change observed (ie. worsening of noise floor or stimulated jitter sidebands) even when the HTPC was concurrently streaming a 20Mbps movie from the server.

IV. Summary / Conclusion

I believe if there indeed is an ethernet audio device that "sounds different" because of different cables being used, then that device should be returned because it is obviously defective. Remember folks, it is like accepting that the earth is spherical or that 2+2=4 - because that's just the way it is. Ethernet communication is an engineered system, the parameters and capabilities of this system is not only understood but designed to be the way it is by humans! You really cannot claim to have "discovered" some combination of dielectric or conductor or geometry that works "better" within an already errorless digital system unless you're claiming improved performance outside technical recommendations (in the case of Cat 6 for gigabit networks, it's 100m or 328 feet lengths within a reasonable ambient electrical noise environment).

It's also worth remembering that audio data bitrates are quite low. Today, I hope nobody is running anything slower than 100Mbps "fast ethernet". Although my music is generally streamed out as compressed FLAC, even if you stream uncompressed WAV files, standard stereo 16/44 CD-quality audio requires <1.5Mbps, 24/96 requires ~4.6Mbps, and stereo 24/192 ~9.2Mbps. Even if we went uncompressed multichannel, 5.1 24/96 would only use up <14Mbps.  Considering how cheap gigabit (1000Mbps) networks are, there's no reason not to build upon the gigabit standard these days. There's generally no reason to complain about decent Cat 5E cabling, but splurging a little on Cat 6+ isn't a big deal. The Transporter device used in these tests is almost 10 years old at this point and limited to 100Mbps. I would certainly be surprised and disappointed if a modern audio streaming device measured differently with various cables these days with even faster ethernet interface hardware!

Ultimately, I'm not suggesting anyone use the cheapest ethernet cable he/she can find. If you like the esthetics and build construction, go for it! Just realize that it's essentially impossible to argue that a functioning (free of data transmission error) ethernet cable will "sound" any different or worthy of significant cost differential based on sonic quality. The idea of specialized "audiophile" ethernet cables (or "ethernet switches" for that matter) is plain nonsense.

For the record, subjectively, I have never heard a difference between ethernet cables on my system. For fun I did have a listen to Shelby Lynne's Just A Little Lovin' (2012 Analogue Productions SACD ripped to 24/88) - sounded great to me even with the Cat 5E freebie cable and cheap LinkPro switch while a 20Mbps movie was playing off my HTPC. I have never tried those expensive cables from AudioQuest or Chord, but seriously, why bother when there's no logical rationale based on understanding of an engineered product and the lack of empirical evidence? Must a person try out or demo every claim made or testimonial uttered when some things are self-evident? Must I double check when someone comes up to me and tells me the world is flat or the sun rises in the west? Should I also try Snake Oil if someone in a crowd around the traveling salesman yelled out that it "Works for me!" without any other evidence?

Well, it looks like Chord got their hands slapped for claims about sound quality with their ethernet cable ads determined to be "misleading advertising", lacking in "substantiation", and "exaggeration" in November 2014. Bravo to the UK's Advertising Standards Authority. Truth is important.

Bottom line: There's no evidence that any of the digital cables make an audible difference be it TosLink, coaxial, USB, or now ethernet within an error-free system.**

As usual, if anyone feels that I am in error, please demonstrate and leave a link to the evidence.

Okay... Now I really have to go do some real work :-). Enjoy the music!

-----------

** I was reminded about this post I made using the Squeezebox Touch and EDO plug-in awhile back. In it, I was able to demonstrate measurable differences using an unshielded cheap 3' "zip-chord" RCA cable instead of a proper coaxial cable (I'm sure it's nothing close to the 75-ohm impedance spec). It is a reminder that we of course should be using *proper* cabling and that extreme situations like in this post will allow demonstration of noise phenomena that otherwise would be highly unlikely. Notice also how this poor RCA cable degraded sound quality when pushed to 24/192 which is also outside the usual Squeezebox Touch specification but available thanks to the Triode plugin.