Saturday, 18 February 2017

MEASUREMENTS: Roon 1.2 (with Intel NUC 6i5SYH)

NOTE: I know that just a couple weeks ago, version 1.3 of Roon has been released and I'll perhaps look into that a bit later once some of the initial bugs are stamped out and the system matures a little. Clearly 1.3 has a few new interesting features but the basic bit-perfect playback function I trust would be the same which is what I'm aiming at exploring in this article.

Over the last few years, I have already gradually made my way across the different audio ecosystems that most audiophiles find themselves interested in. Starting with the venerable Squeezebox / Logitech Music Server system (eg. Touch, Transporter), to standard Windows PC and Mac OS X playback, to questionable software (like JPLAY and the JPLAY update), to even just as questionable OS tweaks (eg. Fidelizer), to more recently looking at DLNA streaming using low power devices like the Raspberry Pi 3 and ODROID-C2.

For today and the next few weeks (let's see how this goes), let's spend some time on Roon, another computer audio system much lauded in the audiophile press and see if we can make a few measurements and comment on some observations and thoughts...

Saturday, 11 February 2017

MUSINGS: Discussion on the MQA filter (and filters in general)... [Update: Including a look at the classic "Apodizing" filter.]

Here's an interesting comment from the post last week...

Excellent article but I have one query. On Sound on Sound they say "MQA claim that the total impulse-response duration is reduced to about 50µs (from around 500µs for a standard 24/192 system), and that the leading-edge uncertainty of transients comes down to just 4µs (from roughly 250µs in a 24/192 system)." In that case wouldn't you need an ADC with higher resolution than the RME Fireface 802 in order to see any real differences between the Reference and Hardware MQA decode?
As I said... Dammit CBee! Now you've made me post another blog entry on MQA :-).

Saturday, 4 February 2017

COMPARISON: Hardware-Decoded MQA (using Mytek Brooklyn DAC)

As promised in my last blog post about software MQA decoding, I have been wanting to have a peek and listen to hardware decoding. Due to the proprietary nature of the MQA firmware as well as the fact that we don't have access to MQA encoder software, there is only so much we can do to explore the effect on an audio signal. Ideally, encoding a test signal and then decoding would be the best way to explore the effects and limits of how this all works...

I don't have an MQA-capable DAC myself (and honestly owning one is not high on my list of priorities), but a friend does happen to have the Mytek Brooklyn which is fully MQA-native and has the ability to decode all the way to 24/384. Furthermore, he has the use of a professional ADC of fantastic quality - the RME Fireface to make some recordings of the output from the DAC.

Image from Mytek. Obviously very capable DAC!
With the combination of the excellent DAC and ADC, we should be able to examine the output and make some comparisons. The main questions being:

1. Can we show that hardware-decoded MQA is closer to an original signal beyond the 88/96kHz decoding already done in software?

2. Can we compare the hardware decoder with the output from the software decoder? How much difference is there between the two?

Saturday, 28 January 2017

QUICKIE POST: Yes, cables matter... (for HDMI 2.0/4K/60fps/HDR...)

Well, of course cables matter! Without them there would be no sound or image! The key of course is to be wise enough to have a concept of what is needed, and how much difference the cables/wires can make.

Just a reminder, if you have not seen it yet, here is my summary of tests and thoughts on audio cables over the years if you're wondering about that.

As for today, given that time is limited this week, I just wanted to put up a "quickie" post mainly about some 4K HDMI cables I tried over the last couple months in setting up my 4K TV (a Vizio P75-C1).

Saturday, 21 January 2017

MUSINGS: On the ongoing push for high resolution audio... and the virtualization of media.

Thanks Sony for this BS diagram... Except for the most primitive non-oversampling (NOS) DAC's out there, more likely than not, DACs these days utilizing antialiasing filters of course do not output their analogue like these stair-stepped waves. Sadly, these kinds of diagrams often become the marketing material used to push hi-res! Can good outcomes result from marketing with dubious claims? (Didn't seem to help Pono, did it?)
I didn't have much time this week for any experiments/measurements. Alas, will be busy for a little while still. Which means it's a good time to put up a "MUSINGS" post based on comments and questions. Always fun to take some time to think and hopefully flesh out ideas a little more. In the post last week, I found these interesting comments worth spending some time on:

Saturday, 14 January 2017

COMPARISON: TIDAL / MQA stream & high-resolution downloads; impressions & thoughts...

As I mentioned last week, and I'm sure you've seen all over the audiophile news, TIDAL has started streaming MQA audio and has embedded a software decoder into the Windows/Mac desktop player. It will basically take a 24-bit 44kHz or 48kHz stream that's encoded by MQA and spit out an 88kHz or 96kHz data stream to send out to your DAC; whether the internal one in your laptop, or a fancy external DAC with options for "Exclusive" mode which allows changes to the appropriate samplerate (I know this works well on the PC, have not tried the Mac).

If you're not using "Exclusive" mode, you can tick "Force volume" to set it to 100% volume so the internal mixer/dither routine hopefully doesn't mess with it. "Passthrough MQA" should be ticked only if you have an MQA enabled DAC or want to purposely hear MQA undecoded (I'll say it now that this is not recommended). My assumption is that if you do have one of these MQA DACs and passthrough is on, you should either make sure "Exclusive" mode is ticked or if not, manually make sure the OS samplerate is correct (ie. 44kHz or 48kHz at 24-bit depth) and that the volume is 100% (either with "Force volume" or making sure the computer volume slider is 100%). Otherwise, it will not be "bit-perfect" and the DAC will not recognize the MQA encoding. I suspect this could be confusing for some.

Saturday, 7 January 2017

MEASUREMENTS: Raspberry Pi 3 as USB Audio Streamer (with recommended CRAAP config & TIDAL/MQA arrives)

A few weeks ago, I got this question from Josh Xaborus in my previous post on the Raspberry Pi 3 + HiFiBerry DAC+ Pro measurements:
Have you measured the USB output from the RPI3 to a USB DAC to see if it's "clean" like the ODROID?
Good question Josh, and perceptive as well. I had not posted anything on the Raspberry Pi 3 specifically, whether there was any difference to be found streaming to the same USB DAC as compared to say the ODROID-C2. Let's have a good look at this and see if we can arrive at some facts and come to some conclusions...

Sunday, 1 January 2017

MUSINGS: On the Digital Music Collection, Metadata Tagging, and Hygiene...

Happy New Year everyone! I hope the holidays went well and you're ready to take on 2017.

Today, I thought I'd spend some time talking about something extremely important and one which I'm surprised I don't hear more about looking around forums and audiophile watering holes... It's the topic of how one creates a collection of music. The difference between a collection and just plain hoarding is of course the discipline of organization involved in the collector's hobby. The collector knows what he/she has. The collector has mastery over the collection. Though responses may be variable, I believe a friend or even complete stranger would be able to appreciate the time and dedication that a collector has put into achieving this mastery as opposed to a sense of revulsion when faced with the hoarder (this is honestly the feeling I get when looking at this "collection").

Over the years, I have seen a handful of articles like this one which also introduces one to Picard, the free MusicBrainz software that will do the job in an automated fashion. If I were to start putting a new collection fresh today, I'll probably do something like this and grow from there, adding customizations, and checking accuracy along the way. However, I have been collecting CD's since the 1980's and over the years, especially after 2004, I have migrated all the "physical" music over to my music server. Through the years, although I have gone through multiple hardware servers, the data from the music collection really has not. It has been essentially rip once into a lossless format, and the CDs packed up in storage thereafter. As the years go on, I suppose like every collector, one develops a unique way to archive the albums, manage the directory structure, and a way to tag the files in a fashion that "works" for oneself.

Let's start the new year with a look at one way to manage the music collection (my idiosyncratic way :-). It has served me well and maybe some of what I do will resonate with you as well...

Monday, 26 December 2016

QUICK COMPARE: AVC vs. HEVC, 8-bit vs. 10-bit Video Encoding

As I mentioned in the last blog post on HEVC encoding in response to "Unknown" in the comments, I do believe there are potential subtle benefits to the use of the 10-bit x265 encoder in Handbrake even with an 8-bit video source. I figured I'll run a very quick test to show what I've seen...

Friday, 23 December 2016

MUSINGS: End of 2016 - Video Encoding (HEVC 10-bits, the HDR "Trinity"), Multichannel Streaming, and Other Thoughts...

For those of you into video, I suspect you're already very excited about the "next generation" H.265/HEVC encoding format. About a year back, I already made mention of the impressive results I was seeing with playback of HEVC on the Skylake HTPC I was putting together. A year down the road, we see the ongoing development of software harnessing the power of the new encoding technique - even lower bitrate for very high quality output.

As we say goodbye to 2016, I thought I'd just "shoot the breeze" a bit and meander down some related topics. Let's talk about video encoding, what I've been doing, what I've found useful/interesting, and some speculation of what I think would be in the not too distant future as it applies to high dynamic range (HDR) video...

Monday, 12 December 2016

MEASUREMENTS: Yamaha RX-V781 Receiver (a look at the pre-out quality)

With the upgrade recently to a 4K TV, it was alas also time to upgrade the surround receiver system I was using. A number of years ago, I bought a used Onkyo TX-NR1009 which I wrote about. It has served me reasonably well over the last few years but not without some issues. The most bothersome was the fact that the HDMI board died last year and it had to be sent back to Onkyo for a board replacement. Thankfully, despite the machine being released around 2011, Onkyo still honored the repair as apparently this is a common problem acknowledged by the company. Not good that the product was defective due to an engineering oversight (soldering & overheating issues), but at least the company "manned up" to the problem.

These days, to make full use of the surround sound system connected to a full-featured 4K TV, the best way is to upgrade the receiver to be compatible with the latest HDMI 2.0 specification with allowance for 4K/UHD @ 60Hz, HDCP 2.2 copy protection compatibility, as well as passing through the full video signal - including full color information (Rec. 2020/BT.2020 and HDR). (For those new to this kind of AV talk, you might want to review the 4K tech article from a few months back.)

Unfortunately, the Onkyo was only good to HDMI 1.4. It was time to upgrade to one of the new receivers... And this is what I found on sale locally:

It's a new model year 2016 Yamaha RX-V781 (current price ~US$700).