Friday 7 March 2014

MEASUREMENTS: Does "Burn-In" / "Break-In" Happen for Audio DACs?

Jazz at Lincoln Center Orchestra with Wynton Marsalis did a great job last Saturday night at the Chan Center here in Vancouver. A friend once told me "Jazz should be seen as much as heard!" I think he's right. Always great to watch artistry in the making and correlate the sounds heard with how it was done. It's also a good opportunity to check out the acoustics in a moderate sized venue with minimal amplification of a 15-piece jazz band. Puts into perspective the dynamics and detail of what one hears in the home system.

Some writers seem to idealize live performances, but IMO, more often than not, what I hear on the home system is clearly better - cleaner, more defined, often much more enjoyable assuming the recording was a good one. The best performance and best seat of the house every night! Of course a studio record could never (and is not supposed to) capture the live ambiance or variation that a live performance can provide. That too is often a good thing as I reminisce on some concerts I've been to sitting beside rather annoying concert-goers. :-)

-----



I realized something the other night after measuring the Belkin PureAV PF60. I have been measuring my TEAC UD-501 DAC numerous times throughout its lifetime with me. I bought it back in early May 2013 and it has been in use for measurements, headphone listening in the evenings, and as part of my media room since then. I've run all manners of signals through it from standard PCM to high-resolution PCM to DSD64/128. It has been on for days playing "background" tunes as well as much more attentive "serious" listening through numerous albums of genres from jazz, to pop, to blues, to hard rock, to classical...

As of this writing in early March, I've had this DAC for 10+ months. I'll very conservatively estimate that I've put on >300 hours of actual audio through it (not just time turned on). I made sure when I first bought it that I would measure it within the first couple hours of use so that one day (now), I can go back and do a comparison to see if any kind of significant "break-in" can be demonstrated.

Note that the test set-up isn't exactly the same... Components are different like the playback computer, the RCA cable, and I'm in a different house as well. But the setup is comparable:

Win 8 PC --> shielded USB --> TEAC UD-501 --> shielded RCA --> E-MU 0404USB --> Shielded USB --> Win7/8 measurement PC

Results:

So, without further ado, here it is at 3 time points - within 2 hours of use, around 200 hours last year when I moved home, and just about 2 weeks ago with about 300 hours of use...
Summary

Frequency Response
Noise Level
THD
IMD

I figure there's no point measuring at lower resolution than 24/96 for something like this. You will note that the stereo crosstalk has a 4dB spread from highest to lowest (remember, we're talking down at -90dB here). The reason is simple. These are different RCA cables. At "<2 hours" and "~200 hours", I was using a 3' length of RCA cable whereas the ">300 hours" measurement was done with a 6' cable due to the inconvenience of a short cable in the current set-up. As I showed in the RCA analogue interconnect test, length of cable makes a significant difference with stereo crosstalk measurements (shorter is better) for the standard zip-cord type I'm using. This could also be the reason for the slightly lower noise floor especially notable on the THD and IMD graphs above (again, remember we're looking at the -120dB level here!). Otherwise, I see no evidence here of a significant change that would be audible.

Conclusion:

These measurements suggest that there really is no such thing as audible "break-in" for purely electronic devices like DACs within a reasonable period of time. People hearing the "effect" on a regular basis are more likely to be changing their psychological expectations over time than the device actually changing sonic character. I guess change could be happening within the first 2 hours but one almost never hear people claim this; for the most part, people recommend something like 100+ hours. Logically, if anything, electronic devices deteriorate in time as components (like capacitors) get old and connectors oxidize.

Sonic change for mechanical devices like speakers would make much more sense... InnerFidelity had an article about change in the sound of the AKG Q701 headphones over time (65 hours). The measured differences in those graphs are much more than what I'm demonstrating here.

In summary... I wouldn't be worried about "burn-in" with purely electronic devices. If you like the sound at the start, great. If not, maybe give it some time for your ears/brain to adjust and see if you like it then. Sure, keep the device on, blast some hard rock or play a burn-in CD if you feel this helps.

If there's no difference with complex electronic devices like the DAC, it'd be quite unreasonable to expect to hear a difference with totally passive "components" (eg. wire/cable burn-in). I find it suspicious that some companies like this one would claim cables needing 400-500 hours (17+ days straight!) to break-in! The more cynical side of me wonders if there is a benefit for companies to do this because it gives them a "grace period" to tell customers to wait. Furthermore the message itself promotes expectation bias towards improvement in time... "No worries! Give it some time to really sound it's best, Mr. Audiophile!"

Subjectively, I cannot say I've ever thought I could hear burn-in. The TEAC sounded good to me from the start and I'd be foolish to claim with certainty any difference at this point almost a year down the road unless of course there were some kind of night-and-day change (which there obviously hasn't been).

An observation - why is it that "burn-in" essentially always results in a reportedly better / smoother / less harsh sound? How does the component "know" that it should go in the right direction? It's not like the electronic component is functioning like the cells of the body where there's a homeostatic mechanism directing the 'healing' towards optimal functioning... Unless of course, we are dealing with a biological mechanism - the ears & brain. ;-)

Perhaps the standard measurements I present here are unable to capture whatever change there's supposed to be. As usual, I propose to those who are certain that burn-in happens to present any links to information or data to support this belief.

---------

Listening tonight:
Sonny Rollins Way Out West - just got reacquainted with this old 1957 jazz recording. A fantastic vintage recording from the golden age of analogue done with the tube Ampex 350 tape recorder. This was chosen as the first CD release by Mobile Fidelity back in the mid-1980's (I think 1984). There have been many reissues of this over the years and I think the highest resolution one would be the Analogue Productions SACD from 2002.

Enjoy the music everyone...

10 comments:

  1. The only conclusions that I could draw based on the 3 overlayed plots is that over time either the power supply (regulation ?) got slightly worse OR there is more 'leakage' of mains frequencies into the system.
    The lack of more measurements over time also isn't helping here to draw real conclusions about this.
    The mains components (60Hz and 120Hz) seem to get 'worse' as time progressed.

    Also the general noise floor seems to have increased ever so slightly (5dB !) after 200hrs but reckon it is because of different circumstances around the measurement setup rather than components breaking down (not improving as the myth wants us to believe).

    I agree 100% about the burn-in issue.... IMO it is more of a myth than reality.
    Components that do alter over time are tubes(valves), rechargeable batteries and capacitors to name a few obvious ones.

    The degradation of capacitors depends on a LOT of factors such as nominal voltage, currents, frequency, size, used materials/electrolytes and temperature to name a few of the bigger ones.

    Most argue that it is these parts (capacitors) that 'add' the most to the 'break-in' part but there are those that claim all parts do.

    To me it's a myth more than anything, with some small truth in it.
    The audibility of it is what I would question most.
    As is the repeatability of measurements at these levels which will show very small differences.
    Those advertising 'burn-in' will gladly use the presented evidence as something that changed and THEY can easily hear rather than knowing these small differences are due to errors in the 'bottom' of its capabilities (noise etc.)
    They will also gladly point to the 'numbers' (we non-subjectivists love so much) and say... you can see the FR response got better over time and THIS is what I can hear !

    Good to see many of the myths are 'tackled' here though.

    ReplyDelete
    Replies
    1. Yup, indeed it does look like there's a trend with the mains leakage using these 3 measurements... I'll have to see if I have other intermediate comparable measurements along the way. In any event, the effect is really small :-).

      Delete
  2. http://www.psaudio.com/products/audio/media-players/perfectwave-directstream-dac/

    There is a video with the lead designer presentation, he mentioned jitter, importance of clock module/crystals, PCB.....and audiences keep asking about jitter.

    I don't think I understand why upsample 10x DSD makes CD sounds better after the video.

    but I supposed this is the state-of-the-art DAC for 2014.

    ReplyDelete
    Replies
    1. I think there's something about it upsampling to 10x DSD and then gets downsampled back to DSD128 (2x) before the actual analogue conversion...

      Haven't heard it myself so can't comment on how or why this should sound better/different.

      Delete
    2. your postings have been very educational about (measuring) jitter, which can give us an objective opinion about a parameter of an electronic equipment.

      The good news is that the current generation of DAC, say under $1000, performs very very good.

      All of us are in pursue of good sound, high fidelity, to various degree. It is sort of funny that when I read the older reviews (easy to do on the Internet, no more paper magazines), the excellent piece of hardware became not so excellent 2 or 3 years later.

      Given the DAC chip is a pretty advanced piece of technology, different system implementation (power supply, input stage, output stage, PCB....) yields us different sounding. Like your TEAC DAC, the Japanese stuff tends to be more system oriented, without claiming the miracle from the DAC chip alone.

      Currently I am researching on the pre-amp. Some opinions say that the digital volume control, like the one on ESS Sabre, has limited the quality of sound. Better flush out the full signal from the digital stage and do the gain or attenuation with a pre-amp. With this argument, we are back to analogue electronics. And can we really measure a pre-amp (easily) ?

      And now, with your postings on the media files, who knows what they did in the re-mastering, or perhaps none at all, or just re-sample and say HD ?


      Delete
    3. Hi KK. Yes, DAC quality has improved tremendously over the years. I would say since about 2005 or so, dynamic range >16-bits has been routinely achieved and with the asynchronous USB DACs these days, jitter has really become much less of an issue. I suspect the days of companies blaming jitter as what's causing bad sound needs to come to an end. The whole movement to "femto clocks" appears overblown to me.

      As for digital volume control, I do not believe it'll be an issue so long as you're not reducing volume very substantially. If for the sake of argument, one feels that 16-bits is enough for good sound (which I believe it is), then when you pad it to a 24-bit signal, you've gained 8 extra bits at the end which are just zeroes. 8 bits represents 48dB of potential attenuation before losing any of the original data... Now of course in practice, DACs are only accurate down to about 21-bits at best. Nonetheless, that -48dB attenuation is still usually achievable with minimal loss of perceived quality.

      As you said, a good analogue volume control could be better if implemented without added distortion or loss in quality.

      Delete
  3. there is also this excellent video on digital signal :

    http://xiph.org/video/vid2.shtml

    http://people.xiph.org/~xiphmont/demo/neil-young.html

    In the 2nd link, "24/192 Music Downloads...and why they make no sense"

    do not flame. He did not mention the quality of electronic equipment that do the job, but just from a theoretical perspective.


    ReplyDelete
    Replies
    1. Monty does a good job with the video presentations... Well worth a view!

      Delete
  4. Just a few days before you posted this, I had a discussion about "burn in" in general.

    Digital burn in? The only thing you should do with your digital equipment is not to shut it down. I know it is not in sync with greenpeace, but...

    I think in most cases our head/brain is "burned in" after few hours/days/months of critical listening.

    I really appreciate your "common sense" posts. After some "audiophile" discussions, I just turn my head, read some of your posts or call/mail my - http://drzlab.com/ - friend (that guy is genius) and say to myself:

    "It's OK! You'r fine!" :))

    ReplyDelete
  5. Absolutely spot on - burn-in is illogical, irrational and totally bogus! The placebo effect is, however, completely real!

    ReplyDelete