This cable is the thinnest, most flexible, likely most poorly shielded USB cable I have; in other words, about as "bad" as it gets when connected to a quality USB DAC which expects to operate in high speed USB 2.0 mode without completely failing... Behold the "Bad Cable":
Plugging this cable into my desktop ASUS Essence One provided the opportunity to demonstrate just what a poor USB cable does to the sound... I'm sure this is "old hat" to those who have experience with digital audio, but for those who haven't, have a listen...
I recorded 1 minute of a freely available track from Jason Shaw called "Pioneers" from here off the Essence One fed into my EMU 0404USB to the usual Win8 laptop using a good quality cable versus the flimsy one above.
Good USB 2.0 cable - well shielded 12', ferrite core on both ends of this specific cable:
"Bad" USB cable as pictured above - poorly shielded against interference and incapable of transmitting at bit-perfect high-speed data rate to the ASUS Essence One:
Even though SoundCloud recompresses the uploaded FLAC audio, I'm sure you can appreciate the obvious errors in the "Bad Cable" sample. (You can press play back & forth between the two samples to A-B them if you want.)
What you're hearing is what happens with digital error (ie. not bit-perfect), similar to watching digital TV with the occasional data error leading to macroblocking and bad pixellation as in this sample found off Google (notice the blue stripe due to digital error):
It's worth noting a few characteristics of this poor cable as it pertains to sound:
1. Poor digital cables leading to digital errors sound like brief pops or occasional static (assuming they do not completely malfunction). They're similar to the errors you get when ripping a CD without something like EAC or equivalent. Sometimes, you'll hear very brief dropouts. Depending on the data packet disrupted, occasionally they will occur in only one channel but not possible for this to happen consistently in a single channel. Remember that although asynchronous DACs have the capability to buffer, hence improve timing and lower jitter, they do not (at least not in the case of the Essence One with the CM6631 USB interface as far as I can tell) necessarily error correct or ask for a packet resend. The more data error, the less the amount of "normal sounding" music will be heard. Obviously if the data error occurs every few minutes, it might be difficult to detect, but if it happens frequently, it's not subtle.
2. A poor digital cable does not result in overall level changes in the song... This is not like analogue distortion that can consistently alter the volume level or change the dynamic range uniformly or periodically.
3. Similar to the above point, poor digital cables are not capable of changing the overall tone of the sound. There is no such thing as a digital cable capable of acting as a "tone control", making certain sounds "brighter" or "warmer". A passive digital cable is not capable of acting with some kind of frequency filtering mechanism.
4. Poor digital cables do not consistently do anything to the soundstage. A poor digital cable cannot make a voice or instrument sound "distant" or move it "forward", or pan the soundstage to the left or right as a whole or in relation to other components of the music.
5. Bad cables cannot cause speeding up or slowing down of the data transfer. Poor digital cables therefore cannot cause sporadic or consistent timing issues like warble (speed up/slow down pitch changes), "pacing", or rhythm problems.
6. The concept of cable "break in" makes no sense with digital audio cables. If it carries data accurately when plugged in then the only problem that can happen in time is corrosion at contact points or reactions such as oxidation of the metal over time. This can only lead to transmission errors as demonstrated above, not some magical improvement due to "break-in".
7. I was reminded here the other day about the measurements with a poor RCA cable I used as coaxial SPDIF last year. Indeed, if you use a very poor, unshielded RCA cable paying no attention to the expected 75-ohm impedance specification with an SPDIF digital interface that's not galvanically isolated (eg. coaxial SPDIF of the ASUS Essence One in that case), noise can be introduced into the system. However it does not take extravagantly priced cables to make things right (an inexpensive 6' <$20 decent shielded cable from a reputable company will do). As always, noise can be introduced into the analogue domain with any electrical connection (or just being careless like putting your DAC right on top of a noisy desktop computer), so it's not really an issue with the digital system itself.
You might be curious how the 2 USB cables measure in terms of jitter...
Surprised? As you can see - not much difference at all! If you monitor the realtime FFT for the J-Test, you will see errors "popping up" with the bad cable due to bad data transfer, but in between, the jitter plots are essentially indistinguishable! This is expected... For an asynchronous DAC like the ASUS Essence One, jitter rejection is handled very well by design and there's nothing the passive cable can do about that.
Now I'm sure there will be a number of folks who disagree and hear various effects in the list above (see here, here and here for some interesting perceptual accounts and/or creative writing). The thing is, where is there decent evidence to show that passive digital cables (and I'm talking here not just of USB but also the SPDIF variants like coaxial or TosLink) sound different if they're error free (a.k.a. bit-perfect) and built to specifications (assuming no issue with analogue noise as in item 7 above)? I've never seen manufacturers come up with anything of substance... Or hobbyists/DIY guys show/demonstrate verifiable claims... As usual, I'm happy to change my mind if some kind of objective evidence exists since I personally have not subjectively heard a problem except as demonstrated above with digital errors.
(Digital cable summary from a number of months back for those who might have missed it. Recent post on EETimes blog on this.)
Recommended album:
- Have a listen to Babatunde Olatunji's posthumous 2005 album Circle Of Drums. This is one of the best Chesky albums I have heard. The drums sound fantastic with wonderful tonality and sense of 'space' around the instruments; a lovely exploration of African drums and rhythm. Unless you believe you can hear the difference between 16-bit noise floor and that provided by SACD, IMO there's no need to buy the SACD because it appears to be a 44kHz PCM upsample (here's the Master List). There is a multichannel mix on the SACD which sounds OK but derived from post-processing. An impressive sounding and quite enjoyable record for those interested in world music nonetheless!
Relax and enjoy the music!
----
Addendum (January 20, 2014):
For the sake of completeness in answering Frans' comment below, here is the J-Test result with the TEAC UD-501 using the poor USB cable vs. good one:
In any case, using a different DAC, the jitter test remains unchanged; two examples now of how an obviously poor USB cable does not appear to affect the jitter from asynchronous DACs in terms of the analogue output (which IMO is the only important measure since that's what we hear!).
I have a feeling that when you look at the eye-pattern of the better USB cable you might find (much) less measurable jitter and most likely a 'nicer looking' defined eye-pattern.
ReplyDeleteWhether or not this results in measurable jitter in the analog output signal will totally depend on the abilities of the DAC to eliminate (ignore) this jitter in the USB signal.
A DAC with less well executed jitter reduction may therefore show higher amounts of jitter in the J-test using the 'bad' cable.
In this case we are just looking at the intrinsic jitter of Asus E 1 itself, not what the cable introduces.
Of course, as mentioned, the bad cable does NOT result in worse measuring (= sounding ?) results with this particular DAC, nor with many other modern DACs most likely, even though the drop-outs clearly indicate the digital signal is f*ed up considerably to the point where data is even misinterpreted on occasions (probably dependent on the data-pattern) to the point where it is resulting in drop-outs / ticks.
People with other DAC's (older ones with very poor, if any, jitter rejection) thus may well perceive deteriorated sound using 'bad' cables.
IMO the myth is still not 'busted' as this test should also be done with a 'poor' DAC. If that DAC also doesn't show any 'alterations' to the analog signal the cable lovers would be hard pressed to provide 'proof' on their behalf.
True. The results are specific to the Essence One in a case like this. I guess I could try measuring the TEAC as well to see whether that DAC's output gets affected by the poor USB cable... Ultimately, there's no way to really prove or disprove these things universally since there's no way for anyone to measure every single DAC out there! What I would like to see is an example of one asynchronous DAC where the cable makes a difference - AND AVOID IT!
DeleteIn any case, I think the CM6631 USB chip in the Essence One is about as "commodity" as most audiophiles would use so if these are the results from an inexpensive part, I sure hope the higher end stuff doesn't measure worse!
I think it could be a good idea to test good and poor USB cable on the TEAC. A french reviewer found a 7-8db difference for the dynamic range. This between a standard Belkin cable and a Nordost Blue Heaven....
Deletehttp://hdfever.fr/forum/viewtopic.php?p=38703#p38703
The thing is that you really gotta look for such a broken cable.
ReplyDeleteWith the USB 3.0 standard finished years ago, and USB 2.0 standard being outdated (hey, it's over a decade old) even very cheap cables will conform to at least the 2.0 specifications nowadays.
As such, there must not be such kinds of transmission errors. If there are then it's either a horribly old and broken cable, or an audiophile one with shrink-wrap instead of the standard USB connector with logo as in the specifications.
Having checked cheap or free USB cables myself I have never seen such a bad cable. I guess your cable may work for something like a printer but would not achieve the transfer rates of USB 2.0 and probably fail many other requirements of the 2.0 spec as well.
Yup, good point. That's the reason I mentioned the lineage of the cable I used.
DeleteBad cable 2001 baby! :-)
Yeah and I *love* that you have such a cable and do these tests.
DeleteI've never found one that does this, it either works or doesn't. Have been looking for, it is that rare to have so borderline that it is audible but doesn't just halt. Great, i can put this in my bookmarks.. been looking for an example but like i said, it needs to be just right kind of wrong to ride on the
Deletethreshold like that..
Funny story... Reminds of commodore 64 days tape station data cable was connected direct to PCB foils and just looking at it wrong induced errors after it slowly but inevitably corroded.. The end solution: only one kid could be at computer at a time, with lights out, no radio and what else.. can't remember anymore, anyway, it went to total ritual with no actual rationality, if the game loaded when someone was wearing a featherhat, we would've thought it was the hat.. Combine that the corrosion slowly making it worse, we moving the cable, it getting better, then getting worse, totally random. When i finally learned what was causing it, of course solution was to clean it periodically and it always worked, i was 12.. Took a LOT of magic away as we were basically praying for the C64 to load, now it just worked. In the end it was great, it happened right at young age so never went full retard with audio myths (plus kind of started whole passion for engineering, knowing precisely what and why, instead of using strange rituals that don't even actually work...)
Why not make an audio difference file BAD vs. OK USB Cable and publish that? (E.g. with Audio DiffMaker) Could be convincing...
ReplyDeleteMake sure you keep the bad cable as a reference. I've tried a few cables from the PC recycle bins at work and none of them are bad. Its hard to find one (might have to sabotage one, but it will probably just not work at all).
ReplyDeleteMy understanding was that asynch USB protocol has re-transmit (will dig up the RFC and see if i can confirm).
Also, I read somewhere that the USB 2.0 interface is not galvanically isolated. USB 1.1 is. I am not an electronics expert, so don't quite understand the implication.
A late comment, but great article! I was a bit bewildered by the people who would say "this USB cable was too bright" or "this USB cable made the music sound fuller and richer". Apparently, it was all in their minds - which I had suspected/known for a while, actually.
ReplyDeleteHi.
ReplyDeleteOne comment: Unlike SP-dif, which has a biphase-mark encoded clock signal in the data stream, (synchronous) USB-audio synchronizes the clock via the start-of-frame (SOF) bit. When the clock is encoded in the datastream, bandwidth limitations may cause data dependent jitter distortion, but this is not the case in USB-audio. The SOF is not data dependent and thus the cable cannot introduce any data dependent jitter distortion. Asynchronous USB-audio is a handshake type protocol so the clock is totally independent of the transmitter, but a synchronous USB transmission is also immune to data dependent jitter distortion. If any jitter were to be caused by the transmitter or link in synchronous USB it would be random jitter noise caused by circuit noise or interface noise.
Unfortunately I don't know English, so I called the google translator for help. It may not give back well what I want to communicate.
ReplyDeleteAnd then here is my writing:
I don’t have systems in the millions, unfortunately: probably if someone gives me 100,000 for a 1m USB cable, their DAC, amp, and speaker are at least a few million.
I also sign that cables play a big role. Especially for an analog signal, but for a digital signal, I think there are exaggerations; and now I'll tell you what I think.
I LOOK FORWARD to someone’s convincing response to the opposite of my sentences. If this is not the case, the hanky-panky of digital cables will remain above a certain level for me.
And then my text: so if I transfer a file over cable, wifin or bluetooth from one source for one purpose, it can’t happen that that digital signal set is even 1 bit different. That's obvious! From now on, no IT sign would be reliable, so this text, for example, would not be reliable. It wouldn't work well if this "1 bit" text says "2 bit" or looks like "0 bit" due to a 1 bit difference. In both cases, there was only one bit difference in the bitmap of the string. But here's another case for a 1-bit difference: "1! Bit" (check out the ASCII bit sequence for characters here if you don't know: https://en.wikipedia.org/wiki/ASCII)
I continue...
This comment has been removed by the author.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteThis would be even more critical for executable files, as it could result in bad code that would not even run the program / application. But it is also not permissible, albeit a little exaggerated, for me not to be in a printed image, but, say, a cat; oh well it's maybe rough, let's refine it: don't let a black hole open in the place of my eyes. Well, that's rough, too, let's refine: let's just have a few pixel errors.
ReplyDeleteSo in the case of sound, some kind of sound defect corresponds to the pixel error of the image. But there is no pixel error on the printer either !!! If so, it's not because of the USB cable, because it wouldn't be allowed, as we've already looked at above. Not all cartridges may have enough color, paper jams or cartridge clogging, or other mechanical defects. But it has nothing to do with the data content on the USB, since then the hell ate the signal!
What do they do when transmitting signals? I’m not an expert, but they use data packets and checksums for them that ensure the integrity of the data packet, in other words, to make the source string match the target string.
And now I have reached the end of what I have to say. WHY IS IT NOT POSSIBLE TO ENSURE THE INTEGRITY OF THE DIGITAL SERIES WITH THE AUDIO SERIES LIKE ANY OTHER SERIES ?! It doesn't matter if the different devices and cables transmit an image sequence, an email, a pdf, or just some audio format: they don't know until we tell them at the end point when they need to decompress and convert it.
So for AUDIO SERIES, why not use a similar checksum solution ?! That is, it does not capture the signal until it arrives unscathed, as is the case in the other cases. This would solve this problem.
I understand that time has a role to play, but buffering has long been invented and is also used by audio player software. So if such an accurate signal were stored in the DAC and stored on the original source (CD or file; all the same), the 100 digital cables would be over! Because we can also send the exact signal to the printer with a few hundred forints cable. Although there may be a lot of error-correcting communication that takes time, if we use a few thousand forints of cable, it can be faster.
The question here is no longer whether the signal sequence is coming through well, because a bad signal cannot come through !; but the question is whether the cable is in a technical condition that is suitable for data transmission. If appropriate, the applicant should be able to resolve it. If the cable meets the minimum requirements (which is an engineering cliché minimum), there can be no bad signal on the receiving side because the question will be whether the cable is suitable or not. Better quality may play a role in the need for fewer repair algorithms -> so fewer buffers may be needed -> so there is no risk that the timing of the music playing will be "chunky". With a sufficiently large buffer, in my opinion, the same result can be achieved with a lower quality cable, since: THERE CAN BE ANOTHER SIGNAL ON THE RECEIVING SIDE OTHER THAN THE SOURCE SIDE!
That's my opinion! I am convincing, but it is very difficult in this matter! 🙂