Time for a change?



For decades, domestic audio systems have used SPDIF connections to carry digital audio from one box to another. This relies on a serial data format originally specified by Sony and Philips when CD Audio was launched. Each bit is sent one after the other in series. The format has some specific properties that make SPDIF vulnerable to various physical imperfections. For example, problems caused by the connection having a limited bandwidth or by being ‘mismatched’ in impedance terms. As a result it has become common to examine the ‘data induced jitter’ caused by imperfect transfer of SPDIF signals. This has used specific patterns of sample values and called the ‘J-Test’ – named after Julian Dunn of the AES who was a leader in investigating these problems. Timing ‘jitter’ can matter in practice because the receiving DAC may be affected and the result is an unwanted distortion that alters the analogue output.

During the last decade or so, people have begun to use other types of digital data transfer system. The most common being HDMI (for AV or Home cinema) and USB (for computer) audio. These are rather different to SPDIF. However imperfections still can cause timing problems. As a result reviewers have tended to check the details of their behaviour using the established ‘J-Test’ jitter measurements.

The most recent development has been the arrival of ‘asynchronous USB’ transfer for audio. This is important for two reasons. Firstly, devices using this method can already carry stereo at rates/resolutions up to 96k/24bit – a distinct improvement over the limit of 48k/16bit that has been normal for earlier USB sound devices. Secondly, they take control of the timing of the data transfer to obtain a smoother and more regular flow of audio data. This is aimed at dealing with the problem that a computer is generally busy spinning many plates at the same time, so may be ‘busy’ when it should be noticing that more data is urgently needed for the audio output.

I recently started experimenting with a Halide Design USB-SPDIF Bridge. To use this I connected it between my computer and a DACMagic. The DACMagic had already been giving me good results. But its USB input is limited to 16bits per sample. So I wanted to see if the Halide Bridge would be better. Simple listening experiments were encouraging, although they did involve some days of trying to tweak the audio settings of my computer! Using the Bridge I was able to get 96k/24bit material to play via USB. And the results did sound excellent. But being an engineer I started wondering about trying to make some measurements to establish if the data transfer really was smoother. This set me wondering. Was there a better way to do this than apply the traditional ‘J-Test’?...

The J-Test waveform probes for problems of a kind that can be expected with SPDIF. But USB works differently. And my general impression with computer-based audio is that the timing problems tend to be over a wide range of timescales, not always dominated by the physical properties of the connection. So I started trying to think up a more general way to measure timing problems over a wide range of timescales. After a few weeks of head-scratching and experiments I was pleased to come up with some interesting results. I decided to call the approach the IQ-Test. Never could resist a play with words...


Halide Bridge versus Direct USB into the DACMagic

Fig1.gif - 35Kb
Figure 1 shows a diagram of the setup I used for my test measurements. The source was a Shuttle computer running Ubuntu Linux. I used the simple linux ALSA ‘aplay’ command to play test soundfiles, all of which were LPCM Wave files. I made two USB connections from the Shuttle. One was connected directly to the USB input of the DACMagic. The other was to the USB input of the Halide Bridge. I then connected the SPDIF output of the Bridge to one of the coaxial SPDIF inputs of the DACMagic. I then recorded the analogue output from the DACMagic using a Tascam HDP2 recorder running at 192k/24bit. Once the recordings were made I transferred them to another computer for analysis. For the comparisons I used LPCM files at 44.1k and 48k with 16 bit data. Almost all of my listening has been to source material at these rates and 16bit, so this comparison covers what I listen to.

The analysis I carred out is akin to the old fashioned concept of ‘Wow and Flutter’. People are used to that being a measure of the replay speed stability of mechanical systems like LP turntables or analogue tape decks. For example, if a turntable was playing an off-center LP the output would ’wow’ up and down in frequency once per revolution (i.e. with a wow rate of about 0·55 Hz). If the offset changed the stylus-center distance by, say, 1%, then the output frequency would also vary by about 1%. We can measure this by playing an LP track of a sinewave test tone whose frequency is already known. Similarly, if the motor was juddering or varying its rate of rotation that would also affect the output frequencies and ‘rate of play’ of the music.

For example, if we play an LP track of a 3kHz tone and we find the output frequency actually varies up and down from 3,003 Hz to 2,997 Hz during each rotation we can say this +/-0·1% variation in output frequency is due to the speed of replay varying by +/-0·1%. Similar measurements (usually done by spectral analysis) show quicker flutters or judders in the playing speed. In much the same way we can apply this notion to digital sources whose rate of replay varies with time. Any ‘jitter’ or timing problems will modulate the output frequencies of the test signal (or music!). We can then measure the results and use them to determine the details of the timing imperfections that affect replay. The main problem is to be able to do this with a high enough precision to detect the relatively small timing problems of an entirely electronic digital system!

Fig2.gif - 40Kb
Figure 2 shows an example of this, applied to the setup in Figure 1 and using an IQ-Test waveform. The blue line shows the results for USB via the Halide Bridge. The red line shows the results for a direct USB connection into the DACMagic. In this case the test was using ‘CD standard’ LPCM data – i.e. a 44·1k sample rate using 16 bit values. By comparing the two replay methods you can clearly see that the direct USB connection produces periodic ’jumps’ in the replay speed. These don’t occur when sending the data via the Halide Design USB-SPDIF Bridge. The Bridge produces a much smoother and more regular replay of the data.

Fig3.gif - 66Kb
Figure 3 shows the same results, but this time in the form of a spectrum of the ‘flutter’ in the replay rates. Again the blue line is when the Halide Bridge is used, and the red is for a direct USB connection. The forest of red spikes below about 1·5Hz is the spectrum of the periodic ‘jumps’ shown in Figure 2.

Fig4.gif - 37Kb
Figure 4 illustrates how the output replay rate varies with time when the input signal comes from a 48k/16bit file. As in Figure 2 you can see a regular series of ‘jumps’ in the replay rate when a direct USB connection is used. Once again the Halide Bridge avoids this problem. Note that in this case I had to change the vertical scale of the graph as the jumps are considerably larger than when using 44·1k.

Looking at the vertical scales of Figures 2 and 4 you may also have noticed that the units are ppm (Parts Per Million). The periodic changes in rate are quite small. Only about 1 ppm for 44·1k and 8 ppm for 48k. This form of analysis isn’t directly comparable with the conventional J-Test, but to get some idea of the possible relative significance we can consider Figure 4 as an example. Here the rate jumps down about 8 ppm for around 2 seconds at a time. Now an 8 ppm change in rate accumulates to a timing error of 16 microseconds over two seconds. i.e. a ‘jitter’ over this period of 16 million picoseconds! This is many orders of magnitude greater than the kinds of values reported for J-Test measurements on shorter timescales!

Looking at this another way, for 48k samples per second an 8 ppm error corresponds to 166 picoseconds between successive sample pairs. These figures indicate that the results shown may be as significant as the J-Test values usually reported. But in this case the IQ-Test employs a waveform that makes no assumptions that were originally targetted at the SPDIF format. The IQ-Test is a new approach so it isn’t possible yet to really decide on its merits in terms of audible effects. But you can at least expect that errors and variations over such a range of durations may make it plausible that some of them may indeed be audible if the effects shown by the J-Test are audible. It also clearly shows when timing imperfections arise over a range of timescales.

As before we can also display the results for the 48k/16 comparison in the form of the spectra of the wow and flutter.

Fig5a.gif - 63Kb
Figure 5 shows the 48k spectra using the same graph scales as used in Figure 3. You can see that in this case the 48k flutter is higher, but the peaks actually extend off-scale. To be able to see the peaks we can change the scales.

Fig5b.gif - 54Kb
Figure 5b replots the same spectrum as Figure 5, but using scales that make the results clearer. This shows that for 48k/16 the peak components are far bigger than for 44·1k/16. Note that the spectral component amplitudes are in ppb (Parts Per Billion, not Million) since the variations are represented by a combination of many frequencies which add together.

Results for High Resolution audio.
The limitations of the normal direct USB input meant that no direct comparison with/without the Halide Bridge was possible for ‘high rez’ material of various types. However I did carry out measurements on the Halide Bridge + DACMagic combination with higher resolution audio files. The results do show up one or two curious features. Firstly I compared 24bit with 16bit at 44·1 and 48k. These all gave broadly similar results when using the Halide Bridge. However the situation changed when I extended this to include 96k and 88·2k.

Fig6.gif - 55Kb
Figure 6 shows some of the results. In this case I’ve plotted six different rate/bitdepth combinations. To make the outcome clearer I’ve zoomed in to the ‘wow’ section of the spectrum. I’ve also plotted the results using 96k signals with solid lines and plotted the other results using broken lines to highlight the outcome. You can see that the replay rate is more stable (i.e. lower flutter amplitudes) when using 96k data than when using any of the other rates. This remains true regardless of if the streams are 16 or 24bit and regardless of the use (or not) of sample dithering. For some reason the system gives more regular and uniform data transfer for 96k than for any other rate. Note that the results for 88·2k are similar to those for 44·1 and 48k, and are not as good as for 96k. So this isn’t simply a case of “the higher the sample rate, the better”. 96k is being handled better than any other rate. Note that the vertical scale in Figure 5 is logarithmic, and this doesn’t perhaps make the size of the improvement obvious. In fact, in the region of ‘wow’ below about 0·2 Hz the 96k signals have a wow and flutter that is about five times lower than the other rates! This is quite a marked improvement in replay rate stability.

At present the reason for this is unclear. One possible reason is that the DACMagic upsamples to 192k which is just double 96k, so just a simple integer ratio between rates. But I’d expect that to mean that 48k should also work better than 44·1k – which isn’t the case. So perhaps it is due to the design of the control systems inside the Halide Bridge. Whatever the reason, it does chime with a comment I’ve seen on the web. This is a recommendation to upsample what you send to the Halide Bridge to 96k for optimum results. However all my results simply stream LPCM data at the rate (and number of bits per sample) appropriate for the individual file being played, Checks show this is then received by the DACMagic as a sample-by-sample bit-perfect copy of the contents of the source file.

Conclusions from the measured results
The most obvious conclusion we can draw from the above results is that using the Halide Design USB-SPDIF Bridge clearly improves the performance of the system I used. The output becomes much more uniformly timed. i.e. wow and flutter (jitter) is reduced by using the Halide Bridge rather than relying on a direct USB connection from computer to DACMagic. Another advantage of the Halide Bridge is that it allows higher rate and bit depth material to be played via USB. (In fact I found that 192k/24 and 176·4k/24 LPCM files also played via the Halide Bridge, but were downsampled to a ‘mere’ 96k/24 or 88·2k/24.) Overall, therefore, using the Halide Bridge instead of a direct USB connection gave a clear improvement in both capability and performance.

The above said, we should take care not to put the blame on the DACMagic for the higher replay rate flutter when using a direct USB connection. The root of the problem here is that a normal domestic computer tends to keep being ‘distracted’ by having a number of under-the-hood activities to perform. It also probably has a computer clock whose clock rate isn’t ideal as a basis for 44·1k or 48k operations. Audio is very demanding in terms of the final output needing to be clocked in a very uniform and regular manner. So the unwanted ‘jumps’ here are likely to be due to things going on inside the computer system that get in the way of maintaining a carefully regulated flow of audio data to the DAC. For this reason the details of the flutter can be expected to vary from one domestic computer setup to another. And may also change if you alter what software is running or do something as innocuous as connect or disconnect some other device from the USB ports. What is clear is that the Halide Bridge can take control of the data flow and overcome these problems.

2500 Words
Jim Lesurf
2nd Mar 2011



prev.gif - 2352 bytes ambut.gif - 3891 bytes next.gif - 2248 bytes