The problem with jitter is in the SPDIF format. You are transmitting
two things down the wire. One is the data, the other is the clock.
For SPDIF, this is done with Manchester encoding (to the best of my
knowledge). In Manchester encoding, at the center of every bit time,
there is a transition. If that transition is low to high, that bit is
a 1. If that transition is high to low, that bit is a 0. At the ends
of the bit cells, there may be transitions to reach the right state
for the next mid-bit transition.
Here's the example from the DEC/intel/Xerox Ethernet spec:
| 1 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 0 |
+---+ +-+ +---+ +-+ +-+ +---+
| | | | | | | | | | | |
--+ +-+ +---+ +---+ +-+ +-+ +--
(In the top line, the numbers are at the center of each bit time, the
|'s are the edges of the bit times.)
The data decoder sorts out the Manchester data, and figures out which
transitions are the mid-bit transitions. These are used to steer a
PLL whose center frequency is the nominal bit rate. The PLL will lock
onto the mid-bit transitions. This is the clock that gets recovered
from the data, and it is fed back into the data decoder to do the data
decoding. (Yes, it's circular.) This is also the clock that feeds
the D/A chip (after division down to the sample rate).
There are inevitable tradeoffs between how slowly the PLL will change
output frequency, and how much incoming jitter or frequency error is
acceptable. Some D/A boxes are now very picky about the accuracy of
the SPDIF clock frequency. This lets then have a more stable PLL. I
suspect that most of the SPDIF PLL's are not very tight, and may not
even be stable. PLL's have stability criteria that are quite
comprable to feedabck amplifiers.
The clock in a Manchester data stream is very subject to sytematic
jitter errors due to typical transmission phenomena. For instance,
typical fiber optic components are notorious for dramatically unequal
rise and fall times. Since the polarity of the mid-bit clock
transition is data dependent, one will see the clock drift one way
(phase-wise) on continuous 1's, and the other way on continuous 0's.
These rise and fall time phenomena are probably much more severe on
the TOSLINK plastic fiber components than the AT&T ST glass ones.
Multimode fiber links (which both TOSLINK and ST are) also suffer from
dispersion, due to multiple paths in the multimode fiber. Some
photons go through the middle of the core, others bounce from side to
side and take longer. However, this is probably truly irrelevant for
the typical one meter interconnect, especially glass ones. (Plastic
fiber has huge core diameters.)
In coaxial cables, the rise times are typically more equal. However,
one is still subject to intersymbol interference. When the rise time
gets to be a real fraction of the bit time, the waveforms aren't as
pretty, and you get a baseline shift. When a period of high or low
voltage on the cable only lasts half a bit time, you don't get quite
as asymptotically close to the target voltage as you would in a full
bit time. This means that the following transition does not start
from a consistent voltage. Thus, the voltage of the next transition
will cross the centerline at a relative time that depends on how long
the signal had been at the previous voltage. (Draw it.)
Both of these are scenarios that cause data dependent errors in the
timing of the mid-bit transitions that feed the PLL. Since the SPDIF
data is pretty close to raw audio data (very little subcode or error
correction left), there is little scrambling of the data, and there is
more correlation of the timing errors to the signal.
Note, however, that the signal distortion phenomena are not the same
on fiber and coax.
Things are further abused by stupid design in the digital domain. I
was looking at a back issue of HFN&RR, a review of an Arcam CD player.
They had applied some wierd inductive peaking to the SPDIF signal to
get a better risetime. Sure, this makes for nice risetimes, but there
was all sorts of overshoot, and recovery problems during the level
period. This "improvement" would add all sorts of baseline shift.
(The 'scope photo was of a constant 1's or 0's data stream, so one
didn't see the baseline shift problem.)
However, more typically, the coaxial SPDIF data is driven with weak
line drivers, through less than wideband transformers. Moreover, it
may be deliberately filtered to eliminate RFI. All of these will
contribute to a poor risetime, baseline shift, and jitter.
For desiging SPDIF transmitters and transmission facilities, what
should be measured is the uniformity of the timing of the mid-bit
transitions. This can be done with (expensive) instruments like the
Tektronix CSA 404.
(Manchester encoding is designed for reliably transmitting data in a
self-clocking manner. It is not designed to preserve the timing
information reliably, only the data.)
....The modulation scheme used in cd players is 'EFM' or eight-to-
fourteen code. This is more efficient than MFM, the density ratio
(bits/transition) is typically 1.25, a very high figure. A good
description of this appears in "Principles of Digital Audio", a
Howard M. Sams book.
>
>(Manchester encoding is designed for reliably transmitting data in a
>self-clocking manner. It is not designed to preserve the timing
>information reliably, only the data.)
>
What has me puzzled about all this 'jitter' talk, is what audible
effect does it have ? I don't have any reliable or trustworthy articles
that show any clear audible effects of jitter. I can understand that
if the PLL is not quick enough, data errors will occur. In the absence
of hard errors, the digital data stream is placed in a local memory
(at least in the block diagram of a Yamaha YM3616) that buffers 2k bytes
of data. The error correction is done as the data comes out of this
buffer, and the corrected data is crystal clocked out to the over-
sampling filter (Yamaha 3404b). The oversampled, filtered output
runs into the d/a, also crystal clocked.
So apart from distinct data errors, what will jitter cause? I
think I'm missing something.....
The hi-fi rags blame damn near every audio evil there is, on jitter.
I have heard a number of CD players that reduce jitter (LINN
NUMERIK, Meitner C-lock, etc) and I'm impressed with the sound. But
is the superior sound due to other design considerations? Changing
the analog circuitry on my own CD player (NAD 5100) as per Audio
Amateur's latest issues made an enormous difference to the sound.
Putting in ridiculously fast amplifiers MADE A DIFFERENCE! I
measured a 30-fold improvement of IM distortion going from a 20v/us
amplifier to the 2500v/us AD811. I used to be non-believer of audio
madness, but now I listen more carefully.
The problem with the jitter issue, is that it is very difficult to
measure, and I don't know what to listen for, assuming that I can
cure or change the jitter of a cd player.
The values of timing jitter that are talked about are a few
nanoseconds for el-cheapo players, and less than a few hundred
picoseconds for the super-duper players. I find it difficult to
understand how such small timings can affect the sound.
In desperation, I may set up a circuit on my bench to make a
'precision' jitter so that I can listen to its effects, but
hope that some kind person can clue me in as to what it should sound
like, and spare me the agony of breadboarding some whacky circuits.
(The circuit would consist of two RC circuits, fed by audio signals
180 degrees out of phase, and combined differentially. The
capacitor in the RC network would be a DC isolated varicap diode,
modulated by an external signal. As the varicaps change, the group
delay of the RC filter will shift. The differential circuitry will
cancel feedthrough of the modulating signal. And there are many more
potential bugs to be solved...)
So tell me.... (flames if necessary) what does jitter sound like ?
-Paul
--
-----------------------------------------------------------------------------
Paul J Guy work phone:519-885-1211 ext 6371
pa...@gaitlab1.uwaterloo.ca home/FAX/message:519-576-3090
pg...@healthy.uwaterloo.ca ..remember...bullshit baffles brains...
> In desperation, I may set up a circuit on my bench to make a
>'precision' jitter so that I can listen to its effects, but
>hope that some kind person can clue me in as to what it should sound
>like, and spare me the agony of breadboarding some whacky circuits.
>(The circuit would consist of two RC circuits, fed by audio signals
>180 degrees out of phase, and combined differentially. The
>capacitor in the RC network would be a DC isolated varicap diode,
>modulated by an external signal. As the varicaps change, the group
>delay of the RC filter will shift. The differential circuitry will
>cancel feedthrough of the modulating signal. And there are many more
>potential bugs to be solved...)
> So tell me.... (flames if necessary) what does jitter sound like ?
No need to go to all the trouble. Head down to a shop that carries
Audio Alchemy and listen to the DTI. It's an anti-jitter device (well
done PLL circ.) that is cheap enough for a grad student such as myself
to afford. I listened with a both an NAD CD and the Alchemy transport
running through the Alchemy v 2.0 D/A and the difference was
phenomenal.
[OK, fine. I'm bracing myself for flames on the Audio Alchemy
equipment. Sure, it's not Levinson and such, but not all of us can
afford that price range and IMHO the AA equipment does damn well for
the price. Kicked the pants off the D/A's in my Denon DCD-1700.]
Craig
--PSW
Even 100 ps of DAC jitter should be measureable: see ``Ask the
Applications Engineer'' in the current issue of Analog Dialog.
Also in this issue is an excellent discussion of fast line drivers by
Walter Jung, who will discuss audio line drivers in the next issue.
Analog Dialog should be available for the asking from your nearest
Analog Devices sales office.
On (no affiliation with AD -- just a happy customer) Paradise