Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Jitter in S/PDIF signal and Dolby Digital/DTS playback

34 views
Skip to first unread message

Gav

unread,
Mar 27, 2002, 4:25:44 AM3/27/02
to
Can anyone give the definitive information on whether or not jitter in
the input signal to the decoder over an S/PDIF connection could make
it through to the DACs in a Dolby Digital or DTS system?

Basically, is the output signal completely reclocked off of an
internal reference clock (thus making any jitter on the interface
irrelevent) or is the timing for the reconstruction/replay process
derived directly from the timing of the input datastream on the S/PDIF
interface as I've seen some claim?

I've seen assertions from both 'camps' that the output PCM streams
are/are not reclocked. What I haven't seen is definitive tech
documentation that specifies exactly what happens clockwise.

Please note I'm not asking 'does jitter make a bitstream sound
worse' - just 'does the jitter in the input signal somehow propogate
to the bistream going into the DACs from a surround sound decoder.


thanks,

Gav

Alan Rutlidge

unread,
Mar 27, 2002, 5:46:52 AM3/27/02
to

Gav wrote:

Gav,

It is unlikely that the digital signal is "re-clocked" as such, rather the
input signal is "sniffed off" and the signal regenerated and re-timed
based on the incoming bitstream frequency. This simple arrangement allows
for bitstreams of differring bitrates to be handled by a single input.
eg. CD 1411kbits/s, DD 448kbits/s etc etc.....

Cheers,
Alan


Arny Krueger

unread,
Mar 27, 2002, 6:51:18 AM3/27/02
to

"Gav" <gavs_...@swissonline.ch> wrote in message
news:3bf1057d.0203...@posting.google.com...

> Can anyone give the definitive information on whether or not jitter
in
> the input signal to the decoder over an S/PDIF connection could
make
> it through to the DACs in a Dolby Digital or DTS system?

I've found that the DD/DTS decoders I've tested had remarkable
abilities to reject jitter. I think this is inherent in their design,
because they must heavily buffer the audio signal, since it is
decoded at a variable rate in DD/DTS mode.

> Basically, is the output signal completely reclocked off of an
> internal reference clock (thus making any jitter on the interface

> irrelevant) or is the timing for the reconstruction/replay process


> derived directly from the timing of the input datastream on the
S/PDIF
> interface as I've seen some claim?

The timing is obtained from the rate of the input data, but in good
DACs, this is a very indirect process.

> I've seen assertions from both 'camps' that the output PCM
streams
> are/are not reclocked.

Part of the problem here is that the word "reclocked" is not used in
a standard way.

> What I haven't seen is definitive tech documentation that
specifies exactly what happens clockwise.

Forget about how it is done, look at the actual results.

But if you want to see some of the basics, look at the spec sheets
for digital receivers at
http://www.crystal.com/design/products/index.cfm?c=12 .

> Please note I'm not asking 'does jitter make a bitstream sound
> worse' - just 'does the jitter in the input signal somehow

propagate


> to the bistream going into the DACs from a surround sound decoder.

It can, but it obviously doesn't have to. The decoders I tested were
a Technics SHAC-300 and SHAC-500, which are modestly-priced pieces of
equipment. I presume that the same basic technology they use, is also
used in comparable receivers.

I built a test rig that added jitter in controlled amounts.

I found that these relatively inexpensive decoders recovered analog
with very low (inaudible to me) amounts of jitter, as shown at
http://www.pcavtech.com/adc-dac/shac300/index.htm#JIT_DA . This was
true no matter how much jitter I artificially added, at least up to
the point where the jitter was so great that:

(a) the jitter was visible in an oscilloscope view of the SP/DIF wave
(this is gross!)

(b) the decoders finally lost synch with the input wave because the
jitter was so high.

(c) Just I increased the jitter to the point where synch was lost and
the decoder muted, I listened to the output of the Denon DAC. The
test tone had clearly audible and unmistakable vibrato. I listened to
the output of the SHAC-500, and it was a pure tone.


Richard D Pierce

unread,
Mar 27, 2002, 9:47:48 AM3/27/02
to
In article <3bf1057d.0203...@posting.google.com>,

Gav <gavs_...@swissonline.ch> wrote:
>Can anyone give the definitive information on whether or not jitter in
>the input signal to the decoder over an S/PDIF connection could make
>it through to the DACs in a Dolby Digital or DTS system?
>
> Basically, is the output signal completely reclocked off of an
>internal reference clock (thus making any jitter on the interface
>irrelevent) or is the timing for the reconstruction/replay process
>derived directly from the timing of the input datastream on the S/PDIF
>interface as I've seen some claim?
>
> I've seen assertions from both 'camps' that the output PCM streams
>are/are not reclocked. What I haven't seen is definitive tech
>documentation that specifies exactly what happens clockwise.

It depends strictly upon the competence or lack thereof of the
designer of the DAC. Some DACS are poorly designed and use the
incoming bitstream almost as a reference clock, others isolate
the output clock from the input stream successfully.

--
| Dick Pierce |
| Professional Audio Development |
| 1-781/826-4953 Voice and FAX |
| DPi...@world.std.com |

0 new messages