I have seen a number of references to using a 1004 Hz test tone
instead of 1KHz because of issues with obtaining accurate SNR results
with a test signal that is at a submultiple of the sampling frequency.
Can anyone give me a good explanation of the problem?
Thanks
here is an explanation of why it was changed when PCM/T-1 systems came
into use.
From http://www.dcbnet.com/notes/9611t1.html
"The D1 type channel bank (D1A,B,C) placed alternate 1's and 0's in
the 193rd bit position. It was assumed that random data would not
contain this pattern, in bits spaced exactly 193 bits apart, for any
significant length of time. The receiving device would find the 193rd
bit by using a simple search technique. This algorithm had the
advantages of circuit simplicity and speed. In the early 1960's, there
were few commercially available ICs for building complex logic
functions, and elementary designs cost less. The disadvantages of this
technique were rapidly uncovered when equipment was installed in
actual customer sites. Certain standard analog tones, such as the 1000
Hz test tone, applied to one or more voice channels and digitized by
Channel Bank, created an alternating one and zero pattern every 193
bits in one or more voice channels. It was possible for the terminal
to lock up on the incorrect pattern. This condition, affecting all 24
channels, could last until the test tone was removed. The 1000 Hz tone
has been changed to a 1004 Hz test tone."
There may also be an explanation involving "harmonics" being generated
by the old 1000Hz tone as an exact multiple of the 8000Hz sampling
rate of a PCM channel.
--reed
Superb answer! Thank you very much.