I'd appreciate your help digging into this peculiarity working with labview and sample clocks. I'm simulating an NI PCI-6250 with SCXI chassis, and haven't had any problems with that, as far as the usual DAQmx task manipulation and execution goes.
When we are ready to deploy our DAQ software, we will be running it for hours at a time, collecting data that will need to be reasonably synchronized (within a few milliseconds would be acceptable) with other instruments (not NI hardware, but is integrated with the labview DAQ via serial, ethernet, etc). For now I'm testing the software on a simulated SCXI chassis, since we know the hardware works and use it often with other DAQ software.
Where I'm a little stumped, is how to deal with sample clock drift. I have looked at the following knowledge base article on DAQmx device data, and understand that the sample clock is referenced to the system clock at the start of data acquisition, and no further reference to the system clock is made while the hardware relies on its own hardware sample clock to acquire data.
As there could be significant clock drift over the span of several hours of data acquisition, I need to account for this drift. I'm trying out adjusting the t0's and dt's of the waveforms using the system clock. It works, but I hope I'm not just hiding an underlying problem that can be solved correctly! The following VI is called immediately upon reading the sample buffer. This is a simple version of the VI, it will also track and save the total drift (fudge factor) over time so we can review the drift of the data we have already acquired.
t was pretty surprising that the simulated hardware shows about a second of drift for each minute of acquisition. I don't know how the simulated device times its samples. And, I haven't tested it with our hardware yet, but I would be disappointed if that was no better. I would imagine that NI would use fairly accurate sample clocks. Much better than the 1.7% that I am getting with the 'simulated' sample clock. Of course the computer system clock can't be trusted to be very accurate (Windows isn't a real time operating system, and NTP on windows isn't as good as other systems) but it should be good enough for our purposes. We don't generally look at phase relationships between data from different sets of hardware, so if we are a few milliseconds off, it is acceptable. Seconds or minutes at the end of a long session, however, are not acceptable.
To that end, the raw data looks good with my hacked timestamping. If anyone has any experience they can share with respect to either tracking sample clock drift, or if there is anything I could try to make the strip chart display the data without the artifacts, I would appreciate it!
I have zero experience with NI SCXI, but it might be possible to get serial/ethernet interface modules for this SCXI chassis? I imagine, if a single hardware serves all you interfaces, the timing might be better? But all this is just a wild guess Did you contact NI for advice yet? Usually they are very helpful.
When the acquisition task is started, a system clock time stamp (windows system clock) is grabbed, and used for the t0 of the very first waveform. All other t0's of subsequent waveforms are computed from the sample rate and number of samples since that first system clock timestamp. So if the hardware sample clock (not a time clock, but just a frequency to trigger sampling) is not accurate (of if the computer system clock is not accurate), there will end up being compounding drift between your samples and their time stamps.
I feel a bit confused about what time data info you get in the acquired waveform. As it was written by another poster above, SCXI does not provide timing info along with the data. Ok, so as you wrote, the DAQmx driver "creates" the t0 and so on based on the PC clock:
"Where I'm a little stumped, is how to deal with sample clock drift. I have looked at the following knowledge base article on DAQmx device data, and understand that the sample clock is referenced to the system clock at the start of data acquisition, and no further reference to the system clock is made while the hardware relies on its own hardware sample clock to acquire data."
The reasoning why I would do the above procedure: Using external devices with serial/ethernet/etc protocols, you just do not have any info about when exactly those data were generated! You see the problem here? You would like to sync the SCXI DAQmx data to those other data sources (serial, ethernet, etc), but you have just no guarantee that those devices supply your PC at exact intervals (max few msec deviations), for example, a usual serial unit will not provide you data with msec sync tightness...
Unless your serial/ethernet external units send you msec resolution time stamps too? Could you share what kind of other signals you need to sync with the DAQmx data? What are those external units with serial/ethernet interfaces?
OK, sounds like you're doing more or less the same thing as I am, just applying your own timestamps instead of fudging the waveform timestamps. In the end we get the same thing. I didn't know if there were other more accurate ways to interrogate the sample clock.
b1e95dc632