"Hans-Peter Diettrich" wrote in message
news:e1o5ms...@mid.individual.net...
Much of this code dates back to around 1996-1999 when I first tried making
an application for Windows using Borland C++, and my first use of Borland
Delphi 4 seems to be around 2003, when a consultant for another project
showed me how to use it. I think I purchased Delphi and BC++ around 1998.
I'm not sure how multiple buffers would help. The serial port component has
its own inBuffer which I have set to 48,000 bytes, or enough for 10 seconds
of data. There is probably another small buffer at a lower level (there is
an FTempInBuffer which is a pointer to data space allotted to
FInBufferSize). The actual reading of data is accomplished by using:
nRead := 0;
nToRead := comStat.cbInQue;
if (nToRead > 0) and ReadFile( FHandle, FTempInBuffer^, nToRead, nRead,
nil ) then
if (nRead <> 0) and Assigned(FOnReceiveData) then
FOnReceiveData( Self, FTempInBuffer, nRead );
So it does not appear that the FTempInBuffer is a circular queue, and thus I
need to implement that in my own code using my CommBuffer[]. I could
probably use much smaller values for the buffer sizes, but that's not really
an issue these days when most computers have at least 2g of memory. The
RxDataCount displayed in my status dialog is usually 280-390 bytes, which
may depend on the PollingDelay which I have set at 50 mSec. At 4800 char/sec
that is 240 characters. The extra characters likely occur during the buffer
processing procedures.
I am doing additional processing in a procedure called by a timer set at
50-250 mSec. So that may account for more delays and accumulation of data in
the buffers. My MaxRxdCount after 10 minutes is now 1758. I have found that
other Windows processes can increase this number, but so far it does not
seem to lose data.
Quite a lot of real time processing occurs during these update intervals.
Character pairs are read from the CommBuffer circular queue, and are parsed
to verify their validity according to the extra two bits in each character,
while the remaining 6 bits in each are concatenated to form a 12 bit
integer. The ADC is only 10 bits but I designed the system to use 12 bits if
needed. The value is shifted by 2048 to get a signed integer.
The absolute value is compared to a threshold value and a number of samples
to determine if the value (of current) is enough to indicate that a test is
in progress. If so, a small circular queue is used for pre-trigger data and
added to a Waveform data buffer (like a digital storage scope). When the
value drops below threshold for a number of samples, the actual start and
stop points of the waveform are calculated, and a true RMS computation is
performed to give the value of current in the pulse. This may be repeated
for up to five pulses of 60 Hz current in a single test, with off times of
as long as 20 seconds (although usually 1 or 2).
Meanwhile, the true RMS value of the samples in each update interval is
computed and displayed on the screen, giving a continuous RMS value at all
times. The sampling intervals of 50, 100, 150, 200, and 250 mSec are chosen
to use an integral number of zero crossings for either 50 or 60 Hz. At 2400
samples per second, 50 mSec is 120 samples, and corresponds to 3 cycles at
60 Hz or 2.5 cycles at 50 Hz. This measuring technique is used in DMMs to
cancel the effects of line voltage noise. It also produces consistent true
RMS readings when the start and end of the sampling period are not
synchronized to the zero crossings.
You may have already "left the building" at this point, unless you find this
sort of discussion interesting. I have been involved in the measurement of
power line frequency current and voltage signals for a LONG time, close to
40 years. Technology has changed since then, from mostly analog methods,
then ADCs with early microprocessors like the 8085 and Z80, then signal
processing boards with MSDOS, and now PICs and DSPs with Windows 10 or other
OS.
Thanks for your valued suggestions and time.
Paul