Hi,
Firstly, all my data is here [0].
The new power board's output drivers have a current sense facility, the
aim of which is to allow a software current limit that protects the
output driver. Being able to identify an over-drawn output during
debugging is a nice feature too. AFAIK there's no intention for the
current limit to be reset-able: it's for protection and debugging.
The output driver doesn't produce a calibrated current measurement, so
we only have the ADC samples to go by. Thus, we have to do our own
calibration to work out how the ADC samples correspond to actual
current, so that we know where the current limit should be set in terms
of the raw ADC values. I've made some measurements to resolve this,
explained below.
To load outputs I've got five 50W lamps tied together for the
high-current outputs, three of them for the low current outputs. At 12V
these draw approximately 20A and 12A respectively. I've connected these
to a motor board, and to generate data ramp the output from 0% to 100%
in single percentage point increments. I take measurements with 2
sensors: the on-board current sense ic that the power board has, and the
ADC readings from the output drivers. I don't have any independent
hardware for sensing more than 10A of current, and assume the power
board hardware is sufficient.
The readings were produced with the drive_output.py script at [0], I
cooked the code in brain/sr-robot.git to work on my laptop and added
code to read the current ADC samples over USB (already implemented in
the firmware, but not software). The ADC samples read over USB are pre
scaled by 7.5 [1]. The outputs, as csv, are in the .txt files at [0].
I've also plotted the actual current readings against the ADC readings
in the png files in [0] too. The first thing to note is that the ADC
readings are pretty dirty; but not by a huge amount. (They appear to be
on the same scale as the current measurements because of the scaling
done by the firmware).
The important thing to take from this is that (surprisingly) the actual
current and ADC readings correlate linearly (despite me previously
reporting that they wouldn't). As a result we can just keep the scaling
factor mentioned above, and encode a current limit in milliamps.
Feedback on this approach would be appreciated. Without objection I'll
bake in a 10A / 20A current limit for low / high current outputs.
There'll be an IIR smoothing these to allow for transients.
[0]
https://jmorse.net/~jmorse/sr/output_readings/
[1] I only noticed this after looking at the plots and being confused
that the ADC readings were already the correct scale. It turns out in
some sleep deprived state I introduced this scaling and then completely
forgot about it, only discovered this when writing this email.
--
Thanks,
Jeremy