Scheduling Latency of a Blocking UART Read.

60 views
Skip to first unread message

Kenny Koller

unread,
Jan 22, 2018, 4:58:42 PM1/22/18
to BeagleBoard

I have a Python script that is running as a systemd service. It is using a UART and SPI to communicate with an ARM processor. The process performs a blocking read on the serial port waiting for 4 bytes. Using an oscilloscope I measure the difference between when the serial port packet is transmitted and when the read call returns. Most often this value is around 2 ms which is fine. Sometimes (within 20 seconds to a minute) this value is 500 ms and I've seen it be larger than 1 second. If attempted to use `chrt` to change the static priority and scheduling policy. It reports that the values are set correctly but the large latencies remain.

This is with the 3.8.13-bone62 that I rebuilt to increase the SPI buffer. The buffer size was the only change.

Any thoughts on how to improve this? I can probably tolerate up to 100 ms.

Przemek Klosowski

unread,
Jan 22, 2018, 11:15:24 PM1/22/18
to beagl...@googlegroups.com
On Mon, Jan 22, 2018 at 4:58 PM, Kenny Koller <ke...@understoryweather.com> wrote:

 the difference between when the serial port packet is transmitted and when the read call returns
Are you measuring the time to return to Python, or return from the read() syscall? 

Kenny Koller

unread,
Jan 22, 2018, 11:48:27 PM1/22/18
to BeagleBoard
I toggle an output pin on the Beaglebone in Python after the call to self._serial.read(8). I also measure the time in Python using time.time() and it jibes with what I see on the scope. Also the call to the SPI driver seems to have little latency and also have the SPI lines on the scope. So I can see the packet on the UART RX of the BBB and I can see the output pin go high at about the same time as the SPI clock burst and chip select.

Kenny Koller

unread,
Jan 22, 2018, 11:53:34 PM1/22/18
to BeagleBoard
Another detail is that I have had serial port code running for more than year on several hundred units that do not exhibit this issue. At first I thought it was because in this case I am using a blocking. I rewrote the code to use read with a timeout as with the earlier code and the problem persists.

I misstated the packet size above. It is 8 bytes. But that is also difference. My packets before were around 1 kB minimum. I'm wondering if there is some latency getting those bytes out of the hardware buffer or something like that.
Reply all
Reply to author
Forward
0 new messages