Streaming PT2 + FLIR Lepton 3.5 using Raspberry Pi

3,328 views
Skip to first unread message

Magnús Pétur Bjarnason Obinah

unread,
Aug 1, 2018, 5:34:27 PM8/1/18
to Flir Lepton
Hi Guys,

I just got my PT2 + Lepton 3.5 and for my use, I need to stream video from the Raspberry to another device, but I also need to be able to control the thermal settings of the video.

Using the GetThermal wiki, I installed GetThermal. Running it on the Raspberry Pi provides some controls and a good stable image with a decent refresh rate (low lag).
To stream the video only (not the whole Raspberry Desktop or the GetThermal window) I've tried to find guides available online, but can't seem to find anything that works.

1. I tried using Lepton Webstream, which got me to starting the web_fork.py but it outputs:
Warning, failed read. Sh: 1: ./frame.exe: Permission Denied.
This might be fixable although I can't find any reference to it using google, and the project description is for a Lepton connected to a RaspberryPi using a breakout board (not PT-1 or 2).

This works somewhat, as video is captured and sent, however the video lags quite a lot in the web-browser, and more importantly the output is somewhat different from what I want. See image below.

3. I looked into this guide for FLIR Lepton Hookup, which references the Lepton Module GitHub page, but couldn't get the raspberry_video to compile and also these seem to specific to the breakout board.

-

My questions, and please if you have further questions to or input on just a single one, feel free to reply here:

1. I there a way to stream the GetThermal "video" out, maybe also using the controls available to change the settings for the stream?

On the Raspberry Wiki page they write:

"The best way to run the application fullscreen is to use the eglfs QPA plugin" with this command "GetThermal -platform eglfs".

I thought that maybe I could just stream the whole deskop to my other unit, but the command above makes the whole GetThermal program window fullscreen and not just the video screen.
Maybe there are other "plugins" or command line options, but I can't seem to find any references to either of these..

2. Is there another program/app/way to control the image settings (i.e. color palette), that can stream the output somehow (would use VLC or similar to read stream)? 

I've found a forum posting here from someone who was attempting something similar: https://groups.google.com/forum/#!topic/flir-lepton/oUG8CjNxDlw. Initially he was using the breakout board, but was told to get the PT USB dev board instead to make it easier. He then got a PT1 but still couldn't get it to work: https://groups.google.com/d/msg/flir-lepton/oUG8CjNxDlw/jDzK-7_WAgAJ. It seems that this is at least possible using the Raspberry Webcam software that I used in example 2, but there's still no way to control the settings and the stream coloring palette seems to be random..

3. If the answer to these 2 questions is no, what's the point of the PT2 board making the Lepton 3.5 into a UVC if you can't control the image being streamed (temp. range, palette etc.)?

I'm thinking that surely someone must have made a simple program similar to GetThermal or a fork thereof, that allows control of the camera, fixing temp range, choosing color palette etc. and streaming it. Choosing the PT2 board is supposed to make the streaming easier by making the lepton a UVC, but without a way of controlling the camera settings, it's just a really expensive low-res webcam that streams a random representation of thermal data..

4. Where does the actual processing of the thermal data take place, on the PT2 board or on the unit running the software (I.e. Raspberry Pi running GetThermal)?

Now I know that if each pixel has a temp. value, it should be easy to assign a palette of your choice, but I haven't been able to get an answer from PureThermal (emailed) on exactly where this processing is done. Does the software talk to the PT2 board and tell it how to color the data and what to stream out via UVC (temp.range, palette etc.) or does the PT2 board stream to the unit it's plugged in to, include all the thermal data, that the software on that unit must then sort / visualize according to the user settings (temp.range, palette etc)?





Kurt Kiefer

unread,
Aug 3, 2018, 2:56:22 PM8/3/18
to Flir Lepton
Hello Magnus,

I will point you towards the readme for the UVC capture examples repo, where you can see some example gstreamer pipelines, which I think may be of interest to you: https://github.com/groupgets/purethermal1-uvc-capture

This is where using UVC as the transfer mechanism really shines -- you have the opportunity to use standard tools to do any job you can do to a webcam. Now, you'll have to look for examples elsewhere for how to set up a gstreamer pipeline just like you want, but because this board acts like any other webcam there's nothing special about it.

Particularly in Linux, support for USB cameras via V4L2 APIs and tooling is very flexible. You can run Lepton CCI commands from the command line and you don't necessarily have to develop or use any custom software to do so, or if you do, you'll be using a well established API. You can use these CCI functions to choose the color palette if you're acquiring data in a non-radiometric mode (the Lepton will handle the colorization itself in this mode) -- now that I'm thinking about it, in fact, you may HAVE to select a palette on a radiometric Lepton in these modes, since I'm not sure what they're selecting by default. Besides this, you could also make a change to the firmware to set a default here at startup.

Kurt

Kurt Kiefer

unread,
Aug 3, 2018, 4:13:40 PM8/3/18
to Flir Lepton
OK. I just tried the gstreamer pipelines against a radiometric lepton, and it does appear that you must first select a LUT so that the hardware-colorized modes look normal (and not just random). Here's what I did to select FLIR's "fusion" palette and show on the display:

cd purethermal1-uvc-capture/v4l2/uvcdynctrl
./load.sh /dev/video0
v4l2-ctl -c lep_cid_vid_lut_select=1
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=BGR ! videoconvert ! xvimagesink

Magnús Pétur Bjarnason Obinah

unread,
Aug 3, 2018, 8:25:23 PM8/3/18
to Flir Lepton
Hello Kurt,

Thank you very much for the feedback - this was just what I needed to get it working!

I could see that the ./load.sh actually already had the device and configuration file defined, and I also had to modify the gstream string a little.
I'm now running these 3 commands after a fresh boot-up, that I've put together in a little lepton.sh file, that gets the stream with the iron palette running on the raspberry desktop (I use VNC)

purethermal1-uvc-capture/v4l2/uvcdynctrl/load.sh
v4l2-ctl -c lep_cid_vid_lut_select=1
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! videoscale ! video/x-raw,width=640,height=480 ! videoconvert ! ximagesink

/

My new battle is with gstreamer in order to get the feed off the raspberry and onto a VLC player on my laptop. I've included my pipelines below, of which the top one (STREAMING) is the one I can't get to work, while the other ones work like a charm. All of the streaming pipelines below will run just fine, but I cannot open the stream with VLC on the RaspberryPi (desktop crashes, memory/cpu related?) nor VLC on Windows (vlc crashes). Below I'm outputting to 10.0.0.100, but I've also tried running them to 127.0.0.1 instead and opening on the raspberry with vlc, with the same result.

Have you had any luck streaming the feed to another unit (vlc, possibly android)?

STREAMING:TESTING
gst-launch-1.0 -v videotestsrc ! videoconvert ! x264enc ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=10.0.0.100 port=5000
gst-launch-1.0 -v videotestsrc ! videoconvert ! x264enc ! h264parse ! rtph264pay ! gdppay ! tcpserversink host=10.0.0.100 port=5000
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! videoscale ! video/x-raw,width=640,height=480,framerate=9/1 ! videoconvert ! queue ! x264enc ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink host=10.0.0.100 port=5000
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! videoconvert ! x264enc ! h264parse ! udpsink host=10.0.0.100 port=9000

SAVING:WORKS
gst-launch-1.0 videotestsrc ! videoconvert ! x264enc ! flvmux ! filesink location=xyz.flv
gst-launch v4l2src ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=output.avi
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! videoscale ! video/x-raw,width=640,height=480 ! videoconvert ! x264enc ! flvmux ! filesink location=xyz.flv

VIDEO:WORKS
gst-launch-1.0 v4l2src ! videoconvert ! videoscale ! ximagesink
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! videoconvert ! ximagesink
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! videoscale ! videoconvert ! ximagesink
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! videoscale ! video/x-raw,width=640,height=480 ! videoconvert ! ximagesink

I've also tried outputting via xvimagesink using a string that works with ximagesink, but that crashes the desktop, I'm guessing due to memory/cpu usage.

VIDEO:CRASHES
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! videoscale ! videoconvert ! xvimagesink

Kurt Kiefer

unread,
Aug 3, 2018, 11:58:26 PM8/3/18
to Flir Lepton
Try to keep the work you're doing on the pi to a minimum. If you have a good connection, you can transmit raw data in an rtp payload.

On the pi:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! rtpvrawpay ! udpsink host=192.168.0.169

On the host:
gst-launch-1.0 udpsrc caps="application/x-rtp, sampling=YCbCr-4:2:2, depth=(string)8, width=(string)160, height=(string)120" ! rtpvrawdepay ! videoconvert ! xvimagesink

Or whatever else you want to use instead of the xvimagesink. You might try the glimagesink or kmssink or something else if you like too. You can try "gst-inspect-1.0 | grep sink" to see what's available.

Side note: one weird thing I'm noticing about the pi is that gstreamer is choking on the V4L2 BGR3 type (BGR in gst). I'm not sure what's up with this as using BGR works fine on my debian box.

If you don't have the bandwidth to send the uncompressed stream, make sure you don't use the x264enc on a pi as this happens in cpu and it's underpowered for this, try omxh264enc.

Magnús Pétur Bjarnason Obinah

unread,
Aug 4, 2018, 5:18:13 AM8/4/18
to Flir Lepton
You are awesome, thank you so much!

I'm working on a local WiFi connection and your pipeline worked perfectly, I just had to switch xvimagesink to ximagesink, not sure why the first one doesn't work on my pi.
Now I can stream the video to my Linux and Windows machines as well as the Raspberry itself (when streaming to 127.0.0.1) using UDP.

My next and final goal is to get the video stream on to an Android device. I had thought to use VLC for this, and seeing as it's easier to work/test on a full machine, I wanted to get the stream working on my Linux/Windows/Raspberry VLC players prior to doing this. I have therefore been trying to see if the stream that I can now read with Gstreamer could be read by VLC but so far I haven't been able to figure out how to get it to read it, I think I may need to provide an SDP file for VLC to understand it but I haven't found a resource describing the required content for the SDP file or how to get the correct parameters for the file based on a given (g)stream.

- Is it possible to get VLC to read a raw stream or does it first need to be encoded with x264enc or something similar and have you tried this? Maybe there are other android apps more suitable?

I've also tried experimenting with TCP and Multicast, to make the testing process easier (so I can connect the different boxes to the stream instead of having to direct the stream to one at a time), and using the pipelines listed below I've managed to get an image, but for TCP it's crossed with green lines and is very slow, whereas the multicast worked for a couple of seconds when I tried 224.0.0.1 but that also promptly crashed my router :-/. I'm guessing that using TCP hosting will require a lot more resources from the Raspberry (having it play server to different connections, providing multiple streams), so multicast is probably the way to go, but this will require more work on my end.

- Any chance you've got this working on your end?


MULTICAST-STREAMING:BUGGY(CRASHED ROUTER)
Streaming: gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! rtpvrawpay ! udpsink host=224.0.0.1 auto-multicast=true
Recieving: gst-launch-1.0.exe udpsrc multicast-group=224.0.0.1 auto-multicast=true caps="application/x-rtp, sampling=YCbCr-4:2:2, depth=(string)8, width=(string)160, height=(string)120" ! rtpvrawdepay ! videoconvert ! videoscale ! autovideosink

TCP-STREAMING:BUGGY
Streaming: gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! rtpvrawpay ! tcpserversink host=127.0.0.1 port=1234
Recieving: gst-launch-1.0 tcpclientsrc port=1234 host=127.0.0.1 ! capsfilter caps="application/x-rtp, sampling=YCbCr-4:2:2, depth=(string)8, width=(string)160, height=(string)120" ! rtpvrawdepay ! videoconvert ! videoscale ! ximagesink

UDP-STREAMING:WORKS
Raspberry: gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! rtpvrawpay ! udpsink host=10.0.0.250 (<- IP of Receiver)

UDP-RECEIVING:WORKS
Raspberry: gst-launch-1.0 udpsrc caps="application/x-rtp, sampling=YCbCr-4:2:2, depth=(string)8, width=(string)160, height=(string)120" ! rtpvrawdepay ! videoconvert ! videoscale ! ximagesink
Windows: gst-launch-1.0.exe udpsrc caps="application/x-rtp, sampling=YCbCr-4:2:2, depth=(string)8, width=(string)160, height=(string)120" ! rtpvrawdepay ! videoconvert ! videoscale ! glimagesink

Note: Windows receiving also works just fine with: d3dvideosink / autovideosink

Kurt Kiefer

unread,
Aug 4, 2018, 1:36:32 PM8/4/18
to Flir Lepton
You can broadcast the stream by setting sink host=192.168.0.255 and each src address=192.168.0.255 (or whatever your subnet broadcast address on your network is).

As for the sdp file, I wasn't able to get this to work with my VLC installation, which I think lacks support for SDP. This one does work with gstreamer (gst-launch-1.0 -v filesrc location=raw.sdp ! sdpdemux ! rtpvrawdepay ! videoconvert ! autovideosink)

v=0
o
=- 1 1 IN IP4 127.0.0.1
s
=Raw-stream
c
=IN IP4 192.168.0.255
t
=0 0
m
=video 5004 RTP/AVP 96
a
=rtpmap:96 RAW/90000
a
=fmtp:96 media=video; sampling=YCbCr-4:2:2; depth=8; width=160; height=120; colorimetry=BT601-5


Magnús Pétur Bjarnason Obinah

unread,
Aug 4, 2018, 4:19:48 PM8/4/18
to Flir Lepton
Hi Kurt and thank you once again!

The SDP file you provided works just as it should when I open it with the gstreamer pipe you wrote - but as you also experienced, it doesn't work with VLC.
It does however bring me closer to understanding what an SDP file that should work with the gstream output you wrote should look like, and I'm very grateful for that.

I've tried sinking the lepton stream using the broadcast IP (10.0.0.255 in my case), but this doesn't seem to work for some reason, though I may not be providing the right syntax:

Output
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,format=UYVY ! rtpvrawpay ! udpsink host=10.0.0.255

Viewer
gst-launch-1.0.exe udpsrc address="10.0.0.255" caps="application/x-rtp, sampling=YCbCr-4:2:2, depth=(string)8, width=(string)160, height=(string)120" ! rtpvrawdepay ! videoconvert ! videoscale ! glimagesink

The syntax seems to be right, as if I point the output pipeline to the viewer IP instead (10.0.0.249), entering anything but that in the Viewer pipeline under address, results in no stream being displayed.

-

I did find another code example that works with gstreamer broadcasting (10.0.0.255), and can be opened via SDP by VLC in a LinuxVM and on Windows.

VLC-STREAM+RECIEVE:WORKS

gst-launch-1.0 videotestsrc ! openh264enc ! rtph264pay config-interval=10 pt=96 ! udpsink host=10.0.0.255 port=5000

I ran this on my Windows host box, and could then view the videotest-stream by opening the SDP file below in VLC in a LinuxVM and on the Windows Host.

v=0
m
=video 5000 RTP/AVP 96
c
=IN IP4 10.0.0.255
a
=rtpmap:96 H264/90000

The "IN IP4" seems to not be relevant when broadcasting to 10.0.0.255 or if I point the stream directly to the viewer IP (I can change it to anything or delete it from the .SDP, doesn't make a difference).

-

I'm not sure why one stream can be opened by VLC and the other cannot, if it was a problem in VLC with SDP, then it should also affect the code example I found.
Could you try running the example I found on your end and see if you can get the videotestsrc stream to output in VLC using the provided .SDP?

- Magnús

Kurt Kiefer

unread,
Aug 4, 2018, 5:18:08 PM8/4/18
to Flir Lepton
This is definitely a question for the VLC developers, and not me. I will point out one big difference is that in your working example, the RTP stream is H264 rather than RAW. So perhaps the RTP demuxer for VLC (live555) doesn't know how to handle these raw payloads. You can try feeding the lepton stream into the H264 encoder on the pi, and that ought to work the same.

Or maybe it's an issue with the SDP file. You could try gst-rtsp-server (you'll have to build it for the pi). RTSP basically does the job of automatically transmitting the SDP data to the client so it can receive the (otherwise same) RTP stream.

Or better yet, use a more capable program to receive streams. There are lots of remote webcam receivers for android (think baby monitors and security cameras). Maybe some of them are even open source, and could be even based on gstreamer already.

William David Aguilar Cortes

unread,
Feb 13, 2019, 9:40:50 AM2/13/19
to Flir Lepton
hi magnus great work, i need your help, i have a purethermal2 and  i need to show the image in the url of my browser web, is necessary for me that the thermal image has a ip addres and a port, i see that you used the lepton webstream, when you run the webfork.py you obtain a image in a url, i need the same but for this camera (purethermal2), thanks for your help and sorry for my bad english, greetings from colombia, mi e-mail is david....@ittingenieria.com if you can contact me personally.
Reply all
Reply to author
Forward
0 new messages