Hi Guys,
I just got my PT2 + Lepton 3.5 and for my use, I need to stream video from the Raspberry to another device, but I also need to be able to control the thermal settings of the video.
Using the
GetThermal wiki, I installed GetThermal. Running it on the Raspberry Pi provides some controls and a good stable image with a decent refresh rate (low lag).
To stream the video only (not the whole Raspberry Desktop or the GetThermal window) I've tried to find guides available online, but can't seem to find anything that works.
1. I tried using
Lepton Webstream, which got me to starting the web_fork.py but it outputs:
Warning, failed read. Sh: 1: ./frame.exe: Permission Denied.
This might be fixable although I can't find any reference to it using google, and the project description is for a Lepton connected to a RaspberryPi using a breakout board (not PT-1 or 2).
This works somewhat, as video is captured and sent, however the video lags quite a lot in the web-browser, and more importantly the output is somewhat different from what I want. See image below.
3. I looked into this guide for
FLIR Lepton Hookup, which references the
Lepton Module GitHub page, but couldn't get the raspberry_video to compile and also these seem to specific to the breakout board.
-
My questions, and please if you have further questions to or input on just a single one, feel free to reply here:
1. I there a way to stream the GetThermal "video" out, maybe also using the controls available to change the settings for the stream?
On the Raspberry Wiki page they write:
"The best way to run the application fullscreen is to use the eglfs QPA plugin" with this command "GetThermal -platform eglfs".
I thought that maybe I could just stream the whole deskop to my other unit, but the command above makes the whole GetThermal program window fullscreen and not just the video screen.
Maybe there are other "plugins" or command line options, but I can't seem to find any references to either of these..
2. Is there another program/app/way to control the image settings (i.e. color palette), that can stream the output somehow (would use VLC or similar to read stream)?
3. If the answer to these 2 questions is no, what's the point of the PT2 board making the Lepton 3.5 into a UVC if you can't control the image being streamed (temp. range, palette etc.)?
I'm thinking that surely someone must have made a simple program similar to GetThermal or a fork thereof, that allows control of the camera, fixing temp range, choosing color palette etc. and streaming it. Choosing the PT2 board is supposed to make the streaming easier by making the lepton a UVC, but without a way of controlling the camera settings, it's just a really expensive low-res webcam that streams a random representation of thermal data..
4. Where does the actual processing of the thermal data take place, on the PT2 board or on the unit running the software (I.e. Raspberry Pi running GetThermal)?
Now I know that if each pixel has a temp. value, it should be easy to assign a palette of your choice, but I haven't been able to get an answer from PureThermal (emailed) on exactly where this processing is done. Does the software talk to the PT2 board and tell it how to color the data and what to stream out via UVC (temp.range, palette etc.) or does the PT2 board stream to the unit it's plugged in to, include all the thermal data, that the software on that unit must then sort / visualize according to the user settings (temp.range, palette etc)?