Display a rtsp camera stream.

85 views
Skip to first unread message

Davide D'Angelo

unread,
Mar 19, 2019, 11:23:49 AM3/19/19
to PraxisLIVE software discussion group
Hi, I've a new goal, which is to read and display (and then manipulate..) a rtsp camera streaming.

it works, meaning that I can see the streaming, using as input the address suggested on the manufactor manual, into a video::player component:
the problem is there are 3 seconds of delay in the stream shown. I'm on Linuxmint 19.1 , Nvidia GTX 850M with Prime (same lag using the built-in Intel graphics.
if I use the same address inside Processing, it is the same delay.
if I use the same address into VLC the delay goes down to 2 / 2.5 seconds.
if I read the camera from the manufactor suggested software (CMS) inside Windows (Virtualized into the same host) there is only 100-200 ms of delay, which for me would be fine.

Do you know if the problem is inside gstreamer decoding? maybe I have wrong h264 decoder? 
Any hint is welcome, since I don't know what to look for by now. (I've spent last two weeks trying to do this in Processing, until I found Praxis and decided to give it a try)

Thanks, Davide.

Neil C Smith

unread,
Mar 19, 2019, 3:36:28 PM3/19/19
to Praxis LIVE software discussion group
On Tue, 19 Mar 2019 at 15:23, Davide D'Angelo <77d...@gmail.com> wrote:
> Do you know if the problem is inside gstreamer decoding? maybe I have wrong h264 decoder?

Bit of googling suggests there is a 2 second default latency for rtsp
in GStreamer. If you can build up the pipeline yourself you can set
the latency properties on the source element. The video:capture
component can use the same syntax as you use on the command line with
gst-launch. However, there's an issue with using decodebin in it,
which is the solution you're most likely to find mentioned by the look
of it. I've been getting annoyed with the inability to use a
decodebin in a video:capture component so this might get rewritten
quite soon!

Best wishes,

Neil

--
Neil C Smith
Artist & Technologist
www.neilcsmith.net

PraxisLIVE - hybrid visual live programming
for creatives, for programmers, for students, for tinkerers
www.praxislive.org

Davide D'Angelo

unread,
Mar 19, 2019, 4:00:31 PM3/19/19
to PraxisLIVE software discussion group
Thanks, for the info.
I already tried to read the stream thru gst-launch-1.0 in the terminal, but I had no luck in making it work.. but for sure that's my fault.
I'm going to study some way to do it, and of course if you (or others) are going to put your hands on this argument, i'll be happy to help in any way I can.

Thanks, Davide

Neil C Smith

unread,
Mar 20, 2019, 12:36:50 PM3/20/19
to Praxis LIVE software discussion group
Hi,

On Tue, 19 Mar 2019 at 20:00, Davide D'Angelo <77d...@gmail.com> wrote:
> I already tried to read the stream thru gst-launch-1.0 in the terminal, but I had no luck in making it work.. but for sure that's my fault.

Does this work from the CLI?

gst-launch-1.0 rtspsrc latency=10
location=rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov !
rtph264depay ! decodebin ! autovideosink

Then try changing the location, assuming your camera is h264,
otherwise you might need a different depay element.

This won't work, yet, in a video:capture, but I'm just looking at
updating the code so it will.

Davide D'Angelo

unread,
Mar 20, 2019, 1:43:34 PM3/20/19
to PraxisLIVE software discussion group
YESS!! it's almost 5 hours that I'm studing the gstreamer tutorials, and couldn't find a way to use rtspsrc.. what I was missing is the rtph264depay element, which isn't listed HERE .
my terminal line is:
gst-launch-1.0 rtspsrc latency=10 location="rtsp://192.168.1.10:554/user=admin&password=&channel=1&stream=0.sdp?real_stream" !  rtph264depay ! decodebin ! autovideosink
and gives me the stream with just some 200-300 ms of delay, which for my intent is quite ok. I need this camera for stage monitoring while playing automations.

Thank you, for this code, I think otherwise I could need a week to reach the same result..

Is there something I can do to help you to make it work in video:capture?

Neil C Smith

unread,
Mar 20, 2019, 3:04:58 PM3/20/19
to Praxis LIVE software discussion group


On Wed, 20 Mar 2019, 17:43 Davide D'Angelo, <77d...@gmail.com> wrote:
gives me the stream with just some 200-300 ms of delay, which for my intent is quite ok. I need this camera for stage monitoring while playing automations.

Great! If you check rtspsrc with gst-inspect you'll also see a property for dropping data to keep latency, although didn't work so well with h264 here.

Your project sounds interesting, would love to hear more about it. 

Is there something I can do to help you to make it work in video:capture?

Thanks,  but I don't think so - seem to have got it working here already. Although need to test it more. The current video:capture code creates user defined pipelines in a bin (container) and because decodebin doesn't have an output at that stage no output is created on the bin. I've changed the code to create everything at the top-level which should allow this to work and provide more flexibility.

This should be in the next release in a week or so. But if you need something sooner I can make a dev build for you to test with? 

Best wishes, 

Neil

Spiderdab

unread,
Mar 20, 2019, 4:27:43 PM3/20/19
to praxi...@googlegroups.com
I would be very interested in trying that as soon as possible.
Thank you very much!

--
You received this message because you are subscribed to the Google Groups "PraxisLIVE software discussion group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to praxis-live...@googlegroups.com.
To post to this group, send email to praxi...@googlegroups.com.
Visit this group at https://groups.google.com/group/praxis-live.
For more options, visit https://groups.google.com/d/optout.

Davide D'Angelo

unread,
Mar 20, 2019, 11:41:40 PM3/20/19
to PraxisLIVE software discussion group
Your project sounds interesting, would love to hear more about it. 
I'm working on a helper software for my job. I work as an automation operator/programmer for the live shows (theater, musicals, concerts..) and there is lack of a couple of things:
- a video system to monitor the stage, but programmable. i.e. I need to draw on the video, zoom with the mouse wheel, maybe also keep track of IR LED markers in the dark and draw something between these points.
- a cue call software to help when I have hundreds of cues (automation movements) on a show. the software is in sync with the music and give countdown for every cue.
- a way to make it sync is SMPTE, that's why i was asking about that, because till now the sync was a manual start on the beginning of the song.
- a message interchange system with sensors or other machines (I was thinking at OSC, or artNet..)

I have been using Processing, which is wonderful for its versatility, but I have to admit I like very much the way PraxisLive works and how it is structured. I love the components structure and how you can link everything as in a audio mixer. You've done a really good job Neil.

Do you think it is possible (or easy) to add the possibility to make Gui aspect customization? I mean, some simple task like add colors into the items property, or choose a rounded aspect for the buttons.
On processing I used to draw my GUI using a series of classes I've done for the various controls, so I want to ask you: Do you think is better to stick on PL Gui for whatever reason, or it is irrilevant and one can draw his own Gui in p2d as usual?

Thanks, Davide.

Neil C Smith

unread,
Mar 21, 2019, 7:12:14 AM3/21/19
to Praxis LIVE software discussion group
On Thu, 21 Mar 2019 at 03:41, Davide D'Angelo <77d...@gmail.com> wrote:
>>
>> Your project sounds interesting, would love to hear more about it.
>
> I'm working on a helper software for my job. I work as an automation operator/programmer for the live shows (theater, musicals, concerts..) and there is lack of a couple of things:
> - a video system to monitor the stage, but programmable. i.e. I need to draw on the video, zoom with the mouse wheel, maybe also keep track of IR LED markers in the dark and draw something between these points.
> - a cue call software to help when I have hundreds of cues (automation movements) on a show. the software is in sync with the music and give countdown for every cue.
> - a way to make it sync is SMPTE, that's why i was asking about that, because till now the sync was a manual start on the beginning of the song.
> - a message interchange system with sensors or other machines (I was thinking at OSC, or artNet..)

OK, really interesting! Let me know if you need any help. It would
make a great case study of PraxisLIVE in action! Zoom on mouse wheel
might be awkward - some of the events aren't exposed at the moment,
although it is possible to access.

I've put the zip with updated GStreamer code up at
https://github.com/praxis-live/praxis-live/releases/download/v4.1.1/PraxisLIVE.v4.1.1.updated.GStreamer.zip
for now.

You'll have to extract and run manually from ./bin/praxis_live

> I have been using Processing, which is wonderful for its versatility, but I have to admit I like very much the way PraxisLive works and how it is structured. I love the components structure and how you can link everything as in a audio mixer. You've done a really good job Neil.

Thanks!

> Do you think it is possible (or easy) to add the possibility to make Gui aspect customization? I mean, some simple task like add colors into the items property, or choose a rounded aspect for the buttons.

With the existing GUI code, no.

> On processing I used to draw my GUI using a series of classes I've done for the various controls, so I want to ask you: Do you think is better to stick on PL Gui for whatever reason, or it is irrilevant and one can draw his own Gui in p2d as usual?

Theoretically if you're using the one window you can use Processing
code as normal. You can run multiple windows if you use the
distributed hubs to put them in separate VMs - Processing has a
tendency to crash running more than one OpenGL window in the same JVM!

As touched on above, you have access to Processing's mouseX, mouseY,
mousePressed variables, but not the event methods - they are difficult
to map because they fire at the wrong time! I'm looking into that.

The built-in GUI has the benefit that everything automatically syncs
up, but it is limited.

The other option is building your UI via Swing, JavaFX, HTML5, etc.
using libraries. The GUI side of PraxisLIVE will get an overhaul at
some point soon. Or you could look at a secondary piece of software
that can build UIs for OSC and bind that - Ossia Score, IanniX,
Rhizome, etc.?

Spiderdab

unread,
Mar 21, 2019, 7:54:21 AM3/21/19
to praxi...@googlegroups.com


Il gio 21 mar 2019, 12:12 Neil C Smith <ne...@neilcsmith.net> ha scritto:
On Thu, 21 Mar 2019 at 03:41, Davide D'Angelo <77d...@gmail.com> wrote:
>>
>> Your project sounds interesting, would love to hear more about it.
>
> I'm working on a helper software for my job. I work as an automation operator/programmer for the live shows (theater, musicals, concerts..) and there is lack of a couple of things:
> - a video system to monitor the stage, but programmable. i.e. I need to draw on the video, zoom with the mouse wheel, maybe also keep track of IR LED markers in the dark and draw something between these points.
> - a cue call software to help when I have hundreds of cues (automation movements) on a show. the software is in sync with the music and give countdown for every cue.
> - a way to make it sync is SMPTE, that's why i was asking about that, because till now the sync was a manual start on the beginning of the song.
> - a message interchange system with sensors or other machines (I was thinking at OSC, or artNet..)

OK, really interesting!  Let me know if you need any help.  It would
make a great case study of PraxisLIVE in action! 
Thanks.
Zoom on mouse wheel
might be awkward - some of the events aren't exposed at the moment,
although it is possible to access.
Ok. Are keyevent a better option?

I've put the zip with updated GStreamer code up at
https://github.com/praxis-live/praxis-live/releases/download/v4.1.1/PraxisLIVE.v4.1.1.updated.GStreamer.zip
for now.

You'll have to extract and run manually from ./bin/praxis_live
Wonderful, i'm going to give it a try on this evening.

> I have been using Processing, which is wonderful for its versatility, but I have to admit I like very much the way PraxisLive works and how it is structured. I love the components structure and how you can link everything as in a audio mixer. You've done a really good job Neil.

Thanks!

> Do you think it is possible (or easy) to add the possibility to make Gui aspect customization? I mean, some simple task like add colors into the items property, or choose a rounded aspect for the buttons.

With the existing GUI code, no.

> On processing I used to draw my GUI using a series of classes I've done for the various controls, so I want to ask you: Do you think is better to stick on PL Gui for whatever reason, or it is irrilevant and one can draw his own Gui in p2d as usual?

Theoretically if you're using the one window you can use Processing
code as normal.  You can run multiple windows if you use the
distributed hubs to put them in separate VMs - Processing has a
tendency to crash running more than one OpenGL window in the same JVM!

As touched on above, you have access to Processing's mouseX, mouseY,
mousePressed variables, but not the event methods - they are difficult
to map because they fire at the wrong time!  I'm looking into that.

The built-in GUI has the benefit that everything automatically syncs
up, but it is limited.

The other option is building your UI via Swing, JavaFX, HTML5, etc.
using libraries.  The GUI side of PraxisLIVE will get an overhaul at
some point soon.  Or you could look at a secondary piece of software
that can build UIs for OSC and bind that - Ossia Score, IanniX,
Rhizome, etc.?
I like this idea. It is going to make a modular gui too. And also the possibility to link a tablet, i.e.

Thank you for your availability.
Davide.

Spiderdab

unread,
Mar 21, 2019, 9:39:53 AM3/21/19
to praxi...@googlegroups.com

The other option is building your UI via Swing, JavaFX, HTML5, etc.
using libraries.  The GUI side of PraxisLIVE will get an overhaul at
some point soon.  Or you could look at a secondary piece of software
that can build UIs for OSC and bind that - Ossia Score, IanniX,
Rhizome, etc.?
I'm looking at those project, they look great! 
I've found also this project which I want to look at: https://osc.ammd.net/

Spiderdab

unread,
Mar 21, 2019, 7:49:29 PM3/21/19
to praxi...@googlegroups.com
I've put the zip with updated GStreamer code up at
https://github.com/praxis-live/praxis-live/releases/download/v4.1.1/PraxisLIVE.v4.1.1.updated.GStreamer.zip
for now.

You'll have to extract and run manually from ./bin/praxis_live

Hi, I could finally try you mods, created a video graph and added a video:capture, then changed its device to my rtsp address:
gst-launch-1.0 rtspsrc latency=10 location="rtsp://192.168.1.10:554/user=admin&password=&channel=1&stream=0.sdp?real_stream" ! rtph264depay ! decodebin ! autovideosink

and Run -> Play.
after that a snapshot of the camera is shown in another window (other than the first black one) and Praxis Live blocks, and I need to kill the process manually...
Am I doing something wrong?

Neil C Smith

unread,
Mar 21, 2019, 7:54:15 PM3/21/19
to Praxis LIVE software discussion group


On Thu, 21 Mar 2019, 23:49 Spiderdab, <77d...@gmail.com> wrote:
gst-launch-1.0 rtspsrc latency=10 location="rtsp://192.168.1.10:554/user=admin&password=&channel=1&stream=0.sdp?real_stream" ! rtph264depay ! decodebin ! autovideosink

and Run -> Play.
after that a snapshot of the camera is shown in another window (other than the first black one) and Praxis Live blocks, and I need to kill the process manually...
Am I doing something wrong?

Yes. Remove "! autovideosink" or you're overriding the sink that PraxisLIVE is trying to add for you. And gst-launch if you really have that there. 

Interesting it blocks to that extent though! That shouldn't happen. 

Best wishes, 

Neil

Spiderdab

unread,
Mar 21, 2019, 7:59:57 PM3/21/19
to praxi...@googlegroups.com
...oops.. yes, now it works... I could understand that..
this is my device now:
rtspsrc latency=10 location="rtsp://192.168.1.10:554/user=admin&password=&channel=1&stream=0.sdp?real_stream" ! rtph264depay ! decodebin
and that works...
Thanks. again..

--

Spiderdab

unread,
Mar 22, 2019, 5:56:12 PM3/22/19
to praxi...@googlegroups.com
Hi, can you tell me if perspective() works into a p3d object?
I can confirm camera() works as expected, but if I use perspective() nothing is drawn.
I need it because I need to change the camera angle.

Neil C Smith

unread,
Mar 22, 2019, 6:19:39 PM3/22/19
to Praxis LIVE software discussion group
Hi,


On Fri, 22 Mar 2019, 21:56 Spiderdab, <77d...@gmail.com> wrote:
Hi, can you tell me if perspective() works into a p3d object?
I can confirm camera() works as expected, but if I use perspective() nothing is drawn.
I need it because I need to change the camera angle.

To be honest, not sure - it maps straight through to the underlying Processing PGraphics. But I've not really used it myself. Do you need it to achieve this though? I've used the PeasyCam library quite a bit recently and that just uses camera() I think. 

Best wishes, 

Neil

Davide D'Angelo

unread,
Mar 23, 2019, 11:13:36 AM3/23/19
to PraxisLIVE software discussion group
PeasyCam cam;

    @Override
    public void setup() {
        processing.core.PGraphics pg = find(processing.core.PGraphics.class).get();
        cam = new PeasyCam(pg.parent, pg, 200);
    }

    @Override
    public void draw() {
        perspective(0.5, 1.77, 0.0, 1000.0);
        lights();
        
        cam.feed();
        //perspective(2.0, 1.77, 0.0, 1000.0);
        
        background(24);
        stroke(0);
        strokeWeight(0.5);
        
        pushMatrix();
        translate(-100, 0, 0);
        fill(0,96,0);
        box(100);
        popMatrix();
        
        pushMatrix();
        translate(100, 0, 0);
        rotateX(PI/2);
        fill(255, 60, 120);
        sphere(80);
        popMatrix();

    }

I've tried to just add the perspective() line to a working peasy code stolen from you, and it doesn't show anything anymore. is it the right way to add it?
I need that because I have to collimate what I'm drawing with the things shown in the camera.

Neil C Smith

unread,
Mar 23, 2019, 3:51:10 PM3/23/19
to Praxis LIVE software discussion group
On Sat, 23 Mar 2019 at 15:13, Davide D'Angelo <77d...@gmail.com> wrote:
> I've tried to just add the perspective() line to a working peasy code stolen from you, and it doesn't show anything anymore. is it the right way to add it?
> I need that because I have to collimate what I'm drawing with the things shown in the camera.

I've slightly revised the PeasyCam code for an example I'm working on,
but haven't got it to reference right now.

Does the perspective code you have work correctly in Processing itself?

Spiderdab

unread,
Mar 23, 2019, 5:15:38 PM3/23/19
to praxi...@googlegroups.com
Yes, it works. I have been looking for the perspective() function source code, but i couldn't find it.

--
You received this message because you are subscribed to the Google Groups "PraxisLIVE software discussion group" group.
To unsubscribe from this group and stop receiving emails from it, send an email to praxis-live...@googlegroups.com.
To post to this group, send an email to praxi...@googlegroups.com.

Spiderdab

unread,
Mar 23, 2019, 8:24:10 PM3/23/19
to praxi...@googlegroups.com
I'm really sorry, I can confirm that perspective() works inside PraxisLive.
It was my fault, all the values I've tried before where not visible...

I apologize..

Davide D'Angelo

unread,
Mar 26, 2019, 10:27:37 AM3/26/19
to PraxisLIVE software discussion group
Hi, another question for you on camera rtsp streaming..

Every is working excellent on PraxisLive, but if I run my project from the terminal using the command:
praxis ./project.pxp
everything works except the two cameras I've setup.

rtsp address is set into both Video:Capture Device Address like that:
rtspsrc latency=0 buffer-mode=0 udp-buffer-size=400000 location="rtsp://192.168.1.10:554/user=admin&password=&channel=1&stream=0.sdp?real_stream" ! rtph264depay ! decodebin ! videocrop top=285 bottom=285

changing only the ip between the two cameras.

Do you know why?

Davide D'Angelo

unread,
Mar 26, 2019, 10:47:44 AM3/26/19
to PraxisLIVE software discussion group
Forgot to say that I added a Core:Start-Trigger, which passes inside a fixed 4-seconds Core:Delay and then to the Start input of the two Video:Capture (cameras..)
Schermata del 2019-03-26 15-45-18.png

Neil C Smith

unread,
Mar 26, 2019, 1:58:41 PM3/26/19
to Praxis LIVE software discussion group
On Tue, 26 Mar 2019 at 14:27, Davide D'Angelo <77d...@gmail.com> wrote:
> Every is working excellent on PraxisLive, but if I run my project from the terminal using the command:
> praxis ./project.pxp
> everything works except the two cameras I've setup.

That's a bit weird! There shouldn't be any difference in behaviour
there. The only thing that seems slightly unusual is running praxis
with the project.pxp file - you would normally use the project folder
- although that shouldn't cause an issue.

Can you replicate with a simple single video:capture component
project? start-trigger -> capture -> output

Although the obvious question! Are you using the updated PraxisLIVE
zip with a system installed praxis command? You'll need to use the
./bin/praxis command in the zip you downloaded until 4.2 is released.

Spiderdab

unread,
Mar 26, 2019, 2:28:57 PM3/26/19
to praxi...@googlegroups.com
It was obvious, isn't it?
Using the correct (updated) binary it works...
Thank you.. Again.

Reply all
Reply to author
Forward
0 new messages