troubleshoot working pipeline with gst1-java

454 views
Skip to first unread message

Abu Abdullah

unread,
Dec 2, 2016, 10:59:44 PM12/2/16
to gstreamer-java
Hi,

i have the following pipeline working from the command line (Windows):
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \
 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc ! mpegtsmux ! filesink location=snapshot.mp4 \
 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! autovideosink sync=false

but i got the following warning when used with gst1-java:
WARNING: DecodeBin: [decodebin0] code: 7 message: Delayed linking failed.

the pipeline in the code is the same but replacing autovideosink with videoconvert:

udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \
 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc ! mpegtsmux ! filesink location=snapshot.mp4 \
 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvert

I'm using the following code:

import org.freedesktop.gstreamer.*;
import org.freedesktop.gstreamer.elements.AppSink;

import java.nio.ByteOrder;

import javafx.application.Platform;
import javafx.beans.value.ChangeListener;
import javafx.beans.value.ObservableValue;
import javafx.scene.image.Image;
import javafx.scene.image.ImageView;
import javafx.scene.layout.BorderPane;

public class GstreamerFX extends BorderPane
{
   
String caemraPipline = "udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc ! mpegtsmux ! filesink location=snapshot.mp4 t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvert";
   
   
GstreamerFX()
   
{
       
Gst.init();

       
AppSink videosink = new AppSink("GstVideoComponent");
        videosink
.set("emit-signals", true);
       
AppSinkListener GstListener = new AppSinkListener();
        videosink
.connect(GstListener);
       
StringBuilder caps = new StringBuilder("video/x-raw,pixel-aspect-ratio=1/1,");

       
// JNA creates ByteBuffer using native byte order, set masks according to that.
       
if (ByteOrder.nativeOrder() == ByteOrder.LITTLE_ENDIAN)
            caps
.append("format=BGRx");
       
else
            caps
.append("format=xRGB");

        videosink
.setCaps(new Caps(caps.toString()));
        videosink
.set("max-buffers", 5000);
        videosink
.set("drop", true);

       
final Bin bin = Bin.launch(caemraPipline, true);
       
final Pipeline pipe = new Pipeline();
        pipe
.addMany(bin, videosink);
       
Pipeline.linkMany(bin, videosink);

       
final ImageView imageView = new ImageView();

       
final ImageContainer imageContainer = GstListener.getImageContainer();
        imageContainer
.addListener(new ChangeListener<Image>()
       
{
           
@Override
           
public void changed(ObservableValue<? extends Image> observable, Image oldValue, Image newValue)
           
{
               
Platform.runLater(new Runnable()
               
{
                   
@Override
                   
public void run()
                   
{
                        imageView
.setImage(newValue);
                   
}
               
});
           
}
       
});

       
// listen for EOS signal (to shut pipeline down smoothly) and error messages.
       
// Without connecting the bus, it will stuck and not work !
       
final Bus bus = pipe.getBus();
        bus
.connect(new Bus.EOS()
       
{
           
public void endOfStream(GstObject source)
           
{
               
System.out.println("EOS: " + source);
                pipe
.setState(State.NULL);
               
Gst.quit();
           
}
       
});

        bus
.connect(new Bus.ERROR()
       
{
           
public void errorMessage(GstObject source, int code, String message)
           
{
               
System.out.println("ERROR: " + source + " code: " + code + " message: " + message);
               
Gst.quit();
           
}
       
});

        bus
.connect(new Bus.WARNING()
       
{
           
@Override
           
public void warningMessage(GstObject source, int code, String message){
               
System.out.println("WARNING: " + source + " code: " + code + " message: " + message);
           
}
       
});

        setCenter
(imageView);

       
Thread t = new Thread(() -> {
            pipe
.play();
       
});
        t
.start();
   
}
}

any support is appreciated.

Abu Abdullah

unread,
Dec 2, 2016, 11:03:46 PM12/2/16
to gstreamer-java
i tried as well removing "decodebin ! x264enc" and i managed to get the display but the resulted file was not recording correctly.

LukasEeeE

unread,
Dec 3, 2016, 7:09:59 AM12/3/16
to gstreamer-java
So when you remove the decodebin and the x264enc the display starts to work?
Although the are in different bins?
That sounds odd.

So what exactly have you've done after your first post?

First thing I would recommend to you is to connect a bus handler for Bus.Message()
The messages you get there have the function .getstruct(). Or something like that. Your IDE will help you. ;)
This will give you a little more information for a start.

The error :

WARNING: DecodeBin: [decodebin0] code: 7 message: Delayed linking failed.

could come from the fact that the source pad of the decodebin is a sometimes pad.
If you use the decodebin in the command line it will automaticaly connect the pad correctly at the time it becomes availible.
If you are writing an own application, no matter if it's in c, python or java, you have to take care of that by yourself.

Abu Abdullah

unread,
Dec 3, 2016, 7:44:17 AM12/3/16
to gstreamer-java
On Saturday, December 3, 2016 at 4:09:59 PM UTC+4, LukasEeeE wrote:
So when you remove the decodebin and the x264enc the display starts to work?
Although the are in different bins?
That sounds odd.


decodebin and the x264enc is used for a filesink not for the videosink. so when i removed them, the other tee branch (videosink) starts working.

udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \
 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc ! mpegtsmux ! filesink location=snapshot.mp4 \
 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvert

So what exactly have you've done after your first post?


i just try this:
udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \

 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 !  mpegtsmux ! filesink location=snapshot.mp4 \

 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvert

only the second tee is working now. initially non of them was working and the error is "Delayed linking failed"
 
First thing I would recommend to you is to connect a bus handler for Bus.Message()
The messages you get there have the function .getstruct(). Or something like that. Your IDE will help you. ;)
This will give you a little more information for a start.


thanks for the hint. this is the messages that came out now:


Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_NULL, new-state=(GstState)GST_STATE_READY, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_NULL, new-state=(GstState)GST_STATE_READY, pending-state=(GstState)GST_STATE_PLAYING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_CREATE, owner=(GstElement)"\(GstQueue\)\ queue1", object=(GstTask)"\(GstTask\)\ queue1:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_CREATE, owner=(GstElement)"\(GstQueue\)\ queue0", object=(GstTask)"\(GstTask\)\ queue0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_ENTER, owner=(GstElement)"\(GstQueue\)\ queue1", object=(GstTask)"\(GstTask\)\ queue1:src"; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_ENTER, owner=(GstElement)"\(GstQueue\)\ queue0", object=(GstTask)"\(GstTask\)\ queue0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_CREATE, owner=(GstElement)"\(GstRtpJitterBuffer\)\ rtpjitterbuffer0", object=(GstTask)"\(GstTask\)\ rtpjitterbuffer0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_ENTER, owner=(GstElement)"\(GstRtpJitterBuffer\)\ rtpjitterbuffer0", object=(GstTask)"\(GstTask\)\ rtpjitterbuffer0:src"; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_CREATE, owner=(GstElement)"\(GstUDPSrc\)\ udpsrc0", object=(GstTask)"\(GstTask\)\ udpsrc0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_ENTER, owner=(GstElement)"\(GstUDPSrc\)\ udpsrc0", object=(GstTask)"\(GstTask\)\ udpsrc0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_PLAYING; Bus: Bus: [bus3]
Message: GstMessageNewClock, clock=(GstClock)"\(GstSystemClock\)\ GstSystemClock"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_PAUSED, new-state=(GstState)GST_STATE_PLAYING, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_PAUSED, new-state=(GstState)GST_STATE_PLAYING, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_NULL, new-state=(GstState)GST_STATE_READY, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: null Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_NULL, new-state=(GstState)GST_STATE_READY, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageWarning, gerror=(GError)NULL, debug=(string)"./grammar.y\(506\):\ gst_parse_no_more_pads\ \(\):\ /GstPipeline:pipeline0/GstBin:bin0/GstDecodeBin:decodebin0:\012failed\ delayed\ linking\ some\ pad\ of\ GstDecodeBin\ named\ decodebin0\ to\ some\ pad\ of\ GstX264Enc\ named\ x264enc0"; Bus: Bus: [bus3]

WARNING
: DecodeBin: [decodebin0] code: 7 message: Delayed linking failed.
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]



The error :
WARNING: DecodeBin: [decodebin0] code: 7 message: Delayed linking failed.

could come from the fact that the source pad of the decodebin is a sometimes pad.
If you use the decodebin in the command line it will automaticaly connect the pad correctly at the time it becomes availible.
If you are writing an own application, no matter if it's in c, python or java, you have to take care of that by yourself.

thanks for the info. I searched  the form now about decodebin pad and get some hints of using Pipeline.launch. i will try that and come back.

Neil C Smith

unread,
Dec 3, 2016, 8:29:45 AM12/3/16
to gstream...@googlegroups.com
Hi,
Firstly, one of the reasons I like using Bin.launch() is that you
shouldn't have to take care of sometimes pad linking yourself.

Try launching your original pipeline in Java using Pipeline.launch()
to check it stills works. Does it work if you replace autovideosink
with appsink? Does it work if you add the caps in front of the
appsink?

Note that your original working pipeline is not the same as the one
you're trying to run.

Incidentally, there's an updated version of the Swing video component
that takes an AppSink in its constructor (see the multisink example
that's just been added). Try adding the same feature to the JavaFX
component. You can then specify the appsink and its connections in
your pipeline description.

Best wishes,

Neil



--
Neil C Smith
Artist & Technologist
www.neilcsmith.net

Praxis LIVE - hybrid visual IDE for creative coding - www.praxislive.org

Abu Abdullah

unread,
Dec 3, 2016, 9:22:01 AM12/3/16
to gstreamer-java
thanks Neil/LukasEeeE

it is now working with only Pipeline.launch()

the changes in the code was very minimum. i replaced

AppSink videosink = new AppSink("GstVideoComponent");

with

Pipeline pipe = Pipeline.launch(cameraPipline);
AppSink videosink = (AppSink) pipe.getElementByName("appsink");

and the cameraPipline:
udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \

 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc tune=zerolatency ! mpegtsmux ! filesink location=snapshot.mp4 \
 t
. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvert ! appsink name=appsink

LukasEeeE

unread,
Dec 4, 2016, 1:09:17 PM12/4/16
to gstreamer-java
Thanks for posting your solution.
Your solution makes the code a little nicer to read since you can give a function the whole pipeline.
I like that. :)
Reply all
Reply to author
Forward
0 new messages