gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc ! mpegtsmux ! filesink location=snapshot.mp4 \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! autovideosink sync=falseWARNING: DecodeBin: [decodebin0] code: 7 message: Delayed linking failed. autovideosink with videoconvert:udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc ! mpegtsmux ! filesink location=snapshot.mp4 \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvertimport org.freedesktop.gstreamer.*;
import org.freedesktop.gstreamer.elements.AppSink;
import java.nio.ByteOrder;
import javafx.application.Platform;
import javafx.beans.value.ChangeListener;
import javafx.beans.value.ObservableValue;
import javafx.scene.image.Image;
import javafx.scene.image.ImageView;
import javafx.scene.layout.BorderPane;
public class GstreamerFX extends BorderPane
{
String caemraPipline = "udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc ! mpegtsmux ! filesink location=snapshot.mp4 t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvert";
GstreamerFX()
{
Gst.init();
AppSink videosink = new AppSink("GstVideoComponent");
videosink.set("emit-signals", true);
AppSinkListener GstListener = new AppSinkListener();
videosink.connect(GstListener);
StringBuilder caps = new StringBuilder("video/x-raw,pixel-aspect-ratio=1/1,");
// JNA creates ByteBuffer using native byte order, set masks according to that.
if (ByteOrder.nativeOrder() == ByteOrder.LITTLE_ENDIAN)
caps.append("format=BGRx");
else
caps.append("format=xRGB");
videosink.setCaps(new Caps(caps.toString()));
videosink.set("max-buffers", 5000);
videosink.set("drop", true);
final Bin bin = Bin.launch(caemraPipline, true);
final Pipeline pipe = new Pipeline();
pipe.addMany(bin, videosink);
Pipeline.linkMany(bin, videosink);
final ImageView imageView = new ImageView();
final ImageContainer imageContainer = GstListener.getImageContainer();
imageContainer.addListener(new ChangeListener<Image>()
{
@Override
public void changed(ObservableValue<? extends Image> observable, Image oldValue, Image newValue)
{
Platform.runLater(new Runnable()
{
@Override
public void run()
{
imageView.setImage(newValue);
}
});
}
});
// listen for EOS signal (to shut pipeline down smoothly) and error messages.
// Without connecting the bus, it will stuck and not work !
final Bus bus = pipe.getBus();
bus.connect(new Bus.EOS()
{
public void endOfStream(GstObject source)
{
System.out.println("EOS: " + source);
pipe.setState(State.NULL);
Gst.quit();
}
});
bus.connect(new Bus.ERROR()
{
public void errorMessage(GstObject source, int code, String message)
{
System.out.println("ERROR: " + source + " code: " + code + " message: " + message);
Gst.quit();
}
});
bus.connect(new Bus.WARNING()
{
@Override
public void warningMessage(GstObject source, int code, String message){
System.out.println("WARNING: " + source + " code: " + code + " message: " + message);
}
});
setCenter(imageView);
Thread t = new Thread(() -> {
pipe.play();
});
t.start();
}
}decodebin ! x264enc" and i managed to get the display but the resulted file was not recording correctly.WARNING: DecodeBin: [decodebin0] code: 7 message: Delayed linking failed.So when you remove the decodebin and the x264enc the display starts to work?
Although the are in different bins?
That sounds odd.
udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc ! mpegtsmux ! filesink location=snapshot.mp4 \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvertSo what exactly have you've done after your first post?
udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! mpegtsmux ! filesink location=snapshot.mp4 \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvertDelayed linking failed"First thing I would recommend to you is to connect a bus handler for Bus.Message()
The messages you get there have the function .getstruct(). Or something like that. Your IDE will help you. ;)
This will give you a little more information for a start.
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_NULL, new-state=(GstState)GST_STATE_READY, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_NULL, new-state=(GstState)GST_STATE_READY, pending-state=(GstState)GST_STATE_PLAYING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_CREATE, owner=(GstElement)"\(GstQueue\)\ queue1", object=(GstTask)"\(GstTask\)\ queue1:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_CREATE, owner=(GstElement)"\(GstQueue\)\ queue0", object=(GstTask)"\(GstTask\)\ queue0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_ENTER, owner=(GstElement)"\(GstQueue\)\ queue1", object=(GstTask)"\(GstTask\)\ queue1:src"; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_ENTER, owner=(GstElement)"\(GstQueue\)\ queue0", object=(GstTask)"\(GstTask\)\ queue0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_CREATE, owner=(GstElement)"\(GstRtpJitterBuffer\)\ rtpjitterbuffer0", object=(GstTask)"\(GstTask\)\ rtpjitterbuffer0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_ENTER, owner=(GstElement)"\(GstRtpJitterBuffer\)\ rtpjitterbuffer0", object=(GstTask)"\(GstTask\)\ rtpjitterbuffer0:src"; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_CREATE, owner=(GstElement)"\(GstUDPSrc\)\ udpsrc0", object=(GstTask)"\(GstTask\)\ udpsrc0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStreamStatus, type=(GstStreamStatusType)GST_STREAM_STATUS_TYPE_ENTER, owner=(GstElement)"\(GstUDPSrc\)\ udpsrc0", object=(GstTask)"\(GstTask\)\ udpsrc0:src"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_PLAYING; Bus: Bus: [bus3]
Message: GstMessageNewClock, clock=(GstClock)"\(GstSystemClock\)\ GstSystemClock"; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_PAUSED, new-state=(GstState)GST_STATE_PLAYING, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_PAUSED, new-state=(GstState)GST_STATE_PLAYING, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_NULL, new-state=(GstState)GST_STATE_READY, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: null Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_NULL, new-state=(GstState)GST_STATE_READY, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
Message: GstMessageWarning, gerror=(GError)NULL, debug=(string)"./grammar.y\(506\):\ gst_parse_no_more_pads\ \(\):\ /GstPipeline:pipeline0/GstBin:bin0/GstDecodeBin:decodebin0:\012failed\ delayed\ linking\ some\ pad\ of\ GstDecodeBin\ named\ decodebin0\ to\ some\ pad\ of\ GstX264Enc\ named\ x264enc0"; Bus: Bus: [bus3]
WARNING: DecodeBin: [decodebin0] code: 7 message: Delayed linking failed.
Message: GstMessageStateChanged, old-state=(GstState)GST_STATE_READY, new-state=(GstState)GST_STATE_PAUSED, pending-state=(GstState)GST_STATE_VOID_PENDING; Bus: Bus: [bus3]
The error :WARNING: DecodeBin: [decodebin0] code: 7 message: Delayed linking failed.
could come from the fact that the source pad of the decodebin is a sometimes pad.
If you use the decodebin in the command line it will automaticaly connect the pad correctly at the time it becomes availible.
If you are writing an own application, no matter if it's in c, python or java, you have to take care of that by yourself.
AppSink videosink = new AppSink("GstVideoComponent");Pipeline pipe = Pipeline.launch(cameraPipline);
AppSink videosink = (AppSink) pipe.getElementByName("appsink");cameraPipline:udpsrc port=5000 ! application/x-rtp,payload=96 ! rtpjitterbuffer ! rtph264depay ! tee name=t \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! decodebin ! x264enc tune=zerolatency ! mpegtsmux ! filesink location=snapshot.mp4 \
t. ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! avdec_h264 ! videoconvert ! appsink name=appsink