AppSrc in gstreamer java fixing memory leak means failure, providing the asked fro

771 views
Skip to first unread message

Jim Carroll

unread,
Mar 4, 2018, 9:24:51 AM3/4/18
to gstreamer-java

I have two pairs of seemingly intractable problems. Fixing each one results in a different error. At the bottom is code for a paired  down version of my overall program to illustrate the problem. 

The code uses and AppSrc and in it's NEED_DATA callback it pushes blank frames. The caps are set explicitly to 1920x1080 BGR frames. I feed it zeroed out data buffers and expect xvimagesink to just show a black screen.

PROBLEM PAIR #1: (marked in a comment in the code) 
 - If I don't unmap the buffer I push from NEED_DATA, I get a memory leak and quickly crash
 - If I do unmap the buffer I get the following repeatedly:
JNA: Callback org.freedesktop.gstreamer.elements.AppSrc$3@60f82f98 threw the following exception:
java.lang.IllegalStateException: Native object has been disposed
at org.freedesktop.gstreamer.lowlevel.NativeObject.handle(NativeObject.java:137)
at org.freedesktop.gstreamer.lowlevel.GTypeMapper$3.toNative(GTypeMapper.java:87)
at com.sun.jna.Function.convertArgument(Function.java:514)
at com.sun.jna.Function.invoke(Function.java:338)
at com.sun.jna.Library$Handler.invoke(Library.java:244)
at com.sun.proxy.$Proxy20.gst_buffer_unmap(Unknown Source)
at org.freedesktop.gstreamer.Buffer.unmap(Buffer.java:158)
at com.kognition.gstreamer.ReproduceLeak.lambda$0(ReproduceLeak.java:44)
at org.freedesktop.gstreamer.elements.AppSrc$3.callback(AppSrc.java:187)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.jna.CallbackReference$DefaultCallbackProxy.invokeCallback(CallbackReference.java:520)
at com.sun.jna.CallbackReference$DefaultCallbackProxy.callback(CallbackReference.java:551)



PROBLEM PAIR #2
  - If I pay attention to the size parameter and only feed it the amount of data it asks for I repeatedly get:
0:00:00.179531512 [335m 5782 [00m 0x7f95e8522cf0 [31;01mERROR  [00m [00;04m             default video-frame.c:175:gst_video_frame_map_id: [00m invalid buffer size 4096 < 6220800
0:00:00.179567841 [335m 5782 [00m 0x7f95e8522cf0 [33;01mWARN    [00m [00m         videofilter gstvideofilter.c:292:gst_video_filter_transform:<videoscale> [00m warning: invalid video buffer received

... and the xvimagesink displays garbage.
  - Alternatively I could just feed it an entire frame (1 byte x 3 channels X 1080 rows X 1920 cols = 6220800 bytes) the error goes away but the memory leak is exacerbated and this just seems wrong according to all of the examples I've seen.

This problem is not changed whether or not i feed the exact same empty frame in every time or create the frame data ByteBuffer in the NEED_DATA callback.
 
Any help would be greatly appreciated.

For reference the code is here:

    public static void main(final String[] args) throws Exception {
        Gst.init(ReproduceLeak.class.getSimpleName(), args);

        // Setup the AppSrc
        final AppSrc appsrc = (AppSrc) ElementFactory.make("appsrc", "appsrc");
        appsrc.setLive(true);
        appsrc.setTimestamp(true);
        appsrc.setFormat(Format.TIME);
        appsrc.set("emit-signals", true);
        appsrc.setStreamType(AppSrc.Type.STREAM);
        appsrc.setCaps(new Caps("video/x-raw,width=1920,height=1080,interlace-mode=progressive,pixel-aspect-ratio=1/1,framerate=30/1,format=BGR"));

        // NEED_DATA callback
        appsrc.connect((AppSrc.NEED_DATA) (elem, size) -> {
            System.out.println("Needs data: " + size);

            // NEED_DATA provides empty zeroed buffers

            // PROBLEM PAIR #2
            // Can feed full frames of data OR ...
            final ByteBuffer destBb = ByteBuffer.allocate(6220800 /* size */);
            // ... can feed what it asks for.
            // final ByteBuffer destBb = ByteBuffer.allocate(size);

            final Buffer destBuffer = new Buffer(destBb.remaining());
            destBuffer.map(true).put(destBb);
            elem.pushBuffer(destBuffer);

            destBb.rewind();

            // PROBLEM PAIR #1 : To unmap or not to unmap
            destBuffer.unmap();
        });

        // Prepare the video of blank images to display
        final Element e1 = ElementFactory.make("videoscale", "videoscale");
        final Element e2 = ElementFactory.make("videoconvert", "videoconvert");
        final Element e3 = ElementFactory.make("capsfilter", "capsfilter");
        e3.setCaps(new Caps("video/x-raw,width=300,height=300"));
        final Element e4 = ElementFactory.make("xvimagesink", "xvimagesink");

        // Prepare the video of blank images to display
        final Pipeline pipe = new Pipeline("pipe");

        // add the elements to the pipeline
        pipe.addMany(appsrc, e1, e2, e3, e4);

        // link the elements
        appsrc.link(e1);
        e1.link(e2);
        e2.link(e3);
        e3.link(e4);

        // ================================================
        // connect EOS and ERROR detection to the pipeline
        final Bus bus = pipe.getBus();
        bus.connect((Bus.EOS) (object) -> {
            System.out.println("end-of-stream: " + object);
            Gst.quit();
        });

        bus.connect((Bus.ERROR) (object, code, msg) -> {
            System.out.println("error:" + object + " code:" + code + " " + msg);
            Gst.quit();
        });
        // ================================================

        //
        pipe.play();

        // appsrc.setState(State.PLAYING);
        Gst.main();
    }



Jim Carroll

unread,
Mar 4, 2018, 9:29:12 AM3/4/18
to gstreamer-java
A quick note. After struggling with the memory leak for a day I found an example where someone had the same problem and called the unmap BEFORE pushing the buffer and that seems to have fixed at least that for me.

I'd still like to know what the right way to handle the NEED_DATA size parameter. Also I've found examples where people have called pushBuffer from WITHIN the NEED_DATA and also examples where people did it outside of NEED_DATA. Is one of these approaches wrong? Personally I'd prefer not to write the code required to get data buffers into the NEED_DATA callback if I can just gate another threads pushing the buffer to the AppSrc but I'm not sure the way it's supposed to be done.

Neil C Smith

unread,
Mar 4, 2018, 10:03:33 AM3/4/18
to gstream...@googlegroups.com
Hi,

On Sun, 4 Mar 2018 at 14:29 Jim Carroll <jimfc...@gmail.com> wrote:
A quick note. After struggling with the memory leak for a day I found an example where someone had the same problem and called the unmap BEFORE pushing the buffer and that seems to have fixed at least that for me.

Haha, just spent 10min reading to tell you to try exactly that!  Pushing the buffer takes ownership so the buffer reference is presumably invalid after that - reference management with MiniObject subclasses is less automated than it is for the GObject ones.
 

I'd still like to know what the right way to handle the NEED_DATA size parameter. Also I've found examples where people have called pushBuffer from WITHIN the NEED_DATA and also examples where people did it outside of NEED_DATA. Is one of these approaches wrong? Personally I'd prefer not to write the code required to get data buffers into the NEED_DATA callback if I can just gate another threads pushing the buffer to the AppSrc but I'm not sure the way it's supposed to be done.


I would have though using ENOUGH_DATA might be a better approach in that case?

I've not made any use of AppSrc and it doesn't look like there's been any commits to it since I repackaged everything.  Hopefully what's required is mapped correctly for 1.x.  I'll need it myself shortly so will take more of a look soon.

Best wishes,

Neil
--
Neil C Smith
Artist & Technologist

Praxis LIVE - hybrid visual IDE for creative coding - www.praxislive.org

Jim Carroll

unread,
Mar 4, 2018, 10:23:54 AM3/4/18
to gstreamer-java

Thanks. I kept looking at the map/unmap as if it was reference counting the underlying memory. I assumed unmap would have deleted the underlying memory and therefore I would be passing an invalid buffer to pushBuffer. but then I didn't want a leak so I unmapped it when I was done (after pushing it, as you would thinking in terms of reference counting).

The way I'm doing the gating of the frame source currently is: NEED_DATA sets and AtomicBoolean and ENOUGH_DATA unsets it.
In the thread that's sourcing the frames I simply push entire frame (since not ignoring the size parameter causes problems) as long as the AtomicBoolean is in the NEED_DATA state. Otherwise I spin until it is. This seems to be working.

If anyone is interested I have a "Breakout" of video frames into java.


                                    ------------------------
                                    |        Breakout      |
                                    ------------------------
Source -> Element ... -> Element -> |  AppSink      AppSrc | -> Element ... -> Element -> Sink
                                    |      \          /    |
                                    ------------------------
                                           Java Callback
                                         to process frames

From there you can do what you want with the raw frame data in java.

Neil C Smith

unread,
Mar 4, 2018, 10:38:21 AM3/4/18
to gstream...@googlegroups.com
On Sun, 4 Mar 2018 at 15:23 Jim Carroll <jimfc...@gmail.com> wrote:

Thanks. I kept looking at the map/unmap as if it was reference counting the underlying memory. I assumed unmap would have deleted the underlying memory and therefore I would be passing an invalid buffer to pushBuffer. but then I didn't want a leak so I unmapped it when I was done (after pushing it, as you would thinking in terms of reference counting).

Yes, we need to improve the documentation on our side about this too.  I guess it's kind of ref'ing and unref'ing the memory info rather than the memory itself.  But presumably having that still mapped forces a copy when pushed, hence the memory leak?


The way I'm doing the gating of the frame source currently is: NEED_DATA sets and AtomicBoolean and ENOUGH_DATA unsets it.

Ah, yes, I missed you probably need NEED_DATA to counteract ENOUGH_DATA!
 
If anyone is interested I have a "Breakout" of video frames into java.

If you'd be up for doing a PR for the examples repository that shows a simple example of this, that would be great!

Best wishes,

Neil 

Bob F

unread,
Mar 8, 2018, 11:48:01 AM3/8/18
to gstream...@googlegroups.com
My NEED_DATA callback
1)creates a Buffer
2) map()s it to a ByteBuffer
3) put()s an entire access unit into the ByteBuffer
4) unmap()s it
5) sets things like time stamps (which I had to add to the source code because the APIs haven't been mapped in stock source code)
6) pushBuffer() it into the AppSrc

All this happens inside the callback

I don't care what quantity is specified by the NEED_DATA callback.  I always feed it a complete access unit.  I tried feeding it 4K at a time as requested and performance was terrible.

I can't say this workflow is perfect, because I have a 2-pad pipeline that the gst-rtsp-server framework totally fails to clock correctly and the result is an insane audio pull that somehow results in looping timestamps in RTP.  But that might just be a bug in the RTSP framework.  And the1-pad pipeline works fine (left it running overnight and it was still playing when I got to work the next morning).

Neil C Smith

unread,
Mar 8, 2018, 11:58:13 AM3/8/18
to gstream...@googlegroups.com
On Thu, 8 Mar 2018 at 16:48 Bob F <thot...@gmail.com> wrote:
5) sets things like time stamps (which I had to add to the source code because the APIs haven't been mapped in stock source code)

Please consider a PR for this, thanks!
 
All this happens inside the callback

I more meant that if you were pushing until you were receiving an ENOUGH_DATA then all the NEED_DATA callback *needs* to do is unset your AtomicBoolean flag.  Nothing wrong with doing more in that callback obviously.  Just surprised you need to worry about ENOUGH_DATA at all if you're only pushing when you get the NEED_DATA callback?!

I may have misunderstood the upstream API on this one - not worked with it yet! :-)

Best wishes,

Neil

Bob F

unread,
Mar 8, 2018, 12:11:12 PM3/8/18
to gstream...@googlegroups.com
On Thu, Mar 8, 2018 at 11:58 AM, Neil C Smith <ne...@neilcsmith.net> wrote:

I more meant that if you were pushing until you were receiving an ENOUGH_DATA then all the NEED_DATA callback *needs* to do is unset your AtomicBoolean flag.  Nothing wrong with doing more in that callback obviously.  Just surprised you need to worry about ENOUGH_DATA at all if you're only pushing when you get the NEED_DATA callback?!

Jim is the one watching ENOUGH_DATA.  My application has different needs than Jim's, so I only push buffers when I get NEED_DATA.  Pushing more than that would increase my latency which I am trying to squeeze down to the tiniest possible value.

Jim Carroll

unread,
Mar 8, 2018, 12:24:41 PM3/8/18
to gstreamer-java
Hi Bob,

I now have the push mode switchable. See this class: https://github.com/jimfcarroll/utilities/blob/master/lib-gstreamer/src/main/java/com/jiminger/gstreamer/Breakout.java#L27 (note, it's currently a WIP)

I can manage the output AppSrc with NEED_DATA/ENOUGH_DATA here:

needsData is an AtomicBoolean that acts as a switch gating the pushBuffer call here:

Or it can be set to "BLOCKING" mode where emit-signals is false on the AppSrc so there's no NEED_DATA/ENOUGH_DATA callbacks and the output.pushBuffer blocks. This is now my default but I have them both working.

Neil C Smith

unread,
Mar 8, 2018, 12:48:01 PM3/8/18
to gstream...@googlegroups.com
On Thu, 8 Mar 2018 at 17:11 Bob F <thot...@gmail.com> wrote:
Jim is the one watching ENOUGH_DATA. 

Oops, sorry, not paying enough attention! :-)

Best wishes,

Neil

Manoj Nirala

unread,
Jun 16, 2018, 12:36:31 AM6/16/18
to gstreamer-java
Hi Jim,
           I am also struggling with a similar issue you described above. The following code segment generates an error as given further below. I am calling the below code from the implementation of NEED_DATA(). Putting buf.unmap() after pushBuffer(buf) streams the video despite the below exception, but when I try to put buf.unmap() before pushBuffer(buf), exception goes away and the streaming stops. Could you please point me to a proper solution?
            ////////////////////////////////////////////////////
                Buffer buf = new Buffer(imgBytes.length);
ByteBuffer byteBuffer = ByteBuffer.wrap(imgBytes);
buf.map(true).put(byteBuffer);
buf.setDuration(ClockTime.fromMicros(1000000 / fps));
appSrc.pushBuffer(buf);
buf.unmap();
               ///////////////////////////////////////////////////
JNA: Callback org.freedesktop.gstreamer.elements.AppSrc$3@3abbfa04 threw the following exception:
java.lang.IllegalStateException: Native object has been disposed
at org.freedesktop.gstreamer.lowlevel.NativeObject.handle(NativeObject.java:137)
at org.freedesktop.gstreamer.lowlevel.GTypeMapper$3.toNative(GTypeMapper.java:87)
at com.sun.jna.Function.convertArgument(Function.java:514)
at com.sun.jna.Function.invoke(Function.java:338)
at com.sun.jna.Library$Handler.invoke(Library.java:244)
at com.sun.proxy.$Proxy21.gst_buffer_unmap(Unknown Source)
at org.freedesktop.gstreamer.Buffer.unmap(Buffer.java:158)

Thanks,
Manoj

Jim Carroll

unread,
Jun 16, 2018, 8:41:44 AM6/16/18
to gstreamer-java

Hi Manoj,

I think the problem is that calling pushBuffer needs to be considered a "move" operation. Once appSrc.pushBuffer is called, 'buf' is no longer valid to do anything with. IOW, don't call unmap at all. Once you've passed 'buf' to pushBuffer it's not yours anymore.

Let me know how that goes.

FWIW I found the whole AppSrc/AppSink approach to be problematic and wrote my own plugin called 'Breakout' which takes a callback and gets passed frames. You can find an example here: https://github.com/jimfcarroll/utilities/blob/master/lib-gstreamer/src/test/java/com/jiminger/gstreamer/TestBreakoutPassthrough.java The problem with looking at this is that I also added a 'Builder' pattern and a bunch of 'guard' classes that manage gstreamer resources so if you have any questions, feel free to ask.

Jim

Neil C Smith

unread,
Jun 16, 2018, 10:23:52 AM6/16/18
to gstream...@googlegroups.com


On Sat, 16 Jun 2018, 13:41 Jim Carroll, <jimfc...@gmail.com> wrote:
FWIW I found the whole AppSrc/AppSink approach to be problematic and wrote my own plugin called 'Breakout' which takes a callback and gets passed frames.

Out of interest did you look at using an Identity element for this? 

Jim Carroll

unread,
Jun 16, 2018, 11:08:28 AM6/16/18
to gstreamer-java
Doh!

Manoj Nirala

unread,
Jun 16, 2018, 10:09:14 PM6/16/18
to gstream...@googlegroups.com
Hi Jim,

Thanks a lot for your response. It turns out that not using buf.unmap() stops the streaming of video as well. Of course, the error goes away. On the other hand, using buf.unmap() is allowing the streaming to happen. Please see below the full code in order to better understand the problem. 

Best Regards,
Manoj

/*
 * Used jars: jna-4.4.0.jar; opencv-330.jar; gst1-java-core-0.9.3.jar
 * Gstreamer version - 1.12.4
 * 
 */

import java.nio.ByteBuffer;
import java.util.LinkedList;

import org.freedesktop.gstreamer.Buffer;
import org.freedesktop.gstreamer.Bus;
import org.freedesktop.gstreamer.Caps;
import org.freedesktop.gstreamer.ClockTime;
import org.freedesktop.gstreamer.Element;
import org.freedesktop.gstreamer.ElementFactory;
import org.freedesktop.gstreamer.Format;
import org.freedesktop.gstreamer.Gst;
import org.freedesktop.gstreamer.Message;
import org.freedesktop.gstreamer.Pipeline;
import org.freedesktop.gstreamer.elements.AppSrc;
import org.opencv.core.Core;
import org.opencv.core.Mat;
import org.opencv.imgproc.Imgproc;
import org.opencv.videoio.VideoCapture;
import org.opencv.videoio.Videoio;

import javafx.application.Application;
import javafx.stage.Stage;

public class AppSrcTest3 extends Application {
static {
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
}
private Element convert;
private Element videoRate;
private Element sink;
private Element videoFilter;
private Bus bus;
private Pipeline pipe;
////////// Add AppSrc to replace autoSrc of autovideosrc type
private AppSrc appSrc;
private Element queue;
private Mat matImage;
private VideoCapture capture;
private LinkedList<byte[]> preQueue;
private int sourceWidth;
private int sourceHeight;
private int fps = 30;
private boolean isInited = false;

private Caps videoCaps;
private Caps videoCaps2;
private int SRC_QUEUE_SIZE = 30;

public AppSrcTest3() {
initIfNeeded();
}

public void initIfNeeded() {
if (isInited) {
return;
}
//we are using openCV VideoCapture to provide input frames for appSrc. new VideoCapture(0) launches the default webcam in this case.
capture = new VideoCapture(0);
capture.set(Videoio.CAP_PROP_FRAME_WIDTH, 640);
capture.set(Videoio.CAP_PROP_FRAME_HEIGHT, 480);
capture.set(Videoio.CAP_PROP_FRAME_COUNT, fps);
capture.set(Videoio.CAP_PROP_FORMAT, Videoio.CAP_MODE_BGR);
matImage = new Mat();
///////////////////////////////
preQueue = new LinkedList<byte[]>();
sourceWidth = 640;
sourceHeight = 480;
videoCaps = Caps.fromString("video/x-raw,format=BGRA,width=" + 640 + ", height=" + 480);
appSrc = (AppSrc) ElementFactory.make("appsrc", "appSrc");
appSrc.setLive(true);
appSrc.setFormat(Format.BUFFERS);
// appSrc.setLatency(-1, 0);
appSrc.setSize(-1);
appSrc.setMaxBytes(SRC_QUEUE_SIZE * sourceWidth * sourceHeight * 4);
appSrc.setCaps(videoCaps);

/////////////////////////////
queue = ElementFactory.make("queue", "queue");
videoFilter = ElementFactory.make("capsfilter", "filter");
videoFilter = ElementFactory.make("capsfilter", "filter");
videoRate = ElementFactory.make("videorate", "videoRate");
convert = ElementFactory.make("autovideoconvert", "convert");
sink = ElementFactory.make("autovideosink", "sink");
videoCaps2 = Caps.fromString("video/x-raw,format=BGRA,width=" + 640 + ", height=" + 480);
videoFilter.setCaps(videoCaps2);
videoRate.setCaps(videoCaps);
convert.setCaps(videoCaps);

pipe = new Pipeline();
pipe.addMany(appSrc, queue, convert, sink);
Pipeline.linkMany(appSrc, queue, convert, sink);
/////////////////////////////
bus = pipe.getBus();
bus.connect(new Bus.MESSAGE() {
@Override
public void busMessage(Bus arg0, Message arg1) {
System.out.println(arg1.getStructure());
}
});
appSrc.connect(new AppSrc.NEED_DATA() {
@Override
public void needData(AppSrc appSrc1, int size) {
while (true) {
input(getImageBytes());
}
}
});
isInited = true;
}

public void input(byte[] imageBytes) {
preQueue.add(imageBytes);
byte[] imgBytes = preQueue.removeFirst();
Buffer buf = new Buffer(imgBytes.length);
ByteBuffer byteBuffer = ByteBuffer.wrap(imgBytes);
buf.map(true).put(byteBuffer);
buf.setDuration(ClockTime.fromMicros(1000000 / fps));
appSrc.pushBuffer(buf);
buf.unmap();
/////////////////////
}

public byte[] getImageBytes() {
byte[] imageBytes = null;
if (capture.isOpened()) {
capture.read(matImage);
Imgproc.cvtColor(matImage, matImage, Imgproc.COLOR_BGR2BGRA);
if (!matImage.empty()) {
System.out.println("matImage.channels(): " + matImage.channels());
imageBytes = new byte[matImage.channels() * matImage.cols() * matImage.rows()];
matImage.get(0, 0, imageBytes);
System.out.println(imageBytes.length);
}
}
return imageBytes;
}

@Override
public void start(Stage primaryStage) throws Exception {

}

public static void main(String[] args) {
Gst.init("appsrc", args);
new AppSrcTest3();
}
}


--
You received this message because you are subscribed to the Google Groups "gstreamer-java" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gstreamer-java+unsubscribe@googlegroups.com.
To post to this group, send email to gstreamer-java@googlegroups.com.
Visit this group at https://groups.google.com/group/gstreamer-java.
For more options, visit https://groups.google.com/d/optout.

Jim Carroll

unread,
Jun 17, 2018, 11:41:16 AM6/17/18
to gstreamer-java

First. I'm almost positive you can't unmap the Buffer without causing problems.

Second, it looks like you're in AppSrc "pull mode" and expecting gstreamer to repeatedly call NEED_DATA. I suspect this is your problem. Here is a link to my implementation (though it's tough to follow). 


It optionally works in "pull mode" (I call it "managed") or "push mode" (I call it "blocking"). Notice that in "pull mode" I don't do any actual pushing of data from within the NEED_DATA callback. I just set a flag that tells some other area of the code to start pushing:


 ... which it does repeatedly until an ENOUGH_DATA signal is sent at which point I unset the flag:


... and stop the pushing until another NEED_DATA signal is hit. 

Here is the place I push the buffer in "managed" mode:


The method this line is in is called when I have data to send. In this case it's being "sourced" from an AppSink.

One more note according to the docs if you really want NEED_DATA to gate when you push:

"The pull mode, in which the need-data signal triggers the next push-buffer call. [...] In this mode, a buffer of exactly the amount of bytes given by the need-data signal should be pushed into appsrc."

It doesn't appear you're even looking at the size parameter of the callback.

To unsubscribe from this group and stop receiving emails from it, send an email to gstreamer-jav...@googlegroups.com.
To post to this group, send email to gstream...@googlegroups.com.

Manoj Nirala

unread,
Jun 21, 2018, 11:56:36 PM6/21/18
to gstream...@googlegroups.com
Hi Jim,

Thanks for your code (it was really not too straightforward as you pointed out). Based on your advice, I took out the pushbuffer part from the NEED_DATA callback, and the exception went away. The display is now happening correctly but with following error (repeated multiple times):

(javaw.exe:6584): GStreamer-CRITICAL **: gst_segment_to_stream_time: assertion 'segment->format == format' failed

It appears that I still need to put the input() function in some kind of a call back. Not only am I getting the above error, but I am also not able to write a video file by using filesink. The video file gets created but nothing gets written.

Also, your code uses flags in NEED_DATA and ENOUGH_DATA to push buffer only in Managed mode and not in Blocking mode. Hence we did not incorporate that. There are a few unsolved pieces still remaining in this puzzle:

1) I see that gst1-java-core has a listener named PushBuffer(). What is it used for? There is no documentation available.

2) How can we set the appsrc to write a file. The pipeline I implemented was appsrc ! queue2 ! videorate ! videoconvert ! jpegenc ! avimux ! filesink.

3) I found some implementations in C that uses MainLoop(). Is that a viable approach? How do we then tell the MainLoop object in Java to associate itself with the input function.

Attached below is our updated code that throws the above error.


Thanks and Regards,
Manoj 

//////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////


/*
 * Used jars: jna-4.4.0.jar; opencv-330.jar; gst1-java-core-0.9.3.jar, Gstreamer version - 1.12.4
 * 
 */


public class WorkingAppSrc extends Application {
static {
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
}
private Element convert;
private Element videoRate;
private Element sink;
private Element videoFilter;
private Bus bus;
private Pipeline pipe;
////////// Add AppSrc to replace autoSrc of autovideosrc type
private AppSrc appSrc;
private Element queue;
private Mat matImage;
private VideoCapture capture;
private LinkedList<byte[]> preQueue;
private int sourceWidth;
private int sourceHeight;
private int fps = 30;
private boolean isInited = false;
private boolean quit = false;

private Caps videoCaps;
private Caps videoCaps2;
private int SRC_QUEUE_SIZE = 30;
private static final AppAPI gst() { return AppAPI.APP_API; }

public WorkingAppSrc() {
initIfNeeded();
}

public void initIfNeeded() {
if (isInited) {
return;
}
//we are using openCV VideoCapture to provide input frames for appSrc. new VideoCapture(0) launches the default webcam in this case.
Gst.init();
///////////////////////////////
preQueue = new LinkedList<byte[]>();
sourceWidth = 640;
sourceHeight = 480;
videoCaps = Caps.fromString("video/x-raw,format=BGR,width=" + 640 + ", height=" + 480);
appSrc = (AppSrc) ElementFactory.make("appsrc", "appSrc");
appSrc.setLive(true);
appSrc.setFormat(Format.BUFFERS);
appSrc.setLatency(-1, 0);
appSrc.setSize(-1);
appSrc.setMaxBytes(SRC_QUEUE_SIZE * sourceWidth * sourceHeight * 3);
appSrc.setCaps(videoCaps);
appSrc.setStreamType(AppSrc.Type.STREAM);
/////////////////////////////
queue = ElementFactory.make("queue", "queue");
videoFilter = ElementFactory.make("capsfilter", "filter");
videoFilter = ElementFactory.make("capsfilter", "filter");
videoRate = ElementFactory.make("videorate", "videoRate");
convert = ElementFactory.make("autovideoconvert", "convert");
sink = ElementFactory.make("glimagesink", "sink");
videoCaps2 = Caps.fromString("video/x-raw,format=BGR,width=" + 640 + ", height=" + 480);
videoFilter.setCaps(videoCaps2);
pipe = new Pipeline();
pipe.addMany(appSrc, queue, convert, sink);
Pipeline.linkMany(appSrc, queue, convert, sink);
/////////////////////////////
bus = pipe.getBus();
bus.connect(new Bus.MESSAGE() {
@Override
public void busMessage(Bus arg0, Message arg1) {
System.out.println("Bus Message: "+arg1.getStructure());
}
});
bus.connect(new Bus.EOS() {
            @Override
            public void endOfStream(GstObject source) {
                System.out.println("Reached end of stream");
                quit = true;
            }
        });

        bus.connect(new Bus.ERROR() {
            @Override
            public void errorMessage(GstObject source, int code, String message) {
                System.out.println("Error detected");
                System.out.println("Error source: " + source.getName());
                System.out.println("Error code: " + code);
                System.out.println("Error Message: " + message);
                quit = true;
            }
        });
capture = new VideoCapture(0);
capture.set(Videoio.CAP_PROP_FRAME_WIDTH, 640);
capture.set(Videoio.CAP_PROP_FRAME_HEIGHT, 480);
capture.set(Videoio.CAP_PROP_FRAME_COUNT, fps);
capture.set(Videoio.CAP_PROP_FORMAT, Videoio.CAP_MODE_BGR);
matImage = new Mat();
appSrc.connect(new AppSrc.NEED_DATA() {
@Override
public void needData(AppSrc appSrc1, int size) {
}
});
while (!quit) {
input(getImageBytes());
}
pipe.stop();
        Gst.deinit();
        Gst.quit();
}

public void input(byte[] imageBytes) {
preQueue.add(imageBytes);
byte[] imgBytes = preQueue.removeFirst();
Buffer buf = new Buffer(imgBytes.length);
ByteBuffer byteBuffer = ByteBuffer.wrap(imgBytes);
buf.map(true).put(byteBuffer);
//buf.setDuration(ClockTime.fromMicros(1000000 / fps));
appSrc.pushBuffer(buf);
//buf.unmap();
/////////////////////
}

public byte[] getImageBytes() {
byte[] imageBytes = null;
if (capture.isOpened()) {
capture.read(matImage);
//Imgproc.cvtColor(matImage, matImage, Imgproc.COLOR_BGR2BGRA);
if (!matImage.empty()) {
System.out.println("matImage.channels(): " + matImage.channels());
imageBytes = new byte[matImage.channels() * matImage.cols() * matImage.rows()];
matImage.get(0, 0, imageBytes);
System.out.println(imageBytes.length);
}
}
return imageBytes;
}

@Override
public void start(Stage primaryStage) throws Exception {

}

public static void main(String[] args) {
new WorkingAppSrc();
}
}

To unsubscribe from this group and stop receiving emails from it, send an email to gstreamer-java+unsubscribe@googlegroups.com.
To post to this group, send email to gstreamer-java@googlegroups.com.

Manoj Nirala

unread,
Jun 22, 2018, 12:17:04 AM6/22/18
to gstream...@googlegroups.com
Error correction:

**** 1) I see that gst1-java-core has a listener named PushBuffer(). What is it used for? There is no documentation available.
please read PUSH_BUFFER in place of PushBuffer();

Thanks & Regards,
Manoj
Reply all
Reply to author
Forward
0 new messages