Gstreamer-java with opencv

60 views
Skip to first unread message

Noux Ha

unread,
May 11, 2023, 5:25:48 PM5/11/23
to gstreamer-java
Hello,
I am handling the AppSink.NEW_SAMPLE to get the images from rtsp stream by using the function pullSample(), and I am using the function sample.getBuffer() to get a bufferedImage.
I had a model that uses opencv to process and analyze the images, I transformed the BufferedImage to opencv.core.Mat, but what I want is to get Mat directly from the appsink to avoid memory leak that reasoned by creating BufferedImage.
How should I do that?

Benjamin Telford

unread,
May 11, 2023, 5:38:14 PM5/11/23
to gstream...@googlegroups.com
Hello, Noux Ha

You can avoid creating a BufferedImage by directly using the Mat object in your OpenCV processing by converting the Gst.Buffer object to a numpy array and then to a cv2 image. Here is a code snippet that can help you achieve this:

def on_new_sample(sink):
    sample = sink.emit("pull-sample")
    buf = sample.get_buffer()
    img_format = {4: 'RGBA', 3: 'RGB'}
    if buf:
        info = buf.get_info()
        print(info)
        data = buf.extract_dup(0, buf.get_size())
        rows = info.get("height")
        cols = info.get("width")
        stride = info.get("stride")
        np_array = np.ndarray(
            (rows, cols),
            buffer=data,
            dtype=np.uint8,
            strides=stride
        )
        img = cv2.cvtColor(np_array, cv2.COLOR_GRAY2RGB)
        #Process the img using OpenCV

In this code snippet, we first extract the data from the Gst.Buffer object using the extract_dup() method and store it in a numpy array. We then create a Mat object using cv2.cvtColor() method and the numpy array. This Mat object can then be used in your OpenCV processing.
I hope this helps!

Benjamin Telford

--
You received this message because you are subscribed to the Google Groups "gstreamer-java" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gstreamer-jav...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/gstreamer-java/417500f7-8d88-4721-b1d3-36fbce7e4bc8n%40googlegroups.com.

Noux Ha

unread,
May 11, 2023, 6:12:12 PM5/11/23
to gstreamer-java
Hello  Benjamin,
Thanks for replying...
I am using the wrapper java of gstreamer and not python code :(

Neil C Smith

unread,
May 11, 2023, 6:22:55 PM5/11/23
to gstream...@googlegroups.com


On Thu, 11 May 2023, 23:12 Noux Ha, <nou...@gmail.com> wrote:
Hello  Benjamin,
Thanks for replying...
I am using the wrapper java of gstreamer and not python code :(

Well, as this is the email list for the Java bindings, one would hope that was obvious! :-)

What is the opencv code requiring?

You might take a look at how the JavaFX sink works. This passes the native buffer pointer (via direct bytebuffer) straight to JavaFX. You need to make sure you don't release the buffer and sample back to GStreamer while the memory might be accessed elsewhere.


Best wishes,

Neil

Noux Ha

unread,
May 11, 2023, 7:00:16 PM5/11/23
to gstreamer-java
Hello Neil,
Here is my code:

AppSink appSink = (AppSink) pipeline.getElementByName("sink");
appSink.set("emit-signals", true);

StringBuilder capsBuilder = new StringBuilder("video/x-raw,pixel-aspect-ratio=1/1,");
if (ByteOrder.nativeOrder() == ByteOrder.LITTLE_ENDIAN)
capsBuilder.append("format=BGRx");
else
capsBuilder.append("format=xRGB");
appSink.setCaps(new Caps(capsBuilder.toString()));

appsink.connect((AppSink.NEW_SAMPLE) s -> {
Sample sample = s.pullSample();
Caps caps = sample.getCaps();
Structure struct  = caps.getStructure(0);
int width = struct.getInteger("width");
int height = struct.getInteger("height");
Buffer buffer = sample.getBuffer();
java.nio.ByteBuffer byteBuffer = buffer.map(false);
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
int[] pixels = ((DataBufferInt) image.getRaster().getDataBuffer()).getData();
byteBuffer.asIntBuffer.get(pixels, 0, width * height);
Mat frame = bufferedImageToMat(image);

// process frame

frame.release();
sample.dispose();
buffer.unmap();

return FlowReturn.OK;
});

public Mat bufferedImageToMat(BufferedImage image) {
BufferedImage imgout = new BufferedImage(image.getWidth(), image.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
imgout.getGraphics().drawImage(image, 0, 0, null);
Mat mat new Mat(imgout.getHeight(), imgout.getWidth(), cvType.CV_8UC3);
DataBuffer dataBuffer = imgout.getRaster().getDataBuffer();
byte[] data = ((DataBufferByte) db).getData();
mat.put(0, 0, data);
retuen mat;
}



My goal is to get Mat directly without using BufferedImage.
Thanks

Neil C Smith

unread,
May 12, 2023, 3:57:57 AM5/12/23
to gstream...@googlegroups.com
On Fri, 12 May 2023 at 00:00, Noux Ha <nou...@gmail.com> wrote:
> Here is my code:
...

Without going in to depth, firstly look to get the GStreamer caps to
give you the data in the format you need - eg. RGB rather than BGRx ?
Be careful with endian-ness - you're swapping the byte order but I'm
not sure why.

https://gstreamer.freedesktop.org/documentation/additional/design/mediatype-video-raw.html?gi-language=c

Secondly, look for a constructor, or other code, in the opencv
bindings that can accept a ByteBuffer directly. Or failing that a
native address, which you can extract from the ByteBuffer using JNA's
Native class.

If this is for a commercial project, you can contact me off list for
paid support. Unfortunately I don't have the free time to delve
further into this at the moment.

Best wishes,

Neil


--
Neil C Smith
Codelerity Ltd.
www.codelerity.com

Codelerity Ltd. is a company registered in England and Wales
Registered company number : 12063669
Registered office address : Office 4 219 Kensington High Street,
Kensington, London, England, W8 6BD
Reply all
Reply to author
Forward
0 new messages