Creating movie file...

184 views
Skip to first unread message

acolubri

unread,
Feb 17, 2008, 2:47:30 PM2/17/08
to gstreamer-java
Ok, a new question...

Now I want to reimplement the MovieMaker class found in the video
library of processing (http://processing.org/reference/libraries/video/
MovieMaker.html)

So, what I need to do with gstreamer is to create a pipeline that has
a file writer sink, right?
But the source of the frames should be what the user grabs from the
screen. In other words, the API of the MovieMaker object class
includes a function addFrame(), which should send a new frame down the
pipeline, obtained from the screen pixels.

How can I define a gstreamer source that creates frames from image
buffers I send manually to it?

Andres

Wayne Meissner

unread,
Feb 19, 2008, 8:51:23 PM2/19/08
to gstream...@googlegroups.com
Have a look at using a "fakesrc" or an "identity" element. There's an
example in FakeSrcTest.java, which just fills the buffer with a solid
colour before its sent down the pipeline.

Alternatively, have a look at the CustomSrc class - it allows you to
create real Element subclasses that can be plugged into a pipeline.
But, its highly experimental, and might not work on windows (there was
some difference in the structure sizes).

You probably want to use a BlockingQueue or something to shuffle data
from addFrame() to the fakesrc element. It should be safe to have the
handoff listener block, waiting until the queue has a frame in it.

acolubri

unread,
Feb 20, 2008, 3:53:56 PM2/20/08
to gstreamer-java
Great, thanks for the tips. I'll start working on this very soon.

On Feb 19, 5:51 pm, "Wayne Meissner" <wmeiss...@gmail.com> wrote:
> Have a look at using a "fakesrc" or an "identity" element. There's an
> example in FakeSrcTest.java, which just fills the buffer with a solid
> colour before its sent down the pipeline.
>
> Alternatively, have a look at the CustomSrc class - it allows you to
> create real Element subclasses that can be plugged into a pipeline.
> But, its highly experimental, and might not work on windows (there was
> some difference in the structure sizes).
>
> You probably want to use a BlockingQueue or something to shuffle data
> from addFrame() to the fakesrc element. It should be safe to have the
> handoff listener block, waiting until the queue has a frame in it.
>

acolubri

unread,
Feb 22, 2008, 1:19:30 AM2/22/08
to gstreamer-java
Ok, I've been using the FakeTestSrc example (I also found this
article: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/section-data-spoof.html
on the documentation of gstreamer).

The code I have is the following:

final int w =320, h =240;
gpipe = new Pipeline("GSMovieMaker");

fakesrc = ElementFactory.make("fakesrc", "source");
Element flt = ElementFactory.make("capsfilter", "flt");
Element conv = ElementFactory.make("ffmpegcolorspace", "conv");
Caps fltcaps = new Caps("video/x-raw-rgb, width=" + w + ", height=" +
h + ", bpp=16, depth=16, framerate=25/1");
flt.setCaps(fltcaps);

... (declaration of videosink here, I don't think it matters, just an
RGBDataSink)

fakesrc.link(flt, conv, videosink);
gpipe.addMany(fakesrc, flt, conv, videosink);

fakesrc.set("signal-handoffs", true);
fakesrc.set("sizemax", w * h * 2);
fakesrc.set("sizetype", 2);
fakesrc.connect(new Element.HANDOFF() {
byte color = 0;
byte[] data = new byte[w * h * 2];
public void handoff(Element element, Buffer
buffer, Pad pad) {
printMsg(color);
Arrays.fill(data, color++);
buffer.getByteBuffer().put(data, 0,
data.length);
}
});

gpipe.setState(State.PLAYING);


The method printMsg(color) just prints the value of color.

From what I understand, the handoff(Element element, Buffer buffer,
Pad pad) method should be called 25 times per second since
"framerate=25/1" is set up in the Caps. Am I right?

However, it is called just once (during initialization). Any comments
on this?

Thanks again,
Andres

Wayne Meissner

unread,
Feb 23, 2008, 3:16:36 AM2/23/08
to gstream...@googlegroups.com
Hmm. I don't know. I know fakesrc has been temperamental for me at
times, but your code looks the same as the FakeSrcTest code, so I
don't understand why it doesn't work.

Make sure you're keeping a reference to gpipe around, so its not
getting disposed. Sounds dumb, but I've forgotten this myself.

There's also a lot of cruft in FakeSrcTest - leftover from when I knew
less about gstreamer, and it was a pretty straightforward conversion
from that example.

The 'conv' element is not needed - RGBDataSink includes its own colour
converter.

Do the gpipe.addMany() before linking the elements together. Thats
the proper order, and I think FakeSrcTest had it the wrong way around.

Have a look at the updated FakeSrcTest code (in mercurial) - I now do
rate limiting with a videorate element. Besides pushing it and a
capsfilter for it in the pipeline before the video sink), you need to
add:

fakesrc.set("sync", true);
fakesrc.set("is-live", true);
fakesrc.set("filltype", 1); // Don't fill the buffer
before handoff

(the last line is there for efficiency)

The videorate stuff is:
final Element videorate = ElementFactory.make("videorate", "videorate");
final Element ratefilter = ElementFactory.make("capsfilter",
"RateFilter");
// limit frame rate to 2 frames per second
ratefilter.setCaps(Caps.fromString("video/x-raw-rgb, framerate=2/1"));

And you link the whole thing together with:

pipeline.addMany(fakesrc, srcfilter, videorate, ratefilter, videosink);
Element.linkMany(fakesrc, srcfilter, videorate, ratefilter, videosink);

acolubri

unread,
Feb 23, 2008, 3:48:46 PM2/23/08
to gstreamer-java
Ok, now seems to be working. The rate set by
ratefilter.setCaps(Caps.fromString("video/x-raw-rgb, framerate=2/1"));
actually affects the frequency with which new frames are sent out by
the RGBDataSink element through the rgbFrame method.

However, it seems that the handoff function in the videsrc gets called
much more often. Something like 600 times per second. Is this ok?
Even though there is this line:
Caps fltcaps = new Caps("video/x-raw-rgb, width=" + w + ", height=" +
h + ", bpp=16, depth=16, framerate=25/1");
What is the effect of this framerate value?
> On 22/02/2008, acolubri <andres.colu...@gmail.com> wrote:
>
>
>
> > Ok, I've been using the FakeTestSrc example (I also found this
> > article:http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/...

Wayne Meissner

unread,
Feb 24, 2008, 1:02:01 AM2/24/08
to gstream...@googlegroups.com
On 24/02/2008, acolubri <andres....@gmail.com> wrote:

> However, it seems that the handoff function in the videsrc gets called
> much more often. Something like 600 times per second. Is this ok?

Make sure you're setting the "sync" property on the fakesrc to true.
I think you can also fiddle with properties on the videorate element
(not caps on it), to change how many buffers (video frames) it will
queue up internally.

> Even though there is this line:
>
> Caps fltcaps = new Caps("video/x-raw-rgb, width=" + w + ", height=" +
> h + ", bpp=16, depth=16, framerate=25/1");
>
> What is the effect of this framerate value?

I'm not sure. Other than that example program, I haven't used the
fakesrc element for anything, so you know as much as I do. I vaguely
remember that if you don't put it in, it doesn't work at all.

I kinda understand now why the gstreamer guys say you're on your own
if you use fakesrc or fakesink for injecting/extrating data - they
just don't seem to behave quite right.

You could try creating a src with CustomSrc - but as I said, that
stuff is pretty experimental still, and I was hoping not to solidify
it until post 1.0

acolubri

unread,
Mar 1, 2008, 3:24:19 PM3/1/08
to gstreamer-java
The fakesrc thing is working. I can direct the buffers I'm injecting
with the handoff method back to the screen with RGBDataSink, and I see
the blinking colors. So this is good.

The other part, which is writing to a file, is also working, using
videotestsrc as the video source and the following pipeline:

final Element fakesrc = ElementFactory.make("videotestsrc",
"source");
final Element srcfilter = ElementFactory.make("capsfilter",
"srcfilter");
Caps fltcaps = new Caps("video/x-raw-yuv width=" + movieWidth + ",
height=" + movieHeight + ", bpp=16, depth=16, framerate=25/1");
srcfilter.setCaps(fltcaps);
final Element videorate = ElementFactory.make("videorate",
"videorate");
final Element ratefilter = ElementFactory.make("capsfilter",
"RateFilter");
ratefilter.setCaps(Caps.fromString("video/x-raw-yuv,
framerate=25/1"));

Element videoenc = ElementFactory.make("theoraenc", "theoraenc");
Element muxer = ElementFactory.make("oggmux", "oggmux");
Element videosink = ElementFactory.make("filesink", "sink");
videosink.set("location", filename);

gpipe.addMany(fakesrc, srcfilter, videorate, ratefilter, videoenc,
muxer, videosink);
Element.linkMany(fakesrc, srcfilter, videorate, ratefilter, videoenc,
muxer, videosink);

I'm using video/x-raw-yuv instead of video/x-raw-rgb because I found
out online that the theora codec requires the stream to be in yuv
format.

So, buffer injection and output to file are working both on Linux and
Windows (unfortunately, I haven't had time so far to make any test on
OSX). However, when I try to combine these two things, i.e.: writing
the buffers to a video file, the resulting file is always empty
(again, both on windows and linux, so it doesn't seem to be a platform
dependent problem). What I've done is to use:

final Element fakesrc = ElementFactory.make("fakesrc", "source");
fakesrc.set("signal-handoffs", true);
fakesrc.set("sizemax", movieWidth * movieHeight * 2);
fakesrc.set("sizetype", 2);
fakesrc.set("sync", true);
fakesrc.set("is-live", true);
fakesrc.set("filltype", 1); // Don't fill the buffer before
handoff
fakesrc.connect(new Element.HANDOFF() {
byte color = 0;
byte[] data = new byte[movieWidth * movieHeight *
2];
public void handoff(Element element, Buffer buffer,
Pad pad) {
Arrays.fill(data, color++);
buffer.getByteBuffer().put(data, 0, data.length);
}
});

as the video source. From the tests, seems that for some reason the
handoff method gets called only once, but no idea why. Interestingly
enough, when I replace video/x-raw-yuv back to video/x-raw-rgb,
handoff is called twice.
Any suggestions?


On Feb 23, 10:02 pm, "Wayne Meissner" <wmeiss...@gmail.com> wrote:

Wayne Meissner

unread,
Mar 1, 2008, 8:42:18 PM3/1/08
to gstream...@googlegroups.com
I don't know what the problem is, but there are a couple of things to
try for debugging:

1) Have a look at the various --gst-debug= flags and see if they
produce some useful output.

2) Insert an 'identity' element just before the theoraenc, the oggmux
and the filesink with a handoff callback that just prints some Buffer
details (offset, etc), and which sink the buffer is about to hit.

3) Try inserting a 'ffmpegcolorspace' element before the theoraenc. I
_think_ that should convert between the output of the fakesrc and what
theoraenc wants automagically (no need to put a capsfilter after the
ffmpegcolorspace).

4) Try dropping the 'depth=16, bpp=16' off the fakesrc caps. This
would likely confuse the way you're injecting data though, so leave it
until last.

acolubri

unread,
Mar 2, 2008, 3:07:44 PM3/2/08
to gstreamer-java
Yes, ffmpegcolorspace made the trick.It is (almost) working now!

I also had to use a SynchronousQueue to do the buffer passing between
Processing and the fakesrc. Works fine too.

There is only one puzzling aspect. The pixels from Processing come in
a integer array where each int represents a color in ARGB format. So I
changed the maximum size of the fakesrc element to
fakesrc.set("sizemax", width * height * 4);
I do the conversion from the original int array into a byte array,
which is fed into the gstreamer buffer.
I also changed the bpp and depth in the source filter to 32 and 24,
respectively.

However, the pipeline refuses to work with bpp=32 and depth=24 (the
Queue hangs when polling for the incoming buffer). I have to keep
using bpp=16 and depth=16. The video file gets created and plays fine,
but the colors and resolution are wrong, clearly the incoming byte
buffer is interpreted as having (size width * height * 2), while in
reality has (width * height * 4) elements.

I don't see right now any "logic" bug, except that I have to use
bpp=16 and depth=16, which seems wrong to me. Do you have any
recommendations about this?


On Mar 1, 5:42 pm, "Wayne Meissner" <wmeiss...@gmail.com> wrote:
> I don't know what the problem is, but there are a couple of things to
> try for debugging:
>
> 1) Have a look at the various --gst-debug= flags and see if they
> produce some useful output.
>
> 2) Insert an 'identity' element just before the theoraenc, the oggmux
> and the filesink with a handoff callback that just prints some Buffer
> details (offset, etc), and which sink the buffer is about to hit.
>
> 3) Try inserting a 'ffmpegcolorspace' element before the theoraenc. I
> _think_ that should convert between the output of the fakesrc and what
> theoraenc wants automagically (no need to put a capsfilter after the
> ffmpegcolorspace).
>
> 4) Try dropping the 'depth=16, bpp=16' off the fakesrc caps. This
> would likely confuse the way you're injecting data though, so leave it
> until last.
>
> ...
>
> read more »

Wayne Meissner

unread,
Mar 2, 2008, 4:57:48 PM3/2/08
to gstream...@googlegroups.com
That one has me beat. I tried changing to bpp=32, depth=24 in
FakeSrcTest and it didn't work at all either.

I wonder if dropping the bpp=xx, depth=xx bit from the srcfilter after
fakesrc and adding another ffmpegcolorspace + capsfilter with those
parameters after it would work?

i.e. add it after srcfilter, so the pipeline is:
fakesrc, srcfilter, bppcolorspace, bppfilter, ... rest of the pipeline as is

on bppfilter, just set the media type, width, height, bpp, depth

I have no idea if that will make any difference though.

acolubri

unread,
Mar 2, 2008, 6:20:27 PM3/2/08
to gstreamer-java
mmmh, I've just tried what you suggests, and still doesn't work...

Well, perhaps is a bug in fakesrc? I kind of clueless right now...

I think I'll post a question on the gstremer-devel mailing list.

On Mar 2, 1:57 pm, "Wayne Meissner" <wmeiss...@gmail.com> wrote:
> That one has me beat. I tried changing to bpp=32, depth=24 in
> FakeSrcTest and it didn't work at all either.
>
> I wonder if dropping the bpp=xx, depth=xx bit from the srcfilter after
> fakesrc and adding another ffmpegcolorspace + capsfilter with those
> parameters after it would work?
>
> i.e. add it after srcfilter, so the pipeline is:
> fakesrc, srcfilter, bppcolorspace, bppfilter, ... rest of the pipeline as is
>
> on bppfilter, just set the media type, width, height, bpp, depth
>
> I have no idea if that will make any difference though.
>
> ...
>
> read more »

Wayne Meissner

unread,
Mar 2, 2008, 10:42:10 PM3/2/08
to gstream...@googlegroups.com
I think the real problem is that neither of us understands the way
data flows in gstreamer - it is apparently as simple as it appears
from the gstreamer fakesrc example.

Fakesrc itself doesn't care about bpp, depth, or any of those things -
those properties are just set on the buffer that is sent out.

From digging around in fakesrc, basesrc and videotestsrc, for video
frames, we might need to change the format from 'bytes' to 'time', so
the segment-start events get sent out correctly. I'll probably have
to expose some api to get that done. Why 16bit bpp/depth works
without this is a bit of a mystery.

On 03/03/2008, acolubri <andres....@gmail.com> wrote:
>

acolubri

unread,
Mar 3, 2008, 12:02:14 PM3/3/08
to gstreamer-java
> I think the real problem is that neither of us understands the way
> data flows in gstreamer

Me particularly :-) But I'm starting to get an superficial idea of
how gstreamer works by constructing all these pipelines.

> From digging around in fakesrc, basesrc and videotestsrc, for video
> frames, we might need to change the format from 'bytes' to 'time', so
> the segment-start events get sent out correctly. I'll probably have
> to expose some api to get that done.

Cool, keep me posted about this.

Another option would be to ditch fakesrc altogether and start using
this new element GstAppSrc that is being mentioned as much better
suited for injecting buffers into the pipeline.
You also mentioned a CustomSrc class which is part of gstreamer-java
already, right? Is it the same thing as GstAppSrc?

However, the code I wrote based on fakesrc is so close to work ok...
all it is needed is to fix these issues with the buffers

Wayne Meissner

unread,
Mar 3, 2008, 6:47:26 PM3/3/08
to gstream...@googlegroups.com
On 04/03/2008, acolubri <andres....@gmail.com> wrote:

> > From digging around in fakesrc, basesrc and videotestsrc, for video
> > frames, we might need to change the format from 'bytes' to 'time', so
> > the segment-start events get sent out correctly. I'll probably have
> > to expose some api to get that done.
>
>
> Cool, keep me posted about this.

The only bit I think that is needed is BaseSrc.setFormat().
I added a FakeSrc class that is returned when you get a "fakesrc"
element from a factory,
so you can do something like:

FakeSrc fakesrc = (FakeSrc) ElementFactory.make("fakesrc", "my_src");
fakesrc.setFormat(Format.TIME);

>
> Another option would be to ditch fakesrc altogether and start using
> this new element GstAppSrc that is being mentioned as much better
> suited for injecting buffers into the pipeline.
> You also mentioned a CustomSrc class which is part of gstreamer-java
> already, right? Is it the same thing as GstAppSrc?

CustomSrc isn't quite the same as GstAppSrc. GstAppSrc allows you to
shove buffers at it, which it then queues up and sends out on the
pipeline.

CustomSrc doesn't handle any of that - its just a thin layer to enable
you do write gstreamer source elements in java, where you want more
control than fakesrc gives you.

The downside of CustomSrc is that its pretty experimental, and it has
problems on windows (different struct sizes somewhere). I haven't
bothered to fix it, because it hasn't been a high priority ... until
now.

>
> However, the code I wrote based on fakesrc is so close to work ok...
> all it is needed is to fix these issues with the buffers

It might be an idea to dig around in the gstvideotestsrc.c source, and
see if you can make sense of what it does.

Maybe a port of gstvideotestsrc to java using CustomSrc and/or FakeSrc
would be an idea. At least it would help to figure out how to write a
video source correctly.

acolubri

unread,
Mar 7, 2008, 12:20:59 PM3/7/08
to gstreamer-java


> The only bit I think that is needed is BaseSrc.setFormat().
> I added a FakeSrc class that is returned when you get a "fakesrc"
> element from a factory,
> so you can do something like:
>
> FakeSrc fakesrc = (FakeSrc) ElementFactory.make("fakesrc", "my_src");
> fakesrc.setFormat(Format.TIME);

This funcionality is available on the CVS version, right?

Wayne Meissner

unread,
Mar 7, 2008, 1:24:17 PM3/7/08
to gstream...@googlegroups.com

Yeh, its in hg. Make sure you do hg pull -u if you're doing it on the
command line.

The parameters we set on the Buffer seem to affect things. I just got
ReadableByteStreamSrc working by doing:

Buffer.setTimestamp(ClockTime.NONE);

The only way I found that was by reading the fdsrc source.

acolubri

unread,
Mar 9, 2008, 4:35:48 PM3/9/08
to gstreamer-java
I tried a few things with fakesrc, but with no luck. Setting format to
Format.TIME, then playing with the parameters of Buffer. I started
with timestamp to ClockTime.NONE. Then, after looking at the source of
videotestsrc, I did:

buffer.setTimestamp(running_time);
buffer.setOffset(n_frames);
n_frames++;
buffer.setLastOffset(n_frames);
Double nsecs = n_frames.doubleValue() / frameRate;
Long nanos = nsecs.longValue() * NANOS_PER_SECOND;
ClockTime next_time = ClockTime.fromNanos(nanos);
ClockTime diff = ClockTime.fromNanos(next_time.toNanos() -
running_time.toNanos());
buffer.setDuration(diff);
running_time = next_time;

inside the handoff method. I think these should correspond to the
appropriate timing parameters...

I also different combinations for the caps in the pipeline, but didn't
work.

It seems that irrespective of anything I do with the parameters, the
handoff method in fakesrc stops being called when I set bpp=32 and
depth=24 in the caps.

I'm giving up on fakesrc for now. I think I'll start with CustomSrc.
Or perhaps writing my own input source element :-)
Do you have any simple example on how to build a pipeline using
CustomSrc?


On Mar 7, 11:24 am, "Wayne Meissner" <wmeiss...@gmail.com> wrote:

Wayne Meissner

unread,
Mar 9, 2008, 7:05:56 PM3/9/08
to gstream...@googlegroups.com
On 10/03/2008, acolubri <andres....@gmail.com> wrote:

> I'm giving up on fakesrc for now. I think I'll start with CustomSrc.
> Or perhaps writing my own input source element :-)

Yeh, thats fun ;-)

CustomSrc is an effort to avoid having to muck with Pointers and
Structures full of callbacks.

In theory, all you have to do is over-ride the appropriate src*()
method of CustomSrc in a subclass. Have a look at how its done in
ReadableByteChannelSrc.java - be aware that its still prototype code,
so its not commented much/at all.

I'd prefer to improve CustomSrc, rather than having code out there
that directly fiddles with the structures/callbacks, since that stuff
is just nasty.


In either case, you want to go read the gstreamer plugin writers
guide, and/or some more src elements from gstreamer. There seems to
be a few things which aren't documented, but are some sort of
knowledge you pick up via osmosis.


> Do you have any simple example on how to build a pipeline using
> CustomSrc?

You need to look at two things:

1) The class that defines the new source. Unlike using fakesrc, where
you just set properties on it, with CustomSrc, you need to extend
CustomSrc and over-ride methods. This hooks up your CustomSrc
subclass as a proper gstreamer source element.

You get a lot more control over what happens with your src element -
caps negotiation, setting caps, getting caps, creating buffers, etc.
There might still be a few methods I haven't exposed yet.

2) You just add it to a pipeline like any other src element. Look at
InputStreamSrcTest.java.

acolubri

unread,
Mar 26, 2008, 11:56:31 PM3/26/08
to gstreamer-java
Hi,

From the code examples I came up with this source element descending
from CustomSrc:

public class GSBufferSrc extends CustomSrc {
private byte[] bufferPixels;
private int waitTime;
private SynchronousQueue<ByteBuffer> synchQueue;

public GSBufferSrc(SynchronousQueue<ByteBuffer> queue, int t, int w,
int h, String name) {
super(GSBufferSrc.class, name);

synchQueue = queue;
waitTime = t;
bufferPixels = new byte[w * h * 4];
}

private void readFully(Buffer buffer) throws IOException {
try {
ByteBuffer poll = (ByteBuffer)synchQueue.poll(waitTime,
TimeUnit.MILLISECONDS);

poll.rewind();
poll.get(bufferPixels);

buffer.getByteBuffer().put(bufferPixels, 0, bufferPixels.length);

buffer.setOffset(offset);
buffer.setLastOffset(bufferPixels.length);
buffer.setTimestamp(ClockTime.NONE);
} catch (InterruptedException ie) {
ie.printStackTrace();
} catch (NullPointerException ne) {
ne.printStackTrace();
}
}

@Override
protected FlowReturn srcFillBuffer(long offset, int size, Buffer
buffer) {
try {
readFully(buffer);
return FlowReturn.OK;
} catch (IOException ex) {
return FlowReturn.UNEXPECTED;
}
}

@Override
public boolean srcIsSeekable() {
return false;
}

@Override
protected boolean srcSeek(GstSegmentStruct segment) {
return true;
}
}

Basically, I'm polling the synchronized queue in the readFully method.
On the other side of the queue, I do:

// Converting int[] pixels array into byte array.
tmpBuffer.rewind();
tmpBuffer.asIntBuffer().put(parent.pixels);

try
{
synchQueue.offer(tmpBuffer, waitTime, TimeUnit.MILLISECONDS);
}
catch (InterruptedException ie)
{
ie.printStackTrace();
}

, everytime a new buffer is generated from the screen.

The new code compiles and run, but the synchQueue.poll gets called
just once, the first time a buffer is offered.

Is this the right way to use the CustomSrc class...?

On Mar 9, 4:05 pm, "Wayne Meissner" <wmeiss...@gmail.com> wrote:
Reply all
Reply to author
Forward
0 new messages