Alternatively, have a look at the CustomSrc class - it allows you to
create real Element subclasses that can be plugged into a pipeline.
But, its highly experimental, and might not work on windows (there was
some difference in the structure sizes).
You probably want to use a BlockingQueue or something to shuffle data
from addFrame() to the fakesrc element. It should be safe to have the
handoff listener block, waiting until the queue has a frame in it.
Make sure you're keeping a reference to gpipe around, so its not
getting disposed. Sounds dumb, but I've forgotten this myself.
There's also a lot of cruft in FakeSrcTest - leftover from when I knew
less about gstreamer, and it was a pretty straightforward conversion
from that example.
The 'conv' element is not needed - RGBDataSink includes its own colour
converter.
Do the gpipe.addMany() before linking the elements together. Thats
the proper order, and I think FakeSrcTest had it the wrong way around.
Have a look at the updated FakeSrcTest code (in mercurial) - I now do
rate limiting with a videorate element. Besides pushing it and a
capsfilter for it in the pipeline before the video sink), you need to
add:
fakesrc.set("sync", true);
fakesrc.set("is-live", true);
fakesrc.set("filltype", 1); // Don't fill the buffer
before handoff
(the last line is there for efficiency)
The videorate stuff is:
final Element videorate = ElementFactory.make("videorate", "videorate");
final Element ratefilter = ElementFactory.make("capsfilter",
"RateFilter");
// limit frame rate to 2 frames per second
ratefilter.setCaps(Caps.fromString("video/x-raw-rgb, framerate=2/1"));
And you link the whole thing together with:
pipeline.addMany(fakesrc, srcfilter, videorate, ratefilter, videosink);
Element.linkMany(fakesrc, srcfilter, videorate, ratefilter, videosink);
> However, it seems that the handoff function in the videsrc gets called
> much more often. Something like 600 times per second. Is this ok?
Make sure you're setting the "sync" property on the fakesrc to true.
I think you can also fiddle with properties on the videorate element
(not caps on it), to change how many buffers (video frames) it will
queue up internally.
> Even though there is this line:
>
> Caps fltcaps = new Caps("video/x-raw-rgb, width=" + w + ", height=" +
> h + ", bpp=16, depth=16, framerate=25/1");
>
> What is the effect of this framerate value?
I'm not sure. Other than that example program, I haven't used the
fakesrc element for anything, so you know as much as I do. I vaguely
remember that if you don't put it in, it doesn't work at all.
I kinda understand now why the gstreamer guys say you're on your own
if you use fakesrc or fakesink for injecting/extrating data - they
just don't seem to behave quite right.
You could try creating a src with CustomSrc - but as I said, that
stuff is pretty experimental still, and I was hoping not to solidify
it until post 1.0
1) Have a look at the various --gst-debug= flags and see if they
produce some useful output.
2) Insert an 'identity' element just before the theoraenc, the oggmux
and the filesink with a handoff callback that just prints some Buffer
details (offset, etc), and which sink the buffer is about to hit.
3) Try inserting a 'ffmpegcolorspace' element before the theoraenc. I
_think_ that should convert between the output of the fakesrc and what
theoraenc wants automagically (no need to put a capsfilter after the
ffmpegcolorspace).
4) Try dropping the 'depth=16, bpp=16' off the fakesrc caps. This
would likely confuse the way you're injecting data though, so leave it
until last.
I wonder if dropping the bpp=xx, depth=xx bit from the srcfilter after
fakesrc and adding another ffmpegcolorspace + capsfilter with those
parameters after it would work?
i.e. add it after srcfilter, so the pipeline is:
fakesrc, srcfilter, bppcolorspace, bppfilter, ... rest of the pipeline as is
on bppfilter, just set the media type, width, height, bpp, depth
I have no idea if that will make any difference though.
Fakesrc itself doesn't care about bpp, depth, or any of those things -
those properties are just set on the buffer that is sent out.
From digging around in fakesrc, basesrc and videotestsrc, for video
frames, we might need to change the format from 'bytes' to 'time', so
the segment-start events get sent out correctly. I'll probably have
to expose some api to get that done. Why 16bit bpp/depth works
without this is a bit of a mystery.
On 03/03/2008, acolubri <andres....@gmail.com> wrote:
>
> > From digging around in fakesrc, basesrc and videotestsrc, for video
> > frames, we might need to change the format from 'bytes' to 'time', so
> > the segment-start events get sent out correctly. I'll probably have
> > to expose some api to get that done.
>
>
> Cool, keep me posted about this.
The only bit I think that is needed is BaseSrc.setFormat().
I added a FakeSrc class that is returned when you get a "fakesrc"
element from a factory,
so you can do something like:
FakeSrc fakesrc = (FakeSrc) ElementFactory.make("fakesrc", "my_src");
fakesrc.setFormat(Format.TIME);
>
> Another option would be to ditch fakesrc altogether and start using
> this new element GstAppSrc that is being mentioned as much better
> suited for injecting buffers into the pipeline.
> You also mentioned a CustomSrc class which is part of gstreamer-java
> already, right? Is it the same thing as GstAppSrc?
CustomSrc isn't quite the same as GstAppSrc. GstAppSrc allows you to
shove buffers at it, which it then queues up and sends out on the
pipeline.
CustomSrc doesn't handle any of that - its just a thin layer to enable
you do write gstreamer source elements in java, where you want more
control than fakesrc gives you.
The downside of CustomSrc is that its pretty experimental, and it has
problems on windows (different struct sizes somewhere). I haven't
bothered to fix it, because it hasn't been a high priority ... until
now.
>
> However, the code I wrote based on fakesrc is so close to work ok...
> all it is needed is to fix these issues with the buffers
It might be an idea to dig around in the gstvideotestsrc.c source, and
see if you can make sense of what it does.
Maybe a port of gstvideotestsrc to java using CustomSrc and/or FakeSrc
would be an idea. At least it would help to figure out how to write a
video source correctly.
Yeh, its in hg. Make sure you do hg pull -u if you're doing it on the
command line.
The parameters we set on the Buffer seem to affect things. I just got
ReadableByteStreamSrc working by doing:
Buffer.setTimestamp(ClockTime.NONE);
The only way I found that was by reading the fdsrc source.
> I'm giving up on fakesrc for now. I think I'll start with CustomSrc.
> Or perhaps writing my own input source element :-)
Yeh, thats fun ;-)
CustomSrc is an effort to avoid having to muck with Pointers and
Structures full of callbacks.
In theory, all you have to do is over-ride the appropriate src*()
method of CustomSrc in a subclass. Have a look at how its done in
ReadableByteChannelSrc.java - be aware that its still prototype code,
so its not commented much/at all.
I'd prefer to improve CustomSrc, rather than having code out there
that directly fiddles with the structures/callbacks, since that stuff
is just nasty.
In either case, you want to go read the gstreamer plugin writers
guide, and/or some more src elements from gstreamer. There seems to
be a few things which aren't documented, but are some sort of
knowledge you pick up via osmosis.
> Do you have any simple example on how to build a pipeline using
> CustomSrc?
You need to look at two things:
1) The class that defines the new source. Unlike using fakesrc, where
you just set properties on it, with CustomSrc, you need to extend
CustomSrc and over-ride methods. This hooks up your CustomSrc
subclass as a proper gstreamer source element.
You get a lot more control over what happens with your src element -
caps negotiation, setting caps, getting caps, creating buffers, etc.
There might still be a few methods I haven't exposed yet.
2) You just add it to a pipeline like any other src element. Look at
InputStreamSrcTest.java.