gst-launch-1.0 filesrc location=$src ! \
decodebin name=dec ! \
audioconvert ! faac ! queue ! mux. \
dec. ! videoconvert ! x264enc pass=5 quantizer=21 ! queue ! mux. \
mpegtsmux name=mux ! filesink location=/tmp/test.ts
public static class HandsOff
{
public static void main(String[] argv)
{
Gst.init();
EventQueue.invokeLater(HandsOff::mission1);
}
public static void mission1()
{
Pipeline pipe = new Pipeline();
if (true) {
buildPipeline1(pipe);
} else {
builPipeline2(pipe);
}
Bus bus = pipe.getBus();
addDebuggingBusListeners(bus);
if (true) {
pipe.play();
} else {
new Thread(pipe::play).start();
}
}
public static void buildPipeline1(Pipeline pipe)
{
Element fs = ElementFactory.make("filesrc", "src");
fs.set("location", srcVideo);
//Element dec = ElementFactory.make("decodebin", "dec");
Bin bin = Bin.launch("decodebin name=dec ! " +
"audioconvert ! faac ! queue ! mux. " +
"dec. ! videoconvert ! x264enc pass=5 quantizer=21 ! queue ! mux. " +
"mpegtsmux name=mux ! filesink location=/tmp/test.ts", true);
pipe.addMany(fs,
// dec,
bin);
Element.linkMany(fs,
// dec,
bin);
}
public static void builPipeline2(Pipeline pipe)
{
Bin bin = Bin.launch("filesrc name=src ! decodebin name=dec ! " +
"audioconvert ! faac ! queue ! mux. " +
"dec. ! videoconvert ! x264enc pass=5 quantizer=21 ! queue ! mux. " +
"mpegtsmux name=mux ! filesink location=/tmp/test.ts", true);
Element fs = bin.getElementByName("src");
fs.set("location", srcVideo);
pipe.add(bin);
//pipe.link(bin);
}
}
public static void addDebuggingBusListeners(Bus bus)
{
bus.connect((Bus.EOS) gstObject -> System.out.println("EOS"));
bus.connect((Bus.ERROR) (gstObject, i, s) -> System.out.println("ERROR "+i+" "+s+" "+gstObject));
bus.connect((Bus.WARNING) (gstObject, i, s) -> System.out.println("WARN "+i+" "+s+" "+gstObject));
}
ERROR 12 Your GStreamer installation is missing a plug-in. DecodeBin: [dec]If I use buildPipeline1() which separates the filesrc from the rest of the pipeline I get
ERROR 1 Internal data stream error. Element: [typefind]
ERROR 1 Internal data stream error. BaseSrc: [src]
--
You received this message because you are subscribed to the Google Groups "gstreamer-java" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gstreamer-jav...@googlegroups.com.
To post to this group, send email to gstream...@googlegroups.com.
Visit this group at https://groups.google.com/group/gstreamer-java.
For more options, visit https://groups.google.com/d/optout.
$ java -version
java version "1.8.0_152"
Java(TM) SE Runtime Environment (build 1.8.0_152-b16)
Java HotSpot(TM) 64-Bit Server VM (build 25.152-b16, mixed mode)
I was able to use the PlayBinVideoPlayer to watch the video in question.
I am running gentoo. The 1.12.3 gst-plugins are installed.
Switching from Bin.launch() to Pipeline.launch() did prevent the errors, and it appears the transcode is actually generating /tmp/test.ts .
I suspect that the final form of my app will not allow me to use Pipeline.launch() because I need to drop frames that are outside a target time range (is there a gstreamer element that does time trimming?) and that means I will probably be using java to pull samples and drop or forward them from pipeline to pipeline.
So, what extra steps are necessary when filling a Pipeline with Bin-s to make the pipeline work as well as Pipeline.launch() -created pipelines work.
If we can get this basic transcoding example working I am going to recommend it be added to the examples git repo so the next person can learn from it. The existing examples seem to lack instances that are a finite process.
Putting a decodebin in a Bin using Bin.launch(..., true); is going to be a problem. Passing true as the second parameter creates ghost pads on the bin for any unconnected pads.
On Mon, 5 Feb 2018 at 20:21 Bob F <thot...@gmail.com> wrote:
So, what extra steps are necessary when filling a Pipeline with Bin-s to make the pipeline work as well as Pipeline.launch() -created pipelines work.Nothing! But you're going to have to understand why the Bin approach fails and how to fix it, which means understanding sometimes and ghost pads. The upstream documentation is worth a read - https://gstreamer.freedesktop.org/documentation/application-development/basics/pads.htmlPutting a decodebin in a Bin using Bin.launch(..., true); is going to be a problem. Passing true as the second parameter creates ghost pads on the bin for any unconnected pads. The likely problem you have is that decodebin has a sometimes src pad (check gst-inspect decodebin), meaning that at that point decodebin and audioconvert / videoconvert are not connected, a ghosts pad gets created for one of the convert elements, which means decodebin can't connect later.
Pull requests for new examples would be very welcome. There are a lot of examples in the 0.10 bindings that have not been ported over yet too.
It took me a long time to figure out how to link the various pads (hierarchy is a major issue).
The fact that pad.link() returns error codes instead of throwing exceptions is actually quite a major pain in the butt. Maybe I should create a new method ( linkOrThrow() ? ) and pull request that.
Even after I got the filter graph looking correct in graphviz, the pipeline does not actually run.
What steps are missing to get the pipeline moving again after the sometimes pads have been linked?
Pull requests for new examples would be very welcome. There are a lot of examples in the 0.10 bindings that have not been ported over yet too.
I just tried your code and couldn't get it to work. But moving the venc bin into mission1 and doing this does -
pipe.addMany(venc, mux, fileSink);
Pipeline.linkMany(venc, mux, fileSink);
dec.connect((Element.PAD_ADDED) (d,p) -> {
d.link(venc);
});Connecting venc to mux in the callback works, but not adding it to pipe - I think it's something to do with the play state not being picked up - can't recall off the top of my head.
I think there needs to be a way to add bins to a pipeline dynamically. ...
Maybe I'll experiment with setting the state on the newly-linked transcode pipeline.
Actually, I forgot that bit. You need to manage the state of dynamically added elements yourself.So, by commenting out all the fakesink / element trash block (there's no need to link a pad you don't need), and calling venc.setState(State.PLAYING) at the end of connectVideoReencode() your code works here.
Well, "works" is not exactly the verb I would use. While it does cause the pipeline to actually run and generate a file, that transport stream ends up being a steaming pile of unplayable garbage.
The resulting transport stream seems to put the video payload into PID 0 which is supposed to be the PAT. The PMT even says "I'm putting the video in PID 0" which is as wrong as a football bat.
I suspect that the MPEG TS mux wasn't coded well enough to support pads being added after its initialization phase. I suspect that I'll have to hold off on creating the mux until the decodebin is done telling me about sometimes pads and I'm done linking transcoding pipelines to those sometimes pads.