The
ImageWriter node provides a way to pipe raw binary image data into a named pipe when the path contains the special prefix "\\.\pipe" which follows the
MSDN convention for named pipes.
Now, for this to work you need to launch ffmpeg with the appropriate command line arguments before starting Bonsai. Here's an example command line to encode a 1280x960 RGB video stream at 108 Hz using variable bit rate with mpeg4:
ffmpeg -y -f rawvideo -vcodec rawvideo -s 1280x960 -r 108 -pix_fmt bgr24 -i \\.\pipe\videotest -vb 20M -vcodec mpeg4 out.avi
parameter explanation:
-y: this is just to answer "yes" to any possible question ffmpeg might have (like overriding files)
-f rawvideo -vcodec rawvideo -s 1280x960 -r 108 -pix_fmt bgr24 -i \\.\pipe\videotest
This part is where you specify the format of the input source. Details:
-f rawvideo: this indicates the format of the video is raw (uncompressed) video
-vcodec rawvideo: same
-s 1280x960: size of video frames
-r 108: frame rate in FPS
-pix_fmt bgr24: the format of each image pixel (in our case a pixel is BGR (blue-green-red) with 8 bits per color, so 8+8+8=24 = BGR24
-i \\.\pipe\videotest: this is the "named pipe" where the input is being streamed from (this is created by Bonsai using the ImageWriter node)
-vb 20M -vcodec mpeg4 out.avi
This part specifies the format for the output encoded video. Details:
-vb 20M: encoding proceeds at variable bitrate (meaning dynamically adjusted bitrate as a function of image complexity, more complex images will take longer to encode, simpler images will be faster)
-vcodec mpeg4: the codec used to compress images for output (mpeg4 seems to be the fastest)
out.avi: the name of the file
Hopefully this will be enough to get you started. You can create multiple such pipes in parallel by launching many instances of ffmpeg.
Regarding the synthetic video streams... It's too bad you cannot use image visualizers in your system for now. They are implemented using OpenGL so they can render quite fast. You can build a synthetic matrix using Python nodes or other operators and then simply drag the visualizer window to a secondary monitor and maximize it to fill the screen (you can hit F11 to hide the borders if needed).
Otherwise, you can use ImageWriter to pipe data to arbitrary processes, so it should in principle work with ffplay or ffserver exactly the same way.
On Thursday, 16 April 2015 15:27:44 UTC+1, Johannes Larsch wrote: