ImageWriter and ffmpeg

1,192 views
Skip to first unread message

Johannes Larsch

unread,
Apr 16, 2015, 10:27:44 AM4/16/15
to bonsai...@googlegroups.com
Hi again,

you mentioned ImageWriter can pipe frames in real time to ffmpeg. This sounds fantastic, can you provide any guidance on this?

somewhat related - what I am really curious about is if I can also generate a synthetic video stream in real time and pipe it to a display to provide visual stimulation in a closed loop fashion.
It appears this should work by assembling a matrix using the transform functions and piping it to ffplay or ffserver via ImageWrite?

any hints or code snippets would be great!

Johannes

Johannes Larsch

unread,
Apr 17, 2015, 11:40:57 AM4/17/15
to bonsai...@googlegroups.com
cool, this is super useful for better control over video encoding.

now, with imageVisualizers working, I ended up building a synthetic image using openCV in a python node and showing it on a separate monitor.



On Friday, April 17, 2015 at 1:56:36 AM UTC+2, goncaloclopes wrote:

The ImageWriter node provides a way to pipe raw binary image data into a named pipe when the path contains the special prefix "\\.\pipe" which follows the MSDN convention for named pipes.

Now, for this to work you need to launch ffmpeg with the appropriate command line arguments before starting Bonsai. Here's an example command line to encode a 1280x960 RGB video stream at 108 Hz using variable bit rate with mpeg4:

ffmpeg -y -f rawvideo -vcodec rawvideo -s 1280x960 -r 108 -pix_fmt bgr24 -i \\.\pipe\videotest -vb 20M -vcodec mpeg4 out.avi

parameter explanation:
-y: this is just to answer "yes" to any possible question ffmpeg might have (like overriding files)

-f rawvideo -vcodec rawvideo -s 1280x960 -r 108 -pix_fmt bgr24 -i \\.\pipe\videotest
This part is where you specify the format of the input source. Details:
-f rawvideo: this indicates the format of the video is raw (uncompressed) video
-vcodec rawvideo: same
-s 1280x960: size of video frames
-r 108: frame rate in FPS
-pix_fmt bgr24:  the format of each image pixel (in our case a pixel is BGR (blue-green-red) with 8 bits per color, so 8+8+8=24 = BGR24
-i \\.\pipe\videotest: this is the "named pipe" where the input is being streamed from (this is created by Bonsai using the ImageWriter node)

-vb 20M -vcodec mpeg4 out.avi
This part specifies the format for the output encoded video. Details:
-vb 20M: encoding proceeds at variable bitrate (meaning dynamically adjusted bitrate as a function of image complexity, more complex images will take longer to encode, simpler images will be faster)
-vcodec mpeg4: the codec used to compress images for output (mpeg4 seems to be the fastest)
out.avi:  the name of the file

Hopefully this will be enough to get you started. You can create multiple such pipes in parallel by launching many instances of ffmpeg.


Regarding the synthetic video streams... It's too bad you cannot use image visualizers in your system for now. They are implemented using OpenGL so they can render quite fast. You can build a synthetic matrix using Python nodes or other operators and then simply drag the visualizer window to a secondary monitor and maximize it to fill the screen (you can hit F11 to hide the borders if needed).

Otherwise, you can use ImageWriter to pipe data to arbitrary processes, so it should in principle work with ffplay or ffserver exactly the same way.

goncaloclopes

unread,
Apr 16, 2015, 7:56:36 PM4/16/15
to bonsai...@googlegroups.com

The ImageWriter node provides a way to pipe raw binary image data into a named pipe when the path contains the special prefix "\\.\pipe" which follows the MSDN convention for named pipes.

Now, for this to work you need to launch ffmpeg with the appropriate command line arguments before starting Bonsai. Here's an example command line to encode a 1280x960 RGB video stream at 108 Hz using variable bit rate with mpeg4:

ffmpeg -y -f rawvideo -vcodec rawvideo -s 1280x960 -r 108 -pix_fmt bgr24 -i \\.\pipe\videotest -vb 20M -vcodec mpeg4 out.avi

parameter explanation:
-y: this is just to answer "yes" to any possible question ffmpeg might have (like overriding files)

-f rawvideo -vcodec rawvideo -s 1280x960 -r 108 -pix_fmt bgr24 -i \\.\pipe\videotest
This part is where you specify the format of the input source. Details:
-f rawvideo: this indicates the format of the video is raw (uncompressed) video
-vcodec rawvideo: same
-s 1280x960: size of video frames
-r 108: frame rate in FPS
-pix_fmt bgr24:  the format of each image pixel (in our case a pixel is BGR (blue-green-red) with 8 bits per color, so 8+8+8=24 = BGR24
-i \\.\pipe\videotest: this is the "named pipe" where the input is being streamed from (this is created by Bonsai using the ImageWriter node)

-vb 20M -vcodec mpeg4 out.avi
This part specifies the format for the output encoded video. Details:
-vb 20M: encoding proceeds at variable bitrate (meaning dynamically adjusted bitrate as a function of image complexity, more complex images will take longer to encode, simpler images will be faster)
-vcodec mpeg4: the codec used to compress images for output (mpeg4 seems to be the fastest)
out.avi:  the name of the file

Hopefully this will be enough to get you started. You can create multiple such pipes in parallel by launching many instances of ffmpeg.


Regarding the synthetic video streams... It's too bad you cannot use image visualizers in your system for now. They are implemented using OpenGL so they can render quite fast. You can build a synthetic matrix using Python nodes or other operators and then simply drag the visualizer window to a secondary monitor and maximize it to fill the screen (you can hit F11 to hide the borders if needed).

Otherwise, you can use ImageWriter to pipe data to arbitrary processes, so it should in principle work with ffplay or ffserver exactly the same way.



On Thursday, 16 April 2015 15:27:44 UTC+1, Johannes Larsch wrote:
Reply all
Reply to author
Forward
0 new messages