Hi there.
This request has a little backstory.
I am the proud owner of a Samsung Gear 360 camera, and also a big advocate of open source software.
What I did not know prior to buying the camera is that it does not do stitching of the video inside the camera.
The raw output from this camera is two 190 degree 1920 x 1920 fisheye images on one 3840 x 1920 canvas.
To be able to use this footage, a Windows-only application will stitch it, or one can stitch the video with an app on your phone.
Both these approaches do not allow for any control over the stitching process, and sticks like a thorn in my side because of the closed-sourceness of it.
There are some very expensive software by Kolor that will also do the stitching, but shelling out 680 Euros for what is a hobby for me is also not an option.
There is a project on github by Ultramango that uses a combination of FFmpeg and a Hugin save file and enblend/enfuse to deconstruct the video into frames, do the circular-fisheye to equirectangular conversion, and then stitching it all back together. I have helped a little in fine-tuning the templates used, and a little on streamlining the process. It's a wonderful use of opensource, and the quality and control over the video is really good.
Unfortunately, since the pixel positions for each image are re-calculated from the lens information in the .pto file, the process is quite slow.
I have been playing around with ffmpeg's remap filter, that moves pixels around in the frames based on a map consisting of two pgm images, which encode y translations and x translations respectively.
With this I am able to get a decent 5fps conversion. The problem that I am having with the remap filter in ffmpeg is that the example program that generates the translation maps assumes a perfect fisheye lens.
What I am hoping to do here is to make a translation map based on a Hugin .pto file rather than a perfect lens.
Is there anybody in the Hugin community that can point me to the right person who might be interested to help me?
Kind regards,
Evert Vorster
For the curious, here are a few links:
The project which uses the saved Hugin template:
https://github.com/ultramango/gear360panoThe description of the remap filter and the program that makes an example maps for use in ffmpeg
https://trac.ffmpeg.org/wiki/RemapFilterExample footage from before the fine tunings in Ultramango
https://youtu.be/GWAwfO16Q3sExample footage after the fine tuning
https://youtu.be/TfPBc6LcgPQThis video also uses multiblend instead of enblend.
Example footage done with ffmpeg's remap filter
https://youtu.be/6kTC9IURlTwThere is still more work to do, vignetting and lens corrections.
The original footage:
https://youtu.be/1udaziKqkFA