Hi,
this pull request exposes a branch that uses scipy.optimize on top of the regular full-calibraion from optv library.
In some cases it has been shown to give a better result, especially if the initial guess is good. This option is automatically turned on only if the full_calibration fails.
What's needed now is your test:
1. install conda
2. create conda environment so you don't mess up with the present one
3. pull from git the repo in this branch
4. run your experiment and see if the calibration has improved.
Use github issues to inform about bugs