Hi Roger,
On 1/16/26 12:28, Roger Oberholtzer wrote:
> 1. In the paid PTGui Pro, is it the case that the .pts file is a JSON
> file that can be edited? Although we will be using a template file,
> there are some things we may wish to set in that template. Like the
> initial image orientation. For example, in one configuration the back
> camera image will have a yaw of -180 degrees relative to the front image.
Yes, the project files are in JSON(*) format, you can modify them quite
easily. I have attached a sample project file.
(*) with one quirk: PTGui expects a certain ordering of some objects.
>
> 2. Are there guidelines on how best to prepare the images when looking
> for control points? For example, is it better to rotate the images to
> get close first, and then let PTGui do control points? Or is it better
> to leave the images alone and tell how they are rotated in the project
> file? I don't need specific answers here/now. I'm just curious if there
> is useful information to help us do the best thing.
If you know the alignment of the images, pre-aligning can be helpful in
two ways: your alignment can be used as a fallback in case PTGui fails
to find control points. And PTGui can limit searching for control points
to pairs of images that it knows to be overlapping, which will speed up
the alignment process. But the control point generator will work just as
well with the source images in any random orientation.
>
> 3. We think the main issue we have is parallax differences between the
> two images. We are never at the exact same location when the images are
> taken. And the LadyBug itself has optimized the joins of it's cameras at
> dome fixed difference. I would think that control points are used to
> address parallax. So even if we can align the images very close, control
> points will still be needed?
The issue is that images with parallax can never be stitched perfectly.
Keep in mind that even a single Ladybug will suffer from parallax, so I
think stitching errors will be unavoidable for your project.
Adding more control points in problem areas can somewhat improve the
result, but usually this will pull other parts of the panorama out of
alignment. My usual advice is to have lots of overlap, so the automatic
seam finder is able to hide any stitching errors. Place control points
on distant objects only for consistent alignment.
> 4. Is it possible to set it so that one of the two cameras does not move
> (as in the whole image is rotated). The reason for this is that we need
> the image orientation so we can then use the images to colorize high
> resolution LiDAR data (using TerraPhoto).
Yes you can fix an image by not optimizing the yaw, pitch and roll of
that image.
> 5. Is the mask applied so that there is a sharp break at the join? We
> have had improvements in the results by using a mask with a gradient
> that selects how much to take from each image. This fuzziness is not
> obvious but makes some otherwise share disconnects less obvious. Is
> there some flexibility related to this in PTGui?
PTGui has two blenders: the Zero Overlap blender performs blending at
the seams, without using any overlap. This creates a somewhat sharp seam
at disconuities but it does not suffer from ghosting artifacts.
And there's the Multiband blender, which creates smoother overlaps but
it will show ghosts of displaced objects.
> 6. If PTGui does not find adequate/useful control points, are the
> alternate options? Can this be detected in the created project file when
> run in command line mode?
There's a setting ("but only if enough control points found") in the
Project Settings. If this is enabled, PTGui will not stitch that
particular panorama after a failed alignment attempt. You could detect
this in your software because the panoramic image is not created.
If this checkbox is not checked, PTGui will continue stitching based on
the settings given in the template.
Hope this helps!
Kind regards,
Joost Nieuwenhuijse
www.ptgui.com