Dual LadyBugs

15 views
Skip to first unread message

Roger Oberholtzer

unread,
Jan 16, 2026, 11:13:10 AM (22 hours ago) Jan 16
to PTGui Support
Hi all,

I have some questions about control of camera alignment in PTGui. Let me start with a description of the images. 

I have a number of vehicle-based road measurement systems that have dual LadyBug 360 degree image cameras. The reason for having more than one is that they are on a large van that has lots of other transducers mounted. A customer has the requirement that no equipment should be seen in the images. In order to do this with one camera, it would have to be mounted like 10 meters above the vehicle. Obviously not possible. So we have two cameras placed so that each can collect a part of the 360 image. We then stitch these two images together. The resulting image has no equipment visible.

We do this by mounting one camera at the front of the vehicle and one at the back. The way they are pointed allows us to get the required images. The images from each camera are taken when the two cameras are at 'the same place'. To do this we use a wheel pulse transducer to track vehicle movement. This signal controls when the front camera should take an image, and when the back has move the appropriate distance and should then take it's image. The goal is that we have two images taken at the same place. Or as close to the same place as we can get. It's never going to be perfect.

In addition to the images, we use a high quality INS (with post-processed data) to determine the orientation of each camera when the image was taken. We then anchor the front image, and calculating the difference in orientations when the images were taken, rotate the back image into place.

This solution generally works very well. In fact it works better than I had expected. However, the stitched images can have small errors. For example, when there is road marking that traverses the place where the images are joined, sometimes that marking is not connected. Unfortunately, our customer feels that this is not acceptable. So we have been looking at methods to improve this. Also, the solution requires that the orientation of the cameras relative to the INS be precisely known. That is awkward to do. So we are looking at ways to improve this.

One solution is to use PTGui (in command line mode) to process pairs of images. We have had some success with this. There are improvements to the alignment. So we would like to pursue this approach. So I have some questions:

1. In the paid PTGui Pro, is it the case that the .pts file is a JSON file that can be edited? Although we will be using a template file, there are some things we may wish to set in that template. Like the initial image orientation. For example, in one configuration the back camera image will have a yaw of -180 degrees relative to the front image.

2. Are there guidelines on how best to prepare the images when looking for control points? For example, is it better to rotate the images to get close first, and then let PTGui do control points? Or is it better to leave the images alone and tell how they are rotated in the project file? I don't need specific answers here/now. I'm just curious if there is useful information to help us do the best thing.

3. We think the main issue we have is parallax differences between the two images. We are never at the exact same location when the images are taken. And the LadyBug itself has optimized the joins of it's cameras at dome fixed difference. I would think that control points are used to address parallax. So even if we can align the images very close, control points will still be needed?

4. Is it possible to set it so that one of the two cameras does not move (as in the whole image is rotated). The reason for this is that we need the image orientation so we can then use the images to colorize high resolution LiDAR data (using TerraPhoto). 

5. Is the mask applied so that there is a sharp break at the join? We have had improvements in the results by using a mask with a gradient that selects how much to take from each image. This fuzziness is not obvious but makes some otherwise share disconnects less obvious. Is there some flexibility related to this in PTGui?

6. If PTGui does not find adequate/useful control points, are the alternate options? Can this be detected in the created project file when run in command line mode? 
 
Sorry if that's lots of questions! But I thought it was best to describe what we are trying to accomplish.

Thanks for any information, pointers or suggestions!

-- Roger Oberholtzer

PTGui Support

unread,
Jan 16, 2026, 1:15:20 PM (20 hours ago) Jan 16
to pt...@googlegroups.com
Hi Roger,

On 1/16/26 12:28, Roger Oberholtzer wrote:
> 1. In the paid PTGui Pro, is it the case that the .pts file is a JSON
> file that can be edited? Although we will be using a template file,
> there are some things we may wish to set in that template. Like the
> initial image orientation. For example, in one configuration the back
> camera image will have a yaw of -180 degrees relative to the front image.

Yes, the project files are in JSON(*) format, you can modify them quite
easily. I have attached a sample project file.

(*) with one quirk: PTGui expects a certain ordering of some objects.

>
> 2. Are there guidelines on how best to prepare the images when looking
> for control points? For example, is it better to rotate the images to
> get close first, and then let PTGui do control points? Or is it better
> to leave the images alone and tell how they are rotated in the project
> file? I don't need specific answers here/now. I'm just curious if there
> is useful information to help us do the best thing.

If you know the alignment of the images, pre-aligning can be helpful in
two ways: your alignment can be used as a fallback in case PTGui fails
to find control points. And PTGui can limit searching for control points
to pairs of images that it knows to be overlapping, which will speed up
the alignment process. But the control point generator will work just as
well with the source images in any random orientation.

>
> 3. We think the main issue we have is parallax differences between the
> two images. We are never at the exact same location when the images are
> taken. And the LadyBug itself has optimized the joins of it's cameras at
> dome fixed difference. I would think that control points are used to
> address parallax. So even if we can align the images very close, control
> points will still be needed?

The issue is that images with parallax can never be stitched perfectly.
Keep in mind that even a single Ladybug will suffer from parallax, so I
think stitching errors will be unavoidable for your project.

Adding more control points in problem areas can somewhat improve the
result, but usually this will pull other parts of the panorama out of
alignment. My usual advice is to have lots of overlap, so the automatic
seam finder is able to hide any stitching errors. Place control points
on distant objects only for consistent alignment.

> 4. Is it possible to set it so that one of the two cameras does not move
> (as in the whole image is rotated). The reason for this is that we need
> the image orientation so we can then use the images to colorize high
> resolution LiDAR data (using TerraPhoto).

Yes you can fix an image by not optimizing the yaw, pitch and roll of
that image.

> 5. Is the mask applied so that there is a sharp break at the join? We
> have had improvements in the results by using a mask with a gradient
> that selects how much to take from each image. This fuzziness is not
> obvious but makes some otherwise share disconnects less obvious. Is
> there some flexibility related to this in PTGui?

PTGui has two blenders: the Zero Overlap blender performs blending at
the seams, without using any overlap. This creates a somewhat sharp seam
at disconuities but it does not suffer from ghosting artifacts.
And there's the Multiband blender, which creates smoother overlaps but
it will show ghosts of displaced objects.

> 6. If PTGui does not find adequate/useful control points, are the
> alternate options? Can this be detected in the created project file when
> run in command line mode?

There's a setting ("but only if enough control points found") in the
Project Settings. If this is enabled, PTGui will not stitch that
particular panorama after a failed alignment attempt. You could detect
this in your software because the panoramic image is not created.
If this checkbox is not checked, PTGui will continue stitching based on
the settings given in the template.

Hope this helps!

Kind regards,

Joost Nieuwenhuijse
www.ptgui.com
DSC00701 Panorama.pts
Reply all
Reply to author
Forward
0 new messages