Application of u/w/v_direction parameters

46 views
Skip to first unread message

Daniel Evans

unread,
May 15, 2024, 4:45:30 AMMay 15
to Ames Stereo Pipeline Support
Hi all,

I have been attempting to use the pipeline with some imagery data that are rotated and flipped compared to the pipeline's definitions.

The Camera definition files allow for the image orientation to be defined via the u/w/v_direction parameters, and the following configuration would work for my data:

u_direction = 0 -1 0
v_direction = -1 0 0
w_direction = 0 0 1

However, I have found that not all of the pipeline tools respect these parameters - e.g. bundle_adjust ignores them. I have either had to massage data when changing from one tool to another depending on whether the u/v/w_direction parameters are respected, or pre-process the images and/or attitude data to apply the changes.

1. Are all tools expected to respect these parameters? If so, happy to raise a bug report (although I don't have a comprehensive list)
2. If not, is there anywhere in the documentation which lists which tools respect it and which don't? I've not found very much discussion of them, especially for non-default values.

Cheers,
Daniel

Oleg Alexandrov

unread,
May 15, 2024, 11:49:12 AMMay 15
to Daniel Evans, Ames Stereo Pipeline Support
These parameters turned out not to be that useful, and I plan to remove them from the format at some point. Sorry you got into an inconsistency with them.

You should not need them, as apparently what you need here is modify the camera rotation to absorb the extra in-plane rotation. I think if you take your camera-to-world transform, which is what R in the camera file stores, and multiply it on the right by the above u,v,w matrix, it should have the same effect. 

--
You received this message because you are subscribed to the Google Groups "Ames Stereo Pipeline Support" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ames-stereo-pipeline...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ames-stereo-pipeline-support/5a045d25-47d2-4b2b-88d2-37e9dd2dc399n%40googlegroups.com.

Oleg Alexandrov

unread,
May 16, 2024, 12:35:44 PMMay 16
to Daniel Evans, Ames Stereo Pipeline Support
Yeah, the flip will be a problem.

I looked at our code, and it appears that there are many places where we manipulate or convert cameras, and ensuring that u_direction, etc, are respected would be quite some work. I think in the computer vision community it is always assumed there is no flip, and that a rotation matrix should always be enough. So, wiping u_direction is easier than making it be respected.

Are your images acquired with mirror reflection? Then modifying the images themselves could work. 

It also looks that the simplest bundle adjustment, without --inline-intrinsics or applying any transforms should have no reason to mess up with the u_direction and would use it as is. So, if you validate with mapproject that the inputs to bundle_adjust are good, and later use your original cameras with the .adjust files produced by bundle_adjust, then hopefully it will still work.

Note that ASP supports the CSM Frame model which can do in-sensor transforms (flip, affine transform) but that would be its own learning curve.

Let me know if nothing works.


On Thu, May 16, 2024 at 8:48 AM Daniel Evans <daniel.f...@gmail.com> wrote:
Hi Oleg,

Thanks for the information that the parameters aren't expected to be consistently used.

The transformation I need to do to my camera is a 90 degree rotation around the Z axis, and then a flip across the Y axis (scale X by -1). Performing the rotation in advance works fine, but including the flip causes tools such as mapproject to fail.

The output I get from mapproject feels a lot like it tried to propagate the camera to the ground to get the DEM intersection, but it instead ended up shooting rays out into space:
```
mapproject_single --query-projection ./example/20231130T085712000_visual_30_hotsat1_dem.tiff ./example/20231130T085712000_visual_30_hotsat1_fliplr.tiff ./example/20231130T085712000_visual_30_hotsat1_camera.tsai ./result/20231130T085712000_visual_30_hotsat1_initial_orthoutm.tiff --t_srs EPSG:32642 -t pinhole
        --> Setting number of processing threads to: 4
Using session: pinhole
Loading camera model: ./example/20231130T085712000_visual_30_hotsat1_fliplr.tiff ./example/20231130T085712000_visual_30_hotsat1_camera.tsai
Computed image to DEM bbox: Origin: (1.79769e+308, 1.79769e+308) width: 0 height: 0
Could not sample correctly the image.
Check your inputs. Or try specifying --t_projwin and --tr values.
```

Cheers,
Daniel

Alexandrov, Oleg (ARC-TI)[KBR Wyle Services, LLC]

unread,
May 16, 2024, 1:09:51 PMMay 16
to Oleg Alexandrov, Daniel Evans, Ames Stereo Pipeline Support
To add, I found a location in bundle_adjust, where, if invoked with --inline-adjustments, the initial cameras were ignored and new cameras were made by copying the rotation. I put a fix for that, and now I see u_direction being respected if set to a non-default value. You can try tomorrow's build (2024-05-16) and see if that fixes it for you. I could not find other such instances, but we also have logic where a pinhole camera does not exist to start with, and and overall, this is a rare enough case that other problems cannot be ruled out, but that may be for more complicated stuff than just regular bundle adjustment, so hopefully the fix is enough for now. 

From: ames-stereo-pi...@googlegroups.com <ames-stereo-pi...@googlegroups.com> on behalf of Oleg Alexandrov <oleg.al...@gmail.com>
Sent: Thursday, May 16, 2024 9:35 AM
To: Daniel Evans <daniel.f...@gmail.com>
Cc: Ames Stereo Pipeline Support <ames-stereo-pi...@googlegroups.com>
Subject: [EXTERNAL] [BULK] Re: Application of u/w/v_direction parameters
 
CAUTION: This email originated from outside of NASA.  Please take care when clicking links or opening attachments.  Use the "Report Message" button to report suspicious messages to the NASA SOC.



Daniel Evans

unread,
May 17, 2024, 10:50:02 AMMay 17
to Oleg Alexandrov, Ames Stereo Pipeline Support
Hi Oleg,

Thanks for the information that the parameters aren't expected to be consistently used.

The transformation I need to do to my camera is a 90 degree rotation around the Z axis, and then a flip across the Y axis (scale X by -1). Performing the rotation in advance works fine, but including the flip causes tools such as mapproject to fail.

The output I get from mapproject feels a lot like it tried to propagate the camera to the ground to get the DEM intersection, but it instead ended up shooting rays out into space:
```
mapproject_single --query-projection ./example/20231130T085712000_visual_30_hotsat1_dem.tiff ./example/20231130T085712000_visual_30_hotsat1_fliplr.tiff ./example/20231130T085712000_visual_30_hotsat1_camera.tsai ./result/20231130T085712000_visual_30_hotsat1_initial_orthoutm.tiff --t_srs EPSG:32642 -t pinhole
        --> Setting number of processing threads to: 4
Using session: pinhole
Loading camera model: ./example/20231130T085712000_visual_30_hotsat1_fliplr.tiff ./example/20231130T085712000_visual_30_hotsat1_camera.tsai
Computed image to DEM bbox: Origin: (1.79769e+308, 1.79769e+308) width: 0 height: 0
Could not sample correctly the image.
Check your inputs. Or try specifying --t_projwin and --tr values.
```

Cheers,
Daniel

On Wed, 15 May 2024 at 16:49, Oleg Alexandrov <oleg.al...@gmail.com> wrote:

Daniel Evans

unread,
May 17, 2024, 10:50:08 AMMay 17
to Oleg Alexandrov, Ames Stereo Pipeline Support
Hi Oleg,

Yes, modifying the images to handle the flip does work - it's the solution I've gone with so far, and actually a step we've had to apply with other software before. I mostly wanted to check that I hadn't missed a single step solution!

Thanks for the help and advice.

Cheers,
Daniel

Michael Studinger

unread,
May 18, 2024, 10:13:21 AMMay 18
to Ames Stereo Pipeline Support

Thank you both Daniel and Oleg for this very informative discussion on a topic that I have encountered myself this week. I have started to use a new capability that Oleg has developed this week and implemented in ASP (build date: 2024-05-16). ASP can now use pitch, roll, and yaw data for airborne data in the cam_gen tool (16.8.1.6. Geodetic coordinates and angles) to create the extrinsics for the pinhole .tsai cameras. My team has flown two different configurations on two different aircraft with different camera orientations due to space constraints in the nadir viewports:  

  •  “forward” installation: camera/image orientation: across track, top of image in direction of flight
  • “backward” installation: camera/image orientation: across track, bottom of image in direction of flight

I have previously used ortho2pinhole for creating extrinsics and that of course does not depend on the camera’s orientation. That is of course not the case for cam_gen when yaw is used. A colleague of mine, who is processing the same data in Metashape, simply subtracted 180° from the yaw angles for the "backward" installation and that works as well when I do this for ASP processing with the cam_gen module.

Thanks for sharing that the u_v etc. parameters are not the way to go for accounting for camera orientations. Good to know!

Thanks,

Michael

Reply all
Reply to author
Forward
0 new messages