Batch coregistering HiRISE *_RED.JP2/LBL files

240 views
Skip to first unread message

mikita belikau

unread,
Aug 24, 2021, 4:47:13 PM8/24/21
to Ames Stereo Pipeline Support
Hello!
I am trying to find a way how to batch coregister HiRISE RDR files (in particular *_RED.JP2/LBL files).

Why RDR and not EDR? Because for the area(s) I am interested there are no RDR

My plan is to utilize something similar as in ASAP wrapper implemented - use combination of ipfind/ipmatch and then somehow apply this transformation and wrap this all using GDAL. From the threads in this group I see that also pc_align could be somehow used. This is the plan.

How I tried to achieve this:

Let’s say, I have three .LBL files that look like this:

b5d716c6f7e0056beecfd2099d3f99bac4700bac.jpeg

Here we see, that these 3 files have intersections

So what I did was:

  1. created cub file for each image with pds2isis
  2. called

ipfind ESP_042315_1985_RED.LBL.cub ESP_046060_1985_RED.LBL.cub ESP_062530_1985_RED.LBL.cub --normalize --debug-image 1 --ip-per-tile 50

Parameters and it’s values I took from im_feeling_lucky function of ASAP

Output was following

Finding interest points in “ESP_042315_1985_RED.LBL.cub”.
Read in nodata value: 0
Normalizing the input image…
Detected 52 raw keypoints!
Found 52 points.
Running sift descriptor generator.
Writing output file ESP_042315_1985_RED.LBL.vwip
Writing debug image: ESP_042315_1985_RED.LBL_debug.png with downsample: 0.0164025
Finding interest points in “ESP_046060_1985_RED.LBL.cub”.
Read in nodata value: 0
Normalizing the input image…
Detected 41 raw keypoints!
Found 41 points.
Running sift descriptor generator.
Writing output file ESP_046060_1985_RED.LBL.vwip
Writing debug image: ESP_046060_1985_RED.LBL_debug.png with downsample: 0.0174001
Finding interest points in “ESP_062530_1985_RED.LBL.cub”.
Read in nodata value: 0
Normalizing the input image…
Detected 13 raw keypoints!
Found 13 points.
Running sift descriptor generator.
Writing output file ESP_062530_1985_RED.LBL.vwip
Writing debug image: ESP_062530_1985_RED.LBL_debug.png with downsample: 0.0177379

What worries me on this step is these debug images. They look like this:

a0d2ece6349e18f3f2a67c65398a2bd6abbece97.jpeg

91c2586dc72e01c3a973552fbed8776c49647a45.jpeg

And

6abfb5a7ea15b3725fce8d83a304778c5e07c016.jpeg

I believe this is the core problem of further failed step

I do not know what does this mean, but these 2 black images look like a bad sign for me

  1. called

ipmatch --debug-image --ransac-constraint homography ESP_042315_1985_RED.LBL.cub ESP_042315_1985_RED.LBL.vwip ESP_046060_1985_RED.LBL.cub ESP_046060_1985_RED.LBL.vwip ESP_062530_1985_RED.LBL.cub ESP_062530_1985_RED.LBL.vwip

Output was following

Matching between ESP_042315_1985_RED.LBL.cub (52 points) and ESP_046060_1985_RED.LBL.cub (41 points).
Using distance metric: L2
Matching:[****************************************************************] 100%Found 11 putative matches before duplicate removal.
Found 6 putative matches.
RANSAC Error. Number of requested inliers is less than min number of elements needed for fit. (3/4)

Attempting RANSAC with 2 of output inliers.
RANSAC Error. Number of requested inliers is less than min number of elements needed for fit. (2/4)

RANSAC Failed: RANSAC was unable to find a fit that matched the supplied data.
Matching between ESP_042315_1985_RED.LBL.cub (52 points) and ESP_062530_1985_RED.LBL.cub (13 points).
Using distance metric: L2
Matching:[****************************************************************] 100%Found 4 putative matches before duplicate removal.
Found 2 putative matches.
RANSAC Error. Not enough potential matches for this fitting functor. (2/4)

RANSAC Failed: RANSAC was unable to find a fit that matched the supplied data.
Matching between ESP_046060_1985_RED.LBL.cub (41 points) and ESP_062530_1985_RED.LBL.cub (13 points).
Using distance metric: L2
Matching:[***************************************************************.] 100%Found 4 putative matches before duplicate removal.
Found 3 putative matches.
RANSAC Error. Not enough potential matches for this fitting functor. (3/4)

RANSAC Failed: RANSAC was unable to find a fit that matched the supplied data.

So, it’s complaining on low number of detected common points.
How can I fix this?

In general, ASAP implemented the idea that was discussed here https://groups.google.com/g/ames-stereo-pipeline-support/c/NVrAIQy3Gv0. Also similar approach was described here https://github.com/NeoGeographyToolkit/StereoPipeline/issues/265

I suspect that my problem is that for some reason these cubes from RDR files are not good enough for ipfind.

Oleg Alexandrov

unread,
Aug 24, 2021, 4:54:14 PM8/24/21
to mikita belikau, Ames Stereo Pipeline Support
I don't know why you get so few interest point matches. Maybe there are stray pixels somewhere which confuse this tool. You can try:

 ipfind with --ip-per-tile 200

 ipfind --interest-operator obalog --descriptor-generator sgrad

Also one of --normalize,  --per-tile-normalize.

I would start with the normalization options, then using obalog, and then increasing ip-per-tile (that should not be necessary, given how big the images are but may be tried if nothing else works).


--
You received this message because you are subscribed to the Google Groups "Ames Stereo Pipeline Support" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ames-stereo-pipeline...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ames-stereo-pipeline-support/dc5d459b-4cf1-4318-9918-03a9a8a29e50n%40googlegroups.com.

mikita belikau

unread,
Aug 24, 2021, 5:00:45 PM8/24/21
to Ames Stereo Pipeline Support
Oleg, thanks for your response!
I hoped to see a reply from you, so now I am quite positive in finding the solution :-) 
I tried already to play with  ip-per-tile. From 50 I tried to set 100 and result was the same. So, I believe, this is not the case.
Anyway, will try combination you mentioned and update here what is the outcome.

Beyer, Ross A. (ARC-SST)[SETI INSTITUTE]

unread,
Aug 24, 2021, 5:14:37 PM8/24/21
to mikita belikau, Ames Stereo Pipeline Support
Mikita,

I’m hopeful that you’ll find a direct solution with the RDR data that you’re working with, but this isn't true:

> Why RDR and not EDR? Because for the area(s) I am interested there are no EDR

The RDR data are the “reduced” data records, and these are never released by HiRISE unless there are also the source “experimental” data records that have been released. Fundamentally, an RDR cannot exist without EDRs existing first.

For example, based on your screen shot, I can go to this URL:

https://hirise.lpl.arizona.edu/ESP_062530_1985

And scroll down to the “EDR Products” link: https://hirise-pds.lpl.arizona.edu/PDS/EDR/ESP/ORB_062500_062599/ESP_062530_1985/

Where I could download all of the EDRs that constitute this observation.

That being said, working with a derived RDR rather than attempting to assemble the EDRs yourself may be easier.


Ross

http://RossBeyer.net/science/

mikita belikau

unread,
Aug 24, 2021, 5:53:27 PM8/24/21
to Ames Stereo Pipeline Support
Ross, thanks a lot for your reply!
Yes, I just figured out how to get EDR if I know RDR. 
I just discovered that map search in Mars ODE does not support coordinate-based search for EDR (which is from one point of view logical cause EDR have no map information). This is very misleading and confusing cause for RDR it does support this.
Anyway, now I know how to get EDR for a particular area, but still I want to see how far I can go with RDR directly, cause if RDR are good enough (I hope it is true) for ipfind/ipmatch, then I think they are preferable input - not EDR.

Thanks a lot.

mikita belikau

unread,
Aug 25, 2021, 5:56:38 PM8/25/21
to Ames Stereo Pipeline Support
Oleg, as you suggested, I tried to play with parameters and I decided to start with --per-tile-normalize.
I called

ipfind ESP_042315_1985_RED.LBL.cub ESP_046060_1985_RED.LBL.cub ESP_062530_1985_RED.LBL.cub --debug-image 1 --ip-per-tile 10 --per-tile-normalize --nodata-radius 3 

Which gave me following output

Finding interest points in "ESP_042315_1985_RED.LBL.cub".

Read in nodata value: 0

Detected 16866 raw keypoints!

Found 16729 points.

Running sift descriptor generator.

Writing output file ESP_042315_1985_RED.LBL.vwip

Writing debug image: ESP_042315_1985_RED.LBL_debug.png with downsample: 0.0164025

Finding interest points in "ESP_046060_1985_RED.LBL.cub".

Read in nodata value: 0

Detected 14353 raw keypoints!

Found 14239 points.

Running sift descriptor generator.

Writing output file ESP_046060_1985_RED.LBL.vwip

Writing debug image: ESP_046060_1985_RED.LBL_debug.png with downsample: 0.0174001

Finding interest points in "ESP_062530_1985_RED.LBL.cub".

Read in nodata value: 0

Detected 12673 raw keypoints!

Found 12500 points.

Running sift descriptor generator.

Writing output file ESP_062530_1985_RED.LBL.vwip

Writing debug image: ESP_062530_1985_RED.LBL_debug.png with downsample: 0.0177379


Much better!


Now I have 16729, 14239, 12673 points. For now I set --ip-per-tile 10, so, maybe when I set it to 100-200 results will be even better. Need to check this. 

So, now debug messages look like this

ESP_042315_1985_RED.LBL_debug.png

ESP_046060_1985_RED.LBL_debug.png

And

ESP_062530_1985_RED.LBL_debug.png

I am a bit confused that nearly whole area of all images are in radiuses of detected circle. Is this Ok? But anyway, with this results I then run 

ipmatch --debug-image --ransac-constraint homography ESP_042315_1985_RED.LBL.cub ESP_042315_1985_RED.LBL.vwip ESP_046060_1985_RED.LBL.cub ESP_046060_1985_RED.LBL.vwip ESP_062530_1985_RED.LBL.cub ESP_062530_1985_RED.LBL.vwip

this gives me following output

Matching between ESP_042315_1985_RED.LBL.cub (16729 points) and ESP_046060_1985_RED.LBL.cub (14239 points).

Using distance metric: L2

Matching:[***************************************************************.] 99%Found 436 putative matches before duplicate removal.

Found 368 putative matches.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 122 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 81 of output inliers.

--> Homography: Matrix3x3((0.998078,0.000185939,-16772.8)(-0.00265371,1.00605,-6403.56)(-6.47686e-08,8.11789e-08,1))

Found 103 final matches.

Writing match file: ESP_042315_1985_RED.LBL__ESP_046060_1985_RED.LBL.match

Writing debug image: ESP_042315_1985_RED.LBL__ESP_046060_1985_RED.LBL.tif

Writing Debug:[******************************************************] Complete!

Matching between ESP_042315_1985_RED.LBL.cub (16729 points) and ESP_062530_1985_RED.LBL.cub (12500 points).

Using distance metric: L2

Matching:[***************************************************************.] 99%Found 173 putative matches before duplicate removal.

Found 160 putative matches.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 53 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 35 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 23 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 15 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 10 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 6 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 4 of output inliers.

--> Homography: Matrix3x3((-2.4393,0.182904,15666)(-2.86632,0.560033,11411.8)(-0.000150837,6.62159e-06,1))

Found 4 final matches.

Writing match file: ESP_042315_1985_RED.LBL__ESP_062530_1985_RED.LBL.match

Writing debug image: ESP_042315_1985_RED.LBL__ESP_062530_1985_RED.LBL.tif

Writing Debug:[******************************************************] Complete!

Matching between ESP_046060_1985_RED.LBL.cub (14239 points) and ESP_062530_1985_RED.LBL.cub (12500 points).

Using distance metric: L2

Matching:[***************************************************************.] 99%Found 125 putative matches before duplicate removal.

Found 114 putative matches.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 38 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 25 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 16 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 10 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 6 of output inliers.

RANSAC was unable to find a fit that matched the supplied data.

Attempting RANSAC with 4 of output inliers.

--> Homography: Matrix3x3((-0.208809,0.00970083,10261.9)(1.35583,-0.409195,16893.1)(1.46242e-05,-1.89501e-05,1))

Found 4 final matches.

Writing match file: ESP_046060_1985_RED.LBL__ESP_062530_1985_RED.LBL.match

Writing debug image: ESP_046060_1985_RED.LBL__ESP_062530_1985_RED.LBL.tif

Writing Debug:[******************************************************] Complete!


I am just a bit confused that out of these more than 10 000 points final selection is "Found 4 final matches." But I think if I increase points per tile value from 10 to 200 results will be much better. Will try this. But then question: is there any "average" value for matched points? I understand that the more it detect - the better result will be, but let's say, is there any minimal amount of point that is considered as Ok'ish?

Next I need to figure out how to use in next steps these match files to coregister images. I do not want to create a mosaic - just to change the position of it and - I assume that somehow gdal's wrap function should be used, or pc_align. Will try. 

Alexandrov, Oleg (ARC-TI)[KBR Wyle Services, LLC]

unread,
Aug 25, 2021, 6:10:23 PM8/25/21
to mikita belikau, Ames Stereo Pipeline Support
Glad you are making progress.

> I am a bit confused that nearly whole area of all images are in radiuses of detected circle. Is this Ok? 

I don't know what convention those debug images use. 

> Found 4 final matches.

> I am just a bit confused that out of these more than 10 000 points final selection is "Found 4 final matches." But I think if I increase points per tile value from 10 to 200 results will be much better. Will try this. But then question: is there any "average" value for matched points? I understand that the more it detect - the better result will be, but let's say, is there any minimal amount of point that is considered as Ok'ish?

It is hard to say how many points you need. It depends what you use the matches for.  For determining a homography matrix likely a dozen may be enough. For estimating the disparity all over the image you may want a huge lot of them. For registration a dozen may do as well. Maybe 50 will be enough if desired to have a margin.

I don't know why out of the large amount of input canidate interest points you were left with so few. That may be a symptom of the left and right images being dis-similar, such as when there is different illumination, different perspective, etc. Yeah, for now you can just increase --ip-per-tile and hope for the best. 

You should visualize the found matches and see if they make sense. The stereo_gui program can be used for that (see the doc).

Note that bundle_adjust can be used instead of ipfind/ipmatch, and it will do additional outlier filtering based on cameras, which ipfind and ipmatch cannot do.  You don't need to use the results of bundle adjustment, but you can use the *clean.match file it will write.


mikita belikau

unread,
Aug 25, 2021, 6:22:26 PM8/25/21
to Ames Stereo Pipeline Support

Oleg, again, thanks for your reply. Will try to follow your advices.

mikita belikau

unread,
Aug 28, 2021, 12:43:06 PM8/28/21
to Ames Stereo Pipeline Support
So, now I can run ipfind/ipmatch commands and have .match files.

mikita belikau

unread,
Aug 28, 2021, 12:57:03 PM8/28/21
to Ames Stereo Pipeline Support
Now I try to run pc_align

Again, my final goal is to have 'aligned' RDR files, which in this case are JP2 files with correspondent .LBL files.
How I try to do this - I run

pc_align --num-iterations 0 --max-displacement 300  --match-file ESP_042315_1985_RED.LBL__ESP_062530_1985_RED.LBL.match ESP_042315_1985_RED.LBL.cub ESP_062530_1985_RED.LBL.cub --save-transformed-source-point -o aligning_result

Here I use cubes that were created for corespondent .LBL files.

This gives me following output

--> Setting number of processing threads to: 4

Writing log info to: aligning_result-log-pc_align-08-28-1342-38379.txt

Detected datum from ESP_042315_1985_RED.LBL.cub:

Geodetic Datum --> Name: D_Mars  Spheroid: Mars_localRadius  Semi-major axis: 3394839.8133163  Semi-minor axis: 3394839.8133163  Meridian: Reference_Meridian at 0  Proj4 Str: +proj=eqc +lat_ts=15 +lat_0=0 +lon_0=180 +x_0=0 +y_0=0 +a=3394839.8133163 +b=3394839.8133163 +units=m +no_defs 

Will use datum (for CSV files): Geodetic Datum --> Name: D_Mars  Spheroid: Mars_localRadius  Semi-major axis: 3394839.8133163  Semi-minor axis: 3394839.8133163  Meridian: Reference_Meridian at 0  Proj4 Str: +proj=eqc +lat_ts=15 +lat_0=0 +lon_0=180 +x_0=0 +y_0=0 +a=3394839.8133163 +b=3394839.8133163 +units=m +no_defs 

Reading match file: ESP_042315_1985_RED.LBL__ESP_062530_1985_RED.LBL.match

Transform computed from source to reference using a match file:

    1.02429  -0.0359572  -0.0079876      104426

  0.0359796     1.02432  0.00276293     -104659

 0.00788572 -0.00304155     1.02492    -22769.4

          0           0           0           1

Computing the intersection of the bounding boxes of the reference and source points using 9000000 sample points.

Reference box: (Origin: (77.2973, 18.325) width: 0.164487 height: 0.269035)

Source box:    (Origin: (77.2273, 18.3453) width: 0.142319 height: 0.269579)

Intersection reference box:  (Origin: (77.2973, 18.3421) width: 0.0741431 height: 0.252007)

Intersection source    box:  (Origin: (77.259, 18.3453) width: 0.110638 height: 0.245349)

Intersection of bounding boxes took 81.7773 [s]

Reading: ESP_042315_1985_RED.LBL.cub

        --> [********************************************************] Complete!

Loaded points: 77158839

Loading the reference point cloud took 64.8074 [s]

Reading: ESP_062530_1985_RED.LBL.cub

        --> [********************************************************] Complete!

Loaded points: 41173266

Loading the source point cloud took 50.6867 [s]

Data shifted internally by subtracting: Vector3(705612,3.14231e+06,1.0757e+06)

Loading reference as DEM.

Building the reference cloud tree.

Reference point cloud processing took 198.786 [s]

Filtering gross outliers

Filtering gross outliers took 933.751 [s]

Reducing number of source points to 100000

Number of errors: 100000

Input: error percentile of smallest errors (meters): 16%: 2.11771, 50%: 5.21519, 84%: 44.1612

Input: mean of smallest errors (meters): 25%: 1.80614, 50%: 2.80052, 75%: 4.98651, 100%: 31.7621

Initial error computation took 17.4019 [s]

Alignment took 0.001482 [s]

Number of errors: 100000

Output: error percentile of smallest errors (meters): 16%: 2.11771, 50%: 5.21519, 84%: 44.1612

Output: mean of smallest errors (meters): 25%: 1.80614, 50%: 2.80052, 75%: 4.98651, 100%: 31.7621

Final error computation took 6.60767 [s]

Alignment transform (origin is planet center):

    1.02429  -0.0359572  -0.0079876      104426

  0.0359796     1.02432  0.00276293     -104659

 0.00788572 -0.00304155     1.02492    -22769.4

          0           0           0           1

Centroid of source points (Cartesian, meters): Vector3(706355,3.14204e+06,1.0755e+06)

Centroid of source points (lat,lon,z): Vector3(18.4673,77.3301,458.807)


Translation vector (Cartesian, meters): Vector3(17.0689,144.854,47.2951)

Translation vector (North-East-Down, meters): Vector3(-1.09344,15.1181,-152.582)

Translation vector magnitude (meters): 153.333

Maximum displacement of points between the source cloud with any initial transform applied to it and the source cloud after alignment to the reference: 0 m

Translation vector (lat,lon,z): Vector3(-1.84511e-05,0.000268957,152.582)


Transform scale - 1 = 0.0249564

Euler angles (degrees): Vector3(-0.17003,-0.440822,2.01176)

Euler angles (North-East-Down, degrees): Vector3(2.05584,0.064431,-0.191486)

Axis of rotation and angle (degrees): Vector3(-0.0785498,-0.214807,0.973493) 2.06586

Writing: aligning_result-transform.txt

Writing: aligning_result-inverse-transform.txt

Writing: aligning_result-trans_source.tif

        --> [********************************************************] Complete!

Writing: aligning_result-beg_errors.csv

Writing: aligning_result-end_errors.csv

Writing: aligning_result-iterationInfo.csv

Saving to disk took 1744.51 [s]


From the output messages it looks like it is Ok.
Now pc_align created a file aligning_result-trans_source.tif which is 22 GB (I have only 16 GB RAM)!!! Tried to open it in QGIS and obviously without success - I mean, QGIS didn't show any error, but after couple of hours it was still trying to open it. So, I can not check visually how does result of pc_align look like.

For this I have couple of questions:

1) Why this outcome tif of pc_align is so big - 22GB?
2) Since my final goal is to have aligned JP2 (LBL) files, which I use as Input and as a first step convert to cubes, at the end I want to have aligned JP2 (LBL). And question is - how can I do this?
I was trying to understand if pc_align have some options for converting to JP2 (LBL) it's output, or at least to gdal's .VRT files (which can be then converted to JP2) , but I didn't find this info. So, is there a way to specify somehow this conversion? 
3) From what I see in other threads people also use point2dem for processing result of pc_align, but I am not sure I need this if pc_align already produces output file for me. I just need to find a way to make this file relatively small (e.g., convert it to .VRT) and then convert this to JP2 (LBL). How can I do this?

mikita belikau

unread,
Aug 28, 2021, 1:06:08 PM8/28/21
to Ames Stereo Pipeline Support
4) what are the units of --max-displacement parameter of pc_align? Is this meters/pixels/etc? I put value 300, cause in the documentation I saw that this values can be big, but reasonable, but in other threads I saw -1. 

Oleg Alexandrov

unread,
Aug 28, 2021, 2:38:19 PM8/28/21
to mikita belikau, Ames Stereo Pipeline Support
>pc_align --num-iterations 0 --max-displacement 300  --match-file ESP_042315_1985_RED.LBL__ESP_062530_1985_RED.LBL.match ESP_042315_1985_RED.LBL.cub ESP_062530_1985_RED.LBL.cub --save-transformed-source-point -o aligning_result

This is a bit of an experimental approach. pc_align is used to align point clouds. The match file is meant to give it a helping hand. The tool was not designed to align images, though I guess it may produce something if an image value is interpreted as a DEM height.

1) Why this outcome tif of pc_align is so big - 22GB?

Because for every pixel it writes to disk 3 doubles (the xyz coordinates). This tool is meant to align point clouds. 
 
2) Since my final goal is to have aligned JP2 (LBL) files, which I use as Input and as a first step convert to cubes, at the end I want to have aligned JP2 (LBL). And question is - how can I do this?

3) From what I see in other threads people also use point2dem for processing result of pc_align, but I am not sure I need this if pc_align already produces output file for me. I just need to find a way to make this file relatively small (e.g., convert it to .VRT) and then convert this to JP2 (LBL). How can I do this?

Yeah, the tool is not meant to be used that way. When you asked about doing matches I did not know you want to go down that path. I guess you can run point2dem on that cloud and you may get some kind of quasi-image at the end. 
  
A sensible way of using the ASP tools is to bundle adjust a stereo pair, create a DEM, and mapproject those stereo images onto the DEM. Or do some alignment first and then mapproject with aligned cameras onto some other DEM. 

I don't know any toolset which can be used for large scale image alignment. Maybe some folks have experience for that.

Sorry if somehow I encouraged you to go down a path which maybe is not the most productive one.

Oleg 

mikita belikau

unread,
Aug 28, 2021, 3:00:11 PM8/28/21
to Ames Stereo Pipeline Support

Oleg, thanks for your reply.
I know that there is no such tool that can do this, so it's Ok to experiment and to try some tools that were not designed for this originally :-)
Ok, so if pc_align can not do what I want, then I will not experiment with it.
The thing is that I do not need stereo images - I want the same 2d images, but just coregistered to some base image. That's what I want.
From what I got one of possible way to do what I need is to use affine transformation based on the info that ipfind/ipmatch produces.
This thread https://groups.google.com/g/ames-stereo-pipeline-support/c/NVrAIQy3Gv0/m/u9UCzJkPBgAJ has this idea and it is even implemented in ASAP wrapper. I just wanted to try something easier and I just was wondering if pc_align can do what I need (even though it was mentioned here in other context). If not - then will go with this affine transformation and then use gdal wrap on a final step of conversion. Probably, that's the only way to do this for now.

mikita belikau

unread,
Aug 28, 2021, 3:04:14 PM8/28/21
to Ames Stereo Pipeline Support
And, btw, I was also looking at bundle_adjust - it is often mentioned in the threads about similar questions that I have. Can this tool somehow be useful here?

mikita belikau

unread,
Aug 28, 2021, 4:01:05 PM8/28/21
to Ames Stereo Pipeline Support

Oleg Alexandrov

unread,
Aug 28, 2021, 4:32:21 PM8/28/21
to mikita belikau, Ames Stereo Pipeline Support
Bundle adjustment can be used to make the cameras consistent with each other. Then in combination with pc_align and stereo one can make the cameras consistent to the ground. After that, the mapprojected images will be aligned to the ground and to each other. I don't think that tool can do anything about the original input images. 

--
You received this message because you are subscribed to the Google Groups "Ames Stereo Pipeline Support" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ames-stereo-pipeline...@googlegroups.com.

mikita belikau

unread,
Aug 28, 2021, 4:57:14 PM8/28/21
to Ames Stereo Pipeline Support
Ok, got it. Thanks.
Then will go with affine transformation

Oleg Alexandrov

unread,
Sep 1, 2021, 11:50:13 PM9/1/21
to Ames Stereo Pipeline Support
>  Then will go with affine transformation 

For the record, if stereo is used with a left and a right image and with the options 

   --alignment-method homography --part-of-multiview-run

it will keep the left image fixed, precisely the input you provided, with no cropping (that is what  --part-of-multiview-run is for, long story as to why this option), and it will determine based on interest point matches a homography transform which maps the second image to the first, and will apply that transform. The result will be output-prefix-L.tif (same as the left input image), and the right image aligned to the left, with the name output-prefix-R.tif.

Hence, ASP does have some logic for applying a transform to an image for the purpose of registration using interest points, which is what you are trying to achieve. 

Not sure how much this observation will help with your goal. 

mikita belikau

unread,
Sep 3, 2021, 2:29:00 PM9/3/21
to Ames Stereo Pipeline Support
Ohhhh... Oleg, it sounds like something what I really need! 
Great!
Will try to follow your suggestion for sure even though I am now on the final stage of implementation of affine transformation approach. 
Will see what works better (if it works at all) for me.

mikita belikau

unread,
Sep 5, 2021, 5:15:29 PM9/5/21
to Ames Stereo Pipeline Support
So, I managed to complete this process with affine transformation.
Here's what I can say right now:

1) alignment of images after this affine transformation is for sure better, then without it, but still it is not that good, as I could imagine.
E.g. alignment after affine transformation looks like this

Screen Shot 2021-09-05 at 22.56.23.png


Without alignment it looks like this

Screen Shot 2021-09-05 at 23.12.15.png

2) This all process from end to end work extremely slow. Even though I have Ok'ish configuration for this kind of things (16 GB RAM, 3.1 GHz Dual-Core Intel Core i7 processor) it is very slow. And I am not sure that adding more hardware resources will increase speed.
For now the slowest parts are (I ordered them descending)

- ipfind
- gdalwarp
- ipmatch
- cube generation with pds2isis  

Can we somehow speedup the process? I know that for gdalwarp this is probably not possible currently, but what about ipfind and ipmatch? 
What about GPU? Does ASP have possibility to use this to speed-up the process?

3) Currently I run ipfind like this

ipfindCommand = 'ipfind {0} --debug-image 1 --ip-per-tile 500 --per-tile-normalize --nodata-radius 5'.format(cubesFilesList)
os.system(ipfindCommand)

And for one cub file it didn't manage to find points at all. What could be the problem here? It was able to find it when --ip-per-tile was equal to 250.
I will double check this again, but this looks a bit strange.

Oleg, BTW, if I tried to use --interest-operator obalog together with setting options per tile (do not remember exactly was it ip-per-tile, or per-tile-normalization), but it complained that this is not possible. So, either per tile, or interest operator could be used, but not both of them.

Will try to play a bit with this and to give update with more details.

Oleg Alexandrov

unread,
Sep 5, 2021, 11:13:08 PM9/5/21
to mikita belikau, Ames Stereo Pipeline Support
Alignment is a hard problem. The right and left images may be taken from different perspectives, and then a homography transform is better than affine, but even that may not be enough as things on the ground change in a nonlinear way depending on the height. A tool which can do such things is Hugin, but it may not work at full res high dynamic range as for HiRISE or may not account for drastic perspective change, dunno.

Your best bet may be aligning the images to the ground and to each other using bundle adjustment and pc_align followed by mapprojection, also called "orthorectification," which is aware of the ground changes and perspective difference, which your own affine transform is not. 

Can we somehow speedup the process? I know that for gdalwarp this is probably not possible currently, but what about ipfind and ipmatch? 

By using fewer ip, maybe?
 
What about GPU? Does ASP have possibility to use this to speed-up the process?

Nope. ASP does not and likely will never support GPU. If you use mapprojection, however, this can be done with multiple processes on many machines in parallel.  
 
> And for one cub file it didn't manage to find points at all. What could be the problem here? It was able to find it when --ip-per-tile was equal to 250.

Maybe illumination conditions are different? I don't know. Try to use bundle_adjust with those two images instead of ipfind and ipmatch, and see what match file you get. 
 
Oleg, BTW, if I tried to use --interest-operator obalog together with setting options per tile (do not remember exactly was it ip-per-tile, or per-tile-normalization), but it complained that this is not possible. So, either per tile, or interest operator could be used, but not both of them.

--interest-operator obalog needs --descriptor-detector sgrad (or sgrad2, patch, or pca).  I will make it more clear in the doc. I think the per-tile switch is an unrelated thing.  
 

mikita belikau

unread,
Sep 6, 2021, 2:23:48 PM9/6/21
to Ames Stereo Pipeline Support
Oleg, thanks for your reply.
Indeed, all images that I use will have different perspective, illumination and other important parameters, and for sure this will complicate I think every step in this pipeline.

Ok, so, now I want to try what you suggest - "bundle adjustment and pc_align followed by mapprojection"
Then I have a few questions for this:

0) Can you, please, have a look at my current "affine" coregistering" algo? It looks like this

1. get matches for images pairs
2. convert each match file to csv in a following way

x1 y1 x2 y2
25019.5703125 8066.5439453125 8122.2998046875 1786.2694091796875
25449.59765625 8149.630859375 8559.6552734375 1870.1402587890625
25547.470703125 8059.01171875 8655.9140625 1778.660888671875
25467.24609375 8146.11572265625 8577.169921875 1864.910888671875

Here x1 y1 - pixels for left image - reference image, x2 y2 - pixels for right image - source image 

3. for left image (reference) get affine transform from gdal for conversion pixels(x,y) to lat/lon
4. for every x2, y2 create a gcp using data x1, y1, lat = a * x1 + b * y1 + c, lon = d * x1 + e * y1 + f
5. use gdal_translate and add these gcps from 4. to source image
6. use gdalwarp for source image from 5. and use as a target spatial reference  left's image (reference) spatial reference. 

That's how I implemented it. Do you see any missing/incorrect steps here?

1) so, sequence should be

bundle adjustment
pc_align 
mapprojection

No ipfind/ipmatch, right? As I understand ipfind/ipmatch are called somewhere inside this bundle adjustment/pc_align/mapprojection sequence. Is this correct?

2) I am not quite sure what input do I need for this bundle adjustment/pc_align/mapprojection combination.
My goal is to stick to original .LBL( .JP2) RDR files. I see this in many places in the documentation that these commands assume using DEM on some step. Thats not really what I want, cause I am not sure if I have this DEM for required area.
So, question is - can this sequence of commands work with .LBL( .JP2) RDR files without DEM? 

Oleg Alexandrov

unread,
Sep 6, 2021, 3:14:45 PM9/6/21
to mikita belikau, Ames Stereo Pipeline Support
Your algorithm looks reasonable enough at first glance. There can be outliers to watch for and filter out. 

The steps I suggested cannot work without a DEM. Ideally one would do bundle adjustment, do stereo to get a DEM, align that DEM to a preexisting bigger DEM, if needed, with pc_align, align the cameras as well, with bundle adjustment with --initial transform and zero iterations, then mapproject onto that DEM using the new adjusted cameras.

I know you want pure image alignment, without orthorectification, but then, as before, you can't correct the distortion due to perspective and ground. 

I never did large-scale mosaicking myself. Maybe other folks know better. 


--
You received this message because you are subscribed to a topic in the Google Groups "Ames Stereo Pipeline Support" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ames-stereo-pipeline-support/H7TTyAUGya0/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ames-stereo-pipeline...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ames-stereo-pipeline-support/3a4a9db4-5a75-4e03-bda8-b1f4ebc3a530n%40googlegroups.com.

mikita belikau

unread,
Sep 6, 2021, 3:27:10 PM9/6/21
to Ames Stereo Pipeline Support
Ok, got it. So, I need DEM.
Let me just break down what you just wrote.

"one would do bundle adjustment, do stereo to get a DEM"
So, this means I can create my own DEM for my images using stereo command and passing cubes derived from LBLs, right?

 "align that DEM to a preexisting bigger DEM"
That's for me the biggest question. Where to get this? This should be a global DEM for the whole Mars, or for some big region of Mars? If yes, then where can I find it? I am not sure if it exists and what to do if there is no big DEM for the area I am interested.

"if needed, with pc_align, align the cameras as well, with bundle adjustment with --initial transform and zero iterations, then mapproject onto that DEM using the new adjusted cameras."

This is just calling 

bundle adjustment
pc_align 
mapprojection

with correct set of parameters, right?

"I never did large-scale mosaicking myself. Maybe other folks know better. "
It's Ok. Everything happens once for the first time :-)

Alexandrov, Oleg (ARC-TI)[KBR Wyle Services, LLC]

unread,
Sep 6, 2021, 5:56:54 PM9/6/21
to mikita belikau, Ames Stereo Pipeline Support
> So, this means I can create my own DEM for my images using stereo command and passing cubes derived from LBLs, right?

At least that's how I know how to do this kind of things. 

> "align that DEM to a preexisting bigger DEM"
> That's for me the biggest question. Where to get this? This should be a global DEM for the whole Mars, or for some big region of Mars? If yes, then where can I find it? I am not > sure if it exists and what to do if there is no big DEM for the area I am interested.

I don't know either. Maybe the MOLA layer will work. Or otherwise there should be some global Mars DEM somewhere, or DEMs produced from stereo pairs.

> "if needed, with pc_align, align the cameras as well, with bundle adjustment with --initial transform and zero iterations, then mapproject onto that DEM using the new adjusted cameras."
>This is just calling bundle adjustment pc_align mapprojection with correct set of parameters, right?

Yep. See if this can be of help.  https://stereopipeline.readthedocs.io/en/latest/tools/pc_align.html#applying-the-pc-align-transform-to-cameras Again, not sure that's the right way, just sharing what I know.


mikita belikau

unread,
Sep 6, 2021, 6:17:54 PM9/6/21
to Ames Stereo Pipeline Support
Ok, got it. 
Will try to implement this and follow your suggestions.
Also found this https://gi.copernicus.org/preprints/gi-2019-11/gi-2019-11.pdf Seams similar to what you described. The only thing is that here again it is mentioned that input cubes should be created from EDR data, not RDR which is not what I want, so will see if this will work. 

mikita belikau

unread,
Sep 7, 2021, 5:25:07 PM9/7/21
to Ames Stereo Pipeline Support
As I was afraid, it seems like stereo does not like cubes created from RDR (LBL/JP2) files... 
When I run

stereo ESP_062530_1985_RED.LBL.cub ESP_042315_1985_RED.LBL.cub results/run

I got this

Warning: Stereo file ./stereo.default could not be found. Will use default settings and command line options only.


[ 2021-Sep-07 23:18:27 ] : Stage 0 --> PREPROCESSING 

--> Setting number of processing threads to: 4

Warning: Stereo file ./stereo.default could not be found. Will use default settings and command line options only.

Writing log info to: results/run-log-stereo_pprc-09-07-2318-2392.txt

Using session: isis.

Loading camera model: ESP_062530_1985_RED.LBL.cub 

Using image files:  ESP_062530_1985_RED.LBL.cub, ESP_042315_1985_RED.LBL.cub

Using "./stereo.default"

--> Computing statistics for left

  left: [ lo:5 hi:980 m: 461.134 s: 150.99]

--> Adjusting hi and lo to -+2 sigmas around mean.

    left changed: [ lo:159.155 hi:763.113]

--> Computing statistics for right

  right: [ lo:6 hi:991 m: 550.123 s: 160.82]

--> Adjusting hi and lo to -+2 sigmas around mean.

    right changed: [ lo:228.483 hi:871.763]

--> Normalizing globally to: [159.155 871.763]

Loading camera model: ESP_062530_1985_RED.LBL.cub 



Error: **ERROR** Unable to initialize camera model in Camera Factory.

**ERROR** Unable to find PVL group [Instrument] in file [ESP_062530_1985_RED.LBL.cub].

Stereo step 0: Preprocessing failed


Does anybody know how to pass RDR files to stereo in a right way?

Alexandrov, Oleg (ARC-TI)[KBR Wyle Services, LLC]

unread,
Sep 7, 2021, 6:03:51 PM9/7/21
to mikita belikau, Ames Stereo Pipeline Support
That is bad luck. It looks as if cameras cannot be initialized. Normally spiceinit does that. 

From: ames-stereo-pi...@googlegroups.com <ames-stereo-pi...@googlegroups.com> on behalf of mikita belikau <miki...@gmail.com>
Sent: Tuesday, September 7, 2021 2:25 PM
To: Ames Stereo Pipeline Support <ames-stereo-pi...@googlegroups.com>
Subject: Re: [EXTERNAL] Batch coregistering HiRISE *_RED.JP2/LBL files
 
--
You received this message because you are subscribed to the Google Groups "Ames Stereo Pipeline Support" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ames-stereo-pipeline...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ames-stereo-pipeline-support/c112c272-1770-44e0-9f66-cf6f3a2b11cen%40googlegroups.com.

mikita belikau

unread,
Sep 8, 2021, 2:02:57 PM9/8/21
to Ames Stereo Pipeline Support
Right, and spiceinit also wants cubes from EDR (IMG) files that do not have mapprojected info. So, it is again forcing me to use EDR files (.IMG)

I just thought is it possible somehow to remove this mapprojected info from these cubes I got from RDR (.LBL) files, so that spiceinit can handle them properly?

Beyer, Ross A. (ARC-SST)[SETI INSTITUTE]

unread,
Sep 8, 2021, 2:10:27 PM9/8/21
to mikita belikau, Ames Stereo Pipeline Support
Mikita,

Sadly, no. RDR products “are” maps. They cannot be “spiceinit”ed. They are the result of taking camera-geometry images, and projecting them onto a terrain model (in the case of HiRISE RDRs, a smoothed version of MOLA gridded data, but still). Once an image has been map-projected, it is quite difficult to “reverse” the process, and you are generally better off starting from the camera-geometry images and working forwards.

> On Sep 8, 2021, at 11:02 AM, mikita belikau <miki...@gmail.com> wrote:
>
> Right, and spiceinit also wants cubes from EDR (IMG) files that do not have mapprojected info. So, it is again forcing me to use EDR files (.IMG)
>
> I just thought is it possible somehow to remove this mapprojected info from these cubes I got from RDR (.LBL) files, so that spiceinit can handle them properly?
>
> On Wednesday, September 8, 2021 at 12:03:51 AM UTC+2 oleg.al...@nasa.gov wrote:
> That is bad luck. It looks as if cameras cannot be initialized. Normally spiceinit does that.
> To view this discussion on the web visit https://groups.google.com/d/msgid/ames-stereo-pipeline-support/a06f6ccf-9d0b-485d-a576-db045089e9c2n%40googlegroups.com.

Ross

http://RossBeyer.net/science/

mikita belikau

unread,
Sep 8, 2021, 2:24:09 PM9/8/21
to Ames Stereo Pipeline Support
Ross, thanks. This is what I was suspecting, but still wanted to be sure if there’s a way to hack this, but it seems like no.
Ok, got it.
Then there is no other option then to use EDR.

Thanks a lot 

mikita belikau

unread,
Sep 24, 2021, 4:39:18 PM9/24/21
to Ames Stereo Pipeline Support
Trying to replicate commands for EDR files from here https://www.hou.usra.edu/meetings/lpsc2016/eposter/1241.pdf
Faced couple of issues

In particular 
1) I successfully ran

hiedr2mosaic.py 

which gave me image on the left - looks fine for me


Screen Shot 2021-09-24 at 22.28.57.png

Then I ran cam2map4stereo.py for the output from hiedr2mosaic.py which gave me image on the right - does not look Ok for me, cause even though orientation is Ok, but it is "cut off". Why is it like this?

For another image I have this results of hiedr2mosaic.py (left) cam2map4stereo.py (right)

Screen Shot 2021-09-24 at 22.35.19.png

Left is Ok, but right - (cam2map4stereo) - again "cut off" and very "light". What could be the problem

3) I tried even for this "cut off" and "light" inout file run this

stereo ESP_042315_1985_RED.map.cub ESP_065985_1985_RED.map.cub results/output


Here it complains on 


Warning: Stereo file ./stereo.default could not be found. Will use default settings and command line options only.

[ 2021-Sep-24 22:02:05 ] : Stage 0 --> PREPROCESSING 

--> Setting number of processing threads to: 4

Warning: Stereo file ./stereo.default could not be found. Will use default settings and command line options only.

Writing log info to: results/output-log-stereo_pprc-09-24-2202-26060.txt

Using session: isis.

Loading camera model: ESP_042315_1985_RED.map.cub 

Loading camera model: ESP_065985_1985_RED.map.cub 

Using image files:  ESP_042315_1985_RED.map.cub, ESP_065985_1985_RED.map.cub

Using "./stereo.default"

--> Computing statistics for left

  left: [ lo:0.0969188 hi:0.199513 m: 0.145607 s: 0.0160078]

--> Adjusting hi and lo to -+2 sigmas around mean.

    left changed: [ lo:0.113591 hi:0.177622]

--> Computing statistics for right

  right: [ lo:0 hi:0.0908073 m: 0.00742464 s: 0.0179881]

--> Adjusting hi and lo to -+2 sigmas around mean.

    right changed: [ lo:0 hi:0.0434009]

--> Normalizing globally to: [0 0.177622]

Loading camera model: ESP_065985_1985_RED.map.cub 

Error: **ERROR** Unable to initialize camera model in Camera Factory.

**ERROR** Unable to find PVL group [Instrument] in file [ESP_065985_1985_RED.map.cub].

How can I check this PVL group in the cub? And how to fix if it is wrong?

Scott McMichael

unread,
Sep 26, 2021, 3:11:47 PM9/26/21
to Ames Stereo Pipeline Support
Can you post the exact command you ran to get those image results?  cam2map4stereo.py does limit the output images to the region it believes they overlap.  Looking at the two images above the computed output overlap looks reasonable for the first image.  For the second have you ruled out that there is a scaling issue in qview?

.cub files have header information written as plain text at the beginning of the file so you can use a text editor to at least see what is in there.

Scott

Beyer, Ross A. (ARC-SST)[SETI INSTITUTE]

unread,
Sep 27, 2021, 5:25:15 PM9/27/21
to mikita belikau, Ames Stereo Pipeline Support
Mikita,

These two HiRISE observations, ESP_042315_1985 and ESP_065985_1985 are not a stereo pair, they do not appear to overlap.

When you give two HiRISE images to cam2map4stereo.py, that program extracts the lon/lat bounding boxes of the two images, and determines the intersection bounding box, and it then map-projects each of the images into that intersecting area, cropping off portions that are outside of that intersected box. These two images are close enough that an overlapping maxlon, minion, maxlat, minlat box can be computed, but the program doesn’t determine if the *actual* image footprints overlap, only that their bounding boxes do (and since MRO orbits aren’t exactly N/S, the footprints often appear “slanted” when projected).

The maps on the right-hand side in both of your examples are the result of this. You can confirm this by looking at the scene in each of the left-hand images, and there aren’t any features that I can see in one image that are in the other. They appear to cover different portions of the floor of Jezero.


> Then I ran cam2map4stereo.py for the output from hiedr2mosaic.py which gave me image on the right - does not look Ok for me, cause even though orientation is Ok, but it is "cut off". Why is it like this?
>
> For another image I have this results of hiedr2mosaic.py (left) cam2map4stereo.py (right)
>
> To view this discussion on the web visit https://groups.google.com/d/msgid/ames-stereo-pipeline-support/1fc723bd-5231-42f1-98b5-0a35e918639fn%40googlegroups.com.
> <Screen Shot 2021-09-24 at 22.28.57.png><Screen Shot 2021-09-24 at 22.35.19.png>

Ross

http://RossBeyer.net/science/

Reply all
Reply to author
Forward
0 new messages