Issues with solution accuracy and discrepancies

97 views
Skip to first unread message

Umair Khan

unread,
Apr 4, 2018, 11:24:07 PM4/4/18
to OpenStartracker
I've been working with the code for a while now, and I decided to compare the DEC/RA results I was getting to nova.astrometry.net and make sure things are still working correctly. It turns out that they aren't, and I was hoping you could give some insight as to what's happening.

I guess it can be broken down into three separate problems: inaccurate solutions, calibration discrepancies, and images not solving when they should be. (All of this testing was with the original xmas images.)

Issue 1: Just running the image test with the original xmas samples and calibration data results in inaccurate DEC/RA numbers compared to the solving the images using nova.astrometry.net.

Issue 2: There are massive discrepancies when you change the number of photographs to calibrate with. Just using the xmas images, calibrating with three photos and running the image test will result in different DEC/RA values compared to calibrating with six photos, which will result in different values compared to calibrating with all nine photos. These discrepancies are vast: image1 solved to a DEC/RA/ORI of 308/263/264 when using a camera calibrated with six images, and then solved to a DEC/RA/ORI of 19/279/214 when using a camera calibrated with all nine xmas images. (And none of the calibration permutations have resulted in an accurate solver. All three calibrations give inaccurate DEC/RA values compared to nova.astrometry.net.)

Issue 3: Photos that are used for calibration and solved during calibration are not being solved during image tests. For example, I calibrated a camera with six sample xmas images, image1 through image6. When running the image test, image2, image3, image5, and image6 do not solve. And the images that do solve give inaccurate DEC/RA numbers.

(Note: the DEC/RA numbers given by astrometry.net during the calibration process are correct. The inaccuracies are when using startracker.py.)

I've kept an untouched version of OpenStarTracker around from when I started to work on it in earnest (commit 5909f2b on March 7) and these errors happen with that "clean" version, as well.

The funny thing is, I remember comparing the system to nova.astrometry.net back when I was first looking into OpenStarTracker (with the previous commit, 5f224bc on Feb 1), and everything was working back then.

Why is this happening, and how can the issues be resolved?

Thanks.

Andrew Tennenbaum

unread,
Apr 4, 2018, 11:38:55 PM4/4/18
to Umair Khan, OpenStartracker
Wow, this is super interesting - I’ll look into it tommorow and get back to you. Would you mind sending me the images you are using?

Also something I was curious about - are any of you guys going to smallsat this summer?



--
You received this message because you are subscribed to the Google Groups "OpenStartracker" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openstartrack...@googlegroups.com.
To post to this group, send email to opensta...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openstartracker/ed4f503e-f8d7-42e0-a96a-fabfcca5a1ed%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Umair Khan

unread,
Apr 5, 2018, 12:00:09 AM4/5/18
to OpenStartracker

This happens with basically any image set, but the tests I mentioned in the original post were with the xmas images from the original repository. I've attached these to this reply - they're in the PNG format to keep the files small, but I used the original BMPs in the testing.

And I don't know if we're going to the smallsat conference - I'll ask about it during one of our meetings and get back to you.

Thanks.
image0.png
image9.png
image1.png
image2.png
image3.png
image4.png
image5.png
image6.png
image7.png
image8.png

Andrew Tennenbaum

unread,
Apr 5, 2018, 8:44:33 PM4/5/18
to Umair Khan, OpenStartracker
Hi, so the first thing I'd do is make sure you've got the latest version of openstartracker from git, as we recently fixed an issue which would have sometimes resulted in false positives on images that can't solve.

19/279/214 is actually correct - the last term is just 360 from what astrometry reports, but the values are equivalent. To avoid confusion, I have just pushed a change it so that it now reports the same way as astrometry.net

The reason you see different results when using different subsets of the images is that the calibration script uses the image in which astrometry.net was able to identify the most stars. With fewer images you get a less accurate callibration

The feb16 images are much more representative than the xmas images - when the xmas images were taken, the camera was not properly aligned with the tv which we were using to simulate the star field. astrometry.net has automatic distortion compensation, so it happily solves them anyway.




To unsubscribe from this group and stop receiving emails from it, send an email to openstartracker+unsubscribe@googlegroups.com.
To post to this group, send email to openstartracker@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openstartracker/3f0921cb-4adf-4c4f-b7f7-dbe545ef66a2%40googlegroups.com.

Andrew Tennenbaum

unread,
Apr 5, 2018, 8:48:06 PM4/5/18
to Umair Khan, OpenStartracker
I should add - in the most recent version, repeating your test with the first 3 images in xmas, none of them solve, which is the correct behavior

Umair Khan

unread,
Apr 5, 2018, 11:57:23 PM4/5/18
to OpenStartracker
Awesome, this is really helpful. Thanks a lot.
Reply all
Reply to author
Forward
0 new messages