TAN Projection

196 views
Skip to first unread message

Ryan Powers

unread,
Mar 1, 2022, 10:03:34 AM3/1/22
to astrometry
Hi,

I'm trying to figure out what TAN projection actually means in terms of photography lenses. From what I gather, it's referencing a gnomic projection, which corresponds with an ideal rectilinear lens vs. an ideal fisheye (equidistant projection). For a small field of view, the two are essentially identical (thanks, small angle approximation!), but as you get into wider fields the error starts to creep in when comparing the two.

An ideal fisheye gives you a uniform field of view for each pixel, but an ideal rectilinear lens is going to have reducing field of view for pixels away from the center, to the point where about 15 degrees off axis, you have about 1 degree of movement (relative to a fisheye).

So, assuming my understanding above is all correct, my questions are as follows:
  • If I am specifying a pixel field of view (scale-low, scale-high, with scale-units arcsecperpix), what am I actually specifying? Is that the pixel FOV at CRPIX1,CRPIX2, or an average over the whole image (i.e. total FOV divided by size)?
  • If I want to undistort the image, should my distortion matrix/SIP polynomial map from pixel space to nominal degree space (i.e. equidistant projection), or to nominal tangent space (i.e. gnomic projection)?
  • Wouldn't it be easier to work on images that have a consistent pixel FOV across the image (fisheye/equidistant)? I don't understand how the solver works at guessing scale, so there's probably a good reason you implemented it the way to did, but at small FOV they are essentially the same, but at large FOV, the pixel distance between two stars will mean different distances in angle space depending on where in the image they are.
I ask these questions because I have had solutions that succeed but give clearly incorrect results because the solved quad was at the edges of the image. This is, of course, with neither an ideal rectilinear nor an ideal fisheye lens, because I'm using consumer lenses. However, with a lot of these lenses, there are known distortion curves that can be used to correct them, I just want to make sure I'm correcting them to the correct thing.

As an aside, for lenses that don't have known distortion curves, they can be calibrated with something like OpenCV. OpenCV comes up with a radial distortion map (and a tangential one, but I think that's less of an issue for me). See https://docs.opencv.org/4.x/d4/d94/tutorial_camera_calibration.html for formulae. I've been trying to figure out how to convert between them and the SIP map for astrometry. Is it as easy as just substituting any usage of r^2 into the x^2+y^2 (or further powers for higher order polynomials) positions in the SIP matrix?

Thanks,
Ryan

Dustin Lang

unread,
Mar 1, 2022, 10:31:50 AM3/1/22
to Ryan Powers, astrometry, David W Hogg
Hi,

Yeah, I believe TAN = gnomonic.  And yes, I think the scale limits are effectively the scale at CRPIX.  You may want to try setting the CRPIX-at-center option, which puts the reference point in the middle of your image.  Then, yeah, as you say, it should be straight-forward to convert between different distortion model parameterizations.  The SIP convention is described here, https://fits.gsfc.nasa.gov/registry/sip.html

And yeah, if you want to undistort, you would do so to the TAN projection.

I haven't thought much about using different projections, or what parts of Astrometry.net we would have to change.  These are multiple steps:
- during matching, we're taking 4-star mini-constellations and computing a 4-dimensional "shape code" for them, and matching those (in "shape space") to
pre-computed shapes of stars from the reference catalog.  The shape code must be invariant to translation, rotation, and scale.  As you've observed,
Astrometry.net doesn't work well for fish-eye projections because, I guess, that projection doesn't preserve angles in pixel space, so it's introducing a
shear to the shapes, and the shape codes are not shear-invariant.  If we knew the angular scale of the image, I guess we could undo that, but... that's the point,
we don't know the angular scale :)
- after we have a match, then we construct a WCS projection based on that match, and then predict the locations of other stars and check how many of those
predictions are correct.  This part would be easier to swap in a different projection.

cheers,
--dustin






--
You received this message because you are subscribed to the Google Groups "astrometry" group.
To unsubscribe from this group and stop receiving emails from it, send an email to astrometry+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/astrometry/77ba25c4-c640-4fd8-9968-9ca1f90f29efn%40googlegroups.com.

Ryan Powers

unread,
Mar 1, 2022, 11:16:54 AM3/1/22
to astrometry
Thanks for all the info, it's very much appreciated. Also, apparently I've been going around referring to the gnomonic projection as if it were created by gnomes (gnomic)

I'm still having trouble wrapping my head around solving with a wide field TAN projection. I've taken some sample imagery from a page I found about distortion (fisheye vs rectilinear) with some night sky photos on it: https://www.lonelyspeck.com/defish/comment-page-3/

I grabbed one of the sample undistorts and grabbed the subset in the top right corner. Left (first, depending on how it shows) is fisheye, right (second) is rectilinear. My confusion is that with the fisheye projection, the shape of the stars seen in this corner should remain relatively unchanged no matter where it appears in the image. With the rectilinear projection, you can see how it has stretched, the angles have changed, and the distances have changed (though I'm not sure how much they have changed relative to each other). With a rectilinear projection, this collection of stars would appear like the fisheye if near the center, but like the rectilinear sample at the edges (and of course somewhere between when somewhere between). Even without knowing angle scale, it still seems to me like it would be easier to work with the fisheye projection, where the shape is relatively invariant to location in the field of view.
fisheye.png     rectilinear.png

Just to reiterate, I am not an astronomer, and come at this from the photography side. There are probably some fundamental things I don't understand, particularly considering astronomy is working in much smaller scales than typical photography. I appreciate you taking the time to explain this to me.

--Ryan

John Murrell

unread,
Mar 2, 2022, 5:37:00 PM3/2/22
to Ryan Powers, astrometry
Mapping the surface of a sphere to a flat plane such as a map or Camera chip has always been a problem - lots of different projections have been created over the years and a lot have fallen by the wayside.

A search on map projections will provide diagrams of how the common projections work and what their advantages and disadvantages are.

One way of experimenting other than the using the night sky is to photograph a celestial sphere if you can find one from various angles and magnifications.

Also don't forget that camera lenses have their own distortions particularly wide angle ones. Imaging something like a large sheet of square paper or a well built brick wall can be quite instructive about the lens distortions. However you need to remember that a lot of digital cameras do an 'in camera' distortion correction to correct for the lens distortions. In some cameras this can be turned on & off. Also in higher end camera the RAW files are not corrected and the correction is done in the Camera software on your computer.

Good Luck

John Murrell
Reply all
Reply to author
Forward
0 new messages