Spherical offset between two pointings

115 views
Skip to first unread message

Falco Peregrinus

unread,
Nov 8, 2024, 5:38:32 PM11/8/24
to Bangalore Astronomical Society

Hi, I have a really basic question and need some help.

Let's say we have two cameras pointing at the sky: cam1 and cam2. Both cameras are at the same location, but there's a fixed offset between them (i.e., their optical axes are not perfectly parallel). I want to measure the angle between them.

How can I do that?

Here’s what I’m thinking:

If we point both cameras to the night sky and have the RA and Dec coordinates of the center of the field of view (FoV) for both cameras (which we can get by solving the images using platesolving/astrometry, for example), I can calculate the angle between them. I know the location of the cameras and the datetime, so I can easily convert the RA and Dec to AltAz coordinates. From this, I can compute the angular separation between the two pointings using something like Astropy's separation function or by calculating the haversine distance. Alternatively, I can transform both RA/Dec coordinates to AltAz and then calculate the delta in Alt and Az between them.

Why do I want to do this?

Later in the night, I plan to point both cameras in the sky again. If I have the RA and Dec coordinates of the center of the FoV for one camera, could I use the previously derived angle/angles between the two cameras to calculate the RADec of the second camera's FoV? My assumption is that the angle between them won't stay the same as the sky moves, and I also assume that the delta in Alt and Az will change over time.

So my question is: Is it possible to calculate the angle/angles between the two pointings, and use that angle/angles to derive the pointing of camera 1 from camera 2 (and vice versa) at a later time?

Cheers
Thank you

Akarsh Simha

unread,
Nov 9, 2024, 1:14:08 AM11/9/24
to b-...@googlegroups.com
On Fri, Nov 8, 2024 at 2:38 PM Falco Peregrinus <downsh...@gmail.com> wrote:

Hi, I have a really basic question and need some help.


This isn't at all a basic question :-P
 

Let's say we have two cameras pointing at the sky: cam1 and cam2. Both cameras are at the same location, but there's a fixed offset between them (i.e., their optical axes are not perfectly parallel). I want to measure the angle between them.

How can I do that?

Here’s what I’m thinking:

If we point both cameras to the night sky and have the RA and Dec coordinates of the center of the field of view (FoV) for both cameras (which we can get by solving the images using platesolving/astrometry, for example), I can calculate the angle between them. I know the location of the cameras and the datetime, so I can easily convert the RA and Dec to AltAz coordinates. From this, I can compute the angular separation between the two pointings using something like Astropy's separation function or by calculating the haversine distance. Alternatively, I can transform both RA/Dec coordinates to AltAz and then calculate the delta in Alt and Az between them.


This is correct. If both images are taken at the exact same time, you can plate-solve to find the center (RA, Dec) of each frame, and compute the angular distance using the haversine law.

If they are not taken at the same time, you will need to compute the alt/az from the plate-solved RA/Dec for each of the frames using the correct time of each exposure. Then you can compute the angular distance using alt/az coordinates to get the angle difference between the cameras.
 

Why do I want to do this?

Later in the night, I plan to point both cameras in the sky again. If I have the RA and Dec coordinates of the center of the FoV for one camera, could I use the previously derived angle/angles between the two cameras to calculate the RADec of the second camera's FoV? My assumption is that the angle between them won't stay the same as the sky moves, and I also assume that the delta in Alt and Az will change over time.


Why would the angle between the cameras change as the sky rotates?

So my question is: Is it possible to calculate the angle/angles between the two pointings, and use that angle/angles to derive the pointing of camera 1 from camera 2 (and vice versa) at a later time?


I don't understand your final goal. The Alt/Az pointing of both cameras remains the same unless you have them on some sort of mount that rotates. You can use plate solving to find the present RA/Dec of either camera, convert that to an Alt/Az, and if you're interested in asking the question "where is my camera pointing now?" at a later time, simply convert that Alt/Az back to RA/Dec at the later time instant.

Regards
Akarsh

 


Cheers
Thank you

--
News - http://www.bas.org.in/
Events - http://www.bas.org.in/Home/events_calendar
---
You received this message because you are subscribed to the Google Groups "Bangalore Astronomical Society" group.
To unsubscribe from this group and stop receiving emails from it, send an email to b-a-s+un...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/b-a-s/f5828e34-0882-41bc-bfe5-3ea3edf9683en%40googlegroups.com.

Falco Peregrinus

unread,
Nov 10, 2024, 9:49:57 AM11/10/24
to Bangalore Astronomical Society
On Saturday 9 November 2024 at 07:14:08 UTC+1 Akarsh Simha wrote:
On Fri, Nov 8, 2024 at 2:38 PM Falco Peregrinus <downsh...@gmail.com> wrote:
 

Why would the angle between the cameras change as the sky rotates?

If we point that setup in the sky, but at the different point(different RADec), would previously measured dAlt and dAz, or haversine distance be the same?

I don't understand your final goal. The Alt/Az pointing of both cameras remains the same unless you have them on some sort of mount that rotates. You can use plate solving to find the present RA/Dec of either camera, convert that to an Alt/Az, and if you're interested in asking the question "where is my camera pointing now?" at a later time, simply convert that Alt/Az back to RA/Dec at the later time instant.

My goal is determine angle/angles between these cameras, if that angle is fixed(meaning dAlt and dAz is not changing with different altitude/azimuth), or equally,
fit a model that could explain dependence dAlt,dAz between these two pointings.

I want to derive a pointing position of the camera1 from the camera2 and vice versa, for a different RADec and/or different time of the night.

I can get RADec(and AltAz) of center of FoV for camera1 and camera2, for a entire range of altitude, if that could help
Message has been deleted

Akarsh Simha

unread,
Nov 11, 2024, 5:23:43 AM11/11/24
to b-...@googlegroups.com
On Sun, Nov 10, 2024 at 6:49 AM Falco Peregrinus <downsh...@gmail.com> wrote:

On Saturday 9 November 2024 at 07:14:08 UTC+1 Akarsh Simha wrote:
On Fri, Nov 8, 2024 at 2:38 PM Falco Peregrinus <downsh...@gmail.com> wrote:
 

Why would the angle between the cameras change as the sky rotates?

If we point that setup in the sky, but at the different point(different RADec), would previously measured dAlt and dAz, or haversine distance be the same?

I don't understand your final goal. The Alt/Az pointing of both cameras remains the same unless you have them on some sort of mount that rotates. You can use plate solving to find the present RA/Dec of either camera, convert that to an Alt/Az, and if you're interested in asking the question "where is my camera pointing now?" at a later time, simply convert that Alt/Az back to RA/Dec at the later time instant.

My goal is determine angle/angles between these cameras, if that angle is fixed(meaning dAlt and dAz is not changing with different altitude/azimuth), or equally,
fit a model that could explain dependence dAlt,dAz between these two pointings.

I want to derive a pointing position of the camera1 from the camera2 and vice versa, for a different RADec and/or different time of the night.

I can get RADec(and AltAz) of center of FoV for camera1 and camera2, for a entire range of altitude, if that could help


So it looks like this is the set up you're considering:

Camera 1 and Camera 2 are mounted rigidly on the same platform, but the platform itself can be re-oriented. So in some sense, if I move Camera 1 30° to the left, Camera 2 also moves 30° to the left because they are on the same rigid platform.

If this is your case, the dAlt / dAz values will not, in general, remain fixed. If your camera mount can rotate along all three axes (think roll, pitch, yaw), then the dAlt/dAz will not remain fixed. If your cameras can only rotate around two axes (pitch and yaw, but no roll), the dAlt/dAz will remain fixed. Not only that, you will need to determine not just ra/dec but also the roll angle along the optic axis (i.e. a position angle) from the plate-solver.

It is easiest to use quaternions if your cameras can move along all three axes. The process would be as follows:

1. Capture an exposure on both cameras simultaneously, and plate-solve to find (ra1, dec1, pa1), (ra2, dec2, pa2).
2. Convert the equatorial coordinates onto unit vectors representing the direction (x1, y1, z1), (x2, y2, z2) where x1 = cos(dec1) cos(ra1), z1 = sin(dec1) and so on.
3. Compute and pre-store the quaternions transforming a standard-frame into the two camera axes from yaw = ra, pitch = dec, roll = pa – there is some work to figure out the details, perhaps there is already a ready reference that I'm not aware of. Let us call them q1 and q2.

With the aforementioned yaw/pitch/roll convention, the standard frame we are thinking of will have the z-axis pointing to north pole and x-axis going to first-point of Aries and the quaternions computed will represent rotations from that frame to the camera frame

So later when you have to compute the position of camera 2 from the position of camera 1, you can use the following method:

1. Plate solve camera 1 to get (ra1', dec1', pa1') the new positioning.
2. Compute the new quaternion q1' in the same manner
3. Compute the transformation quaternion R = q1' q1^{-1}, which will represent the rigid rotation that ensued in going from the old attitude of camera1 to the new attitude
4. Apply this quaternion to the camera 2 quaternion, i.e. q2' = R q2
5. Convert q2' into yaw-pitch-roll representation, so you can read off the new ra2' = yaw, dec2' = pitch

Regards
Akarsh
 


Regards
Akarsh

 


Cheers
Thank you

--
News - http://www.bas.org.in/
Events - http://www.bas.org.in/Home/events_calendar
---
You received this message because you are subscribed to the Google Groups "Bangalore Astronomical Society" group.
To unsubscribe from this group and stop receiving emails from it, send an email to b-a-s+un...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/b-a-s/f5828e34-0882-41bc-bfe5-3ea3edf9683en%40googlegroups.com.
--
News - http://www.bas.org.in/
Events - http://www.bas.org.in/Home/events_calendar
---
You received this message because you are subscribed to the Google Groups "Bangalore Astronomical Society" group.
To unsubscribe from this group and stop receiving emails from it, send an email to b-a-s+un...@googlegroups.com.

Akarsh Simha

unread,
Nov 11, 2024, 5:44:13 AM11/11/24
to b-...@googlegroups.com
Okay, worked out some more details. The method I suggested uses quaternions computed from Tait-Bryan angles. You should be able to simply use the `pyquaternion` library in python to do your computations:

q1 = Quaternion(axis=[1, 0, 0], degrees=pa1) * Quaternion(axis=[0, 1, 0], degrees=dec1) * Quaternion(axis=[0, 0, 1], degrees=ra1)
q2 = Quaternion(axis=[1, 0, 0], degrees=pa2) * Quaternion(axis=[0, 1, 0], degrees=dec2) * Quaternion(axis=[0, 0, 1], degrees=ra2)

Finally to retrieve the ra2', dec2' from q2', you can use this StackExchange answer: https://answers.unity.com/questions/416169/finding-pitchrollyaw-from-quaternions.html
I believe the convention matches the one I'm using above, so in that case theta will be declination and psi will be RA.

There could still be convention and sign mistakes...

Akarsh Simha

unread,
Nov 11, 2024, 7:37:31 PM11/11/24
to b-...@googlegroups.com
BTW, I solve a similar problem in my plate-solving finder system (described here  
https://github.com/kstar/zero-in) but the offset between my camera and scope is small. Therefore I can get away by simply assigning my scope (“camera2”) a pixel position (x, y) on the plate of my finder (“camera1”) and reading out the plate solution at the same pixel. This should be an exact solution if the camera produces a perfect gnomonic projection, but with a lens distortion, I don’t know if there is a guarantee that the polynomial solution correctly and consistently extends beyond the plate. It probably does and so you can probably use this method too.


Falco Peregrinus

unread,
Nov 12, 2024, 5:28:06 AM11/12/24
to Bangalore Astronomical Society


So it looks like this is the set up you're considering:

Camera 1 and Camera 2 are mounted rigidly on the same platform, but the platform itself can be re-oriented. So in some sense, if I move Camera 1 30° to the left, Camera 2 also moves 30° to the left because they are on the same rigid platform.

If this is your case, the dAlt / dAz values will not, in general, remain fixed. If your camera mount can rotate along all three axes (think roll, pitch, yaw), then the dAlt/dAz will not remain fixed. If your cameras can only rotate around two axes (pitch and yaw, but no roll), the dAlt/dAz will remain fixed. Not only that, you will need to determine not just ra/dec but also the roll angle along the optic axis (i.e. a position angle) from the plate-solver.

That is correct. Camera1 and camera2 are mounted on the same platform, to be more precise, it is not camera 2 but telescope. It has alt-az mount. So, I guess it can move left/right and up/down, and that would be pitch and yaw(first time hearing about this terms, I am not engineer, but thanks, I will look into it).
If there is only pitch and yaw, do I need position angle? I can get position angle for camera1, from the astrometry. But for telescope I only have approx RA and Dec, since there is no image from it.
And, camera1 is tilted from the telescope(their optical axes are not parallel), and also, camera 1 is a little bit rotated in respect to horizon.

That’s correct. Camera1 and the telescope (acutally not a second camera) are mounted on the same platform. The telescope uses an alt-az mount, so it can move left/right and up/down, which I understand correspond to yaw and pitch. I hadn’t heard these terms before—I’m not an engineer, but thank you for the clarification; I’ll look into it.
Since only pitch and yaw are involved, do I still need the position angle? I can obtain the position angle for Camera1 through astrometry. However, for the telescope, I only have approximate RA and Dec coordinates, as there’s no image from it.
Camera1 is tilted relative to the telescope (their optical axes aren’t parallel, so that would be dAlt and dAz), but it is also slightly rotated with respect to the horizon.


It is easiest to use quaternions if your cameras can move along all three axes. The process would be as follows:

1. Capture an exposure on both cameras simultaneously, and plate-solve to find (ra1, dec1, pa1), (ra2, dec2, pa2).
2. Convert the equatorial coordinates onto unit vectors representing the direction (x1, y1, z1), (x2, y2, z2) where x1 = cos(dec1) cos(ra1), z1 = sin(dec1) and so on.
3. Compute and pre-store the quaternions transforming a standard-frame into the two camera axes from yaw = ra, pitch = dec, roll = pa – there is some work to figure out the details, perhaps there is already a ready reference that I'm not aware of. Let us call them q1 and q2.

With the aforementioned yaw/pitch/roll convention, the standard frame we are thinking of will have the z-axis pointing to north pole and x-axis going to first-point of Aries and the quaternions computed will represent rotations from that frame to the camera frame

So later when you have to compute the position of camera 2 from the position of camera 1, you can use the following method:

1. Plate solve camera 1 to get (ra1', dec1', pa1') the new positioning.
2. Compute the new quaternion q1' in the same manner
3. Compute the transformation quaternion R = q1' q1^{-1}, which will represent the rigid rotation that ensued in going from the old attitude of camera1 to the new attitude
4. Apply this quaternion to the camera 2 quaternion, i.e. q2' = R q2
5. Convert q2' into yaw-pitch-roll representation, so you can read off the new ra2' = yaw, dec2' = pitch

I will look into quaternions, I never used them. I need some time to try to implent what you wrote, I really appriciate effort. Thank you.

Falco Peregrinus

unread,
Nov 12, 2024, 5:47:55 AM11/12/24
to Bangalore Astronomical Society

Actually, I have a similar setup. I have a telescope and a camera on the same platform, with the telescope mounted on an alt-az mount (as I explained in my response above). The issue is that I can’t see where the telescope is pointing through my camera, as it’s outside the camera's field of view.

You’re absolutely correct about lens distortion—it’s more pronounced toward the edges of the image, and I’ve been using astrometry with SIP coefficients to correct these distortions. While I can extend the WCS solution beyond the image, higher-order polynomial extrapolations aren’t very reliable. I mean, I can use wcs from platesolving camera1 to transform RADec of the telescope to x,y coordinates(camera 1 image cooridinates). Now I have optical axis(OA) of the camera1 in pixel coordinates(It is actually center of the image) and pointing of the telescope in pixel coordinates. And I can calculate the distance in x and y.
I see two problems there:
1. The distance would change with changing altitude/azimuth
2. As we already said, transformation from RADec/AltAz to pixel coordinates outside of the image is not reliable due to polynomial extrapolation of the distortion.

This is why I’m trying to determine the offset angle(s) between the pointing of Camera1 and the telescope, so I can use the solution from the plate-solved Camera1 and found offset angles to estimate where the telescope is pointing.

Thank you very much for sharing your Git project! At first glance, it looks fantastic. I’ll need some time to understand exactly what you’ve done and to see if I can reuse some of your code.
Pretty soon I will have some questions about your code.

Cheers

Akarsh Simha

unread,
Nov 12, 2024, 6:36:44 AM11/12/24
to b-...@googlegroups.com
Regarding extrapolating from the WCS solution (I have not dug into SIP coefficients, but I assume it's some form of polynomial fit), I have seen it "sort of work" even when my scope position is outside the camera FOV, but it's probably a source of error that can be eliminated. My proposal to circumvent this below:

On Tue, Nov 12, 2024 at 2:28 AM Falco Peregrinus <downsh...@gmail.com> wrote:




So it looks like this is the set up you're considering:

Camera 1 and Camera 2 are mounted rigidly on the same platform, but the platform itself can be re-oriented. So in some sense, if I move Camera 1 30° to the left, Camera 2 also moves 30° to the left because they are on the same rigid platform.

If this is your case, the dAlt / dAz values will not, in general, remain fixed. If your camera mount can rotate along all three axes (think roll, pitch, yaw), then the dAlt/dAz will not remain fixed. If your cameras can only rotate around two axes (pitch and yaw, but no roll), the dAlt/dAz will remain fixed. Not only that, you will need to determine not just ra/dec but also the roll angle along the optic axis (i.e. a position angle) from the plate-solver.

That is correct. Camera1 and camera2 are mounted on the same platform, to be more precise, it is not camera 2 but telescope. It has alt-az mount. So, I guess it can move left/right and up/down, and that would be pitch and yaw(first time hearing about this terms, I am not engineer, but thanks, I will look into it).
If there is only pitch and yaw, do I need position angle? I can get position angle for camera1, from the astrometry. But for telescope I only have approx RA and Dec, since there is no image from it.
And, camera1 is tilted from the telescope(their optical axes are not parallel), and also, camera 1 is a little bit rotated in respect to horizon.

That’s correct. Camera1 and the telescope (acutally not a second camera) are mounted on the same platform. The telescope uses an alt-az mount, so it can move left/right and up/down, which I understand correspond to yaw and pitch. I hadn’t heard these terms before—I’m not an engineer, but thank you for the clarification; I’ll look into it.
Since only pitch and yaw are involved, do I still need the position angle? I can obtain the position angle for Camera1 through astrometry. However, for the telescope, I only have approximate RA and Dec coordinates, as there’s no image from it.
Camera1 is tilted relative to the telescope (their optical axes aren’t parallel, so that would be dAlt and dAz), but it is also slightly rotated with respect to the horizon.


Okay, it looks like you're solving the same offset problem that I solved for my plate-solving finder scope. Don't bother trying to understand quaternions. Your dAlt/dAz offset will remain constant if the scope is truly moving in alt/az only. Technically, that's not fully the case: the axes of a real telescope are not orthogonal, and you may not have it fully level on the ground. But for starters you can use dAlt/dAz.

If you want to relax that assumption and not use quaternions, you could try the following:

1. Capture a picture on the camera, plate-solve it to find (RA1, Dec1) and pa1
2. Given the scope's known (RA2, Dec2) corresponding to (RA1, Dec1), i.e. the alignment point, identify the pixel (x, y) that it would correspond to assuming a perfect gnomonic projection (or really any projection of the sphere to a plane should work; I'm picking gnomonic because that's a simple model of a pinhole camera). You could do this as follows:

x = tan(angDist) * cos(angle12), y = tan(angDist) * sin(angle12)

where angDist is the angular distance from camera RA/Dec to scope RA/Dec as computed by the haversine law (or you could approximate it using just Pythagoras theorem correcting RA by cos(dec) if the offset is expected to be small) and angle12 is the position angle of camera 2's position on the plate of camera 1. For small differences, the latter would simply be atan2(cos(dec1)*(ra2 - ra1), dec2-dec1) + pa1 or something like that (not sure of my signs), but if you want the full solution you'll need to apply the spherical law of cosines. I haven't checked my algebra, but this is what I got from applying to the triangle involving plate center, scope position and NCP:

cos(angle12 - pa1) = [sin(dec2) - sin(dec1) cos(angDist)]/[cos(dec1) sin(angDist)] (eqn 1)

Something more is needed to resolve the ambiguity in inverting the cosine, maybe the spherical law of sines, which gives

sin(angle12 - pa1) = sin(ra2-ra1) cos(dec2) / sin(angDist) (eqn 2).

Again, my signs could be wrong.

Now, this (x, y) value you get will remain fixed; in other words, both the angDist and the bearing of scope w.r.t. camera coordinates (for example, "up" of the camera sensor) as computed in angle12 will remain fixed as the two move around the sky in unison. You can also see why the projection used to derive this conclusion is irrelevant as long as it is unchanged shot-to-shot. You now save (angDist, angle12) as the alignment data.

So at a later stage, to compute (ra2', dec2') you can invert eqn1 and eqn2 along with the haversine law, using the stored angDist / angle12 values and the measured (ra1', dec1') and pa1' values.

Much easier using quaternions.

BTW, please feel free to e-mail me off-list if you have questions about my code.

Regards
Akarsh

Falco Peregrinus

unread,
Nov 18, 2024, 4:15:44 PM11/18/24
to Bangalore Astronomical Society

Thank you very much for your detailed response; I really appreciate it.

I tried to go through your calculations and even implemented them in the code, but I’m not sure I completely understood everything. I’m getting some unexpected results.

If you have some spare time, I would truly appreciate it if you could take a look at my implementation. I’ve attached a sample dataset (in CSV format, around 20 points with altitude from 20 to 90), the code you can run("code.py"), and the output I’m currently getting. It is all together in "data.csv". I explained below which columns are what.

Here’s the context:
  1. Cam1 refers to the camera whose image I plate-solve to determine RA and Dec.

    • I use a tweak order of 4 for distortion correction.
    • From the plate-solving process, I also retrieve the orientation (pa1) using the following calculation (output from astrometry.net):
det = np.tan(header["CD1_1"]) * np.tan(header["CD2_2"]) - \
np.tan(header["CD1_2"]) * np.tan(header["CD2_1"])
parity = 1.0 if det >= 0 else -1.0
T = parity * np.tan(header["CD1_1"]) + np.tan(header["CD2_2"])
A = parity * np.tan(header["CD2_1"]) - np.tan(header["CD1_2"])
orient = -1*np.rad2deg(np.arctan2(A, T))
However, I’m not entirely sure if this part is correct.

I am not sure I completely understood your point 2. Given the scope's known (RA2, Dec2) corresponding to (RA1, Dec1), i.e. the alignment point, identify the pixel (x, y) that it would correspond to assuming a perfect gnomonic projection (or really any projection of the sphere to a plane should work; I'm picking gnomonic because that's a simple model of a pinhole camera).

Cam2 refers to the telescope, for which I also have RA and Dec coordinates.

  • Using these coordinates, I calculate the haversine distance and angle12.

Next, I attempt to "go back" to the coordinates of Cam2 by using the previously calculated haversine distance, angle12, and the RA and Dec of Cam1. I call these reconstructed coordinates "predicted."

Finally, I compare the actual Cam2 coordinates with the predicted ones, and the differences are also included in the output.

If you could clarify this part and verify my implementation, it would be immensely helpful.
Here is the link to my dropbox with code.py and data.csv.

https://www.dropbox.com/scl/fo/9kxy59elfzv9rk62wek4f/AFN9b1ZqwqtaCOCTSeSay8w?rlkey=46r81ea4y5sph1b1b9x0g4kwc&st=4rumtl7z&dl=0

Thank you again for your time and expertise!

Cheers

Falco Peregrinus

unread,
Nov 19, 2024, 9:54:56 AM11/19/24
to Bangalore Astronomical Society
I tried to implement your advice for quaternions, could it be done like this:
(I dont have PA for camera2(telescope) that is why there is only 010 and 001 for the q21

# PA, Dec, RA
# Camera1 first position is q11, and the second position is q12
# Camera2(telescope) first position is q21, and the second position that we try to predict is q22
q11 = Quaternion(axis=[1, 0, 0], degrees=1) * Quaternion(axis=[0, 1, 0], degrees=0) * Quaternion(axis=[0, 0, 1], degrees=0)
q12 = Quaternion(axis=[1, 0, 0], degrees=1) * Quaternion(axis=[0, 1, 0], degrees=90) * Quaternion(axis=[0, 0, 1], degrees=0)

R = q12*q11.inverse
q21 = Quaternion(axis=[0, 1, 0], degrees=0) * Quaternion(axis=[0, 0, 1], degrees=0)
q22 = R * q21
# q22
w, x, y, z = q22.q

# Calculate Euler angles
roll = math.degrees(math.atan2(2 * y * w - 2 * x * z, 1.0 - 2 * y * y - 2 * z * z))
pitch = math.degrees(math.atan2(2 * x * w - 2 * y * z, 1 - 2 * x * x - 2 * z * z))
yaw = math.degrees(math.asin(2 * x * y + 2 * z * w))
print("RA:", yaw, "Dec:",pitch, "PA:", roll)


Cheers

prithvi gautham

unread,
Nov 19, 2024, 9:35:55 PM11/19/24
to b-...@googlegroups.com
@Falco Peregrinus Are you trying to propagate the estimates?

--
News - http://www.bas.org.in/
Events - http://www.bas.org.in/Home/events_calendar
---
You received this message because you are subscribed to the Google Groups "Bangalore Astronomical Society" group.
To unsubscribe from this group and stop receiving emails from it, send an email to b-a-s+un...@googlegroups.com.

Falco Peregrinus

unread,
Nov 20, 2024, 2:11:28 AM11/20/24
to Bangalore Astronomical Society
@prithvi gautham
If you’re referring to the general idea, my goal is to measure the angular offset(s) between two optical axes: one belonging to my camera, whose images can be plate-solved to determine the RA and Dec of the center of its field of view, and the other belonging to the telescope, for which I also know the RA and Dec at specific altitude angles.

Why? I aim to obtain an independent estimate of the telescope's pointing by combining the plate-solved coordinates from the camera with the previously measured angular offsets (or possibly a pointing model, if the offsets vary).

Does this answer your question?


Cheers

prithvi gautham

unread,
Nov 20, 2024, 5:31:37 AM11/20/24
to b-...@googlegroups.com
Yes, I do understand your primary goal. Usually quaternions provide a numerically solvable / understandable solution for processors where the problem would be to estimate the next step, in your case the next co-ordinates where euler estimates have a chance of getting locked. If that's not the case and you would like to start simple, then you may first want to get the current error estimate. It may slow down things a bit, but taking a step back gives you a deeper understanding of the problem at hand (i am not aware of your background). If this is already done, then quaternions are the way to go. QUEST (Quaternion Estimator) is one such already implemented- on which you can easily find plenty of data online - with tutorials as well.

Hope it helps.

Cheers
Prithvi

Akarsh Simha

unread,
Nov 20, 2024, 8:28:02 PM11/20/24
to b-...@googlegroups.com
 Hello

On Mon, Nov 18, 2024 at 13:15 Falco Peregrinus <downsh...@gmail.com> wrote:

Thank you very much for your detailed response; I really appreciate it.

I tried to go through your calculations and even implemented them in the code, but I’m not sure I completely understood everything. I’m getting some unexpected results.


Usually a sign mistake, a mistake in converting degrees to radians, or a swapping of sine and cosine (equivalently x and y axes) is likely to be the culprit.

If you have some spare time, I would truly appreciate it if you could take a look at my implementation. I’ve attached a sample dataset (in CSV format, around 20 points with altitude from 20 to 90), the code you can run("code.py"), and the output I’m currently getting. It is all together in "data.csv". I explained below which columns are what.


I don’t have a lot of time to spare this month, but I’ll take a look if I can.


Here’s the context:
  1. Cam1 refers to the camera whose image I plate-solve to determine RA and Dec.

    • I use a tweak order of 4 for distortion correction.
    • From the plate-solving process, I also retrieve the orientation (pa1) using the following calculation (output from astrometry.net):
det = np.tan(header["CD1_1"]) * np.tan(header["CD2_2"]) - \
np.tan(header["CD1_2"]) * np.tan(header["CD2_1"])
parity = 1.0 if det >= 0 else -1.0
T = parity * np.tan(header["CD1_1"]) + np.tan(header["CD2_2"])
A = parity * np.tan(header["CD2_1"]) - np.tan(header["CD1_2"])
orient = -1*np.rad2deg(np.arctan2(A, T))
However, I’m not entirely sure if this part is correct.

I’m not familiar with the WCS polynomial fit to be able to reason out here. But I can suggest a simple, albeit slightly less efficient, solution which is what I have done (I believe).

Before that a comment on the parity aspect: are you not allowed to assume the parity of your camera? Most astronomical imaging systems should have the same positive parity, unless you’re thinking of a guider that has a pickup mirror or are attaching the camera to a prism diagonal.

In any event, you can always “discover” the parity of your system by checking the sign of the cross product of, say, north and east.

The easy way to find the roll with the WCS would be to take a small displacement from the frame center in declination and use the WCS to map it to a pixel. Then use atan2 to compute the north angle. This will work everywhere except near the poles and unless your FOV is very large. More precisely, if (α, δ) is the center of your frame as reported by the plate solver, compute the point (x, y) corresponding to (α, δ + FOV/10), say. Now all you need to do is atan2(y - h/2, x - w/2) where (w, h) are the width and height of the image. Now if the pole is in the frame, you’ll have to make small displacements. If the pole is close to the center, this idea may not work so well, I imagine. Different points in your frame will have different birth angles. It may instead be better to work with a different coordinate system if you expect to hit problems here.

If the parity is unknown, you could do the same in RA to find east (or west) and use the relative orientation of east and north to determine parity.


I am not sure I completely understood your point 2. Given the scope's known (RA2, Dec2) corresponding to (RA1, Dec1), i.e. the alignment point, identify the pixel (x, y) that it would correspond to assuming a perfect gnomonic projection (or really any projection of the sphere to a plane should work; I'm picking gnomonic because that's a simple model of a pinhole camera).

Cam2 refers to the telescope, for which I also have RA and Dec coordinates.

  • Using these coordinates, I calculate the haversine distance and angle12.

Next, I attempt to "go back" to the coordinates of Cam2 by using the previously calculated haversine distance, angle12, and the RA and Dec of Cam1. I call these reconstructed coordinates "predicted."


I still think it’s far easier to work with quaternions than with this approach. You can try using a solver like tetra3 instead of astrometry.net, which I believe can directly output a quaternion representing your solve. I’m guessing this is the case because PiFinder (which uses tetra3) uses the roll value to determine how to orient the image of an object on the screen to match the eyepiece.

Once you have the quaternion representation, there are several algorithms for smoothing, interpolation, estimation, whatnot in the literature.

Your problem in general has two steps: firstly, estimation — where you estimate the alignment between the two cameras using the known correspondence between the two. Then there is the prediction step, where you use the previously computed estimate to determine the position of one camera with respect to the other.

Normally, estimation is a continuous process as you get more data points (thinking in terms of Kalman filters for example), but plate solving is so accurate and drift-free that you can just use a single point to estimate unless your demands on precision are exacting. If your accuracy requirement is high, I would think about modeling flexure and temperature related changes, or mitigating them through mechanical means.

I have to look up my earlier email but my comments were largely around estimation and less around prediction.

I’ll see if I can find the time to look into your data and code, it depends on how my other projects and deadlines go this week.
--
News - http://www.bas.org.in/
Events - http://www.bas.org.in/Home/events_calendar
---
You received this message because you are subscribed to the Google Groups "Bangalore Astronomical Society" group.
To unsubscribe from this group and stop receiving emails from it, send an email to b-a-s+un...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages