geo-tf? geo-slam?

170 views
Skip to first unread message

Nick Armstrong-Crews

unread,
May 24, 2013, 6:27:12 PM5/24/13
to ros-sig-geo...@googlegroups.com
Both tf and all ROS SLAM solutions are based on euclidean coordinate systems. ECEF is one such system for the globe, but numerical precision can become an issue (esp. for noobs) and it becomes a paint to impose simple constraints like "the robot's scanning lidar is ground parallel." Lat/long and UTM are more common methods we're likely to get out of sensors or geo-referenced databases, but not being Euclidean makes it hard for tf and SLAM. Finally, there's this: http://xkcd.com/977/

Why is this relevant? Let's say I have a plane with a nice GPS/INS taking pictures or lidar scans from 10,000 feet, or even a satellite. It's currently very tough to use this kind of data with ROS, but the tf and SLAM solutions in ROS would still apply.

Thoughts:
* can we track round-off errors from various coord conversions/transformations and add them in naturally as a pose covariance?
* often, a local translation offset can be used in conjunction with ECEF coords... transformation operations on sets of points could do this under-the-hood and be transparent to user
* is the GPS time offset (15 or so seconds from UTC?) accounted for "naturally" ? ... since tf lookups are all time-based.

Jack O'Quin

unread,
May 24, 2013, 7:58:16 PM5/24/13
to ros-sig-geo...@googlegroups.com
On Fri, May 24, 2013 at 5:27 PM, Nick Armstrong-Crews <nickarmst...@gmail.com> wrote:
Both tf and all ROS SLAM solutions are based on euclidean coordinate systems. ECEF is one such system for the globe, but numerical precision can become an issue (esp. for noobs) and it becomes a paint to impose simple constraints like "the robot's scanning lidar is ground parallel." Lat/long and UTM are more common methods we're likely to get out of sensors or geo-referenced databases, but not being Euclidean makes it hard for tf and SLAM. Finally, there's this: http://xkcd.com/977/

UTM *is* Euclidean. It is a Transverse Mercator projection, around a few degrees East or West of a meridian of longitude. 

Almost all of my autonomous vehicle work used UTM. As long as you stay within a grid cell several hundred kilometers across, it works fine. There are no problems transforming vehicle-relative laser scans, etc.

Why is this relevant? Let's say I have a plane with a nice GPS/INS taking pictures or lidar scans from 10,000 feet, or even a satellite. It's currently very tough to use this kind of data with ROS, but the tf and SLAM solutions in ROS would still apply.

Thoughts:
* often, a local translation offset can be used in conjunction with ECEF coords... transformation operations on sets of points could do this under-the-hood and be transparent to user

Beware that rviz (and OGRE) do all their graphical 3D computations in 32-bit float, not double. I had problems with that displaying coordinates in meters just within a single UTM latitude band and MGRS grid cell. For that, you need to define a local region small enough to for float calculations.

IIRC, tf does use 64-bit double, so it's not a problem.
 
* is the GPS time offset (15 or so seconds from UTC?) accounted for "naturally" ? ... since tf lookups are all time-based.

I believe the tf buffers only track information for the previous five seconds, so this could become an issue.
--
 joq

Eric Perko

unread,
May 24, 2013, 8:08:45 PM5/24/13
to ros-sig-geo...@googlegroups.com
On Fri, May 24, 2013 at 7:58 PM, Jack O'Quin <jack....@gmail.com> wrote:

On Fri, May 24, 2013 at 5:27 PM, Nick Armstrong-Crews <nickarmst...@gmail.com> wrote:
Both tf and all ROS SLAM solutions are based on euclidean coordinate systems. ECEF is one such system for the globe, but numerical precision can become an issue (esp. for noobs) and it becomes a paint to impose simple constraints like "the robot's scanning lidar is ground parallel." Lat/long and UTM are more common methods we're likely to get out of sensors or geo-referenced databases, but not being Euclidean makes it hard for tf and SLAM. Finally, there's this: http://xkcd.com/977/

UTM *is* Euclidean. It is a Transverse Mercator projection, around a few degrees East or West of a meridian of longitude. 

Almost all of my autonomous vehicle work used UTM. As long as you stay within a grid cell several hundred kilometers across, it works fine. There are no problems transforming vehicle-relative laser scans, etc.

Why is this relevant? Let's say I have a plane with a nice GPS/INS taking pictures or lidar scans from 10,000 feet, or even a satellite. It's currently very tough to use this kind of data with ROS, but the tf and SLAM solutions in ROS would still apply.

Thoughts:
* often, a local translation offset can be used in conjunction with ECEF coords... transformation operations on sets of points could do this under-the-hood and be transparent to user

Beware that rviz (and OGRE) do all their graphical 3D computations in 32-bit float, not double. I had problems with that displaying coordinates in meters just within a single UTM latitude band and MGRS grid cell. For that, you need to define a local region small enough to for float calculations.

IIRC, tf does use 64-bit double, so it's not a problem.

For reference, the ROS Answers post with more explanation and links to a bug with a workaround is: http://answers.ros.org/question/33624/how-to-view-map-regions-with-large-coordinate-values-using-rviz/

- Eric
 
 
* is the GPS time offset (15 or so seconds from UTC?) accounted for "naturally" ? ... since tf lookups are all time-based.

I believe the tf buffers only track information for the previous five seconds, so this could become an issue.
--
 joq

--
You received this message because you are subscribed to the Google Groups "ros-sig-geographic-info" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ros-sig-geographi...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Nick Armstrong-Crews

unread,
May 25, 2013, 6:39:57 PM5/25/13
to ros-sig-geo...@googlegroups.com
Wait - UTM is not Euclidean! It's a piecewise non-linear projection that maps a 3D Euclidean space onto a cylinder. Can't do so without distortion!

But as you say, within a Zone, it's darn close (I think <0.1% error in translation/distance, <0.5% error in orientation). I understand that more serious problems can happen when you cross zone boundaries...

So how would tf calculate a transform between the feet of a person straddling a UTM zone boundary? One foot at 12S 724475mE 3825205mN and the other at 13S 230304mE 3825205mN. Should we define a global tf tree with a coordinate frame at each UTM boundary corner? We'd also need logic to look up the nearest parent frame for the robot's moving frame (and users would have to be careful about zone changes, when if they're just looking at coords and not zone, there will be a huge discontinuity at zone change).

Again, these concerns are probably not relevant for common ROS use cases (indoor turtlebot), but wouldn't it be nice to be able to track the path of a plane from LAX to NYC? Maybe try to do visual odometry / stitching on the way? And view in rviz?

I think ECEF is a reasonable back-end for tf to do calculations, but it may still be worthwhile to consider some sort of automated "local offset" system of coordinate frames, so that rviz and other 32-bit precision computations don't get in trouble.

It should be noted that ECEF is not great for spaceships - the Earth is not *really* in a fixed position for them (the sun is a better reference).

Jack O'Quin

unread,
May 25, 2013, 7:45:27 PM5/25/13
to ros-sig-geo...@googlegroups.com
On Sat, May 25, 2013 at 5:39 PM, Nick Armstrong-Crews <nickarmst...@gmail.com> wrote:
Wait - UTM is not Euclidean! It's a piecewise non-linear projection that maps a 3D Euclidean space onto a cylinder. Can't do so without distortion!

But as you say, within a Zone, it's darn close (I think <0.1% error in translation/distance, <0.5% error in orientation). I understand that more serious problems can happen when you cross zone boundaries...

As you say, it's close enough for practical purposes, as long as the limitations are understood.
 
So how would tf calculate a transform between the feet of a person straddling a UTM zone boundary? One foot at 12S 724475mE 3825205mN and the other at 13S 230304mE 3825205mN. Should we define a global tf tree with a coordinate frame at each UTM boundary corner? We'd also need logic to look up the nearest parent frame for the robot's moving frame (and users would have to be careful about zone changes, when if they're just looking at coords and not zone, there will be a huge discontinuity at zone change).

We are only concerned with outdoor robots in this context.

When dealing with a regional UTM frame that is small enough for rviz and other float tools, there is relatively little extra distortion even if you extend the coordinates across a zone boundary for tens of kilometers into an adjacent grid.
 
Again, these concerns are probably not relevant for common ROS use cases (indoor turtlebot), but wouldn't it be nice to be able to track the path of a plane from LAX to NYC? Maybe try to do visual odometry / stitching on the way? And view in rviz?

I have only considered this issue theoretically, but I do believe it's important.

I have been planning to use lat/long for long-range mapping and path planning, with UTM for regional planning and navigation. For surface vehicles and atmospheric aircraft, that seems the most natural.

I have given no thought at all to spacecraft.
 
I think ECEF is a reasonable back-end for tf to do calculations, but it may still be worthwhile to consider some sort of automated "local offset" system of coordinate frames, so that rviz and other 32-bit precision computations don't get in trouble.

Perhaps so. I had not previously considered it, and I doubt I grasp all the advantages and disadvantages. But they are worth discussing.

I presume that the PROJ.4 library handles ECEF, but we should check to verify that it does.
 
It should be noted that ECEF is not great for spaceships - the Earth is not *really* in a fixed position for them (the sun is a better reference).

I suppose one could construe spacecraft navigation as outside the scope of "geographic information". But, surface navigation on other planets or moons probably qualifies, given a suitable ellipsoid definition.
--
 joq
Reply all
Reply to author
Forward
0 new messages