Fixing Z axis to be 0 for robot known to be on the floor

108 views
Skip to first unread message

Torchtopher

unread,
Jan 18, 2024, 4:59:31 PM1/18/24
to tagslam
Hi everyone,

I am trying to use tagslam to improve the accuracy of my existing odometrey. I have what I think is a good use case tagslam. I can map all the tags beforehand, the robot is known to be fixed on the floor (Z=0), and there are odometrey updates to use. The main potential limitation is that I can not add more tags to the environment (see taglocations.png).
tagslam_odom.png
So far I have seen some very off results (robot only moves on a plane) and am not sure what I am doing wrong. Here is my config. https://github.com/FRC900/2023RobotCode/tree/tagslam/zebROS_ws/src/tagslam_launch/config_hw_localize. robot.jpg
Some things I think I might be doing wrong. I don't understand what T_body_odom is, my camera is near the top of the robot but odometry is from the wheels, so maybe I need the transform between the center of the robot to the camera?
I can provide some bags if they will be helpful. I don't really understand why the odometry is jumping around so much and in 3d, especially because just the wheel odometry is more accurate.


Bernd Pfrommer

unread,
Jan 18, 2024, 5:37:10 PM1/18/24
to tagslam
Hard to say what's wrong without data. Please send me a bag with images and odometry.
In situations like this you want to first get the localization just from tags working.
That your distortion coefficients are all zero is suspect. Where did you get that calibration from?
As far as the transform for the odom is concerned: whatever your source of odometry is, when that algorithm publishes an odometry update, that odometry update is with respect to a certain coordinate system on your robot. Could be the center between the wheels. To find out, rotate your robot in place. If it's the center, then your odometry updates will not have a translation component. Now move the robot straight forward to see what (x,y) direction is forward. That should give you the location and orientation of the odom coordinate system (assuming it's only 2d odom).

How to find the proper transform? I never used tagslam with wheel based odometry, only with VIO in which case the reference frame for the odom is typically the IMU, and the IMU-to-camera transform is already known from previous calibration steps.
In your particular case I would use a calibration board or a single big tag and affix it flat on the ground (hopefully your robot can see it). Then run tagslam to get the camera pose with respect to that tag, then use a measure tape to measure from the center of the tag to center of your robot. Now you can calculate the transform from tag-to-odom frame, and you already know the tag-to-camera transform, which allows you to compute the odom-to-camera frame.

But send me a short bag file and I'll try to wing it. The odom is probably assuming x-axis forward and is centered between the wheels. Also need to know the tag size *for certain* (measure twice!). 



Torchtopher

unread,
Jan 18, 2024, 7:12:04 PM1/18/24
to tagslam

Hi Brad,


First off thanks so much for the (insanely quick) response. 

Here is a bag we recorded https://drive.google.com/file/d/1ZA8WobeOa8IAlF9f_tmSnK8wo8T-FRVL/view?usp=sharing Likely good to rerun tag detections as running live doesn’t match camera frame rate.


Distortion coefficients were obtained from a camera info (/zed_objdetect/left/camera_info in the bag) of an already rectified image. We also have the non rectified image (/zed_objdetect/left_raw/camera_info) which has non zero distortion coefficients. Camera being used is https://www.stereolabs.com/products/zed-2 


We checked odometry while spinning in place, and there was no translation. X and Y we believe also to be correct, X is forward and Y is to the left. 


We believe the TF tree to be correct for odom->camera, it should be between base_link and zed_objdetect_camera_center. We have the camera to odometry transform from CAD and it’s currently in camera_poses.yaml, is that correct?  We will also run the method of the tag flat on the floor to determine the camera transform and make sure they are the same.


Just to confirm, tag size is the black border? That is what we have measured to be 0.161m.


Thanks for your help! We would be more than happy to provide other data.

Bernd Pfrommer

unread,
Jan 18, 2024, 10:31:11 PM1/18/24
to tagslam
- your calib was good
- you only had tags 7 & 8 in the data set, but none of them in the tagslam.yaml. I guess the tagslam.yaml was for a different data set?
- you didn't use the odom at all (I suppose you were still testing without) so I added it.
- will send you a tar file with the config that I used to get this picture. Will also contain a README
- you can dial how much you want to rely on the odometry by tweaking the odom_rotation/translation_noise parameters in tagslam.yaml. The noise is the error assumed between frames (see comments in tagslam.yaml)

Make sure you test the heck out of this before you use this. The problem with tagslam (any mapping/place recognition in fact) is that it produces jumps in your trajectory when it doesn't see tags for a while, the odom has drift, and then it sees a tag again. Typically people run the *controller* based on odometry, but then the *planner* based on SLAM. Robot controllers do funky stuff when they get jumps in positions. If your planer gets a jump, it can replan and send an updated way points to the controller, so the planner can tolerate a jump.  Just don't feed jump data to the controller...
from_the_top.png

Torchtopher

unread,
Jan 21, 2024, 2:09:26 PM1/21/24
to tagslam
Thanks for all of your help! TagSLAM seems to working on our robot!

We had a few more questions about how to integrate tagSLAM data into our pathing. Our current setup involves planning the path once beforehand, and then following it using our wheel odometry (250hz). We were wondering how tagslam would be best used in a setup like this where we don't replan our path and follow it using odom (not relative to map frame). I tried hacking up a script to publish empty tag detections at 100hz to lean on odometry more but ended up with tagslam publishing slower and complaining about high graph errors than just normally running it on camera detections. Bag tested on here https://drive.google.com/file/d/1lF0X8LBdk7u3kN9C7dKQcLh1Q_LxUCp8/view?usp=drive_link, robot drives backwards at deently high speeds Ideally we want something where we can follow tagSLAM influenced odometry but I am not sure if it is possible. We also experimented with a kalman filter to fuse our wheel and tagslam odom here, but saw somewhat jerky results, but could explore it more. We can also try to get faster tag detections, we are working on https://github.com/FRC900/2023RobotCode/tree/tagslam/zebROS_ws/src/deeptag_ros which might help with running tagslam faster.

Any ideas you have about in general how we should approach this would be most appreciated.

Again thank you for you help so far! It means a lot to us.

(sorry if you already got a message similar to this, I think my first one didn't send.)
Reply all
Reply to author
Forward
0 new messages