Fwd: Placement of a Lidar on a mobile robot

939 views
Skip to first unread message

Pito Salas

unread,
Jul 28, 2022, 5:00:42 PM7/28/22
to hbrob...@googlegroups.com
Hi all,

When it comes to mounting a lidar on a robot, what are some of the considerations?

My robot runs ROS and I've mounted to Lidar and set up the various transforms ("tf"s) correctly  --  I believe. 

When I display the Lidar data while moving the robot forward and backwards, it is fairly stable. In other words, I think, that the odometry data reporting the motion of the robot, correctly "compensates" for the movement so that the lidar data as displayed stays more or less in the same place.

However when I turn the robot in place, the Lidar data drifts a little and then compensates somewhat. I also was able to create a decent Slam map with this setup. Although not as relaible as I would like.

It turns out that the place where the lidar is mounted is near the casters, and as a result, in an in place turn, the lidar doesn't simply rotate in place, but instead moves a lot. Because, it is not over the center of rotation in fact its as far away from it as it could be.

My question is: does the math which is used to compute the lidar data during an in place turn compensate for the placement. Does it use the various transforms (which reflect the relative placement of the lidar) or does it just use the odometry of the robot as a whole?

(Hard to explain, but I hope you follow)

Yes, I could rebuild my robot to do the more intuitive thing and place the lidar over the center of turn. But I would want to avoid all that work if it it's not really needed.


Pito Salas
Faculty, Computer Science
Brandeis University


camp .

unread,
Jul 28, 2022, 5:07:04 PM7/28/22
to hbrob...@googlegroups.com
Sweet robot, Pito!  :] 

--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/8CBCC9A4-3003-4862-AAC9-BBB62E14FAC9%40gmail.com.

Michael Ferguson

unread,
Jul 28, 2022, 5:33:01 PM7/28/22
to hbrob...@googlegroups.com
TF should negate any physical offsets - but it really depends on the SLAM package using TF correctly. The widely used ones (karto, cartographer, slam_toolbox) all should do this.

That said, you might also have timing issues - which TF won't handle (since the timing reported to it will be wrong!). If your odometry or laser are lagging/etc relative to one another, that could cause some issues.

-Fergs

Pito Salas

unread,
Jul 28, 2022, 6:36:14 PM7/28/22
to hbrob...@googlegroups.com
Note that the effect is visible without slam. In rviz as the robot moves forward, the "image" of the wall or obstacle stays more or less in place relative to the odom. However if I rotate or turn, then the image moves around and when the rotation stops it settles back down.

Is the timing problem exarcerbated by having the lidar offset from the center of rotation? I could imagine that if the lidar is over the center of rotation then the timing is less critical... but I'm just going by gut feel. 

When you watch the physical robot rotate in place, the lidar follows a pretty wide circular arc around the center of rotation. You can easily imagine that this causes havoc with the calculations that produce the /scan topic.

My big question is, is it worth rearranging things to bring the lidar back to the center of rotation. I think my answer is yes.

Thanks for your insights and knowledge, as always!


Pito Salas
Faculty, Computer Science
Brandeis University

Michael Ferguson

unread,
Jul 28, 2022, 6:50:31 PM7/28/22
to hbrob...@googlegroups.com
Almost every real robot out there has the laser offset from the center of the robot, so it's pretty much a solved problem (with TF and proper drivers giving correct timestamps).

If the timing is wrong, the change in offset really doesn't help much, since small angular errors result in LARGE x/y offsets when a scan point is several meters away from a robot (and this will be the majority of your error, far larger than the small offset from your non-centered laser).

Your comment that "when the rotation stops it settles back down", really makes me think it is a timing related issue. One way to get a better idea is to go into RVIZ, and set the "decay time" of the laser scanner display to something like 60 seconds. Then do your driving around - this will layer all the scans on top of each other and give you a better idea of how accurate the odometry/laser relationship is. In particular - if you just do a bit of rotation, does the final scan when you stop rotating line up with the first scan before you started rotating? If so, it's almost certainly timing related.

-Fergs


Pito Salas

unread,
Jul 28, 2022, 7:09:18 PM7/28/22
to hbrob...@googlegroups.com
Such good advice, thanks!

And if there is a timing problem - what causes that and how can I address it? I'm guessing that the timestamps on corresponding tf messages don't line up in other words given real world time t, the tf message for lidar would be t +/- some error and the one for odometry would be +/- some other error. 

But... how would that happen?

Pito Salas
Faculty, Computer Science
Brandeis University

Cheuksan Wang

unread,
Jul 28, 2022, 7:28:28 PM7/28/22
to hbrob...@googlegroups.com
On Thu, Jul 28, 2022 at 4:09 PM Pito Salas <pito...@gmail.com> wrote:
>
> Such good advice, thanks!
>
> And if there is a timing problem - what causes that and how can I address it? I'm guessing that the timestamps on corresponding tf messages don't line up in other words given real world time t, the tf message for lidar would be t +/- some error and the one for odometry would be +/- some other error.
>
> But... how would that happen?
>

The sensor, network, and ROS all add delays to the data. In industry,
we add a per sensor constant to each ROS timestamp. We find the
constants by temporal calibration. You can learn about it from this
video:

https://www.coursera.org/lecture/state-estimation-localization-self-driving-cars/lesson-3-sensor-calibration-a-necessary-evil-jPb2Y
> To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/4E44551E-5A87-4808-B046-CBD010454C58%40gmail.com.

Michael Wimble

unread,
Jul 28, 2022, 11:26:53 PM7/28/22
to hbrob...@googlegroups.com
There are a few parts to the answer. And I’ll mostly deal with what happens in a two wheel, differential drive robot.

It’s not clear what you mean by LIDAR data drifting. If, as I suspect, you mean visualizing the LIDAR with rviz, then what you may be seeing is the averaging effect. In rviz, the display component for LIDAR has an option to include the last “N” points, as I recall, showing up as the Decay Time parameter for the plugin. So, the LIDAR data is likely just fine, and you’re seeing the effect of history disappearing over time. Crank down the Decay Time parameter to, say, 1 and see if things look better.

Otherwise, the problem with LIDAR usually only shows up in something like a SLAM algorithm, and then it’s because the odometry and LIDAR disagree with each other. And this usually has nothing to do with LIDAR, which is typically pretty truthful.

In SLAM, the algorithm gets two competing truths (typically), odometry and LIDAR. And, besides the normal problems with wheel odometry, which have been bitched about repeatedly in this group, there is also the geometry problem that typically shows up in the motor driver.

Whether using gazebo (simulation) or real motor drivers, both typically rely on knowing the distance between the two wheels, and the wheel circumference. With those two constants and the wheel encoder counts (and some constants such has how many encoder ticks there are per revolution), simple math computes how far the left and right wheels moved during some small change in time, which allows computing the angle of revolution for that same time period. 

You can really see when the circumference is wrong by moving the robot back and forth with SLAM happening and the resulting map showing. If you find that the map shows a wall while the robot is moving and the LIDAR shows that what it thinks is the wall changes as the robot moves, then you have the wheel circumference constant wrong. 

That is, with a wall ahead of the robot, move the robot in a straight line forward. The map wall will stay in place in the rviz map display, but if the LIDAR dots corresponding to the wall move closer or farther than the map wall while movement is happeningl, the wheel circumference constant is wrong. Just adjust your wheel circumference until moving forwards and backwards shows the LIDAR points corresponding to the wall ahead of the robot staying atop the map’s wall.

An error in the distance between the two wheels shows up as an error when the robot rotates. For example, if you have a wall ahead of the robot again and you rotate in place, if you see the wall stays stable in the rviz map display but the LIDAR points corresponding to the wall change the angle of the wall as the robot rotates in place, then you have the wheel circumference wrong. Change the circumference value, usually in the URDF and often repeated in other YAML files as well, and try the experiment again.

My saying that the wall in the map stays stable also implies that in rviz you are setting your global frame to the map frame.

When you get both the circumference and inter wheel distance correct, the SLAM map will closely match the LIDAR points even while the robot moves. It won’t be perfect while the robot moves, partly which I’ll explain next, but it does mean that when the robot slows down or stops, SLAM will fix the error very quickly.

I’m sure people will talk about the time stamp of data as well. In ROS, sensor data is given a time stamp. This should faithfully record when the sensor reading took place. If you have multiple computers in your robot, the sense of time on all computers must match within, at most, say a couple of milliseconds. Less than a millisecond discrepancy between all the computers is better. 

Note that if you are getting sensor data from an Arduino over, say, ROS serial, you will have to deal with the getting the time stamp adjusted before the sensor data gets posted as a ROS message. Since the Arduino didn’t tag the time, and the Arduino is unlikely to know the time to high accuracy, you have to figure out the latency in reading the data over a serial port, and the serialization, deserialization and message conversion delays.

When various components work with sensor data, they often predict what the sensor values will be in some future after the readings actually took place. That’s because the sensor values are coming in with some delay (lag), and the algorithms, especially SLAM, want to predict the current state of the robot. SLAM is monitoring the commanded movement of the robot (the cmd_vel sent to the motors) and when odometry and LIDAR data comes in, it needs to predict where the odometry and LIDAR points would be NOW, not when they were last observed. If the timestamp of the sensor data is wrong, the guess is wrong and SLAM gets very sloppy.

To cope with bad time and unexpectant latencies, especially the notorious big scheduling delays in a Linux system (think of what preemptive time sharing really does to your threads when they think they know what the current time is), one simple filter ROS usually provides is to ignore any data that is too “old”. There are configuration parameters, for instance, for SLAM that say to just ignore data that is older than some relatively small time from now (on order of, say 10 milliseconds might by typical).

This is especially a problem with multiple computers in a robot, but even with a single computer. Remember that Linux is trying to run on order of 100 threads at the same time. On a Raspberry Pi, it’s tying to give each of those 100 threads the illusion that they are running in real time by giving them a tiny slice of time to run before going on to the next thread. A thread can ask for “what time is it now” and as soon as it gets the answer, before the very next instruction in that thread executes, tens or hundreds of milliseconds may have gone by.

Finally, as for positioning of the LIDAR, ROS provides a marvelous mathematical modeling package in the TF system. If you have pretty good modeling of fixed components, like LIDARs mounted solidly to the robot and you get the offsets correct to within a millimeter or two from 0,0,0 in the base_link frame of reference, you needn’t worry about where you put the LIDAR. Make sure you correctly account for all the rotations as well, though. For instance, with the casters on my robot, the plate holding the LIDAR isn’t exactly level with the floor, and my URDF needs to include the rotation of the LIDAR.

My LIDAR is NOT mounted at the center of rotation and rviz shows things just fine. And I’m about to replace my single LIDAR with 4 LIDARS mounted at the 4 corners of my robot’s frame at different heights. It will be no problem for rviz and SLAM to deal with this.

Finally there is an issue with computation speed and robot speed. The faster you robot moves, the faster it needs to get sensor data. SLAM works best when it gets, say, 20 to 100 sensor readings a second for a robot moving on order of a couple of meters per second. SLAM wants to robot to not have moved very far before it does its thing between sensor frame readings. If your odometry and LIDAR readings are coming in at, say, 5 frames per second, and your computer is slow and loaded (such as trying to do everything on a single Raspberry Pi), and you robot is moving at the equivalent of a couple of miles per hour, all bets are off.

Chris Albertson

unread,
Jul 29, 2022, 2:01:09 AM7/29/22
to hbrob...@googlegroups.com
Actually, This is a very good question because 
  1. Some robots use multiple LIDAR units and obviously only one of them can be the center point of a turn. and
  2. If the robot used four wheel "Ackerman steering" (as almost every car) the center of rotation is not within the footprint of the robot. but on the ground some distance to the side of the robot.
So mounting the Lidar in the center of rotation is physically impossible in the above cases.   I hope the software "works".

It could be that you have wheel slip.  In fact I'd guess this is the problem and you should not be using raw odometry but rather the ROS Robot_Localization package.    This package will "fuse" odometry, IMUs, GPS, Visual Odometry and other sources to get the robot's position and orientation and account for wheel slip and other sensor errors. 

On Thu, Jul 28, 2022 at 2:00 PM Pito Salas <pito...@gmail.com> wrote:
--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/8CBCC9A4-3003-4862-AAC9-BBB62E14FAC9%40gmail.com.


--

Chris Albertson
Redondo Beach, California

dpa

unread,
Jul 29, 2022, 1:25:05 PM7/29/22
to HomeBrew Robotics Club
The "drift" you describe is caused by the fact that, as the robot rotates, the lidar is not sampled at exactly the same spot.    This causes flat surfaces to appear to bend, but only while the robot is rotating.
Chris Netter has documented this extensively and written code that corrects for this effect for his ROS robot,  which I can't seem to locate at the moment.   

But it won't be fixed by moving the lidar to the center of rotation.  It is a result of the fact that both the robot and the lidar are rotating while sampling.  
It's not seen when the robot is going straight, hence the "setteling down" which you mention.

Hope this helps.
dpa

Chris Albertson

unread,
Jul 29, 2022, 1:57:47 PM7/29/22
to hbrob...@googlegroups.com
But Pito says he is using software to compensate for the movement of the LIDAR during a turn.   In theory, what he did should work.  The ROS tf2 package should be calculating the LIDAR real-world location as it swings around the center of rotation and interpolating the position at the time the LIDAR scan was processed.

He said he was using this already. For more on tf2 see http://wiki.ros.org/tf2

My guess is that the odometry data that drives the transforms that tf2 uses not accurate.   Odometry assumes a perford differential drive system and none are perfect.   

One way to test my guess would be to run the robot in a simulation where odometry is in fact "perfect" and see if the problem goes away

dpa

unread,
Jul 29, 2022, 3:27:26 PM7/29/22
to HomeBrew Robotics Club
Thanks.   I read the link, which seems to have to do with correcting for offsets of the various sensors from the robot's "center."  Unless I missed it I saw no references to corrections for the lidar rotation + robot rotation errors while in motion that he is experiencing.   Can you clarify?

His description does match what we've seen before with lidar mapping.

dpa

Pito Salas

unread,
Jul 29, 2022, 3:27:54 PM7/29/22
to hbrob...@googlegroups.com
This mailing list and club is amazing. That experienced people are willing to really get in deep, to teach us subtleties and techniques is really awesome. Thank you to all. 

I took the liberty of throwing this thread onto my class web site (which is public but no one knows about it). There are no names or attributions. This is to capture it for myself and make it easier for someone else to learn from. I am asking your permissions after the fact: if you have any concern that your words appear here then let me know and I will remove them asap.

Thanks again!


Pito Salas
Faculty, Computer Science
Brandeis University

Michael Wimble

unread,
Jul 29, 2022, 3:31:38 PM7/29/22
to hbrob...@googlegroups.com
I’m good with my post. Let me know if you need my address to share publication royalties 😀

On Jul 29, 2022, at 12:27 PM, Pito Salas <pito...@gmail.com> wrote:

This mailing list and club is amazing. That experienced people are willing to really get in deep, to teach us subtleties and techniques is really awesome. Thank you to all. 

Chris Albertson

unread,
Jul 29, 2022, 4:49:45 PM7/29/22
to hbrob...@googlegroups.com
It's not magic.  You have to use it to transform points in the point cloud into the real-world frame.  If the points are time tagged, you can use the transformation that was relevant at the time the point was scanned.  tf2 has some memory of past motion

As said.  I think running in a simulation would sort this out.



dpa

unread,
Jul 29, 2022, 5:08:36 PM7/29/22
to HomeBrew Robotics Club
Hi

I think I've describe the phenomena at work here poorly.  I've sent an email to Chris Netter and will post his link if he gets back to me.  

The "magic" involved arrives from the fact that when the robot and the  lidar are both rotating, the rotations add.  

So when the robot is rotating in the same direction as the Lidar, the Lidar is actually rotating slightly faster than designed,  in reference to the real-world environment.   

Similarly, when the robot is rotating in the opposite directing to the Lidar rotation, the Lidar is rotating lightly slower than designed, again in reference to the real-world environment.

This results in the readings being offset slightly from their expected locations, and the result is that flat surfaces like walls tend to bend and curve away from their actual locations.  Until the robot stops rotating.

Chris has some data and plots that makes this more obvious, along with his fix.   Let me dig around a bit and see if I can find the video of his actual presentation.

cheers!

Chris Albertson

unread,
Jul 29, 2022, 9:59:32 PM7/29/22
to hbrob...@googlegroups.com
This article gives a walkthrough on how to handle an off-center LIDAR.   I knew I read this and it was clear  Google "re-found" it for me.

Actually I've read more of what this guy writes.  He is VERY clear. 

Joep Suijs

unread,
Jul 30, 2022, 1:49:12 AM7/30/22
to hbrob...@googlegroups.com
Maybe the issue is ROS doesn't timestamp each lidar point but a set of 360 points? A TF with a single position on such a set won't be accurate while moving.
You probably don't record the position at each point-scan either but you could probably correct individual points using interpolation on both. If this is the issue....

Joep

Op vr 29 jul. 2022 om 23:08 schreef dpa <dav...@smu.edu>:

Chris Albertson

unread,
Jul 30, 2022, 2:16:52 AM7/30/22
to hbrob...@googlegroups.com

On Fri, Jul 29, 2022 at 10:49 PM Joep Suijs <jsu...@gmail.com> wrote:
Maybe the issue is ROS doesn't timestamp each lidar point but a set of 360 points?

This is easy to look up
Notice that the in the comment they specifically say this is to handle the case of a moving sensor so that the location of the sensor can be determined for each point. 

So the next question:  Is the software bothers it make use of this data?   We don't knowwhat Pito did.  So the answer could be yes or no.  This is explained here
I think it is clear you want to use the "complex projection" and not the simple one.

Then you still need good localization and odometry is not that good when doing turns, better to use ROS localization to fuse with IMU.

Alan Federman

unread,
Jul 30, 2022, 12:26:49 PM7/30/22
to hbrob...@googlegroups.com, Pito Salas
A couple of thoughts about SLAM on small robots.

SLAM as done on research or commercial vehicles involves multiple sensors and large CPU capacity. It isn't realistic to think that a RPi 4 or even at Jetson Nano can integrate lidar, optical, and odometry to do SLAM, obstacle avoidance and people detection simultaneously and in real time.

Even using multiple processors would be problematic as network lag time becomes a factor.
For small robots perhaps SLAM in an unmapped space isn't the best approach.

Chris N

unread,
Jul 30, 2022, 3:14:25 PM7/30/22
to HomeBrew Robotics Club
My attempt at solving this problem, which I refer to as "warping", is here:   nettercm/lidar_dewarping: Experiments with de-warping lidar scans (github.com)

In my experience ROS does not magically compensate for the distortion / warping introduced by a moving (and especially rotating) lidar - at least not adequately.

tf / tf2:   
This will of course take care of compensating for the location of the lidar on your robot base.  As others have stated, the lidar does not need to be centered.  This will also take care of the coordinate transforms so that you know at what x,y (e.g. within the world/map coordinate frame) each individual "hit" of the laser scan has occurred.   However, as I understand it and from what I have observed, the assumption is made that the entire 360-degree laser scan has taken place instantaneously.  I.e. "exposure time", if you may, is 0 / negligible.  The faster your lidar spins, the more valid this assumption becomes.    A RPLidar A1 spinning at its nominal 7.5Hz has an "exposure time" of about 130ms.  

SLAM:
I have used only gmapping and amcl so far.  I have not seen evidence that those packages compensate for the distortion.  In fact, they get totally confused by it, which is precisely the reason why I started looking for a solution at one point. 

laser_geometry:
This package does supposedly address exactly this problem.  It has a "high fidelity projection" function that transforms the Laser Scan into a point cloud while - supposedly - compensating for movement that has occurred while the scan was taken.   I have not been able to get this to work.   I did take a 2nd look at this not too long ago (i.e. some time last year), but I don't recall my exact findings. 


My de-warping:
My "de-warping" code takes as input the laser scan and odometry and produces as output a new laser scan (on a different topic name) plus some other output for diagnostic and visualization purposes.

It works by converting the laser scan into a point cloud, and then applying a transform to every point.

Laser scan samples are coming in at a rate of about 2000 per second (in batches, i.e. one batch per 360-degree scan).  Odometry is coming in at 100Hz on my robot.  Therefore, I perform a bit of linear interpolation to determine the pose of my robot at the exact time that a given laser sample (i.e., one ray, one measurement) was obtained (see note below about possible error and uncertainty in this timing).

As you can see in the pictures on my GitHub page, there is noticeable improvement.  Unfortunately, the improvement did not seem to be good enough to avoid confusing the SLAM packages.  For example, there are some strange artifacts in the output that I could not explain.

I can think of a couple of reasons why I'm not able to clean up the distorted / warped laser scan as much as what I think theoretically should be possible:

1) My odometry data is not good enough.  It is coming in much slower and my encoders are not very high resolution.   (I think I have at one point changed this to use orientation provided by my BNO085 instead of orientation calculated vie wheel odometry - but I don't think there was tangible improvement)

2) I am not adequately accounting for (angular) acceleration.   

3) Imprecise timing of the lidar scan.   The lidar driver provides one 360deg scan at a time, each containing hundreds of samples.   There is a timestamp associated with this scan, i.e. presumably corresponding to the time of the first sample.   There could be a fixed offset error.  There could be jitter.


Cheuksan Wang

unread,
Jul 30, 2022, 7:53:16 PM7/30/22
to hbrob...@googlegroups.com
You're on the right track. There is a circular dependency here: Proper
LiDAR "warping" depends on accurate localization. However, accurate
SLAM depends on an accurate "warped" point cloud.

State-of-the-art methods do LiDAR "warping" and SLAM simultaneously.
BTW, IMU is better than odometry for angular acceleration.

You can watch a video here:
https://www.youtube.com/watch?v=8ezyhTAEyHs

LOAM is a ROS module for 3D LiDARs.
https://github.com/laboshinl/loam_velodyne
> --
> You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/b38e6d9e-200f-4a4f-b9a9-f0da05b36c69n%40googlegroups.com.

Dave Everett

unread,
Aug 1, 2022, 12:44:47 AM8/1/22
to hbrob...@googlegroups.com
I have my lidar and IMU mounted over the centre of the differential drive wheels to minimise the drift.

Dave

Chris N

unread,
Aug 2, 2022, 11:50:41 PM8/2/22
to HomeBrew Robotics Club
Hi Pito,

We are probably all curious to find out what your next steps will be after receiving all this input.

One more suggestion from my end on this - low tech  / keep-it-simple style :

Because my de-warping logic didn't work as well as I hoped (and also because its poorly written in Python and a real CPU hog even on a Pi 4), I have been using the following approach more:

I run the original laser scan through a filter which throws out any scans that have been taken while the angular speed is beyond some threshold.   It's basically a ROS node that subscribes to a laser scan topic and to odometry, and which publishes laser scan data on a different topic.

I have found that the SLAM packages get less confused when they receive scans with intermittent gaps (gaps of up to ~ 2 seconds - or however long my robot is making a fast turn) , compared to receiving scans that are warped / distorted.

Chris.


Pito Salas

unread,
Aug 3, 2022, 10:24:07 AM8/3/22
to hbrob...@googlegroups.com
Hi Chris,

I agree totally with your keep in simple philosophy. At this point I am digging and rechecking all my tfs to make sure they are correct. I am seeing a systematic error in the map/AMCL alignment which is suspicious:



And I saw this also which I am trying to explain and is probably related:


Notice that the arrow which is the ODOM arrow is offset to the left from the scan and base links. 

With regard to the spacers, here's a picture of the robot:


I am going to try the min_obstacle_distance approach if the vertical spacers seem to be the problem. Notice from this picture tbat in fact there are 6 spacers, and that the lidar (as mentioned before) is near the very *back* of the robot. But a more careful measurement shows that the farthest two spacers are 25 cm away from the lidar and the others are closer.

Which brings up a question: is the "min_obstacle_distance" parameter relative to the lidar or the robot as a whole?

Anyway, that's a quick update. I would love your comments on the offsets in my amcl/map and the odom arrow on the little screenshot!

Best,



Pito Salas
Faculty, Computer Science
Brandeis University
--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Michael Wimble

unread,
Aug 3, 2022, 11:22:05 AM8/3/22
to hbrob...@googlegroups.com
In rviz, you can expand either the tf plugin or the robot description and see all the frames of reference. I like to look via robot description, expand the frames, and you can see the x-y-z, rotation of each link from the URDF to see if it agrees with what you expect.

On Aug 3, 2022, at 7:23 AM, Pito Salas <pito...@gmail.com> wrote:

Hi Chris,

I agree totally with your keep in simple philosophy. At this point I am digging and rechecking all my tfs to make sure they are correct. I am seeing a systematic error in the map/AMCL alignment which is suspicious:

<Screen Shot 2022-08-02 at 7.17.40 AM.png>


And I saw this also which I am trying to explain and is probably related:

<Screen Shot 2022-08-02 at 6.56.30 AM.png>

Notice that the arrow which is the ODOM arrow is offset to the left from the scan and base links. 

With regard to the spacers, here's a picture of the robot:

<IMG_1407.jpeg>

I am going to try the min_obstacle_distance approach if the vertical spacers seem to be the problem. Notice from this picture tbat in fact there are 6 spacers, and that the lidar (as mentioned before) is near the very *back* of the robot. But a more careful measurement shows that the farthest two spacers are 25 cm away from the lidar and the others are closer.

Which brings up a question: is the "min_obstacle_distance" parameter relative to the lidar or the robot as a whole?

Anyway, that's a quick update. I would love your comments on the offsets in my amcl/map and the odom arrow on the little screenshot!

Best,



Pito Salas
Faculty, Computer Science
Brandeis University

On Aug 2, 2022, at 9:54 PM, Chris N <nette...@gmail.com> wrote:

Hi Pito,

We are probably all curious to find out what your next steps will be after receiving all this input.

One more suggestion from my end on this - low tech  / keep-it-simple style :

Because my de-warping logic didn't work as well as I hoped (and also because its poorly written in Python and a real CPU hog even on a Pi 4), I have been using the following approach more:

I run the original laser scan through a filter which throws out any scans that have been taken while the angular speed is beyond some threshold.   It's basically a ROS node that subscribes to a laser scan topic and to odometry, and which publishes laser scan data on a different topic.

I have found that the SLAM packages get less confused when they receive scans with intermittent gaps (gaps of up to ~ 2 seconds - or however long my robot is making a fast turn) , compared to receiving scans that are warped / distorted.

Chris.



--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hbrobotics/de68dcb2-5b3a-4455-8186-fdc4fd66e0ccn%40googlegroups.com.


--
You received this message because you are subscribed to the Google Groups "HomeBrew Robotics Club" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hbrobotics+...@googlegroups.com.

Pito Salas

unread,
Aug 3, 2022, 12:02:59 PM8/3/22
to hbrob...@googlegroups.com
You just made me think of something... In my rviz.launch I have the following lines to set up the robot model. I never looked into what exactly the robot_state_publisher and the joint_state_publisher did. But... if they are using the info out of the urdf file to publish static transforms ... and I also have individual static transfor launches explicitly in other launch files... and the parameters don't match exactly (which they don't) then it could be causing some havoc in tf and throwing stuff off. I don't have time right now to check that but it might be an important clue...

# rviz.launch, run on the 'remote' computer
<launch>
<!-- Robot URDF definition -->
<arg name="urdf_file" default="$(find xacro)/xacro '$(find platform)/urdf/platform.urdf'"/>
<param name="robot_description" command="$(arg urdf_file)
distance:=true
pi_camera:=true
lds:=true
imu:=true "/>

<!-- Send joint values -->
<node pkg="joint_state_publisher" type="joint_state_publisher" name="joint_state_publisher">
</node>
<!-- Combine joint values to TF-->
<node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher"/>

<node name="rviz" pkg="rviz" type="rviz" args="-d $(find platform)/rviz/slam_setup.rviz" required="true" />
<!-- (required = "true") if rviz dies, entire roslaunch will be killed -->

</launch>

# static_transforms.launch run on the robot itself:
<launch>
<!-- Publish all the static transforms in one place so we can make sure they are all
correct and consistent. Usage is
static_transform_publisher x y z yaw pitch roll frame_id child_frame_id -->
-->

<node pkg="tf2_ros" type="static_transform_publisher" name="base_footprint_to_base_link" args="0.0 0.0 0.045 0.0 0.0 0.0 base_footprint base_link"/>
<node pkg="tf2_ros" type="static_transform_publisher" name="base_link_to_camera" args="0.11 0.07 0.04 0.0 1.0 0.0 base_link raspicam" />

<node pkg="tf2_ros" type="static_transform_publisher" name="base_footprint_to_imu_link" args="-0.14 0 0.0 0 0 0 base_link imu_link"/>

<node pkg="tf2_ros" type="static_transform_publisher" name="base_link_to_scan_link" args="-0.17 0.0 0.1 0.0 0.0 0.0 base_link scan_link"/>

</launch>



Which set up the robot m
Pito Salas
Faculty, Computer Science
Brandeis University

Michael Wimble

unread,
Aug 3, 2022, 1:02:14 PM8/3/22
to hbrob...@googlegroups.com
You should probably never have two sources of transform publishers for the same joint. Try to do things only in the urdf, if possible. I wonder if the ros2 wtf analysis tool would have pointed out the problem. 

On Aug 3, 2022, at 9:03 AM, Pito Salas <pito...@gmail.com> wrote:

You just made me think of something... In my rviz.launch I have the following lines to set up the robot model. I never looked into what exactly the robot_state_publisher and the joint_state_publisher did. But... if they are using the info out of the urdf file to publish static transforms ... and I also have individual static transfor launches explicitly in other launch files... and the parameters don't match exactly (which they don't) then it could be causing some havoc in tf and throwing stuff off. I don't have time right now to check that but it might be an important clue...

Pito Salas

unread,
Aug 3, 2022, 2:33:37 PM8/3/22
to hbrob...@googlegroups.com
I find editing urdf a real pain. Does anyone have a trick or a tool to make it less painful?

How 

Pito Salas
Faculty, Computer Science
Brandeis University

Steve " 'dillo" Okay

unread,
Aug 5, 2022, 8:02:17 AM8/5/22
to HomeBrew Robotics Club
I just started experimenting recently with a URDF editor/viewer plugin for JupyterLab(what was previously called Jupyter Notebook).
It's just been released(so there are bugs), and does have a bit of an install procedure, but it's worth it for being able to view and edit your URDF in one window.
You still edit the URDF by hand in a text editor, but at least you don't have to keep launching/stopping Gazebo or Rviz over and over every time you want to look at your changes.

I haven't had a chance to write up a full HOWTO, but you install Robostack in a conda virtual env:
https://github.com/RoboStack/ros-noetic

Then JupyterLab-ROS:

Then finally JupyterLab-URDF:

HTH,
'dillo

Michael Wimble

unread,
Aug 5, 2022, 10:31:19 AM8/5/22
to hbrob...@googlegroups.com
I use visual studio code, god’s gift to programmers, with an xml and urdf plugin.

On Aug 5, 2022, at 5:02 AM, Steve " 'dillo" Okay <espre...@gmail.com> wrote:



Jim DiNunzio

unread,
Aug 14, 2022, 10:01:32 PM8/14/22
to hbrob...@googlegroups.com, Pito Salas

My robot Big Orange is the size of a small trash can. It can do SLAM using lidar and odometry, obstacle avoidance using a depth camera and/or sonar, and people detection using a stereo AI camera all simultaneously and in real time using a latte Panda windows SBC as the main computer, an STM32 microcontroller and a proprietary SLAM linux core card based module from SLAMTEC, and two Luxonis Oak-D stereo AI cameras, one for depth and one for running Tiny Yolo or other vision algorithms.

 

See all the videos of Big Orange here: https://www.youtube.com/playlist?list=PLmb0WjGtyZZTXYqRmrmorwfBD2mkl4XiM

 

Jim

Chris Albertson

unread,
Aug 14, 2022, 11:48:56 PM8/14/22
to hbrob...@googlegroups.com, Pito Salas
Is there a reason you can not consolidate the sensors?    Maybe there are two depth camera because they need to point in different directions?

I have an Oak-D Lite camera on a robot that is not big enough for multiple cameras.  I'm hoping to do an "N-way" fanout and send the same data to multiple processes.



Jim DiNunzio

unread,
Aug 15, 2022, 4:53:58 PM8/15/22
to hbrob...@googlegroups.com, Pito Salas
Yes you could consolidate, but the camera has a limit of which configuration settings are compatible with the NN models you want to use, and a limit of processing power, etc. I have the oak-d for depth sensing mounted in fixed orientation down lower on the robot below the tray and running depth map publishing only. The oak-d for ai vision is mounted on servo controlled pan and tilt rig and can be pointing anywhere looking for people or objects.

Jim

On Aug 14, 2022, at 8:49 PM, Chris Albertson <alberts...@gmail.com> wrote:


Reply all
Reply to author
Forward
0 new messages