Mid-40 timestamp

1,896 views
Skip to first unread message

darrel...@ncl.ac.uk

unread,
Feb 21, 2019, 3:21:47 PM2/21/19
to Livox LiDARs
Apologies if this is a dumb question.

I am intending to use the Mid-40 for UAV mapping. I have read your integration pdf but remain a little confused, I can see how the pps output from my emlid reach GPS can synchronize the signals but I do not see how the livox data receives an actual timestamp. How do we know the time of the first observation - as this is needed for your pseudo-code to work? 

I am probably missing something simple!


ano T

unread,
Feb 21, 2019, 5:38:52 PM2/21/19
to Livox LiDARs
I am just learning so I am not sure this will be great advice but here goes. 
The data comes from the Mid-40 1000 times a second or every 1 millisecond. When you get a GPS Lat Lon sample it comes with UTC time. 
Take the UTC time and add 1 millisecond to it for every received sample until you get your next your next GPS sample. This should make sure your always in sync. Hope that helps a little. 


darrel...@ncl.ac.uk

unread,
Feb 21, 2019, 7:25:04 PM2/21/19
to Livox LiDARs
Thanks , that is what the pseudo code in the integration pdf suggests. The problem, unless I am being a little thick is that where, or rather when do you start the sequence?  Assuming gps is sending data continuously but the scanner is only turned on at a specific time, since there is no direct timestamp in the scanner data when (and therefore also where) exactly did it start? It gets a pulse from the gps but not a timestamp as far as I can tell.

Perhaps I am over complicating things? Or perhaps I am just confused :(

ano T

unread,
Feb 22, 2019, 9:23:01 AM2/22/19
to Livox LiDARs
The problem, unless I am being a little thick is that where, or rather when do you start the sequence? 
Maybe I can help explaining it this way. 
When your app starts and you are able to get a GPS fix. 
you cache the current UTC and GPS position. So your app is always running. When you hit capture with the LiDAR
you have everything you need from there. Time and position and start marking milliseconds from what you have cached and
the LiDAR input. Does this help? 
If you are asking exactly to the millisecond in time when the LiDAR data started you can wait for the next GPS message to come in before you 
start capture LiDAR data. 

longh...@qq.com

unread,
Feb 22, 2019, 11:25:17 AM2/22/19
to Livox LiDARs
I think @ano T is right. livox mid-40 can't receive UTC time from GPS by uart. So time sync has to be done out of lidars. That is how hardware snyc works like other sensors.

darrel...@ncl.ac.uk

unread,
Feb 22, 2019, 11:27:37 AM2/22/19
to Livox LiDARs
Thanks that does help. I see how that can work from a static position on the ground, although how we precisely synchronise the start of recording with the GPS message I will need to think about (presumably done through some code written using the SDK - this is not a problem just something I will need to factor in).  I guess I am concerned that the area of interest for mapping is likely to be several minutes into the flight time and moving even at a modest 5m/s (I have no idea what ground/airspeed will be viable)  errors could quickly get significant if the post process time matching is out of sync. 

darrel...@ncl.ac.uk

unread,
Feb 22, 2019, 11:28:50 AM2/22/19
to Livox LiDARs
Yes I realise that but I have not done this before and commercial companies do not readily give up their secrets :)

Livox Dev

unread,
Feb 23, 2019, 2:46:32 AM2/23/19
to Livox LiDARs
The "post-processing" time will not go out of sync if the processing delay is less than the PPS cycle.

I think @ano T has made the point. just add a few more comments to clarify. 

Basically, you need a microcomputer onboard the UAV. The microcomputer, on one hand, connects to the LiDAR to receives data packet (and also timestamps), and, on the other hand, connects to the GPS to receive GPS message. 

There are two facts to notice, (a) each data point carries a time stamp indicating the time relative to the most recent rising edge of the GPS PPS signal received by the LiDAR. This is implemented inside the LiDAR unit. (b) In the mean time, the GPS is sending out the absolute time of exact the same rising edge (it is the GPS who generates and sends the PPS, so it knows the time for the rising edge). 

So now, your program running on the onboard microcomputer knows from (b) the absolute time for the rising edge, and from (a) the time relative to that rising edge of a data packet. Then, the absolute time for the data packet is simply the addition of (a) and (b).

The only problem to solve is to find the right GPS time message (i.e. (b)) for a received data packet (i.e., (a)). To do this, every time you receive a data packet from LiDAR, check whether its timestamp is increasing from the last data packet. If no, it has to drop from a value very close to 1 sec to a value very close to 0 sec, meaning that the new data packet timestamp is referred to a new PPS rising edge (thus a new GPS time should be used to correct the data packet's time stamp). If yes, it means the new data packet is referred to the same PPS rising edge (thus the same GPS time) as that of the previous data packet. In this way, you can determine which data packets are belong to the same PPS cycle. Now, assume you receive a GPS time message, this message should correspond to a cycle which the most recently received data packet belongs to.

In this process, as long as the microcomputer delay is less than 1 sec (inverse of the PPS frequency), the synchronization will be good. It does not drift over time, does not depend on the movement of the UAV (static or moving), and does not depend on the powering order of GPS or LiDAR. 

So to answer your question at the very beginning, how to determine the GPS time for the first observation (i.e. data packet). The short answer is, if the first observation has arrived after the first GPS time message, you add the time inside the first GPS time message to the time stamp of the first observation. Otherwise, you wait until the first GPS time message comes, and subtract the time inside the first GPS time message by 1 and then add it to the timestamp of the first observation. 

Let me know if this is clear.

darrel...@ncl.ac.uk

unread,
Feb 23, 2019, 12:15:57 PM2/23/19
to Livox LiDARs
Very many thanks I think I now understand how to do this but I guess I will see when I start.  I will try first to simply get the pulse from my gps (emlid reach) to the Livox via the ttl to rs485 converter (should arrive early in the week). 

Many thanks to everyone.

Darrel

att...@gmail.com

unread,
Feb 24, 2019, 9:14:42 AM2/24/19
to Livox LiDARs
Hey Darrell, sorry to hijack this for a second - but if I'm not interested in having absolute positioning of the point cloud, just the data (relative positioning) - would I need to buy a RTK system and install it on the drone/vehicle ?

My idea of usage - mount the lidar, connect it to my laptop (vehicle) or raspberry pi (drone) and start mapping - stop the sensor, save the point cloud in .lvx format, convert it to .las and that's all.

Thanks in advance

darrel...@ncl.ac.uk

unread,
Feb 24, 2019, 9:52:07 AM2/24/19
to Livox LiDARs
Hi Dragos,

All data coming from the Mid-40 is relative to the sensor therefore if your sensor is moving you need to correct for the motion otherwise you will just have all the data points in one amalgamated cloud - not what you want I assume. The requirement for accurate positioning will depend on how accurate you want your 3d point cloud to be at any given distance. Given the relative costs of an L1 gnss system I would suggest you use at least this - I will be using post processing kinematics (ppk) as I have found using radios for rtk is unreliable (ppk is in any case better as you can usually use the highest sampling intervals e.g. 10Hz +).  Indeed if you are about to invest I would look into using a dual frequency (L1/L2) system as this will give you higher location accuracy(look for those just appearing using the Ublox ZED-f9p). If you are going to do this on a UAV you will need to correct for roll, pitch and yaw (I do not yet know whether the IMU in the flight controller will be good enough for my purpose). The three data streams need to be synchronized for postprocessing (which is where this thread started). 

As you will gather I am learning as I go - hopefully I will be able to do the corrections once I have the correct data. :) 

Dragos Ioan Coste

unread,
Feb 24, 2019, 10:21:27 AM2/24/19
to Livox LiDARs
Hi Darrel,

I'm just realising the stupidity of my question :)

Since the sensor doesn't know if it's moving, rotating or translating at all (for lacking an IMU), all points would overlap the data from one second ago, and so on, so you'd end up with a mess.

One way out of it would be to use a SLAM solution, something like Rtab Map.

Another would be a GNSS RTK system, like Emlid's.

The integrated IMU should be precise enough, but I don't know if or how you'd be able to output that data to the lidar or to the compute board.

I think DJI have released a Phantom 4 with RTK, but not sure how good is it.

I would assume Livax would come up with some sort of integration at some point.

Anyway, thanks for taking the time to reply ! How's that Emlid mapping kit work out for you ? It's not that cheap at 1124$, but I'd be looking seriously if it's good enough...

darrel...@ncl.ac.uk

unread,
Feb 24, 2019, 10:28:29 AM2/24/19
to Livox LiDARs
You don't need the ReachRS, just 2 reach+ - they are much cheaper.  Actually I would currently look at ardusimple ( https://www.ardusimple.com/store/ ) dual frequency now - I am sure emlid will get there at some point this year. I like the emlid stuff - it is well supported but I also have limited funds :) 

I build my own drones so will not be using DJI controllers - I'm actually using an emlid edge for this purpose. The IMU data is embedded in the flight log so it is not difficult to extract this during post processing - it is already time-stamped. I suspect you could also get a live stream if you want to do things in real time. 

kikiestt...@gmail.com

unread,
Feb 26, 2019, 6:58:39 AM2/26/19
to Livox LiDARs
Darrel, you need more precise IMU than cheap one which is present in pixhawk, reach or emlid edge.

Usage of expensive applanix 15 (more than 10k€ but 16x better than a pixhawk IMU) or at least Vectornav VN-200 or uINS (around 2k€ and 2x better than a pixhawk IMU) is recommended.

darrel...@ncl.ac.uk

unread,
Feb 26, 2019, 7:29:37 AM2/26/19
to Livox LiDARs
Hi 

Perhaps you can help me out here. I have read a lot of discussion about IMU accuracy but cannot find any definitive answers. Generally most opinion seems to support the idea that more expensive IMUs are more robust (i.e. suitable for mission critical operation) and have greater MTBF.  The precision claimed for the various units differs very little however and I note that Livox are working on a way to use the DJI controller IMU data (the sensors in their controller are certainly not superior to those used in a Pixhawk).  With a $600 LiDAR it makes no sense to use an expensive IMU (they will not have much of a market if you have to spend more on the IMU than you do on the the drone and the LiDAR combined). 

So my questions is on what do you base your relative i.e. 2x and 16x performance?  Surely the need for accuracy will depend on how accurate you want your point cloud to be. As most LiDAR data is subsampled to 1m resolution could this influence the choice of IMU?  I will, of course, be testing the data I produce using a suitable set of ground control points until I am happy that the cloud is "close enough" for my intended use.

The uINS unit is within my budget possibilities but I need to be certain their is sufficient added benefit.

Darrel)

Sylvain POULAIN

unread,
Feb 26, 2019, 8:27:10 AM2/26/19
to Livox LiDARs
Hi,

Sample rates define IMU precision, if you want high sample rate it's expensive. A 600$ lidar may be but what about precision and accuracy. Precision and accuracy depends on others parameters like GNSS and IMU with lidar. So for me it's not stupid to add a more expensive INS with 600$ lidar !
Darryl Zubot uses Vectornav VN-200 for traditionnal photogrammetry (seems not to be SFM) on ULM and seems to be happy. According to price of this lidar module it makes more sense to buy this one or uINS instead of applanix.
Question is why we never use imu pixhawk data even from camera feedback ? => Because it's not accurate and photogrammetry compute it better than 10$ pixhawk IMU. May be it could be improved with software but I doubt.

Sylvain

darrel...@ncl.ac.uk

unread,
Feb 26, 2019, 8:46:07 AM2/26/19
to Livox LiDARs
Dear Sylvain,

Please do not misunderstand, I have no problem buying a better INS if there is tangible benefit. I guess if you are going to invest in an Applanix you would probably buy an off the shelf ready to roll compete system. Livox I am assuming is aimed mainly at enthusiasts (and poor academics like me :) ).

You talk only about sample rates but the limiting factor here will most likely be the gnss. While an INS may be able to exceed 100Hz I do not know of a low cost gnss system that can even get close to that - hence there is going to be a considerable amount of interpolation in any case. For the record most the SfM software can use the IMU data but whether it improves the solution is not clear - I have done it with and without and not found any tangible benefit to having it (but then again I am not looking for anything better than sub meter accuracy). 

Do you know anyone who has actually compared results using different IMU units or is this opinion based on conventional wisdom? I like to see data - although it may not exist (I have not seen any) I am certainly happy to give this a try and if the results are terrible then move on to try something better (i.e. more expensive).  If you have experience then please share with empirical data to back up the argument. Again I am not saying you are wrong but I have learnt over the years simply because someone else says it is so, does not make it so :)

Darrel

kikiestt...@gmail.com

unread,
Feb 26, 2019, 1:07:05 PM2/26/19
to Livox LiDARs
Dear Darrel,

I'm already enthusiasts, there are some paper on low cost IMU on researchgate. It's true I compare specs only because it's proven that high rate sample and noise filtering which are present in higher price INS/IMU provide better result.

May be the 2x InvenSense ICM-20602 in emlid edge is better, try with what you have at this time. Just look at this : https://github.com/ArduPilot/ardupilot/issues/8211
In a paper you could see that they tried to implement Lidar but will change in conclusion to better IMU : doi:10.5194/isprsarchives-XL-1-W4-195-2015
Even if it was in 2015 let's try with emlid edge but pretty sure you will finish to same conclusion.

IMU if not good or as precise as it should in photogrammetry never increase processing. You are also talking about GPS but for example an ebee is set at 1 Hz for the RTK corrections in RTCM format. It's not Lidar but between points it's interpolation and it's a the same with Lidar on plane => Slide 9 (GPS at 2 Hz) https://www.umr-cnrm.fr/ecole_lidar/IMG/pdf/Mallet-Topo_Bathy_Veget.pdf


ano T

unread,
Feb 26, 2019, 2:00:17 PM2/26/19
to Livox LiDARs
If you are looking at GPS chips look at NavSpark.  You will need to kits one for base and the other for GPS. 

Joe

unread,
Apr 29, 2019, 3:35:17 PM4/29/19
to Livox LiDARs
I am still unclear about how to determine the GPS time for the first data packet. Assume that the lidar just receives a PPS and sends the first packet after this PPS to the microcomputer. And due to delay, the GPS time of this PPS comes to the microcomputer a little later. Then the "GPS time - 1 + time in the packet" is not the exact GPS time of the packet, but one second earlier than the time of the packet. 

darrel...@ncl.ac.uk

unread,
Apr 29, 2019, 3:42:24 PM4/29/19
to Livox LiDARs
Joe,

I would also like an answer to this.

My plan (which may not work) is to synchronize the onboard computer time to gps time then record time signal, send the livox the start record command and then again record the pc gps time.  IF the two time signals are less than one second apart then I am hoping this will suffice i.e. I can identify which second the livox timestamp relates to.  If there is latency of command request to recording of data greater than 1s then I do not know how to resolve this. It is a common problem however so I assume a solution exists. Does that make sense to you?

Livox Dev Team

unread,
Apr 30, 2019, 2:13:31 AM4/30/19
to livox-...@googlegroups.com
Hi Joe and darrel:
Yes, GPS time of the first data packet cannot be determine because of the delay of message to microcomputer. From the 2nd second on, you can just use the "last GPS time" +1  as the current GPS after the timestamp of data packet cleared to near 0 ns.  
Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted

cats...@gmail.com

unread,
Jun 3, 2019, 8:47:15 PM6/3/19
to Livox LiDARs
The accuracy of an IMU is primarily related to pitch/roll and heading specs. Those angles amplify the size of error at a distance. For example if the Pitch/Roll spec is say 1 degree (VectorNav VN-100) then at 50 meters you have a size error of 0.87 meters or 87 centimeters. Almost a full meter. Twice that (2 degrees - VN-100) for heading (yaw). 

What this means is that as your LiDAR unit is pitching, rolling and yawing (heading) and you use the IMU to correct for those motions the resulting cloud will suffer by those size inaccuracies at whatever distance you may be from what you are scanning. So the smaller those angles are the better the cloud will be. You CAN calculate how much better they should be (ignoring other problems like boresighting) . Get ready to spend north of 10k if you want centimeter level accuracy at 50 meters. 

Try it yourself with different specs here: (Select "Size" and enter angle and distance)

The higher the sample rate the less jumpiness (or time difference) between actual LiDAR "position" and IMU "position" so this helps as well. 

kikiestt...@gmail.com

unread,
Jun 4, 2019, 2:27:48 AM6/4/19
to Livox LiDARs
Which unit do you recommand ? Applanix 15 ?

darrel...@ncl.ac.uk

unread,
Jun 4, 2019, 9:13:50 AM6/4/19
to Livox LiDARs
I assume you realise that 1cm is not possible with the Livox?  I have worked with many point clouds, some from very expensive equipment and none of them could achieve this on a moving platform (and indeed would struggle on a static platform). The errors in your trajectory file would be greater than this and this is before you consider the limitations of the lidar itself and the IMU errors.

The answer above is just basic common sense but it does not answer my original question. The stats for the IMUs do not vary that much - so at what resolution of reconstruction will you notice a difference?

For the record I bought the inertialsense sensor but as yet I have not reached the point of comparing it against the sensors in the flight controller.

cats...@gmail.com

unread,
Jun 4, 2019, 10:59:42 AM6/4/19
to Livox LiDARs
I don't know what IMUs you have been looking at but the specs for the ones I have looked at do very quite a bit. For example take pitch/roll degrees within the VectorNAV product line. The VN-100 =1, VN-200=.1, VN-210 = .03. So the VN-200 is 10 times (1/.1=10) more accurate than the VN-100 and the VN-210  is 33 times (1/.03=33) more accurate than the VN-100. The specifics of how those specs translate into problem solving details is completely dependent on what problem you face. Are you using the IMU on a car a boat, a plane, a missile or a UAV? You have to do the math that is specific to what you are doing and what problems you have. If you are on a UAV and you are trying to correct for point placement on the ground that is 100 meters away you have a different problem than someone who is in a submarine trying to stay on course. So I think the ultimate answer to your question is that you have to do the math to solve for your problem to determine what the accuracy differences ultimately mean to you. So again if you are in a UAV at 100 meters distance to your target what level of accuracy do you require? Can you live with less accuracy if you fly at a lower altitude, say 25 or 10 meters?  Each of those altitudes have a different resulting error size even though the spec hasn't changed. Only you can decide if that is sufficient in your particular case. 

cats...@gmail.com

unread,
Jun 4, 2019, 11:12:02 AM6/4/19
to Livox LiDARs
I have been using the VectorNav products. I started with the VN-100 because I wasn't sure what ultimate level of accuracy I needed when I started. I also wanted an upgrade path that would require the least amount of changes as I went up in accuracy. I have not used the Applanix but I wouldn't rule it out. I don't have all of the details about it but if it does include RTK on board then that would be better than an other choice that doesn't. But there are other factors as well such as what all is required to assemble a completed product in both hardware and software and what pieces do you already have. It's difficult to recommend any product without knowing a lot more about your specific problem and what you (can) bring to its solution.

darrel...@ncl.ac.uk

unread,
Jun 4, 2019, 11:16:13 AM6/4/19
to Livox LiDARs
I think you took that comment the wrong way - I obviously did not express myself very well. 

I think we are agreeing :) - it all depends on what you are trying to do.  I have no doubt that a more expensive IMU will make better measurements. What I was trying to estimate is the trade off given the errors elsewhere in the toolchain.  For me the inertialsense IMU is as much as I am prepared to pay given the price of the LiDAR.  In practice I sincerely doubt the ability of this unit to achieve better than sub 50cm accuracy when flying at multi-rotor UAV altitudes.  However that will be sufficient for my needs. 

If you can afford the IMU prices for the vectornav or applanix products then you may want to consider a better lidar unit also.
Message has been deleted

cats...@gmail.com

unread,
Jun 4, 2019, 12:46:01 PM6/4/19
to Livox LiDARs
Agreed. You must take the system as a whole.The system is never going to be better than the weakest link so there is no point in paying more for any one link in that chain. However each chain is different depending on what you are doing. As a different example the Mid-40 has a distance range of more than twice the Velodyne VLP-16 but it's angular error is greater. So which is better? it depends on what you expect from your LiDAR unit. If you have it in a car and you are interested in detecting objects in front of you while you are traveling at 100 miles per hour the then the Mid-40 is actually a better unit for the job because of it's range and you don't really care so much about the angular error as you are more interested in "object" avoidance not "object" identification.

BTW it can be helpful if you can look at what others have done with a given set of hardware and what constraints they were working with. For example if you look at the Mid-40 showcase https://www.livoxtech.com/showcase/1 you will see the end result of the hardware choices and distance constraints they had to work with.  

darrel...@ncl.ac.uk

unread,
Jun 4, 2019, 12:54:34 PM6/4/19
to Livox LiDARs
Yes that showcase uses the same type of hardware chain as I intend to use (except I will use an S900 airframe with pixhawk controller and an nvidia jetson nano for data collection).  The results look very promising. I have everything here waiting for a few days to set it all up properly. Perhaps one day I might be able to afford a higher resolution IMU :)

cats...@gmail.com

unread,
Jun 4, 2019, 1:02:52 PM6/4/19
to Livox LiDARs
And as you have pointed out, that may not ultimately get you anything better in the end. While a better IMU will have less angular error and thus theoretically give you greater range (at a given size error). The LiDAR unit may not scale to that range because of it's angular error. So you may need to upgrade it as well to fully benefit.  

cats...@gmail.com

unread,
Jun 4, 2019, 1:11:01 PM6/4/19
to Livox LiDARs
Ultimately what you are buying is precision at a distance and each element in the chain has to be capable to achieve it. As the distance increases the cost to maintain a given level of accuracy goes up with it. Usually very dramatically.

Jon H

unread,
Jun 4, 2019, 1:28:12 PM6/4/19
to Livox LiDARs
Darrell, Once you get going on the install please share the results! I'm a hobbyist that hopes to eventually turn this into a grant project for heritage mapping and modeling historic buildings and other structures - thus, I can't yet stomach even buying the Inertial Sense unit until I feel confident in plunking down a decent amount of quid (though not an obscene amount) for this system. I wish there were more instructional materials out there on this for the novices like me, but you seem to know much more about it than I do, so I'd love to learn more from your process and lessons learned. I'm also curious if anyone here knows how to do good sensor fusion, and what software or tools out there exists for this... 

One question about the Inertial Sense IMU - isn't it integrated with a GNSS receiver? Does that mean it can record both sets of data without the need for an additional receiver like the f9p? I'm referring to the unit here: https://inertialsense.com/product/rugged-%C2%B5ins/ 

darrel...@ncl.ac.uk

unread,
Jun 4, 2019, 3:13:00 PM6/4/19
to Livox LiDARs
Hi Jon,

Yes happy to share everything but I don't expect this to be straight forward so it may take me some time. In your showcase thread I note that Livox claim the process of data fusion is straight forward and have clearly successful processed the data. It is unclear whether they used their own software or whether they used a commercial solution. I hope they have their own software and we should be pressing them to release it. It does not have to free, but a sensible price (given the costs of their hardware) would get them a lot more sales. I am hoping they will release something affordable. I have sensible money waiting :)

forrests...@gmail.com

unread,
Jun 4, 2019, 7:13:32 PM6/4/19
to Livox LiDARs
I for one do not believe the Livox showcase results. They provide *zero* tangible evidence that their mid-40 platform was the platform that generated those results or that they are capable of successfully post-processing dynamic lvx files... The proof is very much in the pudding.

cats...@gmail.com

unread,
Jun 6, 2019, 9:51:22 AM6/6/19
to Livox LiDARs


On Tuesday, June 4, 2019 at 5:13:32 PM UTC-6, forrests...@gmail.com wrote:
I for one do not believe the Livox showcase results. They provide *zero* tangible evidence that their mid-40 platform was the platform that generated those results or that they are capable of successfully post-processing dynamic lvx files... The proof is very much in the pudding.

Granted, The showcase has some questionable elements and may be heavily post processed but I do think it is doable.  I recently got my hands on a Mid-40 and within a couple days was able to capture the following point cloud: http://mypointclouds.com/projects/e8e3d677-d378-490d-88dc-2c03cc19dd1f-Points%20-%20Cloud

This was done inside my little workspace on a tripod. The Mid-40 was spun slowly 360 degrees using a servo. No IMU was used. It's upside down (easier mounting). No post processing was done other than to add the motion. It has some issues but it does show what should be possible when all the necessary hardware elements are in place and a little additional post processing is done. 

cats...@gmail.com

unread,
Jun 6, 2019, 9:57:48 AM6/6/19
to Livox LiDARs
BTW I think the circle/halo outside the walls is probably due to the fact that I haven't yet removed the clear protective plastic from the laser opening on the Mid-40.I'll update this thread when I know for sure...

Dragos Ioan Coste

unread,
Jun 6, 2019, 10:05:50 AM6/6/19
to Livox LiDARs
Looks fairly good - how did you programmed in the circular motion from the servo ? And what servo have you used ? was it something you did yourself or some cheap tripod like the ones you'd use with a DSLR ?

I've also been thinking of using this without GNSS/RTK, with only maybe a cheap IMU, since I don't need absolute referencing - but I was thinking of using SLAM for that (like Zeb Revo are doing it).

cats...@gmail.com

unread,
Jun 6, 2019, 10:39:11 AM6/6/19
to Livox LiDARs
The servo is a simple pan unit I got from Servo City. It's connected to and controlled by a Pololu Maestro.  It's mounted on a regular tripod with some 1/4-20 bolts. It spins the Mid-40 at the slowest speed in one complete 360 degree circle. To apply the motion I simply divide the 360 degrees into the number of points and apply that fraction to each point. This was done inside a modified version of the "lidar_lvx_file" sample. I added the eigen3 library for rotation and saved the result as a simple timestamp,x,y,z,intensity CSV file. I then converted the CSV file to LAS using txt2las. 

SLAM may be an issue with the MID-40. The problem is that it's difficult to derive motion between frames because of the laser pattern. The inter frame location of laser points do not line up like they do in a more conventional LiDAR. 

cats...@gmail.com

unread,
Jun 6, 2019, 10:47:11 AM6/6/19
to Livox LiDARs
Oh and Keep in mind that the IMU gives you attitude (or pose) corrections it does nothing for external position. Without some kind external positional reference (like GPS) you are't going to know how far you have moved between locations in space. 
Message has been deleted

cats...@gmail.com

unread,
Jun 6, 2019, 11:17:33 AM6/6/19
to Livox LiDARs
One other point: If you are trying to do something like the Zeb Revo then external position (or internal position like SLAM) may not be needed at all. You may be able to take separate scans at different fixed locations and register the different point clouds (in something like cloud compare or DIY) to create a larger scene. But again, because I haven't tried this, there may be an issue with the unusual laser pattern that you get from the Mid-40. Having said all that, since SLAM is essentially interframe registration, if you can get larger cloud registration to work you may be able to get SLAM to work as well.  

cats...@gmail.com

unread,
Jun 6, 2019, 11:28:28 AM6/6/19
to Livox LiDARs
One other thing I have to say: the Zeb Revo looks like it uses one of the Hokuyo LiDAR units. These are fine units but they are very short range (30m) compared to the Livox units (260m). So depending on what you are trying to do one may be more suited for the job than the other.  

Jon H

unread,
Jun 6, 2019, 11:44:36 AM6/6/19
to Livox LiDARs
Registration in cloud compare works through point-to-point matching followed by (I think) the ICP filter. It's actually quite good. I've successfully merged five fixed scans of a scene. SLAM is beyond my capabilities, but I've read somewhere that the Risley-type scan produced from the Livox units isn't ideal for it, and it's better suited for line-based scanners like the Ouster or Velodyne. SLAM also only works in environments with a lot of proximal features (probably not suitable for drone based scanning if you're doing large landscapes).

Dragos Ioan Coste

unread,
Jun 6, 2019, 12:14:42 PM6/6/19
to Livox LiDARs
A Revo is 40k, although they're using a 400$ hokuyo soldered to some tegra board. But hey.

I don't need absolute positioning at all really. I just need to get some castles mapped down do a decent detail level. I could do photogrammetry but it would take ages. Also, if SLAM would work, I'd get both the pointclouds (from lidar + photogrammetry) into Reality Capture and end up with a solid result.

Jon - I've been eyeing Ouster, even if it's like 4K/unit, I'd be happy to see some comparative reviews between those and Livox.

Anyway, I've broke your initial thread, sorry about that ! so I've created a dedicated topic for Livox + SLAM above.

Thanks everyone for chipping in.

Dragos Ioan Coste

unread,
Jun 6, 2019, 12:23:25 PM6/6/19
to Livox LiDARs
Jon - If I have the lidar pointing down at nadir angle or close to that (90 degrees), with max altitude of say 70-100 meters, without anything above the horizon messing with the data capture - then there should be plenty of features and diferentiators for the unit to register correctly.

cats...@gmail.com

unread,
Jun 6, 2019, 12:38:12 PM6/6/19
to Livox LiDARs
To be honest with you SLAM is beyond my capabilities as well, at least from an implementation perspective. However I'm pretty sure that ICP (Iterative Closest Point) is at least one way to do it. You just have to do it for each frame vs the the entire cloud. Of course you also have to account for attitude changes as well as different surface features which makes the whole process very complex (and beyond me). As you mention it's the Risley pattern that causes problems. In a line-based LiDAR (like the ones you mentioned) a frame is one complete azimuth rotation. In a Risley based LiDAR you don't have that so you have to arbitrarily (or in time a limited way) pick a set of points and call that a frame. Then between those frames at a given XYZ (or rho, theta, and phi if you prefer) point you attempt to match. Since there will be a distance between those points and some elapsed time you can derive motion from that. However since the Risley pattern is not going to give you a corresponding point that has moved uniformly from the previous frame at that position (like a line-based scanner would) deriving motion becomes problematic.

Yao Xiao

unread,
Jul 17, 2019, 11:46:26 PM7/17/19
to Livox LiDARs
Hi, eveyone, a dumb question. 
Which document the integration pdf refer to? Can anyone please give me a download link?
Thanks!

kikiestt...@gmail.com

unread,
Jul 17, 2019, 11:55:28 PM7/17/19
to Livox LiDARs

Yao Xiao

unread,
Jul 18, 2019, 12:08:44 AM7/18/19
to Livox LiDARs
Thank you very much!

onias.m...@gmail.com

unread,
Sep 29, 2019, 8:31:49 PM9/29/19
to Livox LiDARs

Noli Sicad

unread,
Sep 30, 2019, 1:56:21 AM9/30/19
to Livox LiDARs
Any ROS driver for OpenIMU300?
Reply all
Reply to author
Forward
0 new messages