Copter precision landing in master for IRLock on Pixhawk

1,586 views
Skip to first unread message

Randy Mackay

unread,
Aug 31, 2015, 5:09:46 AM8/31/15
to drones-...@googlegroups.com, Thomas Stone, bra...@irlock.com

 

     I’ve pushed the precision landing using IRLock into master.  So I’m sure everyone here’s seen the blog posts by Thomas & Brandon Stone.

               http://diydrones.com/profiles/blogs/safe-landings-in-tight-spaces

    ..and I successfully reproduced this a few months ago:  https://www.youtube.com/watch?v=2z14S_a6bNk

 

     At the moment, the precision landing only kicks in when the IRLock sensor is healthy, the vehicle is landing in regular GPS assisted LAND flight mode (i.e. not RTL, not Auto).  Also if the pilot overrides the roll, pitch inputs it stops using the IRLock sensor for that landing.

 

     The precision landing will go out with AC3.4 so we have some time to fix up other issues including:

·         Create a wiki page showing how to connect the IRLock sensor to the Pixhawk although I think some of that info is already on the irlock web page (http://irlock.com/).

·         Implement a velocity control (currently it uses a position controller)

·         Extend the precision landing to RTL and AUTO.

 

-Randy

Randy Mackay

unread,
Aug 31, 2015, 8:50:34 AM8/31/15
to drones-...@googlegroups.com, Thomas Stone, bra...@irlock.com

 

    Here’s a wiki page with details on the setup in case anyone wants to give it a try.

       http://copter.ardupilot.com/wiki/precision-landing-with-irlock/

 

-Randy

Thomas Stone

unread,
Aug 31, 2015, 12:03:26 PM8/31/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Excellent! The wiki looks great. Also, there are extra details regarding the sensor and IR markers in the IR-LOCK docs. (although, it makes many references to the old AC3.2.1)

Randy:
Extending to RTL and Auto should not be too difficult for us to do, but I guess the first step is the velocity control development. You have probably explained this before, but I don't think I have a good understanding of the velocity control....  The position control uses the IR target to compute a new position in the x-y plane, and then shifts the desired GPS coordinates to that location (relative to the current vehicle coordinates). I assume that velocity control will be significantly different than the position control strategy? 

-Thomas

Randy Mackay

unread,
Sep 1, 2015, 4:02:21 AM9/1/15
to tho...@irlock.com, drones-...@googlegroups.com, Leonard Hall, bra...@irlock.com

Thomas,

 

     Leonard is the controls expert but I think we should take the 2D earth-frame angle to the target from the precision landing library and convert it into a 3D velocity request.  To do that conversion we can use the pi_precland 2-axis PI controller (class is here: https://github.com/diydrones/ardupilot/blob/master/libraries/AC_PID/AC_PI_2D.h).  This object is already declared in the code but not used.

 

     So say the target is 10deg North and 10deg East we would take that (10,10) and pass it into the pi_precland.set_input(Vector2f(10,10)).  Then we would call pi_precland.get_PI() to retrieve a 2D Vector which we can use for the north & east velocities.  For the downward velocity we could just use the g.land_speed parameter (normally 50cm/s).  So now we have a 3-axis desired velocity which we want to somehow command the vehicle to fly.

 

     Using the velocity controller is quite different than using the existing LAND mode’s loiter controller so we should create new landing functions called “land_precision_init” and “land_precision_run” which should be similar to the Guided mode’s velocity controller functions “guided_vel_control_start” and “guided_vel_control_run”.

            https://github.com/diydrones/ardupilot/blob/master/ArduCopter/control_guided.cpp#L83

            https://github.com/diydrones/ardupilot/blob/master/ArduCopter/control_guided.cpp#L256

 

     Within these new functions we would call the pos_control.set_desired_velocity() with the 3-axis desired velocity vector from above.

 

     In order to call these new land-precision-init and land-precision-run functions we need to make some changes to the land_init() and land_run() functions to call them if the vehicle has GPS and a healthy precision landing sensor.

 

-Randy

--
You received this message because you are subscribed to the Google Groups "drones-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to drones-discus...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Thomas Stone

unread,
Sep 2, 2015, 1:46:07 AM9/2/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
Randy, 

Thanks for the details. I remember that Brandon spoke with Leonard a while back. Thanks!

If I understand correctly, the advantages of the velocity control strategy are:
(1) It doesn't require a distance-from-target input (i.e., altitude, in most cases). 
(2) Control performance (after sufficient development), and new tuning capabilities

However, I also think there are some advantages for the position control strategy. The advantages are closely related to the limitations of target detection (e.g., machine vision):
(1) If the vision sensor loses sight of the target, the last-known position estimate is very useful. The copter will move to that position, and either (a) re-acquire the target, or (b) loiter/land at that last-known position. 
(2) Vision sensors typically do not provide input at a consistent rate. For example, a '50Hz vision sensor' may detect a long-range target at only ~1Hz, and then ~50Hz at close-range. The velocity controller may (or may not) be more sensitive to this variability. 

We have received positive feedback from UAV developers regarding the performance of the 3.2.1 precision landing code (position control). The negative responses typically refer to a 'landing offset', which can be as high as ~30cm. 

Let me know what you think. I am probably missing some key points here. 

-Thomas

Randy Mackay

unread,
Sep 2, 2015, 2:15:01 AM9/2/15
to drones-...@googlegroups.com, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com

 

     Daniel Nugent brought  up the same point regarding the camera update rate and the control method used.

 

     I don’t think the update rate change of the sensor changes the decision on which controller is used.  Neither IRLock nor a slower vision camera will be as fast as the 400hz update rate of the velocity controller so in both cases we keep using the last known desired velocity until it’s older than a defined timeout period (like 0.2 seconds).  After that timeout we decay the desired velocity back to zero and perhaps perform some kind of failsafe action (like climb back up until we see the target again).

 

     Imagine we only got a single update from the sensor.  If that was used to come up with a desired velocity which we then used indefinitely, or if it was used to calculate an estimated position, the vehicle would end up in the same spot (assuming no errors in the distance estimate).

Thomas Stone

unread,
Sep 2, 2015, 2:58:39 AM9/2/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
Sounds good. 

Regarding the 'single update' scenario, it is hard for me to understand how velocity control can arrive at the desired location, but I trust you and Leonard. :) Obviously, you have already thought these things through. 

Finally, another issue with vision systems is super-close-range detection (e.g., during last 1-2 feet of landing). The IR-LOCK system actually does a descent job of handling this, because the IR markers are relatively small.... I imagine there are many potential solutions, so I am not too worried about this. 

-Thomas

Randy Mackay

unread,
Sep 2, 2015, 3:34:27 AM9/2/15
to drones-...@googlegroups.com, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com

 

     I think the velocity controller thing will become obvious later.  It’s a bit like I can point at the door across the street and say, “walk that direction at 1m/s” or I can give an estimated lat,lon position of the door.  Either way, you’ll get there.

 

     Re issues, on my latest test, the overall solution was having quite a bit of trouble with the earth frame X and Y angles (see attached pic).  I think the issue was either that it was seeing multiple targets (because I didn’t calibrated the sensor for the light levels) or I may have introduced a bug during some refactoring.

 

     If it turns out that it’s because of multiple targets we may need to add some checks to make it easy for users to check this.  Maybe messages to the GCSs with the target info and the maybe ability to set the light sensitivity through the pixhawk.

Randy_PL_earthXY.PNG

Daniel Nugent

unread,
Sep 2, 2015, 12:52:53 PM9/2/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
Thomas,
I started using a velocity controller and it worked very well. If the camera updated and there was no target then we set the velocity to zero. This worked pretty well at 15-30fps but it might be twitchy at 50fps with sporadic targets. And will fail at 1hz. I think both modes should be implemented. Target shifting(loiter controller) and copter shifting(velocity controller). Proposed implementation: Precland starts with the loiter controller. And if the last X number of reading came from sensor at a rate greater then 15~30 hz. Then we switch to high precision velocity controller mode.

Randy,
I still disagree with you on the purely velocity controller approach. To add to your analogy: If you point me to the other side of the street and I am unable to see distance then I will run right into the shop and not stop. Or in the case of the lading target, I will fly right past it. I talked to our controls team here and they agreed that for a low rate sensor, velocity control is not the approach. It does work well for high rate sensors. I would like to hear Leonard's input on this.

Leonard Hall

unread,
Sep 2, 2015, 6:41:59 PM9/2/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
Hi all,

The main advantage of the velocity controller is that this results in the smallest control input, both magnitude and variation, needed to do the job of getting the copter directly above the target landing point in time for touchdown. This means your sensor measurements are made as easy and consistent as possible. The only enhancement that may be beneficial is to constrain the total length of the velocity vector to the desired descent rate. This would help keep the movement of the copter very docile even when converging from large angular errors.

The position based controller is simply bad and only works because you are taking off and landing at the same location and therefore can approximate your distance to the target using your altitude. Take off from the ground and attempt to do a precision landing on the roof of your house, the position estimate will be so far off you will loose track of the beacon immediately.

On Randy's analogy. You have two approaches here. You can look at the direction of the doorway you want to walk through, draw a straight line (velocity vector) directly to that door, then walk directly towards the door. Very simple geometry where all the variables are known.

Using the position approach you look at the angle towards the door relative to the edge of the street, calculate the distance up the street relative to yourself, walk up the street by that distance then walk directly across the street. Any error in your approximation of the street width will cause you to miss the target. Oh, but we update it repeatedly, yes, and every time you update it your copter has to do an acceleration and a deceleration causing a change in your attitude and reducing the accuracy of the next measurement. Even if your range approximation is accurate, the position based controller approach makes it hard for your measurement and looks ugly. The geometry is a little more complicated but you don't know all the variables.

In describing why you don't like the velocity controller you said that the copter will fly right past the point. This can't happen, the ground stops the copter quite quickly whether it hits the beacon or not. In any case the position controller doesn't know the actual distance to the ground and does not actually know when it needs to stop either and relies on the ground providing that feedback.

One other thing it might be worth pointing out. When we control the copter with a velocity vector, the copter makes sure it is following that vector by checking its position. So when the copter doesn't drift around, it does travel in a straight line.

This is a very simple problem that I am sure we can make look very beautiful.

Leonard


Thomas Stone

unread,
Sep 2, 2015, 8:18:30 PM9/2/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
Leonard: 
Sounds great. We are happy to send over a Sensor, cables, and Beacons. Although, it may take a while for the package to arrive.... I agree that constantly updating the 'position' of the Loiter controller is ultimately not the correct approach to this problem. It was a convenient starting point, since it runs on top of the existing Loiter controller, which we already know works nicely.... And yes, the relative altitude of the landing area has to be predetermined in order for the position controller to work. Most developers are simply testing on level ground, so it hasn't bothered them much (yet). 

Randy:
Regarding the calibration for light levels, our goal is to have one setting/calibration that works in all conditions. We are quickly approaching that goal. Here is the status of the iterations: [1.0] The Pod is ok for testing, but it definitely requires sensor 'calibration' according sunlight levels. [2.0] The Beacons has much more IR power than the Pod, and the sensor also runs a corresponding interference rejection algorithm (see video clip). Thus, the default sensor sensor settings can be used with the Beacon in most operating environments, excluding extremely bright glares (e.g., over water or shiny vehicles). [3.0] We are working on completely eliminating false detections (video clip)..... In the meantime, I definitely agree that there needs to be some helpful diagnostic tools for these issues. 

Daniel:
Many thanks for the input. With the IR-LOCK sensor, the Beacon is typically detected at ~25Hz (the algorithm runs at 50Hz, but the Beacon is not constantly on). However, if the target is very far away, the detection rate may be sporadic.... The main reason I like position control is because it was essentially already implemented. :) We simply adjusted the target of the existing Loiter controller. I look forward to exploring other ideas. But I was never confident in my ability to develop another controller from the ground up. We devote a lot of time to the machine vision side of things. 


-Thomas

Daniel Nugent

unread,
Sep 3, 2015, 4:59:21 PM9/3/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Leonard,
I think I figured out what you are trying to say!! It took me a while but when I got it, it was my own little eureka moment. I have attached a picture which depicts my original understanding of a velocity controller implementation verses my current. I have added a bit extra also.















Looking at my current understanding: If we were to use the blue vector as our desired velocity vector we would arrive at the target and ground at the same moment. I think we should offset the green vector by some scalar. 2D example: green unit vector is 30 deg off the gravity. Multiply it by 1.2 to get 36 deg off the gravity vector to give you the orange vector. Then give the green vector the magnitude of the red vector to give you the pink vector. This allows a quicker response when we have a large angular displacement and has little effect of small target offsets. i.e. 4 deg off the gravity vector * 1.2 = 4.8 degree off the gravity vector. So you feed forward the green unit vector to get the orange vector then give it the magnitude of the red vector -> pink vector. This will help use center the vehicle before we impact the ground. Obviously the scalar we use can be discussed. Am I understanding the velocity controller properly now?

Daniel

Leonard Hall

unread,
Sep 3, 2015, 6:37:17 PM9/3/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Hi Daniel,

It looks like you are pretty much there. The only thing you haven't articulated or missed is your 1.2 is actually a variable we can use to tune. If P is 1 then we descend directly onto the pad, if it is over 1 it moves over the pad and gradually reduces the angle. The bigger P is the faster it will move over the plate but the more likely it will be to oscillate. 

There are three variables that will determine how large we will be able to make P, reaction speed of the copter, decent speed and the value of P. I would need to set up a model in matlab but we may find there is a relationship between decent rate and P that we can take advantage of to so the pilot doesn't need to tune P for a higher descent rate. If we are lucky it will work fine with small values of P and will be perfectly stable for all reasonable decent rates.

The only thing missing is the output filtering of the direction vector. This will tend to average out gitter on  your measurement of the landing pad direction. If we do it right it will actually slow the decent rate if the measurement is jumping around a lot.

The thing I really like about this approach is it will look very natural in the final stages of landing. If there is a small position angle in the last 300mm for example the angle with grow quickly causing the descent to slow and the copter to start moving back over the target. As the target moves under the copter the copter will increase it's decent rate to drop the final few mm onto the ground on the target. This is exactly what we would do as a human pilot and I think it will look very confidence inspiring.

Chat soon,
Leonard

eric1221bday

unread,
Sep 3, 2015, 6:47:44 PM9/3/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
Thomas Stone:

Sorry to budge in, I'm a CMU undergrad trying to use the Pixycam with an APM, I was wondering if the old AC 3.2.1 firmware for the IRlock precision landing is usable with an APM 2.6 board. Since only the Pixhawk is mentioned in the wiki but the firmware is shared (at least to that version)

Thomas Stone

unread,
Sep 3, 2015, 8:50:57 PM9/3/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
Hi Eric, 

It may be possible to run on an APM board, but I honestly do not know for sure. I remember a while back somebody attempted to get Pixy and an APM board to communicate, and they ran into some issues. But there were not using our 3.2.1 firmware, so it may have been a bug in their code. 

Best,
Thomas

Thomas Stone

unread,
Sep 4, 2015, 10:05:01 AM9/4/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Daniel:
Thanks for the illustrations and explanation. 

Have you ever investigated the accuracy of the angle-to-target readings? In particular, I have been curious about whether or not 'zero angle' really corresponds to 'zero angle'.... When we fix the camera to the copter, we assume that the camera is level with the flight controller. Also, when we account for pitch/roll, we assume those flight controller's pitch/roll outputs are correct.

I have spent a considerable amount of time investigating logs and such, but I haven't arrived at anything conclusive. Ultimately, it will be good to know what sources of error are significant and/or important. 

-Thomas

Daniel Nugent

unread,
Sep 6, 2015, 1:04:33 AM9/6/15
to drones-discuss
Thomas,

I have not looked into copter-camera angle errors. I think a more likely source of error is the image and IMU being out of sync. I am working on syncing the frame capture with the IMU using timestamps. Haven't had the chance to test it yet.

I also ran an off board(companion computer) test of the new velocity controller proposal and it works quite well. Still tweaking to make it better but it looks promising.

Daniel

Randy Mackay

unread,
Sep 6, 2015, 1:48:49 AM9/6/15
to drones-...@googlegroups.com
Thomas,

If it becomes a problem to many users I guess an angle-offset for x and y could be added to compensate for any lean in the sensor's mounting. That complaint from the user would sound a bit like, "the vehicle doesn't come straight down on the target but instead always approaches it from a 10 degree angle".

@Eric,
The low-level IRLock driver is pixhawk/vrbrain only for the moment and although we plan to make it work on Linux boards (NAVIO+, ErleBrain), it would be quite difficult to try and back port it to the low powered APM2 boards. I think it's best to just bite the bullet and get a high powered CPU board.

As a side note, I’m hoping that we can also get precision landing working with this python enabled open source camera on kickstarter that is shipping soon.
https://www.kickstarter.com/projects/botthoughts/openmv-cam-embedded-machine-vision/description

-Randy

-----Original Message-----
From: drones-...@googlegroups.com [mailto:drones-...@googlegroups.com] On Behalf Of Daniel Nugent
Sent: 6-Sep-15 2:05 PM
To: drones-discuss
Subject: [drones-discuss] Re: Copter precision landing in master for IRLock on Pixhawk

Cameron Owen

unread,
Sep 7, 2015, 4:05:41 AM9/7/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Hey All,

So not so long ago I put together a GPS denied localisation system based on time-of-flight radio technology. It was a network processing setup where a tablet talked to the virtual GPS environment (TrackX) and then controlled upt to 6 UAVs simultaneously (you can see our case study here: https://drive.google.com/file/d/0B2q6TTnoPKO0bHZleEtkb05zdEE/view?usp=sharing).

My end goal with this system will be to do the position processing on-board for, not just precision landing but, indoor/GPS denied localisation. If I can mimic the positioning commands the Pixy Cam sends over I2C then I can piggy back off the precision landing code in the Pixhawk already.

Is there somewhere I can access the Pixy Cam source code or messages?

Regards,

Cameron

Thomas Stone

unread,
Sep 7, 2015, 10:06:45 AM9/7/15
to drones-discuss
Daniel:
Interesting. What sort of time differences are you trying to resolve? For example, do you think ~50ms causes a problem?

The velocity controller news sounds great. 

Randy:
The angle-offset is a good idea.... Unfortunately, our observations during testing were never 100% conclusive. Some days, it seemed like there was a consistent 'angle-error' wrt body frame, but it was difficult for us to diagnose with certainty. 

We have pre-orded the OpenMV sensor, and we look forward to working with it. I like the smaller form-factor. However, the reduced fps may be a concern (not sure). The Pixy has an incredibly efficient machine vision algorithm, but I guess the benefit of OpenMV is that it will be easier for people to develop their own machine vision code. 

-Thomas

Thomas Stone

unread,
Sep 7, 2015, 10:27:16 AM9/7/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Hi Cameron,
We have had requests about working in GPS-denied environments. It was challenging for us because the velocity measurement (from the GPS) was not available. 

You can find Pixy/IR-LOCK source code here and here. In particular, you may be looking for this I2C-related code and communication protocol.... You may need to do some extra work, since I think you also need to supply the velocity measurement (which could be calculated from the time-of-flight positioning). The other developers can correct me on this if I am mistaken. 

-Thomas 

Thaddeus Bort, Jr

unread,
Sep 25, 2015, 5:17:03 PM9/25/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Has there been any update on the velocity controller? I've had fair results landing with the IR Lock using the position controller but I'd really like to improve the landing accuracy and it sounds like the velocity controller could help with that.

Teddy

maxmay2005 Maxmay2005

unread,
Sep 26, 2015, 12:35:46 PM9/26/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com

I need tutoring (I pay $ to you )  for builing/compiling IRlock precision landing pixhawk firmware. Please HELP! I am an experienced real-time C programmer. I need a make kickstart from someone in this forum subject. I am in Montreal , Canada. My phone is 5144474920

Jason Franciosa

unread,
Sep 27, 2015, 12:38:00 PM9/27/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Could this be implemented in Arduplane with a front facing camera for net landings? Just put the beacon in the center of the net and the plane will crash itself into it?

Thomas Stone

unread,
Sep 27, 2015, 12:55:21 PM9/27/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
@Maxmay  (I thought I had posted this message publicly, so here it is again)

If you are flying a standard quad frame (e.g., IRIS+), you can use one of the compiled AC3.2.1 firmwares at docs.irlock.com. If you need to make modifications and re-compile, the code repositories are also linked from that docs page.

There are instructions for compiling on the APM dev wiki. That will probably get you started.

-Thomas

Thomas Stone

unread,
Sep 27, 2015, 1:03:58 PM9/27/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Hi Jason, 

I mostly fly copters, but we have discussed the plane idea with a few people. On the surface, it sounds like a great idea. I think the limiting factor is the detection range. With the Copter, a detection range of ~15 meters is fine, but that may not be good enough for a net landing with a plane. My assumption is that the plane will cover 15 meters too quickly, but I don't know for sure. 

The detection range can be increased by using multiple IR Markers, so if money (and power consumption) is not an issue, the desired detection range could probably be achieved..... Also, a variable zoom lens could be used, but that would require a bit of development effort. 

-Thomas

Jason Franciosa

unread,
Sep 27, 2015, 4:23:17 PM9/27/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
Hello Thomas,

Thank you for the response. I think the detection range is a bit of an issue, but, in combination with the GPS I think it could be possible. The GPS is accurate enough to get it in the general area the net, and as it gets closer the IR lock could do the final corrections to ensure it hits center mass. The biggest challenge without RTK GPS is elevation as the baro drift on the pixhawk is quite horrific over long flights. A rangefinder can help with this, but, there are quite a few limitations of range finders based on the terrain. They are great for flap level ground, but, in a mountainous area where net landing is needed it is basically worthless. 15 meters would take about 1 second to cover with most planes as most are flying between 15-20 meters per second, so that range may be a bit too short for proper corrections, however, if we could get 60-90 meters I think it would work well. Who knows what the future will hold!

Thomas Stone

unread,
Sep 27, 2015, 8:55:05 PM9/27/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
That is good info. Thanks.... Theoretically, I guess four markers could be combined to get 90 meters of range; although, we haven't actually tried that. 

Also, maybe you saw this post on DIYdrones:

Thomas Stone

unread,
Sep 27, 2015, 9:00:52 PM9/27/15
to drones-discuss, tho...@irlock.com, bra...@irlock.com
...*60m (not 90)
Message has been deleted

Frank Wang

unread,
Dec 20, 2015, 3:23:50 AM12/20/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
Hi all,

I've tried to implement the velocity based precision landing recently, using the code from Randy's ardupilot precland12 directory. But there is something I don't understand well, I wonder if anyone could give me some help.

1. In AC_PrecLand.cpp calc_angles():
Vector3f vec_to_target_bf(sinf(-_angle_to_target.y), sinf(_angle_to_target.x), 1.0f);
I think vec_to_target_bf represents some kind of vector instead of angles here, but I'm not sure what it really means. And why using sinf() and making vec_to_target_bf.z=1.0?

2. In AC_PrecLand.cpp calc_desired_velocity():
_desired_vel.x = _vec_to_target_ef.x * _pi_precland_xy.kP() * gain;
_desired_vel.y = _vec_to_target_ef.y * _pi_precland_xy.kP() * gain;
_desired_vel.z = min(0.0f, (-1.0f+(pythagorous2(_vec_to_target_ef.x, _vec_to_target_ef.y)*_pi_precland_xy.kI())));
We do not use Caution Gain any more, is that right? And how to make sure the vector _desired_vel points to the target? Does it keep the length of the vector _desired_vel the same as the desired descent rate?

3. In copter_precland.parm:
PLAND_ENABLED   1
PLAND_TYPE         1
These two parameters has been defined, so do we still need to modified them in the Mission Planner. If so, the value should be (1,1) or (1,2)?

Best,
Frank

Thomas Stone

unread,
Dec 20, 2015, 1:11:17 PM12/20/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
#3
If the parameters are the same as in previous versions, PLAND_TYPE corresponds to the sensor type (and the IR-LOCK sensor is PLAND_TYPE=2). 

-Thomas

Nick McCarthy

unread,
Dec 30, 2015, 1:29:39 PM12/30/15
to drones-discuss, tho...@irlock.com, leonar...@gmail.com, bra...@irlock.com
I've just compiled Randy's precland12 branch, and am also curious about the tuning parameters.

Randy, can you provide some insight? I'm happy to test and provide feedback & logs.
Reply all
Reply to author
Forward
0 new messages