The effects of nozzle runout.

256 views
Skip to first unread message

Henrik

unread,
Jun 15, 2016, 3:29:14 PM6/15/16
to OpenPnP
As to not polute Rays thread I thought I'd start a new thread where we can discuss what (if any) effects the nozzle runout actually has. In an effort for my self to figure this out I had draw it up in CAD and move things around to see what's actually happening. I make no claims of this beeing absoultely correct but lets start with what I have and discuss around it.

In the following imiages the large red cross represents the center point of the camera image, the blue circle is motor shaft, the smaller magenta circle (I know it's hard to see but stick with me) is the nozzle and the green cross is the actual centerpoint of the part.

I started with a case of a perfect motor/adapter/nozzle, ie no runout what so ever. The part is picked up slightly off center with a 10° rotation. The vision system finds the offset to be 2 units in X, 1 unit in Y and 10° of rotation. It then commands the motion platform to rotate the part 10° and move 2 units in X, 1 unit in Y. The result of this is that part does center does not end up where we want it to be - even with a perfect motor/nozzle. Provided the vision system doesn't already compensate for this of course.

I have to admit that the fact that part does not end up in the correct Place in the above scenario came as a surprise to me but it's of course obvious since the part isn't rotated around ITS center point. But what happens if we have the nozzle not run true with the rotation of the motor? Well, not much as it happens....(sorry for the difference in scale).

Here the nozzle is offset (quite a bit) from the center of the motor shaft. What it means is that the part is picked up at Another location but since it's the center of the motor shaft that being moved to the camera the offset found by the vision system is still the same. Then the part is rotated, around the motor shaft (that's what turning right) and moved two units in X, one unit in Y. It ends up in the same location as in the first example - dispite the nozzle not running true to the motor.

So, as far as I can (which may be wrong) there has to be some translation of position in the vision system because simply rotation and moving the part the amount found does not make it end up in the correct location and it doesn't matter if the nozzle runs out or not.

/Henrik.


Jason von Nieda

unread,
Jun 15, 2016, 4:09:17 PM6/15/16
to OpenPnP
Henrik,

You are correct, and the vision system does compensate for this. The algorithm is documented in the code at https://github.com/openpnp/openpnp/blob/develop/src/main/java/org/openpnp/machine/reference/ReferencePnpJobProcessor.java#L593.

I have not yet wrapped my head around whether or not runout affects this, but you are the third person today to say that it doesn't, so it sounds quite likely. I'll be experimenting with this some more when I have more time.

Jason




--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To post to this group, send email to ope...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/31c4ab64-27bf-4414-9bba-a508e0830ced%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

alex

unread,
Jun 15, 2016, 4:20:27 PM6/15/16
to OpenPnP
Henrik, this sounds very reasonable, more to that your conclusion fully complies with practical observations on my machine.

Question for Jason: if the component has 90 degree placing position(for example), will openpnp rotate it first then perfom bottom vision, or bottom vision will be dove first and the the component will be rotated by 90+[pick error] degrees?

First way would be preferable, as pick error is ussualy not more than 10-20 degrees in worst case, so even big runout should not introduce big error at small turn angle.

Richard Sim

unread,
Jun 15, 2016, 4:35:14 PM6/15/16
to OpenPnP
From Fred's comments earlier, here's a video I put together of this with actual components:


Despite the huge runout of the nozzle and offset of the component to the nozzle, you can see that the components rotation axis is actually the motors rotation axis.

For those that want to play with the CAD file: http://a360.co/1ZRLlUP

Henrik Olsson

unread,
Jun 15, 2016, 4:57:01 PM6/15/16
to ope...@googlegroups.com

Thanks Richard!

 

Your 3D simulation shows it much better than my stoneage 2D sketches J

Again, I have to admit that I’m surprised about the result, I really did expect the part to end up in another location. It’s sort of unintuitive at first, at least for my brain.

 

But the fact that it seems to work just fine, I’m not totally convinced that there’s not something we’re missing ;-) , even when the nozzle isn’t concentric with the motor shaft is just great.

 

The important thing to remember then is there’s nozzle runout you need to make sure that you set the camera location to center of the rotational axis and not use the center of the nozzle to calibrate that position. As long as you do that and the runout isn’t bad enough to miss the part altogether it’ll work, no need to chase runout down in the micros.

 

/Henrik.

 

 


Från: ope...@googlegroups.com [mailto:ope...@googlegroups.com] För Richard Sim
Skickat: den 15 juni 2016 22:35
Till: OpenPnP
Ämne: [OpenPnP] Re: The effects of nozzle runout.

Jason von Nieda

unread,
Jun 15, 2016, 10:02:26 PM6/15/16
to ope...@googlegroups.com
Alex,

Currently we do option 2, bottom vision and then rotate. Option 1 is being worked on by another list member, so we should be able to support that soon.

Jason


--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To post to this group, send email to ope...@googlegroups.com.

Jason von Nieda

unread,
Jun 15, 2016, 10:03:37 PM6/15/16
to ope...@googlegroups.com
Thanks Richard and Henrik, 

These drawings and video are very helpful. It makes sense to me, but I don't find it intuitive. My brain keeps fighting me on it :) I'll run some tests here when I have a chance and see how it comes out.

Jason

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To post to this group, send email to ope...@googlegroups.com.

Ray Kholodovsky

unread,
Jun 15, 2016, 11:43:21 PM6/15/16
to OpenPnP
Nice work guys.  My brain after laser fumes does not understand much more than "pretty pictures" and "cool sim" but I will gladly accept the deductions you come up with from these tests.  

Michael Anton

unread,
Jun 16, 2016, 1:50:45 AM6/16/16
to OpenPnP
This new way of thinking about it makes sense to me too.  Once a part has been picked, it now has a fixed orientation with respect to the stepper shaft.  So, instead of knowing where the center of the nozzle is with respect to the camera, we need to know where the center of rotation is with respect to the camera.  Probably this could be calculated from a series of images taken of the nozzle (or a part on the nozzle), as it is rotated.

The only time nozzle runout is likely to matter, is when you are not using bottom vision.

Mike

Matt Baker

unread,
Jun 16, 2016, 8:12:01 PM6/16/16
to OpenPnP
Very interesting.  After reading through I agree that nozzle runout will not affect the vision assisted rotation.  However, it will affect the x-y correction if the offset is calculated relative to the nozzle rather than the rotational axis.  

Rotation axis could be calculated from four images at 90 degree rotations, or possible as few as two.  It seems that the path of a runout nozzle while rotating should form a concentric circle around the axis of rotation.  We must find the midpoint of all images.

Re: Henrik "So, as far as I can (which may be wrong) there has to be some translation of position in the vision system because simply rotation and moving the part the amount found does not make it end up in the correct location and it doesn't matter if the nozzle runs out or not." The x and y offsets need to be calculated along the x and y axes of the component rather than the axes of the camera.  If we know the component center point and the rotation angle this can be calculated with a little trig.  Those offsets will be the same as the post rotation offsets (but can be calculated from the single, pre rotation image).

Curious about the capabilities of the machine vision, can we accurately identify both the center of the component and the lines of its edges?  The edges would be sufficient for rotational correction, but center finding would be necessary for x-y correction.

Jason von Nieda

unread,
Jun 16, 2016, 8:26:46 PM6/16/16
to OpenPnP
Matt,

Right now we don't consider nozzle center to be different from axis center, which is the source of the issue. What ends up happening is that people set their head offsets with nozzles that have runout so your offset reflects the nozzle at a given rotation rather than the motor axis center.

There is a start on nozzle runout calibration in the code. It's not finished and it's not fully tested. It was written such that it applies an XY offset based on the rotation. It uses images captured every 30 degrees to interpolate between rotations.

Based on the learning in this thread, it seems we'd be better served by just doing the nozzle calibration one time and using the data to calculate the axis center, which we would then use as the head offset. It sounds like this would improve things for placements using bottom vision but would not improve "blind" placements, since these would still be affected by the nozzle runout. Perhaps a combination of the two would be ideal.

As for the machine vision, we currently calculate the bounding box of the component, which tends to line up with it's edges and gives us the center. If you are CV savvy, have a look at https://github.com/openpnp/openpnp/wiki/Bottom-Vision#default-pipeline which describes the current algorithm. Improvements are most certainly welcome!

Jason




--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To post to this group, send email to ope...@googlegroups.com.

Matt Baker

unread,
Jun 16, 2016, 9:46:41 PM6/16/16
to OpenPnP
Jason, very interesting.

It seems that making large rotations, worst case being a 180 degree rotation for a unidirectional component like a diode would experience the largest magnitude offset.

Agree with your conclusion on single time calibration when using vision.

Thought about blind placement briefly and it seems like a difficult problem.  My quick thoughts would be to:
1. Always pick components at the same nozzle rotation angle, that way the x-y offset from nozzle runout being rotated is compensated for in the relative position of the pick and is constant.
2. Manually tune in the place positions as well, so that they compensate for the x-y offset from any rotation after the pick.
3. Do not allow any, or minimize the rotation of line items from board to board or within a panel.

If picked and placed components are 1:1, you could potentially eliminate step two by pre-rotating for the final placement before setting the pick position.  i.e. Start at 0,0 or other reference point.  Pre-rotate counter to the component placement angle, and set the pick location.  When you return to 0,0 the component is perfectly aligned and in position, and you can use relative coordinates from CAM export, etc to get to the board.  If there are multiple placements of a component with different rotations this can not work then.

Of course, you probably want to use the down facing camera for identifying the pick and place positions so that technique is probably moot.  So really the next step for blind placement is how to calibrate the nozzle with only a down facing camera.

Matt Baker

unread,
Jun 16, 2016, 9:51:56 PM6/16/16
to OpenPnP
Worth noting, if it is not already apparent, that the component center to rotation axis offset will have to be compensated for when rotating from the camera frame of reference to the destination board frame of reference -- the rotation will add positioning offsets if the centers are not perfectly aligned.

Cri S

unread,
Jun 17, 2016, 7:31:46 AM6/17/16
to OpenPnP
On vision, if the optical center differ from the chip center roi and its not corrected, then the axis center is highly height dependent.
As example for cots endoscope camera with not so bad offset ( elm cameras have 5x that offset) the offset between PCB and feeder height was 6thou and that from pcb to nozzle change plate was 13 thou. The height difference between PCB and plate was near 6 or 7mm height difference. 0201 (only strip feeder) needed offsets.
Reply all
Reply to author
Forward
0 new messages