"Global" Rotation Offset

875 views
Skip to first unread message

fxframes

unread,
Jun 2, 2021, 9:22:52 AM6/2/21
to OpenPnP
Hello Everyone,

I'm having trouble with a rotation offset that seems to applied to all the parts on this one job. I have tried turning "Rotate parts prior to vision?" on and off, to no avail.

I've also checked that bottom vision is recognising the parts well, and these are three very different packages as you can see in the images below.

This did not seem to affect a previous, different job.

I did a visual check on steps/mm ($4tr=160, TinyG) and it seems to be about right (made a pen mark on the rotation axis) and anyway the offset is much larger.

Any tips?

Thanks!

IMG_1903.jpg

IMG_1901.jpg

fxframes

unread,
Jun 2, 2021, 9:24:54 AM6/2/21
to OpenPnP
Just adding something, my PCB is detected through fiducials at a slight angle, but again much smaller than the angle at which the parts are being placed.

Screenshot 2021-06-02 at 14.23.28.png

johanne...@formann.de

unread,
Jun 2, 2021, 9:33:39 AM6/2/21
to OpenPnP
Is your bottom vision camera rotated?
Would try to adjust (either in hardware or in the openpnp settings) that.

fxframes

unread,
Jun 2, 2021, 10:14:45 AM6/2/21
to OpenPnP
Thanks, I did it again just in case, but that doesn't seem to be what's causing this. Just ran those parts again:

IMG_1905.jpg 
IMG_1904.jpg

johanne...@formann.de

unread,
Jun 2, 2021, 10:46:13 AM6/2/21
to OpenPnP
Just to be sure: tried to rotate bottom vision camera (SW or HW), and that didn't changed the behavior?
Does the problem persist, if you disable bottom vision?

fxframes

unread,
Jun 2, 2021, 11:51:46 AM6/2/21
to OpenPnP
Correct, I redid the bottom camera calibration and rotation for all nozzle tips and the problem persists.
However, even though all nozzle tips calibrated successfully and the result shown for the bottom camera is around 1.7º,

Screenshot 2021-06-02 at 16.46.40.png

when aligning the nozzle between placements I noticed how the bottom camera "square" is skewed by a similar angle (close to 5º) to the placement error as you can see below.
Shouldn't the nozzle tip calibration have caught that?

Screenshot 2021-06-02 at 16.40.42.png


fxframes

unread,
Jun 2, 2021, 1:04:35 PM6/2/21
to OpenPnP
My bad, after taking the bottom camera down I understood that the "skew" on the image is a consequence of me trying to manually adjust the Bottom camera rotation...
So I'm back to square one.

Mike Menci

unread,
Jun 2, 2021, 1:07:31 PM6/2/21
to OpenPnP
Try this - JOG nozzle rotation 9 times 10 deg ____on top of up-looking camera ___ does your part turn 90 deg on nozzle ?

fxframes

unread,
Jun 2, 2021, 1:39:24 PM6/2/21
to OpenPnP
Thanks for the suggestion.
From the images below it does seem to turn a bit more than 90º.
Would it be the case of adjusting steps/mm?

Start.

Screenshot 2021-06-02 at 18.34.54.png

End

Screenshot 2021-06-02 at 18.35.29.png

Mike Menci

unread,
Jun 2, 2021, 3:06:20 PM6/2/21
to ope...@googlegroups.com
Correct - what is your current value? 
Mike

On 2 Jun 2021, at 19.39, fxframes <mp5...@gmail.com> wrote:

Thanks for the suggestion.
From the images below it does seem to turn a bit more than 90º.
Would it be the case of adjusting steps/mm?

Start.

<Screenshot 2021-06-02 at 18.34.54.png>


End

<Screenshot 2021-06-02 at 18.35.29.png>

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/50bc1d31-954f-4f0d-ae1d-600c6530c369n%40googlegroups.com.
<Screenshot 2021-06-02 at 18.34.54.png>
<Screenshot 2021-06-02 at 18.35.29.png>

bert shivaan

unread,
Jun 2, 2021, 3:12:54 PM6/2/21
to OpenPnP
It appears like the part is turning more than openPNP thinks. Look at the cross hair in the top pic, it is next to the notch, but in the second it is much further away

--

bert shivaan

unread,
Jun 2, 2021, 3:13:33 PM6/2/21
to OpenPnP
OMG, sorry I mis read your post and thought you were asking if it looks like it is.


fxframes

unread,
Jun 2, 2021, 3:41:18 PM6/2/21
to OpenPnP
Well I'm not sure now because it seems that on the LitePlacer the 2.25 gear reduction on the rotation axis dictates that the correct number here is 160 (360º/2.25) which is what it is right now.
Jason posted this here some time ago. For the record, I did play a little bit with it but not much changed. I'm not sure if this can or even should be set somewhere else.

I agree that OpenPnP seems to be turning more/less than it should, what's baffling me is that it seems to add/subtract a number of degrees to all placements, regardless of nozzle tip or vision settings.
In the images in my first post there are 4 different parts which were placed using 3 different nozzles and of course used their own individual vision pipelines.

fxframes

unread,
Jun 2, 2021, 3:42:12 PM6/2/21
to OpenPnP
@cncmachineguy no worries. :)

johanne...@formann.de

unread,
Jun 2, 2021, 4:01:28 PM6/2/21
to OpenPnP
really wild guess: totally messed up backslash compensation on that axis?

bert shivaan

unread,
Jun 2, 2021, 4:10:03 PM6/2/21
to OpenPnP
Maybe a log output so we can see what is being sent to the control?

fxframes

unread,
Jun 2, 2021, 5:34:53 PM6/2/21
to OpenPnP
Interesting... I’ll take a look tomorrow. Thanks.

Mike Menci

unread,
Jun 3, 2021, 2:03:08 AM6/3/21
to ope...@googlegroups.com
Check as well the actual gear ratio on belt pulleys - count the teeth and see if it matches with your calculation. 

On 2 Jun 2021, at 23.34, fxframes <mp5...@gmail.com> wrote:

Interesting... I’ll take a look tomorrow. Thanks.

ma...@makr.zone

unread,
Jun 3, 2021, 3:59:56 AM6/3/21
to ope...@googlegroups.com

Hi fxframes

Some thoughts:

  1. If you use pre-rotate then moderately wrong steps/degree should automatically be compensated. So during these tests, do not use pre-rotate. But during production do use it (it is way better).
  2. If it is a steps/degree issue, then different placement angles should result in different angle offsets. Parts at 0° should show no offset, at 90° it should show, at 180° double that. If this does not happen, then it is not a steps/degrees issue.
  3. If it is a steps/degree issue, then OpenPnP is not the place to fix this. Check your controller settings instead. The images you posted earlier, would suggest that. 
  4. If you checked all that and still see the angular offset, then look at the fiducial PCB orientation. Double check if your fiducial locations E-CAD import file contains the right coordinates (you would not be the first user to mistakenly import an older revision of a design ;-)
  5. The bottom camera has both a Position Rotation and a Image Transforms Rotation (on two separate tabs). Make sure the Position Rotation is zero (Frankly, I don't know if a non-zero Position Rotation can be made to work correctly at all or if this field should better be removed).


  6. If all this does not help, then yes, send log and machine.xml
_Mark

fxframes

unread,
Jun 3, 2021, 8:31:12 AM6/3/21
to OpenPnP
Hi Everyone,

Thanks for all your ideas.

So first I took a look at the backlash compensation settings for the rotational axis.
I have "None" set up. Jogging with 1.0 seems to move the same amount in both directions. Jogging with ≤ 0.1 is really hard to tell, though, even with some zoom on the bottom camera.

Screenshot 2021-06-03 at 13.04.45.png

And these are the settings for that axis in TinyG:

[4ma] m4 map to axis              3 [0=X,1=Y,2=Z...]
[4sa] m4 step angle               0.900 deg
[4tr] m4 travel per revolution  160.0000 mm
[4mi] m4 microsteps               8 [1,2,4,8]
[4po] m4 polarity                 0 [0=normal,1=reverse]
[4pm] m4 power management         2 [0=disabled,1=always on,2=in cycle,3=when moving]
[aam] a axis mode                 1 [standard]
[avm] a velocity maximum      50000 deg/min
[afr] a feedrate maximum     200000 deg/min
[atn] a travel minimum            0.000 deg
[atm] a travel maximum          600.000 deg
[ajm] a jerk maximum           2160 deg/min^3 * 1 million
[ajh] a jerk homing            5000 deg/min^3 * 1 million
[ajd] a junction deviation        0.1000 deg (larger is faster)
[ara] a radius value              5.3052 deg
[asn] a switch min                0 [0=off,1=homing,2=limit,3=limit+homing]
[asx] a switch max                0 [0=off,1=homing,2=limit,3=limit+homing]
[asv] a search velocity        2000 deg/min
[alv] a latch velocity         2000 deg/min
[alb] a latch backoff             5.000 deg
[azb] a zero backoff              2.000 deg
On Wednesday, June 2, 2021 at 9:01:28 PM UTC+1 johanne...@formann.de wrote:

fxframes

unread,
Jun 3, 2021, 9:01:13 AM6/3/21
to OpenPnP
Ok now to Mark's suggestions.

1. I've disabled pre-rotation BUT the parts still rotate on their way to the bottom camera ?

Screenshot 2021-06-03 at 13.37.32.png

Please check out the video below. the first cap is set to 0º rotation in the job and the second one is set to 180º, but both rotate ~180º before arriving at the bottom camera.

Screenshot 2021-06-03 at 13.56.37.png
Screenshot 2021-06-03 at 13.57.13.png




IMG_1915.mp4

fxframes

unread,
Jun 3, 2021, 9:10:40 AM6/3/21
to OpenPnP
#2 & #3

Hi fxframes

Some thoughts:

  1. If you use pre-rotate then moderately wrong steps/degree should automatically be compensated. So during these tests, do not use pre-rotate. But during production do use it (it is way better).
  2. If it is a steps/degree issue, then different placement angles should result in different angle offsets. Parts at 0° should show no offset, at 90° it should show, at 180° double that. If this does not happen, then it is not a steps/degrees issue.
  3. If it is a steps/degree issue, then OpenPnP is not the place to fix this. Check your controller settings instead. The images you posted earlier, would suggest that. 
-- 
 The offset seems to be the same for all placements regardless of their rotation setting in the job.

fxframes

unread,
Jun 3, 2021, 9:21:37 AM6/3/21
to OpenPnP
Re #4

Yes I've double-checked that. :)
Rotation for all 4 fiducials on this board is set to 0.
When moving to them the center falls within the center area of the fiducials for all of them.
This is the "worst" one:

Screenshot 2021-06-03 at 14.18.12.png 

Re #5

I've double-checked that as well.

Screenshot 2021-06-03 at 13.50.10.pngScreenshot 2021-06-03 at 13.50.02.png

Thanks.
––––––

ma...@makr.zone

unread,
Jun 3, 2021, 9:52:13 AM6/3/21
to ope...@googlegroups.com

> I've disabled pre-rotation BUT the parts still rotate on their way to the bottom camera ?

> ...but both rotate ~180º before arriving at the bottom camera.

The feeder itself and the part inside the tape can also each have a rotation, so this is normal. The difference is that the part must be visible at 0° in the camera when it is aligned. The rotation 0° means: "I see the part in the same orientation as when I look at the footprint as drawn in the E-CAD library".

Conversely, with pre-rotate: "I see the part as it will be placed on the PCB on the machine". So it will already have the rotation of the design plus the rotation of the PCB itself. The advantage of pre-rotate is that any inaccuracies through the rotation (including runout, backlash etc.) will already be compensated out. Important for large parts, where a few degrees offset will result in relatively large pad offsets, due to leverage.

_Mark

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.

fxframes

unread,
Jun 3, 2021, 10:32:34 AM6/3/21
to OpenPnP
So I've set up a partial job so that I could send you the log file. Maybe you can also help me with the java error at the beginning of the log file?

This is the job:

Screenshot 2021-06-03 at 14.58.27.png

This is the result:

IMG_1917.jpg

And here are the log file and the machine.xml file.

OPnP.Log.210603.txt
machine.copy.210603.xml

tony...@att.net

unread,
Jun 3, 2021, 11:01:39 AM6/3/21
to OpenPnP
I don't know if this is the source of your problem but you can't home the rotation axis when using TinyG:

2021-06-03 14:52:29.283 GcodeAsyncDriver$WriterThread TRACE: [serial://tty.usbserial-D308QQ68] >> G28.2X0Y0Z0A0

2021-06-03 14:52:43.918 GcodeDriver$ReaderThread TRACE: [serial://tty.usbserial-D308QQ68] << Homing error - A axis settings misconfigured

Remove A0 from your HOME_COMMAND - Mark can you check the recommended TinyG HOME_COMMAND to make sure it doesn't include the rotation axis?

ma...@makr.zone

unread,
Jun 3, 2021, 11:35:04 AM6/3/21
to ope...@googlegroups.com

Good catch Tony! I'll fix it.

@fxframes, in the mean-time you can set it manually. I would be surprised, if this is related with the problem here, though...

https://github.com/openpnp/openpnp/wiki/GcodeDriver%3A-Command-Reference#home_command

_Mark

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.

fxframes

unread,
Jun 3, 2021, 12:39:46 PM6/3/21
to ope...@googlegroups.com
» I don't know if this is the source of your problem but you can't home the rotation axis when using TinyG:

Thanks Tony, I’ve edited the command manually, but unfortunately it didn’t help with the rotation issue. 

» The feeder itself and the part inside the tape can also each have a rotation, so this is normal. The difference is that the part must be visible at 0° in the camera when it is aligned. The rotation 0° means: "I see the part in the same orientation as when I look at the footprint as drawn in the E-CAD library

If I’m understanding this correctly Mark, should the part be correctly aligned before it leaves the bottom camera and is moved over to the pcb to be placed? Because this is not what’s happening. These images are from the bottom camera right after the part is brought down to the bottom camera for alignment - pre-rotation is still off and just before the nozzle lifts it up. It’s the image that “stays on” for a little bit while the head moves over to the pcb. You can see that the center and orientation are correctly identified, but the nozzle lifts the part just like that and then rotates it further on its way to the pcb.

Thanks.




ma...@makr.zone

unread,
Jun 3, 2021, 1:05:29 PM6/3/21
to ope...@googlegroups.com

> If I’m understanding this correctly Mark, should the part be correctly aligned before it leaves the bottom camera and is moved over to the pcb to be placed?

Well, not really. When pre-rotate is disabled, what you see is what OpenPnP initially thinks is zero degrees. So what you effectively see is the pick angle error.

In your images it is huge and both the part and the nozzle angle (visible as the crosshairs) are strange.

Something is really wrong here.

  • Check the angle(s) of the feeders/rotation in tape.
  • Check if the rotation axis turns the right way around (Issues & Solutions should have told you to use one of the Flip options to mirror the image, but if you have a mirror in your camera view path, then that advice is false)
  • Check the pulley on the rotation axis. When I assembled my Liteplacer I forgot to fasten the pulley screw there and "I almost bit the table edge" (to translate a Swiss-German idiom verbatim) trying to find that bug. :-}

_Mark

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.

fxframes

unread,
Jun 3, 2021, 2:56:12 PM6/3/21
to ope...@googlegroups.com
Hey Mark,

Thanks for the support.
Like I said before, what’s weird is that the same thing is happening for all the parts, with different nozzles and of course different vision pipelines. I’m saying this because the first thing I’d normally check is the vision pipeline. :)

To your points:

» Check the angle(s) of the feeders/rotation in tape.

My tape orientations are all 0 and I adjust for that in the job.

» Check if the rotation axis turns the right way around

It does.

» Check the pulley on the rotation axis.

I did, I even re-tightened it just to make sure.

–––

Here’s the job setup for this part:



A question: if the tape rotation is set at 0 and the part rotation is set at -90º, why does OpenPnP rotate the part close to180º when bringing it to the bottom camera? I know «alles ist relativ» :) but why would it expect that if the parameters it has are 0 and -90º…?

I’ve noticed what seems to be a strange behaviour: after the camera finds the index hole on a tape, when moving to pick the part it always rotates ~180º. Then, when bringing the part to the bottom camera, it “undoes” those ~180º, turning back in the other direction. This happens regardless of the tape and job settingsWhy would OpenPnP do that? This movement seems to be causing quite an error (as you noticed) by the time the part reaches the bottom camera, even though I understand it should be able to compensate for that.

On the other hand, when picking loose parts I’m "snapping to 0” inside OrientRotatedRects so the nozzle does not rotate between the feeder and the bottom camera, yet the part gets placed by the same offset. Just thinking out loud at this point, I guess. :P

Thanks



ma...@makr.zone

unread,
Jun 3, 2021, 3:33:54 PM6/3/21
to ope...@googlegroups.com

The Strip-Feeder calculates the feeder rotation by looking at the sprocket holes. The Rotation in Tape is on top of that. Looking at your photos, your parts may well be 180° turned.

Now that I look at your feeder image: there is only one hole highlighted in red!

It is possible that all this comes from a faulty Strip Feeder sprocket hole recognition! Yes, that finally could explain why both the part and the nozzle are rotated so oddly.

Are you sure the feeder sprocket hole recognition pipeline works well? After all these are transparent or black plastic tapes (it seems), which is very difficult!!

I never managed to create a reliable pipeline with these on my PushPullFeeder:

circular-symmetry-transparent

I only recently developed a new vision stage that can do it reliably, but that is not yet ready for multi-hole recognition.

https://github.com/openpnp/openpnp/pull/1179#issuecomment-823295084

_Mark

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.

fxframes

unread,
Jun 3, 2021, 3:51:55 PM6/3/21
to ope...@googlegroups.com

» Now that I look at your feeder image: there is only one hole highlighted in red!

Yes, I edited the pipeline to reliably find the index hole that corresponds to the part being picked! 😁
How many holes is it expecting to find? The default pipeline found holes all over the place!

I’m hoping that would solve it, I was about to re-flash the TinyG. 🔥


fxframes

unread,
Jun 3, 2021, 4:01:08 PM6/3/21
to ope...@googlegroups.com
This is the pipeline I’m using, even if it’s wrong in finding just one hole, hopefully it contains some ideas to help with clear tapes.

Screen Recording 2021-06-03 at 20.55.51.mp4

dennis bonotto

unread,
Jun 3, 2021, 4:24:54 PM6/3/21
to ope...@googlegroups.com
I’m manually providing the first hole location, the second hole location and the tape dimensions, that’s why I didn’t think I needed the pipeline to find anything but the correct index hole. 🙂

> On Jun 3, 2021, at 21:01, fxframes <mp5...@gmail.com> wrote:
>
> This is the pipeline I’m using, even if it’s wrong in finding just one hole, hopefully it contains some ideas to help with clear tapes.
>
> <Screen Recording 2021-06-03 at 20.55.51.mp4>

ma...@makr.zone

unread,
Jun 3, 2021, 5:52:26 PM6/3/21
to ope...@googlegroups.com

I'm no expert for the strip feeder, I always thought it needs multiple holes. But maybe that's only for Auto Setup.

The last thing that comes to my mind, is that the strip feeder will do very crazy things, when not setup exactly right, for instance when the reference/last holes do not match the reality. The strip feeder only tries to correct the tape "course", it cannot correct its initial position.

See this animation:

 https://makr.zone/strip-feeder-crazy.gif

So if you shifted your home coordinate perhaps and all your feeder holes are off by a certain distance, then the strange pick angle could happen.

But then again, alignment should fix this. Gotta go to bed...

_Mark

fxframes

unread,
Jun 4, 2021, 9:26:25 AM6/4/21
to ope...@googlegroups.com
* SOLVED *

First of all thank you all again for the support and ideas.

The fact that I was observing the nozzle rotate 180º back and forth between the pick and the bottom camera alignment, combined with the fact that the angle was off by approximately the same amount for all parts made me think of something I had observed before on my machine: the nozzle adapter could become a little loose over time, especially after occasionally hitting the nozzle tip support. 😁

So I took it out and applied just one turn of thread seal tape and voilà! The offset gone away, just like that.


Mark, just regarding the strip feeder pipeline, if you ever figure out if it needs to find multiple holes even in manual setup mode, would you please let me know? 😁

Thanks,

ma...@makr.zone

unread,
Jun 4, 2021, 9:32:00 AM6/4/21
to ope...@googlegroups.com

Thanks for reporting this back. This was one heck of an odyssey! ;-)

> Mark, just regarding the strip feeder pipeline, if you ever figure out if it needs to find multiple holes even in manual setup mode, would you please let me know?

Yeah, I had a look. It does not need more than one hole in update mode, i.e. only in Auto Setup.

_Mark

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.

ma...@makr.zone

unread,
Jun 4, 2021, 9:39:36 AM6/4/21
to ope...@googlegroups.com

Oh, and don't forget to re-enable pre-rotate. Like I said, it gives better results.

fxframes

unread,
Jun 4, 2021, 9:41:24 AM6/4/21
to ope...@googlegroups.com
Thanks for the reminder, I did forget. :)

Clemens Koller

unread,
Jun 5, 2021, 2:35:01 PM6/5/21
to ope...@googlegroups.com
Hi!

On 02/06/2021 21.41, fxframes wrote:
> Well I'm not sure now because it seems that on the LitePlacer the 2.25 gear reduction on the rotation axis dictates that the correct number here is 160 (360º/2.25) which is what it is right now.
> Jason posted this here some time ago. For the record, I did play a little bit with it but not much changed. I'm not sure if this can or even should be set somewhere else.

On the Liteplacer+TinyG, the A-Axis (rotational) can be setup as a linar axis as well as a rotational ("radius mode") axis via $4am.
The scaling differs significantly in between these modes. I am currently a kind of happy with the radius mode $4am=3.
With some effort studying the *urgh* documentation, I came to the conclusion that it makes sense to set $ara to 57.2957795deg which corresponds to 1rad or 360deg / 2*PI. Then the Travel per Revolution $4tr is 160 because of the 2.25 gear ratio from the stepper to the nozzle axis( 360deg / 2.25 = 160deg)

Here are my A-Axis parameters for TinyG in "radius mode":

$4ma=3
$4sa=0.9
$4tr=160
$4mi=8
$4po=0
$4pm=2
$aam=3
$avm=100000
$afr=100000
$atn=-1"
$atm=-1"
$ajm=57600
$ajh=57600
$ajd=0.05
$ara=57.2957795
$asn=0
$asx=0
$asv=21600
$alv=21600
$alb=5
$azb=2

Maybe somebody can post the parameters for linear mode - is anybody using it?

As mentioned some time ago, I strongly recommend to avoid setting $ parameters from within OpenPnP, as TinyG is unable to update them internally as fast as it accepts the parameters and OpenPnP just cannot know that it's pushing TinyG beyond its limits!!!

Greets,

Clemens

On 02/06/2021 21.41, fxframes wrote:
> Well I'm not sure now because it seems that on the LitePlacer the 2.25 gear reduction on the rotation axis dictates that the correct number here is 160 (360º/2.25) which is what it is right now.
> Jason posted this here some time ago. For the record, I did play a little bit with it but not much changed. I'm not sure if this can or even should be set somewhere else.
>
> I agree that OpenPnP seems to be turning more/less than it should, what's baffling me is that it seems to add/subtract a number of degrees to all placements, regardless of nozzle tip or vision settings.
> In the images in my first post there are 4 different parts which were placed using 3 different nozzles and of course used their own individual vision pipelines.
> On Wednesday, June 2, 2021 at 8:06:20 PM UTC+1 mike....@gmail.com wrote:
>
> Correct - what is your current value? 
> Mike
>
>> On 2 Jun 2021, at 19.39, fxframes <mp5...@gmail.com> wrote:
>>
>> Thanks for the suggestion.
>> From the images below it does seem to turn a bit more than 90º.
>> Would it be the case of adjusting steps/mm?
>>
>> Start.
>>
>> <Screenshot 2021-06-02 at 18.34.54.png>
>>
>>
>> End
>>
>> <Screenshot 2021-06-02 at 18.35.29.png>
>>
>> On Wednesday, June 2, 2021 at 6:07:31 PM UTC+1 mike....@gmail.com wrote:
>>
>> Try this - JOG nozzle rotation 9 times 10 deg ____on top of up-looking camera ___ does your part turn 90 deg on nozzle ?
>>
>> On Wednesday, 2 June 2021 at 19:04:35 UTC+2 mp5...@gmail.com wrote:
>>
>> My bad, after taking the bottom camera down I understood that the "skew" on the image is a consequence of me trying to manually adjust the Bottom camera rotation...
>> So I'm back to square one.
>>
>> On Wednesday, June 2, 2021 at 4:51:46 PM UTC+1 fxframes wrote:
>>
>> Correct, I redid the bottom camera calibration and rotation for all nozzle tips and the problem persists.
>> However, even though all nozzle tips calibrated successfully and the result shown for the bottom camera is around 1.7º,
>>
>> Screenshot 2021-06-02 at 16.46.40.png
>>
>> when aligning the nozzle between placements I noticed how the bottom camera "square" is skewed by a similar angle (close to 5º) to the placement error as you can see below.
>> Shouldn't the nozzle tip calibration have caught that?
>>
>> Screenshot 2021-06-02 at 16.40.42.png
>>
>>
>> On Wednesday, June 2, 2021 at 3:46:13 PM UTC+1 johanne...@formann.de wrote:
>>
>> Just to be sure: tried to rotate bottom vision camera (SW or HW), and that didn't changed the behavior?
>> Does the problem persist, if you disable bottom vision?
>>
>> mp5...@gmail.com schrieb am Mittwoch, 2. Juni 2021 um 16:14:45 UTC+2:
>>
>> Thanks, I did it again just in case, but that doesn't seem to be what's causing this. Just ran those parts again:
>>
>> IMG_1905.jpg 
>> IMG_1904.jpg
>>
>> On Wednesday, June 2, 2021 at 2:33:39 PM UTC+1 johanne...@formann.de wrote:
>>
>> Is your bottom vision camera rotated?
>> Would try to adjust (either in hardware or in the openpnp settings) that.
>>
>> mp5...@gmail.com schrieb am Mittwoch, 2. Juni 2021 um 15:24:54 UTC+2:
>>
>> Just adding something, my PCB is detected through fiducials at a slight angle, but again much smaller than the angle at which the parts are being placed.
>>
>> Screenshot 2021-06-02 at 14.23.28.png
>> On Wednesday, June 2, 2021 at 2:22:52 PM UTC+1 fxframes wrote:
>>
>> Hello Everyone,
>>
>> I'm having trouble with a rotation offset that seems to applied to all the parts on this one job. I have tried turning "Rotate parts prior to vision?" on and off, to no avail.
>>
>> I've also checked that bottom vision is recognising the parts well, and these are three very different packages as you can see in the images below.
>>
>> This did not seem to affect a previous, different job.
>>
>> I did a visual check on steps/mm ($4tr=160, TinyG) and it seems to be about right (made a pen mark on the rotation axis) and anyway the offset is much larger.
>>
>> Any tips?
>>
>> Thanks!
>>
>> IMG_1903.jpg
>>
>> IMG_1901.jpg
>>
>> --
>> You received this message because you are subscribed to the Google Groups "OpenPnP" group.
>> To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
>> To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/50bc1d31-954f-4f0d-ae1d-600c6530c369n%40googlegroups.com <https://groups.google.com/d/msgid/openpnp/50bc1d31-954f-4f0d-ae1d-600c6530c369n%40googlegroups.com?utm_medium=email&utm_source=footer>.
>> <Screenshot 2021-06-02 at 18.34.54.png>
>> <Screenshot 2021-06-02 at 18.35.29.png>
>
> --
> You received this message because you are subscribed to the Google Groups "OpenPnP" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com <mailto:openpnp+u...@googlegroups.com>.
> To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/c8780fec-e689-4e73-936d-c20c931ec4ffn%40googlegroups.com <https://groups.google.com/d/msgid/openpnp/c8780fec-e689-4e73-936d-c20c931ec4ffn%40googlegroups.com?utm_medium=email&utm_source=footer>.

Clemens Koller

unread,
Jun 5, 2021, 3:27:02 PM6/5/21
to ope...@googlegroups.com
Hi!

I forgot to mention that I've setup my A-Axis differently as in the default Liteplacer+TinyG. The A-Axis Stepper is mounted facing upwards and not
facing downwards to get more Z travel range. In any case, take care that the direction of the rotation is correct. When I press the right "C-Axis" button, my A-Axis turns (regular view onto PCB) negatively, clockwise, the same direction as shown on the button image.

You can change the rotation of the stepper simply by inverting the current through one winding (swapping the leads) or, off course, in the TinyG configuration.

Just in case if you want to check the precision/repeatability of some rotational axis, you can glue a little mirror on the axis and use a laser pointer to
magnify the positional error with the distance to a wall. It's still sticks on my nozzle axis.

Clemens
20210605_Liteplacer_TinyG_Alternate_Setup_Mirror_cut.jpg

Clemens Koller

unread,
Jun 5, 2021, 3:51:09 PM6/5/21
to ope...@googlegroups.com
Hi!

On 03/06/2021 22.00, fxframes wrote:
> This is the pipeline I’m using, even if it’s wrong in finding just one hole, hopefully it contains some ideas to help with clear tapes.

This doesn't look okay or robust in my opinion.
The DetectCirclesHough is supposed to detect all cirles in the image (unless masked).

My thoughs are:
Do not use the MaskCircle d=200 as you cannot really see how robust the circle detection is. I recommend to use a MaskRectangle if necessary.
Do not use BlurGaussian at all (I tend to say: generally in OpenPnP) - use BlurMedian 5 because of it's edge and round corner preserving behaviour.
(In OpenPnP, the only reason I am using BlurGaussian followed by a Threshold operation is to do some lazy erosion/dilatation).
Do not use DetectEdgesCanny to prepare an image for DetectCirclesHough, as the DetectCirclesHough has (unfortunately?) already an Canny Edge Detector built. If you use the DetectEdgesCanny before the DetectCirclesHough, you get the Hough Operating to work on edges of edges (=two edges), which leads to positional jitter.

I strongly suggest replacing the OpenPnP's default Image Pipelines with that in mind.

Attached is my image pipeline for regular 1608M resistors in a white tape on black background.
Since all my tapes are alinged horizontally quite close to each other, I am using MaskRectangle accordingly. But this is optionally.

Greets,

Clemens
imgpipeline-R1608M-Hough.xml

fxframes

unread,
Jun 7, 2021, 5:31:10 AM6/7/21
to ope...@googlegroups.com
Hello

My A axis is set to linear, here are the parameters.

[4ma] m4 map to axis 3 [0=X,1=Y,2=Z...]
[4sa] m4 step angle 0.900 deg
[4tr] m4 travel per revolution 160.0000 mm
[4mi] m4 microsteps 8 [1,2,4,8]
[4po] m4 polarity 0 [0=normal,1=reverse]
[4pm] m4 power management 2 [0=disabled,1=always on,2=in cycle,3=when moving]
[aam] a axis mode 1 [standard]
[avm] a velocity maximum 50000 deg/min
[afr] a feedrate maximum 200000 deg/min
[atn] a travel minimum 0.000 deg
[atm] a travel maximum 600.000 deg
[ajm] a jerk maximum 5000 deg/min^3 * 1 million
[ajh] a jerk homing 5000 deg/min^3 * 1 million
[ajd] a junction deviation 0.1000 deg (larger is faster)
[ara] a radius value 5.3052 deg
[asn] a switch min 0 [0=off,1=homing,2=limit,3=limit+homing]
[asx] a switch max 0 [0=off,1=homing,2=limit,3=limit+homing]
[asv] a search velocity 2000 deg/min
[alv] a latch velocity 2000 deg/min
[alb] a latch backoff 5.000 deg
[azb] a zero backoff 2.000 deg

> On 5 Jun 2021, at 19:34, Clemens Koller <cleme...@gmx.net> wrote:
>
> Hi!

fxframes

unread,
Jun 7, 2021, 6:13:50 AM6/7/21
to ope...@googlegroups.com
Hi again,

Thanks for your comments.
I use different pipelines for different parts/feeders.
The pipeline that I posted is specific to that part which is supplied in a clear plastic tape and it’s been very reliable so far.

Do not use the MaskCircle d=200 as you cannot really see how robust the circle detection is. I recommend to use a MaskRectangle if necessary.

I use MaskRectangle and MaskCircle interchangeably. Either is fine as long as you ensure the dimensions of the area of interest always falls within the resulting image.

Do not use BlurGaussian at all (I tend to say: generally in OpenPnP) - use BlurMedian 5 because of it's edge and round corner preserving behaviour.

I typically combine both to achieve a good contrast and minimize the 3D printed lines of my feeders. Because the feeders are in different positions, with slightly different lighting, with different tape types, I always play with both to achieve a clear edge. For the clear plastic tapes, I also add a HistogramEqualize stage to help with that.


Do not use DetectEdgesCanny to prepare an image for DetectCirclesHough, as the DetectCirclesHough has (unfortunately?) already an Canny Edge Detector built. 

You can see in my video that that stage is actually disabled.

Attached is my image pipeline for regular 1608M resistors in a white tape on black background.

Yes for white paper tapes on a dark background my pipeline is similar to yours.

Cheers

On 5 Jun 2021, at 20:51, Clemens Koller <cleme...@gmx.net> wrote:

Hi!

This doesn't look okay or robust in my opinion.
The DetectCirclesHough is supposed to detect all cirles in the image (unless masked).

My thoughs are:
Do not use the MaskCircle d=200 as you cannot really see how robust the circle detection is. I recommend to use a MaskRectangle if necessary.
Do not use BlurGaussian at all (I tend to say: generally in OpenPnP) - use BlurMedian 5 because of it's edge and round corner preserving behaviour.
(In OpenPnP, the only reason I am using BlurGaussian followed by a Threshold operation is to do some lazy erosion/dilatation).
Do not use DetectEdgesCanny to prepare an image for DetectCirclesHough, as the DetectCirclesHough has (unfortunately?) already an Canny Edge Detector built. If you use the DetectEdgesCanny before the DetectCirclesHough, you get the Hough Operating to work on edges of edges (=two edges), which leads to positional jitter.

I strongly suggest replacing the OpenPnP's default Image Pipelines with that in mind.

Attached is my image pipeline for regular 1608M resistors in a white tape on black background.
Since all my tapes are alinged horizontally quite close to each other, I am using MaskRectangle accordingly. But this is optionally.

Greets,

Clemens
--
You received this message because you are subscribed to a topic in the Google Groups "OpenPnP" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/openpnp/HPZlD0ohRcE/unsubscribe.
To unsubscribe from this group and all its topics, send an email to openpnp+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/84593f71-c3f4-3884-5c18-b89fd48d35f0%40gmx.net.
<imgpipeline-R1608M-Hough.xml>

ma...@makr.zone

unread,
Jun 7, 2021, 11:34:46 AM6/7/21
to ope...@googlegroups.com
Am 05.06.2021 um 21:51 schrieb Clemens Koller:
Do not use BlurGaussian at all (I tend to say: generally in OpenPnP) - use BlurMedian 5 because of it's edge and round corner preserving behaviour.

If you say it this general, I disagree ;-)

BlurMedian should only be used with essentially binary (black and white/very high contrast) images, to erode away insignificant specks typically after a thresholding/channel masking/edge detection operation has taken place.

On color/gray-scale images with soft gradients, BlurMedian loses location information, i.e. it unpredictably "shifts around" the image by as much as its kernel size. An unevenly lighted gradient - e.g. a rounded edge on a paper sprocket hole, a ball on a BGA or a bevel on a pin lighted slightly from the side - may appear to shift to one side. As you cannot determine the percentile (it always takes the fiftiest percentile, a.k.a. the median) it effectively creates an artificial edge at an unpredictable gradient level. 

Look at this animation I made from images by Tutorials Point:

https://makr.zone/blur-median.gif

Source:
https://www.tutorialspoint.com/opencv/opencv_median_blur.htm

See how the rucksack is "growing" into the back of the boy, how the top is "lifted", how the boy's face seems to be "pushed in", how the balloon seen closest to him seemingly shifts position down/left.

Of course the effect is exaggerated here, but you get the idea. 

If there is a rule:
  1. Use BlurGaussian before a thresholding/channel masking/edge detection operation.
  2. Use BlurMedian after a thresholding/channel masking/edge detection operation.

_Mark


Clemens Koller

unread,
Jun 7, 2021, 2:05:05 PM6/7/21
to ope...@googlegroups.com
Hi, Mark!

On 07/06/2021 17.34, ma...@makr.zone wrote:
> Am 05.06.2021 um 21:51 schrieb Clemens Koller:
>> Do not use BlurGaussian at all (I tend to say: generally in OpenPnP) - use BlurMedian 5 because of it's edge and round corner preserving behaviour.
>
> If you say it /this /general, I disagree ;-)
>
> BlurMedian should only be used with essentially binary (black and white/very high contrast) images, to erode away insignificant specks typically /after /a thresholding/channel masking/edge detection operation has taken place.

I disagree, too, if you say "only". ;-)

There is a quite good explanation here: https://en.wikipedia.org/wiki/Median_filter#Edge_preservation_properties
An important part is also: "For small to moderate levels of Gaussian noise..." which I assume is the usual case in OpenPnP's camera images.

I agree with you that a BlurMedian (as well as a BlurGauss+Threshold) on a binary image can be (mis-)used as a erosion / dilatation operation.
However, I would not advice to do that. If you get a lot of undesired speckles, your preprocessing is likely not very robust (i.e. after applying a threshold).

> On color/gray-scale images with soft gradients, BlurMedian loses location information, i.e. it unpredictably "shifts around" the image by as much as its kernel size.

Yes, but it's not unpredictable.
And a BlurGaussian + Threshold operation is also displacing edges with an uneven amount depending on the Blur radius + threshold value.

> An unevenly lighted gradient - e.g. a rounded edge on a paper sprocket hole, a ball on a BGA or a bevel on a pin lighted slightly from the side - may appear to shift to one side. As you cannot determine the percentile (it always takes the fiftiest percentile, a.k.a. the median) it effectively creates an /artificial /edge at an unpredictable gradient level. 

This is true. I was however not talking about edge displacement / shifting (loosing spatial information), which can be a problem in some cases. I was talking about edge preserving behaviour (maintaining contrast information, which is what Canny chews on). An edge displacement is very often not an issue in OpenPnP's use cases when the filter operation is sufficiently isotropic.

> If there is a rule:
>
> 1. Use BlurGaussian /before /a thresholding/channel masking/edge detection operation.
> 2. Use BlurMedian /after /a thresholding/channel masking/edge detection operation.

I don't think that these rules in a general sense are helpful or valid. Off course it all depends on the final goal of the image operations.

I would vote to implement one of the advanced bilateral filters to improve robustness in this case.
It's a pity that I don't have more time to dig deeper into the code, here. :-(

Clemens

ma...@makr.zone

unread,
Jun 7, 2021, 3:55:09 PM6/7/21
to ope...@googlegroups.com

> I was however not talking about edge displacement / shifting (loosing spatial information), which can be a problem in some cases. I was talking about edge preserving behaviour (maintaining contrast information...

OK, I understand. For filters in photography, you're absolutely right. Sorry, I was talking about performing machine vision not about erm... beautifying photos ;-) I guess you agree, for machine vision, it is 100% about spatial information. 

> ... which is what Canny chews on

Canny specifically should be prepended by a Gaussian filter:
https://en.wikipedia.org/wiki/Canny_edge_detector#Gaussian_filter
https://docs.opencv.org/master/da/d5c/tutorial_canny_detector.html

If you want to improve on it, you'd need a special replacement for the Gaussian filter "in order to reach high accuracy of detection of the real edge":
https://en.wikipedia.org/wiki/Canny_edge_detector#Replace_Gaussian_filter

Edges are always fickle, lots of tuning needed. The most robust solution is not to detect edges in the first place but instead work probabilistically. Like with template image matching.

... or with my circular symmetry stage, see the "Example Images" section here:
https://github.com/openpnp/openpnp/pull/1179

I since tested it on the machine. It nails everything, zero detection fails so far (that were not sitting behind the keyboard). Sprocket holes in tapes of all colors/transparent on all backgrounds, nozzle tips, (bad) fiducials. Even completely out of focus with barely any contrast. Doesn't care a bit about changing ambient light.

One original pipeline, zero setup: All it requires is the expected diameter. Once the camera units per pixel are known, it's a no-brainer. Everybody can use a caliper on a nozzle tip or read the datasheet, no pipeline editing skills required. OpenPnP provides the diameter dynamically to the pipeline from easy to set GUI settings or existing data (like footprints if available).

In the meantime I gave it sub-pixel accuracy. Working on multi-target-detection now...

Sorry about the blab, I'm just really happy how this turned out. 8-D

_Mark

P.S. It took me a while to come back to the idea...
https://groups.google.com/g/openpnp/c/0-S2DMXe3t0/m/0FCu8kTzBQAJ

Clemens Koller

unread,
Jun 8, 2021, 10:18:56 PM6/8/21
to ope...@googlegroups.com
Hi, Mark!

On 07/06/2021 21.55, ma...@makr.zone wrote:
> /> //I was however not talking about edge displacement / shifting (loosing spatial information), which can be a problem in some cases. I was talking about edge preserving behaviour (maintaining contrast information.../
>
> OK, I understand. For filters in photography, you're absolutely right. Sorry, I was talking about performing machine vision not about erm... beautifying photos ;-) I guess you agree, for machine vision, it is 100% about spatial information.

I was and I am talking about OpenPnP and not any photography purposes. There is no need to play stupid on me.

> /> ... which is what Canny chews on/
>
> Canny specifically should be prepended by a Gaussian filter:
> https://en.wikipedia.org/wiki/Canny_edge_detector#Gaussian_filter
> https://docs.opencv.org/master/da/d5c/tutorial_canny_detector.html <https://docs.opencv.org/master/da/d5c/tutorial_canny_detector.html>

This (what is written there) is true for the general case. It would not be a good idead to suggest the Median Filter *generally*, as it will fail badly when you have to deal with salt and pepper noise, for example. But this is usually/very likely *not* the case in OpenPnP, where people are using normal CMOS (CCD) cameras which have a relatively uniform gaussian (thermal) noise distribution (AWGN+x). Then, in this case, the Median Filters will just perform better as I've tried to explain already. That's why I've included the Wikipedia link to the Median Filter last time. Maybe you didn't notice:
"For small to moderate levels of Gaussian noise, the median filter is demonstrably better than Gaussian blur at removing noise whilst preserving edges for a given, fixed window size."

I was working for several years in a company where we had to deal with edge detection of all kind of solar cells and ceramic patches with sub-pixel precision up to the point when we started to go with telecentric optics and shorter wavelengths to achieve higher precision. I am not talking about general approaches. Machine Vision only. 8-)

> If you want to improve on it, you'd need a special replacement for the Gaussian filter "in order to reach high accuracy of detection of the real edge":
> https://en.wikipedia.org/wiki/Canny_edge_detector#Replace_Gaussian_filter <https://en.wikipedia.org/wiki/Canny_edge_detector#Replace_Gaussian_filter>
>
> Edges are always fickle, lots of tuning needed. The most robust solution is /not /to detect edges in the first place but instead work */probabilistically/*. Like with template image matching.

Well... it all depends of the final goal and how to wisely tailor the intermediate processing steps to achieve that. Adaptive filtering is only as good as the model of your imaging channel is. It likely not a big deal to get better than the "blind" approach (which is usually not taking the image content into account to adjust the filters) which are more or less the current default image pipelines in OpenPnP.


> ... or with my circular symmetry stage, see the "Example Images" section here:
> https://github.com/openpnp/openpnp/pull/1179 <https://github.com/openpnp/openpnp/pull/1179>
>
> I since tested it on the machine. It nails everything, zero detection fails so far (that were not sitting behind the keyboard). Sprocket holes in tapes of all colors/transparent on all backgrounds, nozzle tips, (bad) fiducials. Even completely out of focus with barely any contrast. Doesn't care a bit about changing ambient light.
>
> One original pipeline, zero setup: All it requires is the expected diameter. Once the camera units per pixel are known, it's a no-brainer. Everybody can use a caliper on a nozzle tip or read the datasheet, no pipeline editing skills required. OpenPnP provides the diameter dynamically to the pipeline from easy to set GUI settings or existing data (like footprints if available).
>
> In the meantime I gave it sub-pixel accuracy. Working on multi-target-detection now...
>
> Sorry about the blab, I'm just really happy how this turned out. 8-D

I am looking forward to test this and read the code when I setup the next PCB on the machine.


Otherwise, as I've written some weeks ago, it would be fun to get my code based on https://users.fmrib.ox.ac.uk/~steve/susan/ back to life. But the stuff is highly optimized c++ code with lookup tables and pointer arithmetics because we had to go for max. throughput... propably not so easy to get over into the Java world. The math behind SUSAN is IMO very interesting and can be tailored (adaptively) to all kind of feature extraction stages (adaptive denoising, edge detection, corner detection (and even anisotropic). I think I've read that the patent is expired in the meanwhile... There are several sources out in the wild. I am not sure if something ended up already in OpenCV or not. YMMV.


Clemens

ma...@makr.zone

unread,
Jun 9, 2021, 3:53:24 AM6/9/21
to ope...@googlegroups.com

Hi Clemens,

To explain why I bother: I was originally responding to this:

> Do not use BlurGaussian at all (I tend to say: generally in OpenPnP) - use BlurMedian 5 because of it's edge and round corner preserving behaviour.

All I was saying is that this is not true in its general and absolute form. I would still argue it is more often wrong than true.

And I started to care because this has a potency to mislead users.

> "For small to moderate levels of Gaussian noise, the median filter is demonstrably better than Gaussian blur at removing noise whilst preserving edges for a given, fixed window size."

Well I still believe this sentence does apply to photography. Immediately before that sentence you cited, it says: "Edges are of critical importance to the visual appearance of images, for example."

https://en.wikipedia.org/wiki/Median_filter#Edge_preservation_properties

It does preserve an edge, yes, but not necessarily at the right place, as I demonstrated with the boy+balloons image:

https://makr.zone/blur-median.gif

Like I said the median blur is fine if the image at hand is already very high contrast, ideally already binary. If there are no relevant smooth gradients or artifacts involved in or around the edge, then OK.

If in doubt, use Gaussian. Gaussian does better preserve spacial information, at least above the channel (integral) resolution and noise level. Hence it is benign for noise and other artifacts reduction. Most common cameras use MJPEG or other compression methods that produce artifacts. These look nice in our brains but are bad for machine vision. Compression often involves an underlying 8x8-pixel block size. Gaussian will typically restore a weaker, but more likely correct edge signal out of that (probabilistically speaking). 

> I am looking forward to test this and read the code when I setup the next PCB on the machine.

You can already do that, if you want. it's already in newer OpenPnP 2.0 (not yet with sub-pixel accuracy). The pipelines are posted in the PR. Just paste them an try. First version code is also linked:

https://github.com/openpnp/openpnp/pull/1179

_Mark

fxframes

unread,
Jun 15, 2021, 12:40:42 PM6/15/21
to ope...@googlegroups.com
Thank you Mark, this stage works really well. 👍🏻

fxframes

unread,
Jun 15, 2021, 2:42:32 PM6/15/21
to OpenPnP
Mark,

When using your new stage with a strip feeder I'm getting an error when trying to pick a part:

Screenshot 2021-06-15 at 19.39.37.png

Any easy way around it?

Thanks.

ma...@makr.zone

unread,
Jun 15, 2021, 2:50:23 PM6/15/21
to ope...@googlegroups.com

Hi fx frames

this is not yet supported in the current stage, it can only detect one hole. But I'm just in the process of testing this. ;-)

Coming soon!

_Mark

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.

fxframes

unread,
Jun 15, 2021, 4:11:25 PM6/15/21
to OpenPnP
Hi Mark,

As discussed previously, for the "manual" feeder mode only one hole is needed, so I thought it was more of a formatting issue (key points vs circles).
Anyway thanks for the feedback. :)

ma...@makr.zone

unread,
Jun 15, 2021, 4:44:24 PM6/15/21
to ope...@googlegroups.com

Ah yes, if you're only using it for the feed vision and not for Auto Setup then it should work.

But the PR is already done:
https://github.com/openpnp/openpnp/pull/1217

Please update your OpenPnP 2.0 version.

You could help me by testing the pipelines as proposed in the PR. ;-)

They should work out-of-the-box, the goal is no tuning with any tape color or material, any background color or material, transparent tapes etc.

Also for Auto Setup.

But be mindful that you need quite accurate Units per Pixel set on the camera and your feeders must be close to the camera focal plane in Z.

If it does not works as is with your feeders, please send the input images (insert an ImageWriteDebug stage right after ImageCapture). Appreciate!

_Mark

fxframes

unread,
Jun 16, 2021, 4:50:41 AM6/16/21
to ope...@googlegroups.com
Thanks Mark, I will test this and report back.

fxframes

unread,
Jun 16, 2021, 5:30:37 AM6/16/21
to OpenPnP
Hello Mark,

This doesn't seem quite right yet.

Out of the box and running auto setup it gives me

Screenshot 2021-06-16 at 10.20.31.png

This is what the pipeline looks like. You can see the hole in the center isn't "found".

Screenshot 2021-06-16 at 10.21.27.png

Screenshot 2021-06-16 at 10.21.50.png

Also when maxTargetCount=1 it finds the wrong hole.

Screenshot 2021-06-16 at 10.24.19.png

fxframes

unread,
Jun 16, 2021, 5:36:41 AM6/16/21
to ope...@googlegroups.com
At another position with maxTargetCount=6 in this image it can only find 2 holes, none is at the center.



—————

With maxTargetCount=1 it finds a hole far from the center.





fxframes

unread,
Jun 16, 2021, 5:51:09 AM6/16/21
to ope...@googlegroups.com
Hi Mark,

I know you’re focusing on auto-setup but I was reading this

• maxTargetCount: Maximum number of targets (e.g. sprocket holes) to be found. The targets with best symmetry score are returned. Overlap is automatically eliminated.

And thinking that when we’re looking for just 1 result (as in setting up the feeders manually) an off-center hole could easily have a higher symmetry score than the centermost target which is the one we’d be usually looking for.

I can make the pipeline work (for maxTargetCount=1) by adding a mask.


 

On 16 Jun 2021, at 10:30, fxframes <mp5...@gmail.com> wrote:

Hello Mark,

This doesn't seem quite right yet.

Out of the box and running auto setup it gives me

<Screenshot 2021-06-16 at 10.20.31.png>

This is what the pipeline looks like. You can see the hole in the center isn't "found".

<Screenshot 2021-06-16 at 10.21.27.png>

<Screenshot 2021-06-16 at 10.21.50.png>

Also when maxTargetCount=1 it finds the wrong hole.

<Screenshot 2021-06-16 at 10.24.19.png>
On Wednesday, June 16, 2021 at 9:50:41 AM UTC+1 fxframes wrote:
Thanks Mark, I will test this and report back.


On 15 Jun 2021, at 21:44, ma...@makr.zone wrote:

Ah yes, if you're only using it for the feed vision and not for Auto Setup then it should work. 

But the PR is already done:
https://github.com/openpnp/openpnp/pull/1217

Please update your OpenPnP 2.0 version. 

You could help me by testing the pipelines as proposed in the PR. ;-)

They should work out-of-the-box, the goal is no tuning with any tape color or material, any background color or material, transparent tapes etc. 

Also for Auto Setup.

But be mindful that you need quite accurate Units per Pixel set on the camera and your feeders must be close to the camera focal plane in Z. 

If it does not works as is with your feeders, please send the input images (insert an ImageWriteDebug stage right after ImageCapture). Appreciate!

_Mark




-- 
You received this message because you are subscribed to a topic in the Google Groups "OpenPnP" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/openpnp/HPZlD0ohRcE/unsubscribe.
To unsubscribe from this group and all its topics, send an email to openpnp+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/68efad37-1568-4469-b839-e9db6ce48ad7n%40googlegroups.com.
<Screenshot 2021-06-16 at 10.21.50.png><Screenshot 2021-06-16 at 10.24.19.png><Screenshot 2021-06-16 at 10.20.31.png><Screenshot 2021-06-16 at 10.21.27.png>

ma...@makr.zone

unread,
Jun 16, 2021, 8:08:27 AM6/16/21
to ope...@googlegroups.com

Hi fxframes

thanks for testing!

The DetectCircularSymmetry stage has a search range (the maxDistance property) that limits the search to this maximum distance from the expected position, which is usually the camera center (but it can be overridden by the vision operation such as in nozzle tip calibration associated camera calibration).

The search range controls both the scope but also the computational cost of the stage. Remember: I had to develop it in Java and Java is clearly not a good match for this low level pixel crunching stuff. Conclusion: there is no need for a mask.

But both the expected position and the search range will only be parametrized by OpenPnP when inside the actual vision function of the specific (feeder) operation. It is different in Auto Setup (range goes to camera edge) and in feed operation (range is only half a sprocket pitch).

When in the Editor the search range is like in Auto Setup i.e. to the camera edge, therefore the number and selection of holes detected might be misleading.

You can make the search range visible by enabling the diagnostics switch. Then the difference becomes visible in the result images with the overlaid heat map, see the ImageWriteDebug images below.

The important question for me is this: Does it work when used in normal operation, i.e. with Auto Setup and with feed operations?

For a feed operation (positional calibration):


In Auto-Setup / Editor (Note my camera is too narrow at the moment, because I lifted the table by 20mm but not yet the camera, so the Auto-Setup range is a bit narrow)

_Mark

You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/793DBF63-D4E4-4390-BF4E-E066F4626D6D%40gmail.com.

ma...@makr.zone

unread,
Jun 16, 2021, 8:53:05 AM6/16/21
to OpenPnP
Just saw your other post online where you said it did not work (it swallowed the images in E-Mail and I misunderstood).

I must have pasted the wrong version of the pipeline into the Wiki/PR, the target count should not be this limited and the correlation is also too high.

What I did not foresee was that the parts can be this close to the sprocket hole, so the outer margin was too high.

I corrected it, please try this one:


I also added ImageDebugWrite stages, so you can easily enable them and send me the original images when there are still problems.

This is certainly more challenging than the simple fiducial and nozzle tip use case, thanks for helping me! :-D

_Mark

fxframes

unread,
Jun 16, 2021, 9:19:24 AM6/16/21
to ope...@googlegroups.com
Hi Mark,

Thank you for the continued support and improvement. 👍🏻
This is the least I can do.

The new pipeline did improve things.
Auto-setup worked for one of my clear plastic tapes, but not for the other one. I must say though that this second one has been very challenging to get right even in manual mode because unlike most clear tapes which aren’t that clear, this one is very clear, making things more difficult. Any suggestions are welcome.

I’ve attached a couple of videos.

Screen Recording 2021-06-16 at 14.06.25.mp4
Screen Recording 2021-06-16 at 14.08.25.mp4

ma...@makr.zone

unread,
Jun 16, 2021, 10:54:47 AM6/16/21
to ope...@googlegroups.com

Could you take this pipeline (if you haven't already):

https://github.com/openpnp/openpnp/wiki/DetectCircularSymmetry#referencestripfeeder

And then enable the first ImageDebugWrite stage and then send me the images?

Found here

$HOME/.openpnp2/org.openpnp.vision.pipeline.stages.ImageWriteDebug/

Don't forget to disable again, it creates a ton of images in Auto Setup.

_Mark

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.


 
--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
-- 
You received this message because you are subscribed to a topic in the Google Groups "OpenPnP" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/openpnp/HPZlD0ohRcE/unsubscribe.
To unsubscribe from this group and all its topics, send an email to openpnp+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/b580a345-038d-4307-ac54-b9ef63d59bb4n%40googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.

ma...@makr.zone

unread,
Jun 16, 2021, 11:51:47 AM6/16/21
to ope...@googlegroups.com

Got the images. Can you please post your down-looking camera's Units per Pixel?

Thanks,
Mark

fxframes

unread,
Jun 16, 2021, 12:22:16 PM6/16/21
to ope...@googlegroups.com
Sure thing.

ma...@makr.zone

unread,
Jun 16, 2021, 2:37:45 PM6/16/21
to ope...@googlegroups.com

Yes, very difficult.

These are the gotchas I see:

  1. Your light diffuser has a hole in the middle, where the camera looks through. The clear tape therefore has no reflection there. Having a co-axial light (half-way mirror before the camera bouncing light) would help (I have the same problem on mine).
  2. The feeder holder has a strong layer pattern that is seen through the clear tape, because of point 1. The layer pattern disrupts circular symmetry outside the hole. You may have to reduce subSampling down from 8, so it is not fooled by interference effects. The stage might then be slower.
  3. The  feeder holder has an outcropping that keeps the tape in. This outcropping (or the shadow of it) goes right to the sprocket hole edge. It therefore breaks the circular symmetry around the hole. Ideally you would reduce the outcropping for the next feeders you print, it seems less would do.
  4. You can try to workaround that by reducing the outerMargin to 0.1, so the ring margin will not be cut so much. However, this may not work (i.e. you'll have to experiment with other values) due to the next point.
  5. In your video it seems the Units per Pixel are not accurate for the tape surface. I guess it is higher in Z than PCB surface i.e. it appears larger. You see how the machine moves much farther than what you clicked. And you see how four ticks on the cross hairs do not align with the sprocket hole pitch (4mm).

  6. I tried reconstructing Units per Pixel and got ~0.0206mm/pixel. I guess your calibrated value is significantly higher, if I got that right ;-).
  7. Once I apply the right Units per Pixel and reduce outerMargin to 0.1, I get detection on the image that seems like the one that fails in the video: strip_7555303524774678967.png

Regarding the feeder surface Z:

@tonyluken has introduced "3D" Units per Pixel. Unfortunately it is not yet applied to feeder Z. Once this is available, it will be possible to compensate:

https://github.com/openpnp/openpnp/pull/1112

However, until then you should have the feeder tape surfaces very close in Z to the PCB surface i.e. where you calibrated your Units per Pixel. Everything must ideally be on the same Z plane. Otherwise, you will likely always have some problems, because these feeders' vision works with well known absolute geometry from the EIA 481 standard.

For the ReferenceStripFeeder  the issue is mostly with Auto-Setup (I guess that's why you didn't use it even before trying this new stage). There is some tolerance in the code and maybe you can get it working by playing with the innerMargin/outerMargin.

For the routine feed vision, the camera will be centered on the sprocket hole and Units per Pixel will not be so important (maybe for 0402 or 0201 parts where the hole offset detected with the wrong Units per Pixel might start to matter, but I doubt it).

Conclusion: The reason it failed can be well explained (so far) and most of these issues will create similar problems with other stages.

I'm afraid the new stage cannot perform miracles ;-)

_Mark

fxframes

unread,
Jun 16, 2021, 3:32:46 PM6/16/21
to ope...@googlegroups.com
Thanks for the thorough explanation.
I’ll try some of the suggestions tomorrow. 👍🏻

On 16 Jun 2021, at 19:37, ma...@makr.zone wrote:

Yes, very difficult.

These are the gotchas I see:

  1. Your light diffuser has a hole in the middle, where the camera looks through. The clear tape therefore has no reflection there. Having a co-axial light (half-way mirror before the camera bouncing light) would help (I have the same problem on mine).
  2. The feeder holder has a strong layer pattern that is seen through the clear tape, because of point 1. The layer pattern disrupts circular symmetry outside the hole. You may have to reduce subSampling down from 8, so it is not fooled by interference effects. The stage might then be slower.
  3. The  feeder holder has an outcropping that keeps the tape in. This outcropping (or the shadow of it) goes right to the sprocket hole edge. It therefore breaks the circular symmetry around the hole. Ideally you would reduce the outcropping for the next feeders you print, it seems less would do.
  4. You can try to workaround that by reducing the outerMargin to 0.1, so the ring margin will not be cut so much. However, this may not work (i.e. you'll have to experiment with other values) due to the next point.
  1. In your video it seems the Units per Pixel are not accurate for the tape surface. I guess it is higher in Z than PCB surface i.e. it appears larger. You see how the machine moves much farther than what you clicked. And you see how four ticks on the cross hairs do not align with the sprocket hole pitch (4mm).
  1. <noljkpaaiceennln.png>
  1. I tried reconstructing Units per Pixel and got ~0.0206mm/pixel. I guess your calibrated value is significantly higher, if I got that right ;-).
  1. Once I apply the right Units per Pixel and reduce outerMargin to 0.1, I get detection on the image that seems like the one that fails in the video: strip_7555303524774678967.png
  1. <docdkaioemdihpdm.png>

fxframes

unread,
Jun 17, 2021, 10:55:56 AM6/17/21
to ope...@googlegroups.com
Hello Mark,

Once again thanks for the support.

You were absolutely correct that the feeder was higher than the PCB level. I guess that for most components you can get pretty close, but there are quite a number of larger components for which it’s just impossible unless you bring the PCB itself higher. IMHO the 3D units per pixel are an essential add-on for “standard” installations though and hopefully that will come out in the near future.

And yes the layer pattern was from the print orientation. I re-printed that feeder with a much lower height and also in a different orientation and the stage got much more consistent results. After that, changing subSampling to 4 did the trick, even auto setup worked. So far I haven’t messed with the margins. 👍🏻

One final note, did you get one of those coax LEDs for yourself? There doesn’t seem to be a lot of them around. This one seems like it could work.

Thanks!


    1. Your light diffuser has a hole in the middle, where the camera looks through. The clear tape therefore has no reflection there. Having a co-axial light (half-way mirror before the camera bouncing light) would help (I have the same problem on mine).
    2. The feeder holder has a strong layer pattern that is seen through the clear tape, because of point 1. The layer pattern disrupts circular symmetry outside the hole. You may have to reduce subSampling down from 8, so it is not fooled by interference effects. The stage might then be slower.
    3. The  feeder holder has an outcropping that keeps the tape in. This outcropping (or the shadow of it) goes right to the sprocket hole edge. It therefore breaks the circular symmetry around the hole. Ideally you would reduce the outcropping for the next feeders you print, it seems less would do.
    4. You can try to workaround that by reducing the outerMargin to 0.1, so the ring margin will not be cut so much. However, this may not work (i.e. you'll have to experiment with other values) due to the next point.
    1. In your video it seems the Units per Pixel are not accurate for the tape surface. I guess it is higher in Z than PCB surface i.e. it appears larger. You see how the machine moves much farther than what you clicked. And you see how four ticks on the cross hairs do not align with the sprocket hole pitch (4mm).
    1. <noljkpaaiceennln.png>
    1. I tried reconstructing Units per Pixel and got ~0.0206mm/pixel. I guess your calibrated value is significantly higher, if I got that right ;-).
    1. Once I apply the right Units per Pixel and reduce outerMargin to 0.1, I get detection on the image that seems like the one that fails in the video: strip_7555303524774678967.png
    1. <docdkaioemdihpdm.png>

    ma...@makr.zone

    unread,
    Jun 17, 2021, 11:41:29 AM6/17/21
    to ope...@googlegroups.com

    Glad it works out.

    > One final note, did you get one of those coax LEDs for yourself? There doesn’t seem to be a lot of them around. This one seems like it could work.

    If you want even lighting in the full camera view plus high Z clearance, co-axial lighting becomes problematic, because the half-mirror needs to reach the edge of the reflecting light cone (or pyramid?). The mirror glass will need to be large and reach far away from the lens front and reduce Z clearance. You will likely need a longer focal length (which in itself is good, but requires buying a new lens), and a higher camera mounting point (which is difficult on an existing machine design).

    Therefore, I was thinking about creating a hybrid design, with only the center part (where the camera needs to peek through a diffuser) being half-mirrored and the rest conventional. You could use one of these very thin microscope cover glasses, that are available with optical quality. LEDs would be pointing up from a ring towards the diffuser, and towards the mirror from a small "side-car" PCB angled at 90°.

    But I only got to design a very basic "light cone" in OpenSCAD so far (6.2mm lens):


    Another design is not co-axial but has a diffuser design that only leaves a tiny gap (hope you get the two pictures):


    _Mark

    --
    You received this message because you are subscribed to the Google Groups "OpenPnP" group.
    To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
    Reply all
    Reply to author
    Forward
    0 new messages