--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/ea3d72b7-d683-4f14-8fc2-18cc9c4b92e8%40googlegroups.com.
Hi Trinh
have you tested with the new Camera Settle Diagnostics? According
to my tests, it is not the camera that requires a long Settle
Time, but the vibration of the machine.
https://makr.zone/openpnp-advanced-camera-settling/431/
Maybe your machine is much better mechanically, but before you
invest a lot of money, you should use the new diagnostics to make
sure.
And I would greatly appreciate if you posted your images
afterwards, because after developing this I'd love to see how
these diagnostics work on other machines. :-)
You can get images as follows (using the newest OpenPnP 2.0):
_Mark
> I doubt it’s the camera that’s slowing it down, more the image processing pipeline (opencv).. have you profiled it to see where it’s spending the time?
If it is the image processing, you can use my new AffineWarp/AffineUnwarp Pipeline Stages for small parts' Bottom Vision and for the Fiducial Locator.
It will cut out the center piece of the camera image and all the
stages will then only have to process much fewer pixels.
Plus you get a nice magnification effect for free. :-)
Here are my stages...
Bottom Vision:
<pipeline>
<stages>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageCapture" name="0"
enabled="true" settle-first="true" count="1"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.AffineWarp"
name="warp" enabled="true" length-unit="Millimeters" x-0="-2.5"
y-0="2.5" x-1="2.5" y-1="2.5" x-2="-2.5" y-2="-2.5" scale="1.0"
rectify="false"
region-of-interest-property="regionOfInterest"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageWriteDebug"
name="13" enabled="true" prefix="bv_source_" suffix=".png"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.BlurGaussian"
name="10" enabled="true" kernel-size="9"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.MaskCircle" name="4"
enabled="true" diameter="300"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ConvertColor" name="1"
enabled="false" conversion="Bgr2HsvFull"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.MaskHsv" name="2"
enabled="false" auto="false" fraction-to-mask="0.0" hue-min="0"
hue-max="255" saturation-min="0" saturation-max="255"
value-min="0" value-max="128" invert="false"
binary-mask="false"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ConvertColor" name="6"
enabled="true" conversion="Bgr2Gray"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.Threshold" name="12"
enabled="true" threshold="160" auto="false" invert="false"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.BlurMedian" name="17"
enabled="true" kernel-size="5"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.MinAreaRect"
name="results1" enabled="true" threshold-min="100"
threshold-max="255"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageRecall" name="3"
enabled="true" image-stage-name="warp"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.DrawRotatedRects"
name="8" enabled="true" rotated-rects-stage-name="results1"
thickness="2" draw-rect-center="false" rect-center-radius="40"
show-orientation="true">
<color r="255" g="255"
b="0" a="255"/>
</cv-stage>
<cv-stage
class="org.openpnp.vision.pipeline.stages.AffineUnwarp"
name="results" enabled="true" warp-stage-name="warp"
results-stage-name="results1"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageWriteDebug"
name="15" enabled="true" prefix="bv_result_" suffix=".png"/>
</stages>
</pipeline>
Fiducial:
<pipeline>
<stages>
<cv-stage
class="org.openpnp.vision.pipeline.stages.CreateFootprintTemplateImage"
name="template" enabled="true"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageWriteDebug"
name="debug_template" enabled="true" prefix="fidloc_template_"
suffix=".png"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ConvertColor"
name="template_gray" enabled="true" conversion="Bgr2Gray"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageCapture"
name="image" enabled="true" settle-first="true" count="1"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.AffineWarp"
name="warp" enabled="true" length-unit="Millimeters" x-0="-3.0"
y-0="3.0" x-1="3.0" y-1="3.0" x-2="-3.0" y-2="-3.0" scale="1.0"
rectify="true"
region-of-interest-property="regionOfInterest"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ConvertColor"
name="image_gray" enabled="true" conversion="Bgr2Gray"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageWriteDebug"
name="debug_original" enabled="true" prefix="fidloc_original_"
suffix=".png"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.MatchTemplate"
name="match_template1" enabled="true"
template-stage-name="template_gray"
threshold="0.699999988079071" corr="0.8500000238418579"
normalize="true"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageRecall" name="1"
enabled="true" image-stage-name="warp"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.DrawTemplateMatches"
name="0" enabled="true"
template-matches-stage-name="match_template1">
<color r="255" g="255"
b="0" a="255"/>
</cv-stage>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ConvertModelToKeyPoints"
name="2" enabled="true" model-stage-name="match_template1"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.DrawKeyPoints"
name="draw_keypoints" enabled="true"
key-points-stage-name="2">
<color r="153" g="0" b="0"
a="255"/>
</cv-stage>
<cv-stage
class="org.openpnp.vision.pipeline.stages.DrawTemplateMatches"
name="draw_matches" enabled="false"
template-matches-stage-name="match_template1"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.AffineUnwarp"
name="match_template" enabled="true" warp-stage-name="warp"
results-stage-name="match_template1"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageRecall"
name="recall_image" enabled="false"
image-stage-name="image"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ConvertModelToKeyPoints"
name="results" enabled="true"
model-stage-name="match_template"/>
<cv-stage
class="org.openpnp.vision.pipeline.stages.ImageWriteDebug"
name="debug_results" enabled="true" prefix="fidloc_results_"
suffix=".png"/>
</stages>
</pipeline>
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/CAJGQTWKPneZBe%3DBz1dYV-isig2pLiWuQ0%2BnpF%2BcotxDPSQYguQ%40mail.gmail.com.
My camera has only 720p with 60fps. With 1080p it has only 30fps.
_m
--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/64a8deea-f1dd-45d9-9d9b-2792ab70ea69%40googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/18bdd763-bb63-1da9-5a41-53a4222ade86%40makr.zone.
Hi Trinh, Hi Everybody
Thanks for the images. They do look good.
I wrote the following as a draft for the OpenPnP Wiki, and now
pasted it here, so everybody please
help me improve the text by giving feedback. English is not my
native language so language feedback is also welcome.
>>
Go to the Camera's Vision Wizard and choosing one of the advanced
Settle Methods (i.e. not Fixed Time) and enable Diagnostics. Then
just press the Camera settle test button.

This should now plot a graph. In the blue curve you see how fast your camera delivers frames. The scale is Milliseconds. If you move the mouse over the graph, you see the frames in the Camera View.


There's an info text where you see the frame number and the time
it took to take this number of frames. You can calculate the avg.
frame rate that your camera delivers (tip: go to the 10th frame,
easy to divide :-). Well, I should add the fps to the info text in
a future OpenPnP version.
Where the blue curve is low, it shows how long the frame settling
analysis takes to compute. This gives you a first indication of
your CPU's vision processing power.
If you see an irregular or unexpectedly slow rate, please reduce
your preview FPS on the Camera's Device Settings tab, because the
Camera Preview steals its frames away from Computer Vision /
Settling and makes it slower. I recommend an FPS of only 3 to 5.
Go back to the Camera's Vision tab and retry. Only the occasional irregularity should now appear in the blue curve.
Basic Tuning
You can start tuning the settings with similar values as in the
screenshot. But start with a Settle Threshold of 0.
Then perform one of the settle tests with the icon buttons. These
will move the Nozzle in front of the Bottom Camera (or the Camera
itself if it is the Down-looking Camera) and then immediately
settle the camera. Read the tool tips for more info about each
one.

For finer tuning provoke a real Vision operation, like Part Alignment aka Bottom Vision or Board Fiducials, Nozzle Calibration etc. and go back to the Camera's Vision tab.

You should now see the red curve descend and settle to some value
above 0. Move with the mouse over the graph ("scratch") to see the
settle frames. Th relative motion is highlighted by overlaying a
"heat map".

When you scratch over the graph you see a red horizontal line
with indicated difference value. Go to where you think the frame
is stable enough and set this value as your threshold. The last
settle frame is also the one taken for computer vision. The higher
the threshold the faster your frame settling is. For best speed
you need to judge how much residual motion you want to tolerate.
If your machine vibrates at all ;-)
You will need to repeat this for different motion and Computer
Vision scenarios and then take the threshold that works for all of
them.
Advanced Tuning
If the red curve shows strong oscillation (vibration of the
machine), choose a Debounce Frames value to make sure the settling
is complete and oscillation has abated enough. Otherwise a freak
coincidence of frames might dip below the threshold.

If you have varied brightness and limited contrast levels in your
vision scenarios (like black Samsung CP40 nozzle tips in
calibration, white paper tape strips on white double-sided
tape/shiny metal desk), use the Contrast Enhance setting to
equalize between the scenarios.
Use the Center Mask to remove unwanted image parts that might limit Contrast Enhance (like a diffuser or shade partially in view).
If some of your Vision subjects are low in contrasts but despite
using Center Mask you still have some very dark or very bright
blurred objects in the camera view, then Contrast Enhance alone
might not work. You might then want to combine it with the Edge
Sensitive mode. A correctly set Denoise (Pixels) value is
important to define the relevant sharpness of the edges. Note that
the Edge Sensitive mode will not score large motion higher than
medium motion, but we only need to judge the small motions. Edge
Sensitive mode might be fooled by motion blur, so you may need to
combine it with Debounce Frames to make sure there is no longer
any motion blur (it depends on the latency of your camera).
Use Color Sensitive for exotic scenarios, where the Vision
subject has a color pattern but hardly any black and white
contrast. Or if you simply like to watch your settle images in
full color. It's even cooler to watch when combined with Edge
Sensitive and a colorful subject :-).
If you have a weak CPU and/or a very high FPS camera and it takes
too long to analyze the frames (blue curve stays down too long),
you might want to set and reduce the Center Mask to only analyze a
small part of the frame. You can also increase the Denoise (Pixel)
value as this partly scales down the image so it has then less
pixels to analyze.
Different Settle Methods can be used to score how different two
subsequent frames are i.e. whether the camera has settled enough.
Maximum
The easiest to understand method is Maximum (the one Jason had
implemented before). It just takes the largest single pixel
difference as the settle score. Use Denoise (Pixels) to make this
resilient against camera artifacts like compression, sensor noise,
faulty pixels, aliasing (Moiré)
etc.
Mean
Takes the mean (average) pixel difference over the whole image.
This might be used together with Edge Sensitive and full Contrast
Enhance. But not recommended.
Euclidean
This takes the Euclidean Distance between the frames (the square
root over all the squared differences). The largest pixel
differences dominate the score but small differences can still
contribute significantly, if large areas change. Therefore this
will handle motion blur better than Maximum, where only the single
larges pixel difference counts. Unlike with Maximum, it matters to
a degree how large and how structured the subject is. So tune this
with the smallest/most uniform Computer Vision subjects (e.g. use
nozzle calibration of the finest tip and not Alignment of a pin
monster MCU).
Square
Like Euclidean this takes the squared differences over the whole
image. It's the classical difference (error) indicator from
science. Unlike Euclidean, no square root is taken, so very small
numbers will result and will be needed for the threshold. For our
purpose of setting a threshold it is equivalent to Euclidean. But
more for the math purist.
All scores are normed to a 100% theoretical maximum.
<<
Is this understandable as far as you use it?
_Mark
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/CAKDs3YzLDEViYN3uj%3Dv%2Bcf1z6H-Bs8K71RGWq6YjeDOys64r3w%40mail.gmail.com.
English is not my native language so language feedback is also welcome.
Is this understandable as far as you use it?
> My takeaway is "wow, amazingly useful"
Thanks :-) Appreciate the feedback.
> Is there a way to up level this one more notch, and somehow show ALL the various settle methods in parallel to help understand the differences and choose?
Funny you should ask. It worked that way in the early stages. One
curve per method. But then I saw that the added computation
changed the rate of frame grabbing (at least for the 120fps camera
mode I also tested). So I removed it. Changing the frame rate will
obviously also change the (relative) differences between frames,
so the result is not usable for tuning.
The settle algorithm is really optimized to the hilt. People are
talking about using Raspis etc. so I don't want this to be an
artificial bottle neck. It is a very central thing.
Also the method is only one thing that makes sense to compare.
What about all the other settings? You could compare different
Denoise settings etc.
One thing that could work is to record the original frames from
the last settle and then retroactively apply new settings
to them (not just the method) and add a new curve every time. Add
a combobox to browse through the recorded settings and highlight
the currently selected curve. Press Apply to keep. But that would
be quite some work. :-]
_Mark
--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/CAKFrckpSMRKfsvaFC4gfYiSGRFfj9nLqjPxKa%3DWosQNU0kJD_Q%40mail.gmail.com.
Hi Keith
Sorry for the delay.
>I had to restart OpenPNP after changing settel
Method before other buttons would show up.
> same after selecting the diag check. nothing would show up
until I restarted OpenPNP.
I finally found the culprit. The same bug in the Vacuum Detect
Diagnostics. It works on Windows due to "lucky" side effects,
that's why I missed that one. Guess you work on a different OS?
See the PR:
https://github.com/openpnp/openpnp/pull/1002
This should now be fixed and I hope Jason can review/merge it
quickly.
I would be very grateful for you to test
this again, on your OS.
In the meantime: does pressing Apply, then going to a different
Wizard in the Machine Tree view and then going back to the
original Wizard not help? I don't see any difference between the
Wizard loading after restarting OpenPnP and the Wizard loading
after a Machine Tree line change.
> BTW how do you post pics?
You mean here in the group? I just attach them. Seems to work if
not too large. If I remember correctly, this works both with
E-Mail replies (like this one) and on the Web.
_Mark
--
You received this message because you are subscribed to the Google Groups "OpenPnP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpnp+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openpnp/2f4d3d12-0b33-47b1-989b-9fb2ef5966d3%40googlegroups.com.