Using hugin for grid'd macro pictures from USB microscope

300 views
Skip to first unread message

Ahron

unread,
Mar 13, 2020, 2:13:30 AM3/13/20
to hugin and other free panoramic software
Hi,

I have a macro scanner made out of basically smooshing a USB microscope into the extruder hole of a 3D printer. Said USB microscope captures 480x720 images in a regular grid of equal overlaps of ~20-50%. The mechanical repeatability of the system is good to better than or no more than a few pixels, depending on magnification. Is there a set of settings for this scenario with Hugin? I feel like it should be almost "easy mode", but in my current state of things (mostly running the auto and fumbling with controls) I can't seem to get a good picture with anything! 

I currently use microsoft ICE since it usually does a fantastic job with no muss or fuss, but have recently been looking for an alternative for reasons involving lack of API and formation of artifacts. I am sure you have heard this story somewhere before, but right now to do batching we have an open loop VBS script that runs through the menus with physical keypresses!

Beyond the actual stitching them together, I am sure there is plenty to deal with blending, dealing with exposure changes, etc... my hope is that by controlling many variables though I will be able to automate the process to minimal human input most of the time. Does anyone in the community have faith that this is possible? it would mean a great deal to me, as my project means a great deal to me.

Thank you so much for any possible assistance!
piece of burlap.jpg

Quantrix

unread,
Mar 30, 2020, 4:44:36 AM3/30/20
to hugin and other free panoramic software
Hi there,
I am looking to solve a similar problem using microscopy images. I was wondering if anyone had any inputs, insights and guidance regarding this issue?
Thanks
Quantrix

AKS-Gmail-IMAP

unread,
Mar 30, 2020, 2:00:46 PM3/30/20
to hugi...@googlegroups.com
Maybe this task is similar to what is described in the Hugin tutorial — Stitching murals using mosaic mode but I question if your microscopy needs also require deconvolution microscopy. I suppose the deconvolution process needs to occur prior to Hugin stitching. The stitching results at the image boundaries may not be suitable for your needs without doing some tight masking. It would be interesting to see the results you come up with. 
--
A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ
---
You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hugin-ptx+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hugin-ptx/3c52ca78-3f0f-4cf9-b562-a9b1dfea2b9b%40googlegroups.com.

Ahron

unread,
Mar 30, 2020, 5:10:49 PM3/30/20
to hugin and other free panoramic software
Hey, thanks for the reply. I did see this one but was hoping for something made for when there are many pictures with exactly known locations --- it just seems like a waste to have this information precisely and not use it. 

I would love to try techniques for improving image quality, but stitching by itself is what I'm after for now. I recently changed my setup and was able to get an image I was very happy with of a coin to 400 megapixels (in microsoft ICE):


It's not really microscopy so much as normal macro photography --- just with many pictures. The above one was about a thousand pictures, which is about the limit for me with ICE. I'd like to see the above quality but for images that are tens or hundreds of times bigger in area.

Thanks so much!. 
 
On Monday, March 30, 2020 at 2:00:46 PM UTC-4, aks wrote:
Maybe this task is similar to what is described in the Hugin tutorial — Stitching murals using mosaic mode but I question if your microscopy needs also require deconvolution microscopy. I suppose the deconvolution process needs to occur prior to Hugin stitching. The stitching results at the image boundaries may not be suitable for your needs without doing some tight masking. It would be interesting to see the results you come up with. 
On Mar 30, 2020, at 3:37 AM, Quantrix <dr.ram...@gmail.com> wrote:

Hi there,
I am looking to solve a similar problem using microscopy images. I was wondering if anyone had any inputs, insights and guidance regarding this issue?
Thanks
Quantrix

On Friday, 13 March 2020 00:13:30 UTC-6, Ahron wrote:
Hi,

I have a macro scanner made out of basically smooshing a USB microscope into the extruder hole of a 3D printer. Said USB microscope captures 480x720 images in a regular grid of equal overlaps of ~20-50%. The mechanical repeatability of the system is good to better than or no more than a few pixels, depending on magnification. Is there a set of settings for this scenario with Hugin? I feel like it should be almost "easy mode", but in my current state of things (mostly running the auto and fumbling with controls) I can't seem to get a good picture with anything! 

I currently use microsoft ICE since it usually does a fantastic job with no muss or fuss, but have recently been looking for an alternative for reasons involving lack of API and formation of artifacts. I am sure you have heard this story somewhere before, but right now to do batching we have an open loop VBS script that runs through the menus with physical keypresses!

Beyond the actual stitching them together, I am sure there is plenty to deal with blending, dealing with exposure changes, etc... my hope is that by controlling many variables though I will be able to automate the process to minimal human input most of the time. Does anyone in the community have faith that this is possible? it would mean a great deal to me, as my project means a great deal to me.

Thank you so much for any possible assistance!

--
A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ
---
You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hugi...@googlegroups.com.

AKS-Gmail-IMAP

unread,
Mar 30, 2020, 6:36:22 PM3/30/20
to hugi...@googlegroups.com
There are ways to script the applications Hugin is using. Maybe someone will help with those. Another possible method, which I am not sure has ever been discussed but seems pretty obvious as a brute force automation scheme, is to create a simple application that populates a Hugin pto file with either all the required information or almost all the required information. The pto file is a simple text file that you can reverse engineer. Then run the pto file through Hugin or its batching utilities. 

To unsubscribe from this group and stop receiving emails from it, send an email to hugin-ptx+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hugin-ptx/4816a9e3-5ca3-453c-9031-56d467a82e8e%40googlegroups.com.

Shaunak De

unread,
Jul 8, 2020, 1:24:16 AM7/8/20
to hugin and other free panoramic software
This is an exciting application! Thanks for posting!

Gunter Königsmann

unread,
Jul 8, 2020, 2:53:06 AM7/8/20
to Shaunak De, hugin and other free panoramic software
That application would be the "stitching murals" case, see the tutorials section: in order to tell hugin that each image was made from a different position one needs to tell hugin it was made from a different, but identical lense.

See also https://github.com/mpetroff/stitch-scanned-images for automating that.

Kind regards,

Gunter.
--
Sent from my Android device with K-9 Mail. Please excuse my brevity.

Greg Warrington

unread,
Jul 8, 2020, 9:34:01 AM7/8/20
to hugin and other free panoramic software
I've done macro stitching using a dlsr on a stand. So the physical repeatability isn't there, but it's similar in other ways. I'll say one thing in case it helps anyone: For my setup, the most significant improvement to my results came after really dialing in the lens distortion characteristics. When those were off it was just an exercise in frustration.

greg

privacyis...@gmail.com

unread,
Oct 9, 2020, 5:45:40 PM10/9/20
to hugin and other free panoramic software
What you call "easy mode" is more an exercise in advanced calculus but sure let's give it a go!

The hardest thing with macro, dSLR or otherwise is the parallax distortion. In your scenario you're physically moving so freaking far by the optic's perspective. Take that burlap shot. Without ICE and complicated math, if you shot a single thread over the top of a bundle of threads dead-centered, but then moved the printer head a few millimeters off in any direction; assuming for the moment we're keeping X/Y and Z is changing which isn't reality, so now that thread looks like it's floating way off to the left right top bottom of said bundle. If you reversed the rig, embedded the camera in a stationary arm, and moved a platen/table that held the object being photographed, you'd still have the same issue.

This is why ICE can do it but hugin cannot. ICE used the magic of SeaDragon aka Photosynth. Microsoft Live went offline what, 4 years ago now? And some of it can work independently of the Live/MSN site, but you'll need to find a safe clean d/l of photosynth (it'll likely be from 2014 and x32) But THAT is how your ICE deep zoom panos work, they're not real, they're synthetic, synthesized content. Same way ICE can fabricate missing sections of panoramas instead of cropping them, it just invents it! It analysed the images and made up what it would have looked like, same way Photosynth made freaking 3D images from multiple 2D photo planes at different angles. It's freaking incredible technology, and is back again in a for-profit app and apparently spies on you too? (what doesn't in 2020?) with Microsoft Pix & Hyperlapse Pro (ICE is back too, but now it's Interactive Composite Editor)

If, however you mounted the microscope on a 6-axis gimbal, and programmed it to pivot around the no-parallax point of the lens, hugin would definitely be your macrogigapanorama tool! You could pan in fraction-of-a-millimeter steps, at whatever the most repeatable step size of your steppers was for the camera weight, and be 100% repeatable, and yes, THEN, in that scenario, automating the multirow would be as simple as knowing the overlap of the 480x720, you could just increment the Tpy (translation parameter yaw) (720-overlap) and likely get some subpixel accuracy.

But, no. What we have here is a comparison of apples to oranges. Plastic oranges. Although very realistic looking ones, granted!
Reply all
Reply to author
Forward
0 new messages