Frame straddling between two cameras

219 views
Skip to first unread message

Erich's Lab

unread,
Aug 9, 2021, 4:07:03 AM8/9/21
to openpiv-users
I am currently working on a budget, modular, and extremely portable PIV system. Presently, I'll be evaluating a turbulent vortex exhibiting multiple secondary circulations (multiple vortices) at a Reynolds number of about 4.51x10^4. My cellphone doesn't quite make the cut, however frame straddling two phones using a synchronizer might be a relatively budget friendly way to attain higher speed image pairs. Has anyone tried something like this or am I confusing the concept of frame straddling with something else?

Additional information:
I am using a CW 525nm laser with a cylindrical rod to get a laser sheet roughly 2mm thick. The seed particle form my current is of unknown size and is produced by a 400W fog machine previously tested on low Re experiments. Phone I'll be using is a OnePlus 6T, but with a recent article I read, a GoPro could also be used.
References:
Käufer, T., König, J. & Cierpka, C. Stereoscopic PIV measurements using low-cost action cameras. Exp Fluids 62, 57 (2021). https://doi.org/10.1007/s00348-020-03110-6

Regards,
Zimmer

alex.l...@gmail.com

unread,
Aug 9, 2021, 9:44:48 AM8/9/21
to openpiv-users
Hi Erich, 

it's an interesting idea, but I do not know if there is a synchronisation tool for the smartphones. Probably it's better to use Rasberry Pi camera that should be able to get an external trigger and probably cheaper than a smartphone and if you use a frame straddle concept, also cheaper than a GoPro. The sync signal could be generated by Arduino or Rasb. Pi itself, I guess. It's just an idea, I have not tried it myself. 


There is also a recent article about SmartPIV app. I downloaded it from an Apple Store, but it does not yet work for me. I think the GUI is nice. 


I'd be glad to know what does it mean "multiple secondary circulations"  or "multiple vortices" - do you mean the vortices that appear after a large vortex breaks down? 

Best regards,
Alex

Erich's Lab

unread,
Aug 11, 2021, 11:47:48 PM8/11/21
to openpiv-users
Dear Prof. Alex,
Thanks for your suggestions, as I didn't think about using Raspberry Pi cameras (didn't know they even existed). As for "multiple vortices", you are correct as the the vortices are caused by the high swirl ratio of the simulated vortex. Here is a numerical simulation of the prototype concept of my current apparatus that a friend of mine performed.
twocell1.png

horizvort.png

suctionvorts.png

multi1.png

As for SmartPIV, I tried it, but it "freezes" on the smartphoneI used possibly due to its low computational power. It does look nice in addition to its two correlation types.

Regards,
Zimmer

PS, sorry for any spelling mistakes in the event that they transform the sentence to something completely different as I am low on time. :(

Alex Liberzon

unread,
Aug 12, 2021, 4:50:49 AM8/12/21
to Erich's Lab, openpiv-users
Got it. Very nice 

From: openpi...@googlegroups.com <openpi...@googlegroups.com> on behalf of Erich's Lab <Erich_...@hotmail.com>
Sent: Thursday, August 12, 2021 6:47:47 AM
To: openpiv-users <openpi...@googlegroups.com>
Subject: [openpiv-users] Re: Frame straddling between two cameras
 
--
You received this message because you are subscribed to the Google Groups "openpiv-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openpiv-user...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/openpiv-users/2540b88b-3b70-4db4-99dd-2a3c43ef3b2an%40googlegroups.com.

William Thielicke

unread,
Aug 16, 2021, 2:16:52 PM8/16/21
to openpiv-users
If you need support with a cheap synchronizer: I recently built one. The current firmware is meant for controlling dual cavity Q-switched lasers, but I can easily modify the code. It is however not open source (yet?) because we are trying to sell it as a part of PIVlab:

Erich's Lab

unread,
Oct 15, 2021, 8:56:39 PM10/15/21
to openpiv-users
Are rolling shutter cameras any good for PIV? I am currently using a 1 MP monochrome global shutter sensor capable of filming object less than 85 m/s. However, I wanted to use a sensor with > 3 MP and a global shutter sensor of that size is over my budget. However, rolling shutter sensors might cause some errors from motion blurring. Has anyone tried using a consumer grade, low budget sensor with particle laced flows of moderate velocities?

Alex Liberzon

unread,
Oct 16, 2021, 8:41:08 AM10/16/21
to Erich's Lab, openpiv-users
Don’t think it is a good idea. There are relatively cheap options like DSLR with single frame and auto correlation, or USB based low frame rate double shutter CMOS cameras that might be more suitable. Or even a smartphone in a video mode with some strobe light.  

Sent: Saturday, October 16, 2021 3:56:39 AM
To: openpiv-users <openpi...@googlegroups.com>
Subject: Re: [openpiv-users] Re: Frame straddling between two cameras
 

Erich's Lab

unread,
Nov 5, 2021, 9:27:59 PM11/5/21
to openpiv-users
What about using a non-polarized frame/beam splitter and two global shutter CMOS sensors? I found some promising 8MP global shutter sensors that are ~$400 USD each with frame rates of 60 Hertz and I think with a synchronizer and laser (most likely CW 524 nm DPSS), should be able to produce high quality particle images.

Alex Liberzon

unread,
Nov 6, 2021, 1:34:56 AM11/6/21
to Erich's Lab, openpiv-users
Possible. But if it is a CW laser - what do you need to synchronize, the shutter and grabber of the two cameras? 
Beam splitter will work but it typically limits your view angles. Two separate cameras is a more flexible arrangement 

Sent: Saturday, November 6, 2021 3:27:58 AM

William Thielicke

unread,
Nov 6, 2021, 6:31:13 AM11/6/21
to openpiv-users
I just ordered two UVC global shutter webcams with 2MP, and a 5W blue CW Laserdiode. This is a very cheap solution ( hardware is below 300 Euro), but it requires quite some effort for programming and timing measurements. In the end, it might be possible to measure a 20x30 cm field with up to 2m/s. The company I am working for allows me to do these experiments during my work time, so in the end it will not be open source but a commercial solution. However, I have the hope that it works and that it makes PIV affordable for a broader audience.

Alex Liberzon

unread,
Nov 6, 2021, 6:51:16 AM11/6/21
to William Thielicke, openpiv-users
Thanks William, 

if you could share some advice on what hardware to use and what kind of programming is required to sync the two cameras and a CW laser (do you chop it or synchronize only cameras?) 

Thanks

You received this message because you are subscribed to a topic in the Google Groups "openpiv-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/openpiv-users/tqSzOeiQHII/unsubscribe.
To unsubscribe from this group and all its topics, send an email to openpiv-user...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/openpiv-users/de64d656-c848-47dd-ab29-9ca29690bb39n%40googlegroups.com.

William Thielicke

unread,
Nov 6, 2021, 8:03:39 AM11/6/21
to openpiv-users
I am using two  Arducam 2MP Global Shutter USB OV2311 cameras. As I already have designed a very flexible synchronizer, I am using it to trigger the cameras and the laser. It doesn't make sense to let the laser run at full power CW when the cameras only expose for 1/1000s (more dangerous, more heat, less energy per exposure). Also, the timing is much more precise and flexible when I modulate the laser beam. The camera exposure does not seem to have a perfectly vertical slope too, so using exposure alone makes PIV measurements less precise with these cameras. A trigger signal also doesn't immediately trigger the exposure, there is quite a long delay. In the end, it seems possible to have an interframe time of about 500 micro seconds. I am still experimenting and also don't know yet if I can use a MJPEG stream (45 fps, but compressed data) or if I will prefer the YUY2 stream (5 fps, uncompressed).
Unfortunately, Matlab's implementation of reading a webcam is not documented, and it seems to change between YUY2 and MJPEG depending on the matlab version with no possibility to change the behaviour... So I will most likely use ffmpeg to setup the cameras and capture the data, then reading the image files into PIVlab during recording.
Two different laser colors, two bandpasses and a (by default hardware syncronized) stereo camera is another option that I might check out. If there is a recommendation for a global shutter USB stereocamera, then let me know :-D

William Thielicke

unread,
Nov 8, 2021, 3:54:34 PM11/8/21
to openpiv-users
Hmmm, I am currently stuck. It seems that in Windows, it is not possible to read two USB cameras more or less simultaneously, if they are connected to the same USB bus. The speed simply seems not to be enough. Or maybe there are conflicts, but other people seem to have experienced similar issues... A workaround might be to use one Nvidia Jetson Nano (with two camera interfaces), or even two RPis that record data from each webcam independently... But this makes everything more complex :(

Erich's Lab

unread,
Nov 13, 2021, 2:26:55 AM11/13/21
to openpiv-users
I have a few questions.
1. What is the effect of the interframe time?
2. Would rolling shutter sensors be okay for velocities of ~2 m/s?
3. I had a system where a pulsed laser sheet was created by rotating a disk (has a hole in it) in front of the CW laser. However, I just can't get things to line up (camera and pulsed laser). Would an external trigger based on the flash of the laser be precise enough to get frames with reliable  ∆t?
4. How hard is it to synchronize a pulsed LED (have optics at hand based on the article Alex mentioned earlier) with Arducam global and rolling shutter sensors?

Erich's Lab

unread,
Dec 18, 2021, 3:35:41 AM12/18/21
to openpiv-users
Dear William,
On multiple image sensors (mainly ONSemi, OmniVision sensors), I heard that using the sensors in external trigger mode (300 μs pulse, 8 ms interval between each trigger) controlled via Arduino Uno provided some decent results.

On external trigger pulse delay, the Aptina (ONSemi) sensors might provide a lower delay, but I haven't tried that manufacturer yet and cannot confirm this claim. This might be able to lower the interframing time to below 500μs.

Ivan Nepomnyashchikh

unread,
Dec 20, 2021, 12:35:23 PM12/20/21
to openpiv-users
Oh William! Good to see you here! You might remember me: I asked a question on Research gate about open source software to get PIV images. I think it would be good if OpenPIV had the same feature as PIVlab. We even have a conversation about it in this group. I started working on that but super-duper extremely very slowly. I respect your desire to make some money. Still the information you shared in the current conversation is helpful for me and I'm grateful to you for it.
Ivan
Reply all
Reply to author
Forward
0 new messages