IR lights underwater

67 views
Skip to first unread message

Renaud Bastien

unread,
Jun 10, 2021, 10:40:28 AM6/10/21
to multi-camera software from the Straw Lab

Hi,
`we are exploring different solutions to get good IR lighting with the bowl. But absorption of IR is quite important in water. Do you have any recommendations on the setup to use?

Andrew Straw

unread,
Jun 17, 2021, 5:09:01 AM6/17/21
to multi-camera software from the Straw Lab
Ah, sorry for the delay - I just saw this message. I haven't done much more than what's described in the 2017 Nature Methods paper. We did quite some extensive validation that mitfa mutant zebrafish larvae (missing pigments and much more transparent than wildtype) were tracked equivalently to wildtype larvae throughout the bowl at a variety of depths. So although the intensity of IR light may be reduced in the water, the data convinced me that this is not a practical concern.

With this setup, we were quite capable of tracking miniscule specks of dust. (And this was painful to eliminate.) The darkfield illumination design I think is very good - it gives huge contrast.

Jana Mach

unread,
Jun 23, 2021, 5:22:57 PM6/23/21
to multi-camera software from the Straw Lab
How big us your setup, what size fish are you planning to track? Are you planning to use VR?

The original setup is using 6-8 3W 860nm LEDs mounted on magic arms and positioned in a way to create dark field illumination of the object of interest. In other words, a fish regardless of it's level of transparency will reflect the light, making it look like a light spot in the "eye" of an IR sensitive camera on an otherwise dark background. As Andrew mentioned above, this approach is quite robust.

In my limited experience, tracking older zebrafish (approx. 1 cm in length) is no problem, but tracking tiny zebrafish lavae (5-7dpf) is more challenging. Right now I am exploring a new idea -- illuminating the fish bowl with a IR LED strip, specifically, this one:
The first attempts look promising, the LED strip seems to provide more homogenous illumination that will hopefully result in more reliable tracking results.

Hope that helps. Feel free to ask more questions, would love to hear what you are doing.

Renaud Bastien

unread,
Jun 28, 2021, 9:12:35 AM6/28/21
to multi-camera software from the Straw Lab
The bowl is 50cm wide and 20cm deep. The fish we are trying to track are 2 to 4 cms. So we should not have too much trouble tracking them. I worked with the original design, and I remembered that it could be painful to setup to get good illumination.

We tried with some IR light projectors with limited success so we just ordered 8 powerful IR LED with magic arms. We also checked those LED strips but I was unsure if we could dissipate the heat properly. I am very curious to know if that works

Jana Mach

unread,
Jun 29, 2021, 4:20:59 AM6/29/21
to multi-camera software from the Straw Lab
The IR LED strip is really nice, I mounted it on a 20" bicycle rim which acts both as a mounting frame and a heat sink. The LEDs and the rim get warm as expected, but not too warm to touch.

IMG_1659.JPG

The LED strip provides a nice homogeneous illumination of the whole bowl making small objects dark field illuminated when the mirror is covered. Unfortunately, adding the mirror to the equation makes things more complicated. 

Mirror covered, styrofoam piece floating on the surface:

Screenshot from 2021-06-25 10-54-54.png  

Mirror uncovered:

Screenshot from 2021-06-25 10-53-42.png

Right now I am thinking of how to solve the reflection problem, using lenses and hoods (as with single LEDs) is not really an option.

Renaud Bastien

unread,
Jun 30, 2021, 9:09:53 AM6/30/21
to multi-camera software from the Straw Lab
That looks amazing! We are probably going to replicate the setup when the first system is working. We might try your solution.
I would imagine that adding a circular mask on the wheel might help prevent reflection with the mirror. But it is possible that there are other issues I haven't considered.

Jana Mach

unread,
Jul 30, 2021, 11:37:49 AM7/30/21
to multi-camera software from the Straw Lab
Hi Renaud,

Just writing to let you know that after trying out different things, what worked for me was 2 rows of LEDs on one single bicycle wheel + a black acrylic hood that protrudes by roughly 35 mm:

IMG_2066.JPG

This resulted in a nice homogeneous illumination making the tiny unpigmented fish larvae visible even through the "eye" of the cameras that previously had the reflection problem (you still see some residual reflection from the mirror, but it does not have an effect on tracking at the end) -- see video attached.

Cheers,
Jana
fish_example.mkv

Renaud Bastien

unread,
Aug 11, 2021, 7:20:19 AM8/11/21
to multi-camera software from the Straw Lab
that looks amazing! We might try that when we reproduce our first system or if we struggle too much with the current light situation.

We now have 8 LEDS and it is still painful to get homogeneous illumination for 4 cameras. I tried something different, where i use only one light per camera and the mirror to flood the zone with IR light. It seems that we are able to get a reasonable tracking in those conditions. It is not excellent yet, and I still need to play with the different parameters. But it was way easier than trying to move the lights and try to get good images on each camera.

 I guess there are good arguments to struggle with the direction of light rather than doing what I do here. But I am not sure why and I wonder if I am missing something obvious. let me know if this method is not appropriate.



IMG_20210810_163843.jpgIMG_20210810_163846.jpg

Jana Mach

unread,
Aug 11, 2021, 7:45:02 AM8/11/21
to multi-camera software from the Straw Lab
Hi Renaud,

It's nice to see you're making some good progress! Here's my two cents on your IR light situation: from what I can tell, you are mixing two different illumination approaches -- dark-field and light-field -- that is most apparent in the right window. There your fish will either appear dark on a light background or light on a dark background (with a possible low-contrast area at the intersection). Since your fish is pretty large and not transparent, the tracking will probably work well. Your conditions would not work for me, as I am using tiny unpigmented (transparent) fish which become completely invisible with bright-field type of illumination. If you are only planning to track large non-transparent fish, then it will probably work well as long as the contrast between the fish and the background is high enough.

This said, Andrew might have a different opinion on this :-)

Good luck!
Jana

Renaud Bastien

unread,
Aug 12, 2021, 8:10:46 AM8/12/21
to multi-camera software from the Straw Lab
Thanks for the answer. I know that my setup is not optimized now, but I wanted to check first that the approach was not problematic before setting the lights definitely. We are usingadults Hemmigramus who are not too transparent, certainly less than the zebrafish larvae. So this approach should work fine for us. Progress on tracking remain a bit slow at the moment. The fish is not so pleased to be left alone in the bowl and often does not move. I am trying to generate some animated background that the fish would appreciate and push him to move more reliably.

I keep you updated.

Andrew Straw

unread,
Aug 12, 2021, 1:29:47 PM8/12/21
to Renaud Bastien, multi-camera software from the Straw Lab
Well, I can't help with the Hemmigramus behavior (sounds like an interesting challenge) but as far as tracking it may be worth saying that the basic background-subtraction code in Braid hasn't really been reworked since an original more-or-less hacky version version was written around 2004. (The Rust implementation for Strand Cam was basically meant to copy it, quirks and all.) So if it seems like the basic 2D tracking should work a lot better - it probably should!

2 approaches both seem reasonable to me: a new basic image processing approach like the current one but simpler and not depending on the closed source Intel IPP. There is some fast, SIMD accelerated image processing code at https://github.com/strawlab/strand-braid/blob/1c1a84821e311b17713e94d20d80c106ce1a13e0/imops/src/lib.rs . This is roughly similar in speed to Intel IPP, perhaps a bit faster, perhaps a bit slower. This is used in a prototype fast detection path in strand cam: https://github.com/strawlab/strand-braid/blob/1c1a84821e311b17713e94d20d80c106ce1a13e0/strand-cam/src/strand-cam.rs#L1189-L1214 . I do intend to evolve this approach, but don't want to promise any fixed timetable.

The other approach would be to use low-latency object detection code from a deep learning framework. Which one and how to plug that in would be an interesting question. This is also very interesting but again don't want to promise anything specific.

Best,
Andrew


--
You received this message because you are subscribed to the Google Groups "multi-camera software from the Straw Lab" group.
To unsubscribe from this group and stop receiving emails from it, send an email to multicams+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/multicams/bd9c97ee-b134-422a-b660-37d848e7e7c0n%40googlegroups.com.

Renaud Bastien

unread,
Aug 20, 2021, 9:06:33 AM8/20/21
to multi-camera software from the Straw Lab
Thanks for your answer. I am not sure exactly what I would need but I knew that lights were difficult to setup. So I am not so surprised to spend so much time on this problem.

First, I made some animated background for the hemmigramus. It is good enough to get the fish to move to setup the tracking. It seems that I am now getting much better results by getting the camera flooded with light and tracking the dark fish on the white background. Visually, the green circular target is clearly identified on the fish. However when I am looking at the UDP stream it is very jumpy. I compared with a single LED to check if there was a problem with the calibration. There was no problem on that side. It seems that due to the size of the fish, the measured position is moving slightly from frame to  frame. I am joining a movie to show what is currently happening.


I am quite unsure how to proceed. I know that the lighting can benefit from some tuning, but it is unclear if that would be enough. I did not grasp exactly what algorithms you were using so the directions for improvement remain unclear. Do you have any advice?
Reply all
Reply to author
Forward
0 new messages