custom lenses

22 views
Skip to first unread message

Peter Bircham

unread,
May 28, 2012, 9:31:01 PM5/28/12
to Oper...@googlegroups.com
Has anybody had any experience or heard of anybody having custom lenses installed.
We do all our work with yeast and having a 100X lens would make a big difference

cheers

Peter

Ghislain Bonamy

unread,
May 29, 2012, 2:12:28 AM5/29/12
to Oper...@googlegroups.com
Hi Peter,

I do not have personally experience with swapping the lens to fit a 100x. I think it should be feasible but if you are using a 100x water immersion lens (not sure if this is even available from Olympus), you will probably have issues with the water collar. If you are using a dry lens, I think the only think that may require some adjustment is the auto-focus, but this should be feasible.

Out of curiosity, have you tried the 60x water immersion lens using a binning of 1? 

Best,

Ghislain

Michael Wyler

unread,
May 29, 2012, 12:11:42 PM5/29/12
to Ghislain Bonamy, Oper...@googlegroups.com
Hi Ghislain,

  I was curious if you routinely run plates with the 60x objective using binning of 1.  I have recently tried to run a few plates that way and noticed they take about 3-4hrs to run with 7 images per well.  The images are acquired in about 1.5 hrs with about half of them stored locally.  It takes an additional 1.5-2 hrs for transferring to the CIA's and analysis.  I could try running the plates with no script to see if this helps, but I was wondering if you see the same thing or if this is a problem related to our machine.

Mike



--
You received this message because you are subscribed to the Google Groups "Opera High Content Imaging" group.
To view this discussion on the web visit https://groups.google.com/d/msg/OperaHCI/-/JJsnlxOxUiAJ.

To post to this group, send email to Oper...@googlegroups.com.
To unsubscribe from this group, send email to OperaHCI+u...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/OperaHCI?hl=en.

Ghislain Bonamy

unread,
May 29, 2012, 5:46:33 PM5/29/12
to Oper...@googlegroups.com, Ghislain Bonamy
Hi Mike,

From the description of your problem, I think that your issue is that the analysis is indeed slower than the actual imaging (explaining why half your images get stored locally before analysis completes. Removing the script from the analysis will definitively solve that issue. Here we have set up Acapella on our cluster so that we can actually perform the analysis using about 170 cpus. This allows us to quickly analyze and re-analyzed data without slowing down the acquisition. I think that during Assay dev, having the online analysis is useful but during screening it is usually best to turn it off in order to avoid issues related with RMCA interactions.

Let me know if you have further questions.

Best,

Ghislain

Peter Bircham

unread,
May 29, 2012, 6:40:48 PM5/29/12
to Opera High Content Imaging
We use the 60X lens with a binning of 2 for most of our screens and
only use a binning of 1 if we want to confim a few hits from a screen
or want higher quality images.
We dont really have enough data storage to do all our screens with a
binning of 1 and like mike has said we also get problems with images
being stored locally sometimes even if only a single position in a
well is imaged and no scripts are run.

Martin Stoeter

unread,
May 30, 2012, 5:42:59 AM5/30/12
to Opera High Content Imaging
Hi Peter,

we use mostly the 40xW and binning 1. According to PerkinElmer 60xW
and binning 1 is oversampling, however when we were imaging yeast I
still had the impression that the 60xW binning 1 gives you a bit more
resolution and you see subcellular structures a bit clearer.

Anyway, we dont have the problem with file transfer and the locally
stored files. Without script even one CIA is able to handle 13 fields,
4 channels per well. Because of memory issues of the Opera-PC I never
use more than 2 CIAs. Once we had something like 40x, binning 1, 18
fields, 2 channels and an Acapella script detecting spots. The 384
took about 3.5h and the 3 CIAs were a bit slower than the acquisition
(maybe acquisition was in 3h), then the locally stored images were
analyzed and transferred at the end of an acquisition. No problem. But
I agree with Ghislain, in these cases one should uncouple acquisition
and analysis.

Did you check the ethernet connection between Opera and CIA? We have
1Gbit network. Our CIAs write directly to a temp image store computer
in a local LAN (independent from our big network and servers). Make
sure that Opera-PC to CIAs connection is as fast as possible!

Best regards,

Martin

Ghislain Bonamy

unread,
May 30, 2012, 10:23:52 AM5/30/12
to Oper...@googlegroups.com
Hi Everyone,

I think that PE might be incorrect about the oversampling at 60x bin1, at least following the Nquyst sampling guidelines. That being said, I wanted to get back to the original question regarding the 100x objective. Using 60x bin1 is a better solution than using 100x bin2, since one would cover a larger area and achieve better resolution using the 60x bin1 strategy.

I wanted to alter in my earlier comment is that uncoupling the analysis from the acquisition makes most sense if you have a separate analysis pipeline (at least this would be the major reason), because no matter what you do you will need to spend sometime analyzing the data.

The points raised by Martin are all correct, including decreasing the number of RMCA, the one caveat to this solution is that if you use 2 RMCA, when you run the "re-analysis" of the data, only 2 RMCA would be used, which also means analysis would not be as quick as if you started with 4 or 8 RMCA. This is not an issue using our analysis pipeline with Acapella Linux running on our Linux cluster, since we create jobs of our own. I also implemented a scheduler, that can work on multi-core windows machine so if you had an 8 or 16 core windows server running Acapella, that would be a good solution to speed up your analysis, the one caveat there being that it is not using a GUI but a command line interface.

Finally if storage and analysis is an issue at 60x bin1, the issue will continue at 100x bin2, mostly because you would need to take more pictures to cover the same area. As far as timing, the increased number of fields required to cover a similar area would result in as much time being dedicated to the screen, although this added time (ie. the extra 1.5Hr for data processing) instead of being for data processing only would be data-processing AND data acquisition (ie. the acquisition and the data-processing would take about the amount of time ~3Hrs). Together I think the 60x bin1 strategy is a better solution.

I think the solution to your problem would come from getting new RMCA for processing data off-line (ie. after reading the plates), or if your institute has a Linux cluster, maybe you can look into Acapella Linux solution. For the windows solution described above, you can look at the google.code repository I created at: http://code.google.com/p/operahci/, there are also other Acapella resources that can be of interest to you there.

Best,

Ghislain

Ghislain Bonamy

unread,
May 30, 2012, 5:59:40 PM5/30/12
to Michael Wyler, Oper...@googlegroups.com

As David mentions the 60x bin 1 does oversample, since the resolution is about 217nm/pixels and the diffraction limited resolution is about 248nm/pixels (@488nm, with NA1.2 60x water lens); however the Nyquist theorem calls for an oversampling of about 2.3 fold the best achievable resolution (ie. diffraction limited resolution), so even though the 60x bin1 does oversample, it does NOT reach the optimal sampling dictated by Nyquit’s sampling theorem (which would be about 107nm/pixels).

 

This is why the 100x lens would be better, as it would provide more oversampling and bring images to a resolution closer to the Nyquist optimal sampling (although not reaching it since the resolution would be ~130nm/pixels). However as highlighted before there are quite a few issues associated with swapping the lens making it not worthwhile (sample size, imaging time, focusing etc.).

 

As far as local storage is concerned, I think the issue is that the analysis is too slow and PE decided to store images into RAM and then locally for temp storage, if the analysis is too slow instead of storing images directly into the final location and working from that location. This is a different story though.

 

Best,

 

Ghislain

.

Peter Bircham

unread,
Jun 5, 2012, 9:01:48 PM6/5/12
to Opera High Content Imaging
thanks for all the feedback,
I think your probably right about the 60X offering the best solution


cheers

Peter
Reply all
Reply to author
Forward
0 new messages