Andor Zyla 5.5 Camera integration into BLACS

338 views
Skip to first unread message

Rohit Prasad Bhatt

unread,
Feb 12, 2020, 4:07:11 AM2/12/20
to The labscript suite
Hi all,
We will soon start using Andor Zyla 5.5 Camera for our experiment. Since we don't have NI Vision we need a code to integrate the camera into labscript. Andor also provides SDK3 support for the camera.

I see that there exists python wrapper from David Meyer (LINK). Now as far as I understand, some code still needs to be written for integrating this camera into labscript using the Python wrapper.

I did a similar thing for our Mako camera by writing a code for integrating it into labscript using Pymba wrapper.

So I would like to ask if someone has already written a patching code using this wrapper to include ANDOR cameras into labscript.

Regards,
Rohit Prasad Bhatt

Philip Starkey

unread,
Feb 12, 2020, 6:53:33 AM2/12/20
to labscri...@googlegroups.com
Hi Rohit,


Might not work with your camera immediately but might be a good starting point!

Cheers,
Phil

--
You received this message because you are subscribed to the Google Groups "The labscript suite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to labscriptsuit...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/labscriptsuite/1bd34514-2aa4-429c-9e16-5066a23e8f09%40googlegroups.com.

Rohit Prasad Bhatt

unread,
Feb 17, 2020, 1:28:04 AM2/17/20
to 'Philip Starkey' via The labscript suite
Hi Phil,
Sorry for the late response. Thanks a lot for mentioning this!

I have a question regarding this. Does this approach require ANDOR SOLIS software to be installed?

OR will it also work if I just have ANDOR SDK3 installed?

We did not buy SOLIS with our cameras. But we did buy the SDK3.

To access our ZYLA cameras for testing, I just used the free 3rd party software Micro Manager along with freely downloadable support library from ANDOR.

Regards,
Rohit Prasad Bhatt

Russell Anderson

unread,
Feb 17, 2020, 6:02:18 AM2/17/20
to labscri...@googlegroups.com
Hi Rohit,

The AndorSolis device class does not require you to have Solis installed, and on the other hand it should work just the same alongside an existing Solis installation (so long as the camera isn't open in Solis when you're trying to program it via labscript).

Is the SDK installed in C:\Program Files\Andor SDK? If not, you will need to point the device class at the location you installed the SDK. The default is defined as sdk_dir in labscript_devices\AndorSolis\andor_solis.py.

Your workaround using μManager sounds interesting, especially as it is a very popular open-source tool in microscopy. Were you able to integrate this into your labscript workflow or was the testing you refer to separate to an experiment shot conducted using labscript?

– Russ



--
Russell Anderson

Rohit Prasad Bhatt

unread,
Feb 18, 2020, 3:35:31 AM2/18/20
to 'Philip Starkey' via The labscript suite
Hi Russ,
Thanks for the clarification. I will make sure to put the SDK path properly.

About the MicroManager, I didn't integrate it into labscript. Since we just got our ZYLA cameras delivered, I wanted to make sure they are taking pictures.

So I used MicroManager, since we didn't buy ANDOR SOLIS. Since this method was pretty quick and ANDOR themselves give guidelines on how to test their cameras with 3rd party software.

Regards,
Rohit Prasad Bhatt

Message has been deleted
Message has been deleted

Rohit Prasad Bhatt

unread,
Apr 2, 2020, 6:08:22 PM4/2/20
to The labscript suite
Hi all,
I started testing out the code for ANDOR integration into labscript. I ran into a problem that the code was not able to see the dll even if it was there. This I resolved by putting the ANDOR SDK3 installation direction into the system path variable.

After this the camera did initialize as a device in BLACS but I could not take a snap. I thought this might be because the code was tailored for EMCCD so I tired to keep minimum initialization attributes. But as the code repeated failed to take any images, I tried searching for the API function in the SDK3 manual which came with our camera purchase.

On going through the manual, I realized that the present code in labscript is written for ANDOR SDK2 while we have ANDOR SDK3. ANDOR has changed everything in this update, so the commands look very different. See image for example.

So I finally concluded that I would have to re-write the entire code for SDK3 compatibility. I could try to keep the same structure. Is there an efficient (and perhaps quicker) way of updating the code?? Or do you have any other suggestions??

SDK2vsSDK3.png



Regards,
Rohit Prasad Bhatt


On Tuesday, February 18, 2020 at 9:35:31 AM UTC+1, Rohit Prasad Bhatt wrote:
Hi Russ,
Thanks for the clarification. I will make sure to put the SDK path properly.

About the MicroManager, I didn't integrate it into labscript. Since we just got our ZYLA cameras delivered, I wanted to make sure they are taking pictures.

So I used MicroManager, since we didn't buy ANDOR SOLIS. Since this method was pretty quick and ANDOR themselves give guidelines on how to test their cameras with 3rd party software.

Regards,
Rohit Prasad Bhatt

On Mon, Feb 17, 2020, 12:02 PM Russell Anderson <> wrote:
Hi Rohit,

The AndorSolis device class does not require you to have Solis installed, and on the other hand it should work just the same alongside an existing Solis installation (so long as the camera isn't open in Solis when you're trying to program it via labscript).

Is the SDK installed in C:\Program Files\Andor SDK? If not, you will need to point the device class at the location you installed the SDK. The default is defined as sdk_dir in labscript_devices\AndorSolis\andor_solis.py.

Your workaround using μManager sounds interesting, especially as it is a very popular open-source tool in microscopy. Were you able to integrate this into your labscript workflow or was the testing you refer to separate to an experiment shot conducted using labscript?

– Russ

On Mon, 17 Feb 2020 at 17:28, Rohit Prasad Bhatt <> wrote:
Hi Phil,
Sorry for the late response. Thanks a lot for mentioning this!

I have a question regarding this. Does this approach require ANDOR SOLIS software to be installed?

OR will it also work if I just have ANDOR SDK3 installed?

We did not buy SOLIS with our cameras. But we did buy the SDK3.

To access our ZYLA cameras for testing, I just used the free 3rd party software Micro Manager along with freely downloadable support library from ANDOR.

Regards,
Rohit Prasad Bhatt

On Wed, Feb 12, 2020, 12:53 PM 'Philip Starkey' via The labscript suite <labscriptsuite@googlegroups.com> wrote:
Hi Rohit,


Might not work with your camera immediately but might be a good starting point!

Cheers,
Phil

On Wed., 12 Feb. 2020, 8:07 pm Rohit Prasad Bhatt, <> wrote:
Hi all,
We will soon start using Andor Zyla 5.5 Camera for our experiment. Since we don't have NI Vision we need a code to integrate the camera into labscript. Andor also provides SDK3 support for the camera.

I see that there exists python wrapper from David Meyer (LINK). Now as far as I understand, some code still needs to be written for integrating this camera into labscript using the Python wrapper.

I did a similar thing for our Mako camera by writing a code for integrating it into labscript using Pymba wrapper.

So I would like to ask if someone has already written a patching code using this wrapper to include ANDOR cameras into labscript.

Regards,
Rohit Prasad Bhatt

--
You received this message because you are subscribed to the Google Groups "The labscript suite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to labscriptsuite+unsubscribe@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "The labscript suite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to labscriptsuite+unsubscribe@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "The labscript suite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to labscriptsuite+unsubscribe@googlegroups.com.


--
Russell Anderson

--
You received this message because you are subscribed to the Google Groups "The labscript suite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to labscriptsuite+unsubscribe@googlegroups.com.

Rohit Prasad Bhatt

unread,
Apr 2, 2020, 6:10:43 PM4/2/20
to The labscript suite
Somehow the image was not showing!!
SDK2vsSDK3.png

David Meyer

unread,
Apr 3, 2020, 5:38:58 PM4/3/20
to The labscript suite
Hi Rohit,

Somehow I never noticed the new AndorCamera class uses SDK2. It's always super annoying when vendors decide to completely deprecate their own API for no apparent reason. It's been a while since I've worked with our Andor cameras but can't the Zyla 5.5 also run on SDK2? You might be able to convince Andor to let you have access to both APIs which would be a quick solution.

As for making an SDK3 wrapper, while the two SDKs differ a lot, their general function and operation is pretty similar. There is a decent chance the only thing you would need to change is the BLACS worker itself, leaving everything else intact. I don't know how helpful it would be to you, but that old python wrapper I wrote was for SDK3 and you are certainly welcome to make use of it if it helps you get things done faster. I even have an old-style camera server that shows how to use it with labscript (unfortunately it is only on my lab computer and I'm not allowed to go to work at the moment). 

Anyway, I'm stuck working from home for the time being and have a little time to spare so if you decide my old wrapper is the way to go I can probably give you a hand in porting to SDK3.

-David
To unsubscribe from this group and stop receiving emails from it, send an email to labscri...@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "The labscript suite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to labscri...@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "The labscript suite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to labscri...@googlegroups.com.


--
Russell Anderson

--
You received this message because you are subscribed to the Google Groups "The labscript suite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to labscri...@googlegroups.com.

Rohit Prasad Bhatt

unread,
Apr 10, 2020, 8:58:24 AM4/10/20
to 'Philip Starkey' via The labscript suite
Hi David,
Thanks a lot for your help and suggestions. After reading your suggestions, I actually managed to adapt the code in a way that now I can use ZYLA as a device in BLACS. I have already tested it in the manual mode by taking a snap and continuos live stream. To make it work I mostly had to edit andor_utils file. Although I have not yet updated the comments properly in the code, but the latest code is available here [https://github.com/fretchen/synqs_devices/tree/master/AndorSolis]. However, the final test of the code in buffered mode still needs to be done.

But I have some general questions/comments on the ANDOR code:
1. I still don't understand how it worked. The library file which is imported by andor_solis file is called 'atmcd64d.dll'. I looked up the properties of this file and the description says 'SDK2 adapter for SDK3'. Maybe this file does the magic. OR maybe it is the C header file 'ATMCD32D.H' which defines codes for various features, actions etc.

2. Since ANDOR SDK can support variety of cameras it would be nice to write the code in such as way that different labscript users can use a same core module and have to change/add a config file for their camera models. For eg. this is being already done in labscript for NI cards. A similar structure might be helpful for ANDOR cameras.

3 . Another point I have is regarding the very low frame rate in BLACS . In the manual continuous mode, I am getting 1.2 FPS max. However, on testing with MicroMangaer, I got a frame rate of 22 FPS in live stream mode at similar setting. So I wonder, what is causing the difference...

Regards,
Rohit Prasad Bhatt

To unsubscribe from this group and stop receiving emails from it, send an email to labscriptsuit...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/labscriptsuite/21b8c87a-72b9-4588-aa8d-3f688001948c%40googlegroups.com.

David Meyer

unread,
Apr 13, 2020, 5:11:18 PM4/13/20
to The labscript suite
Hi Rohit,

Again, it's been a while since I've really looked at the ANDOR SDKs but I'll say what I remember/know.

1) For completeness, what edits did you actually make to the andor_utils.py file? Looking over the code it appears the AndorSolis wrapper with labscript is in fact written for SDK3, which explains why using SDK3 functions is fine. What made you think the labscript code was SDK2?

2) This is definitely the way things are generally done with labscript. Aside from API differences, I believe the code is generally valid for the entire Andor offering of cameras; the finer control elements being set using the configuration dictionaries at initialization. Going the extra step of the NI cards is basically just pre-making those dictionaries and saving them to a file that calls the more general code underneath. I imagine that once the camera integration into labscript matures and gains wider adoption those kinds of simplifications will arise. Adding true SDK2 support is also fairly simple and could be made very transparent, but I suspect it won't happen since it is deprecated and likely not really in demand for the vast majority of labscript users.

3) This one I actually know without even looking at code! It is a subtle problem that is a bit tricky to get around, though the other labscript camera wrappers do a decent job of it. Basically, the issue is in how the manual mode acquisitions are implemented. 

In order to enable fast frame rates in cameras, their APIs have to implement a high-speed data transfer layer (TL) to receive the images as they are being captured. This requires a fair bit of clever, asynchronous code that essentially runs in the background from the configuration and triggering code. In order to get the fastest frame rates, these TLs need to make a number of optimizations depending on the specifics of each acquisition. Things like: how many frames are being captured, how much of the sensor is being used, are kinetics being employed, is the shutter global or rolling, is the sensor a CCD or CMOS, which interface is being used (USB3, GigE, CL)? As a result, when you want to take images, you need to tell the camera all of these things then tell it to start acquiring. In the case of the Andor API, calling acquire() is when the TL gets configured and incurs a significant overhead (basically the camera/API can be a little slow doing all the necessary optimizations for data transfer). The Andor API also takes it's sweet time stopping acquisition (basically undoing many of those changes and going back to steady state). My recollection for our Zyla over USB3 was that the start/stop overhead time was at least 200ms

The way the labscript code you are using works for manual acquisition is to call the snap() method repeatedly. This method configures the acquisition, calls acquire(), takes a single frame, reads it out, then does the acquisition stop cleanup. Obviously this is extremely inefficient if you are trying to take a series of images, especially since it is largely in pure python (not the compiled C of the dll). It would be better to setup and call acquire once, get a stream of images via a generator through the TL, then stop acquiring at the end. This isn't always a trivial thing to do, though all camera APIs support it in some fashion since that's how videos work. I don't think the AndorSolis wrapper in labscript is set up for it at the moment. Given that it is just a ctypes wrapper, it may also be difficult to do well, depending on exactly how the API works and what functions are available through it. This exact issue is why I generally like to go through the extra effort of cython for my camera wrappers since it is easy to make a fast, python-compatible generator function for image streaming.

Anyway, that's a lot of detail. If you don't need fast frame rates for manual mode, you are probably good to go as is. The buffered mode labscript code does configure the TL such that it takes all frames as a single acquisition so the frame rate limitation you are seeing now is not the same as the "frame rate" for buffered images.

-David


On Friday, April 10, 2020 at 8:58:24 AM UTC-4, Rohit Prasad Bhatt wrote:
Hi David,
Thanks a lot for your help and suggestions. After reading your suggestions, I actually managed to adapt the code in a way that now I can use ZYLA as a device in BLACS. I have already tested it in the manual mode by taking a snap and continuos live stream. To make it work I mostly had to edit andor_utils file. Although I have not yet updated the comments properly in the code, but the latest code is available here [https://github.com/fretchen/synqs_devices/tree/master/AndorSolis]. However, the final test of the code in buffered mode still needs to be done.

But I have some general questions/comments on the ANDOR code:
1. I still don't understand how it worked. The library file which is imported by andor_solis file is called 'atmcd64d.dll'. I looked up the properties of this file and the description says 'SDK2 adapter for SDK3'. Maybe this file does the magic. OR maybe it is the C header file 'ATMCD32D.H' which defines codes for various features, actions etc.

2. Since ANDOR SDK can support variety of cameras it would be nice to write the code in such as way that different labscript users can use a same core module and have to change/add a config file for their camera models. For eg. this is being already done in labscript for NI cards. A similar structure might be helpful for ANDOR cameras.

3 . Another point I have is regarding the very low frame rate in BLACS . In the manual continuous mode, I am getting 1.2 FPS max. However, on testing with MicroMangaer, I got a frame rate of 22 FPS in live stream mode at similar setting. So I wonder, what is causing the difference...

Regards,
Rohit Prasad Bhatt

Rohit Prasad Bhatt

unread,
Apr 15, 2020, 7:48:35 AM4/15/20
to 'Philip Starkey' via The labscript suite
Hi David,
Thanks a lot for the detailed info. I would like to tell you why I thought its SDK2 and not SDK3. Also I have some questions.
1. I thought it was SDK2 and not SDK3 because the manual which came with our purchase of SDK3 did not have any mention of the functions that are defined in the andor_solis file. Rather it mentions features which can be used and also which of them are ZYLA specific. Then it also tells how to call the functions for setting those features. This left me puzzled  so I tried to see if there is any other manual for SDK3 which has any other description. Doing this I came across a file from ANDOR themselves which clearly mentions the difference between SDK2 and SDK3 and there is where I saw some of the functions defined in the andor_solis file as SDK2 functions. I had also uploaded that comparison table in one of my previous posts.
Now I got a bit suspicious so I searched for SDK2 manual and I found one online. That manual had entire description of all the functions defined in the andor_solis file. So I became almost sure that this code is for SDK2. Also what I understood from the andor_solis file that the functions defined there, are passed to the SDK in the def uint_winapi(argtypes=[]): function. I thought that since the SDK will only understand functions which are predefined by ANDOR, the current andor_solis file functions should be for SDK2. All in all, I find ANDOR documentation confusing.
Also I had to change some parameters in andor_utils file. But this I thought maybe due to the different camera model we are using because I thought that the ANDOR code was mainly tested for EMCCD camera.
2. Now I would mention what exactly are the changes I did to the andor_utils file. But I must start with a disclaimer that I made those changes mostly to skip over the errors rather than actually trying to solve the issue. So if I ran into an error, I would go to that part of the code and try to guess how critical this section is for taking images. If I thought that its not so critical, i would just comment it out to see if I can proceed to taking images. Moreover the changes I made are minor edits. With this preface, here are the changes :
  • First I tried to keep the minimum number of parameters in the default_acquisition_attrs dictionary in the andor_utils file (for e.g. I disabled shutter attributes etc). Now this is a bad practice because I can also pass the dictionary when instantiating the camera in connection_table. But I am just reporting the changes that I did and some are dirty.
  • I set up a call to setup_acquisition() function in the initialize_camera(self) function.
  • Then I changed the parameter in the SetFanMode() function. Since we don't have water cooling it would try to call SetFanMode(1) which was throwing an error. I resolved this by changing to SetFanMode(0). Then I checked whether the sensor is getting cooled and stabilized. And it was indeed getting cooled and stabilized.
  • Next I made changes in the setup_acquisition() function. I commented out a call to setup_horizontal_shift(), setup_vertical_shift() and setup_shutter() because they were giving errors. Also, I set the wait_until_stable to False in the call to enable_cooldown() function.
Apart from that to take image in buffered mode, I commented out two print statements in blacs_workers. They print the horizontal and vertical shift speeds and are defined in grab_multiple() function.
3. Now I have a question about using the ANDOR ZYLA in Kinetic series for buffered sequence. I tried it by setting the 'acquisition':'kinetic_series', 'number_kinetics':2 and 'trigger': 'external_exposure' because we want two images per shot. We control the Camera exposure via width of external TTL pulse so we work in external exposure mode. I call expose function equal to  'number_kinetics' times. I make sure there is enough interval between two expose calls. Now when I run the sequence the camera can only take and save one image per shot. I checked with 'number_kinetics':3 and still takes one image only namely the first one. I also checked this by setting  'number_kinetics':1 and calling expose only once in Experiment script, then the script ran without any problems.
Basically the error for higher  'number_kinetics' was that the circular buffer of the camera only has one image, so the shape of the data array mismatches which throws out an error. Could it be that the acquire function called from blacs_workers is only taking one image when it should take multiple? I could not resolve this issue.
But I read that even you had tested this code with your ZYLA cameras. So could you please tell me how you would configure the ZYLA camera for a Kinetic series with external exposure mode? Maybe I am committing a mistake.
Whatever I wrote here is all what I understood from the code and literature. I would be happy to be corrected wherever, I am wrong. Thanks again.
Regards,
Rohit Prasad Bhatt


Virus-free. www.avast.com

To unsubscribe from this group and stop receiving emails from it, send an email to labscriptsuit...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/labscriptsuite/111d97d7-283b-4f3d-a8a8-3d41de5a029f%40googlegroups.com.

Virus-free. www.avast.com

David Meyer

unread,
Apr 22, 2020, 3:16:38 PM4/22/20
to The labscript suite
Rohit,

Thanks for your detailed response. Getting little things like that out in a searchable place really helps the community, especially when it comes to the less well tested portions of labscript ;)

1) I see. Well the andor_solis.py file is definitely SDK3. However, the paths that are hard-coded into it appear to be from a computer that also has Andor Solis installed (Andor's stand-alone viewer program, not to be confused with the SDK itself) and possibly SDK2. In any case, I can certainly agree that Andor's docs are not exactly user friendly.

2) What you are describing sounds spot on. Basically, the person who wrote this driver was working with a iXon camera and hard-coded some of the details for working with that camera. Ideally, non-general options should be handled through the configuration dictionaries [think things like the Fan Mode]. It isn't overly difficult to do but it does require some thought and ideally at least a few different types of cameras for testing. Maybe I will make some time to work on this code a bit to make it more friendly to Zylas since I have access to one and some free time on my hands. I suppose that would also allow me to improve the frame-rates of manual mode captures as well, which is likely to be a big plus all around.

3) My memory of the details for kinetics mode is very fuzzy. I probably tested it but ended up not using it day-to-day so I don't remember. Again, very quickly perusing the andor_utils.py methods, I wonder if you might actually want to use fast kinetics instead of kinetic series. I'm not entirely sure what the actual difference is between them, but fast kinetics sounds like the kinetic modes I have used in the past. It looks like kinetic series has a number of limitations to its use and is designed for something very specific. I don't have access to the full SDK docs yet so I can't say more, but my guess is that you don't really want kinetic series, assuming you are doing typical absorption image like things. In fact, unless you are trying to get the images very close to each other, you should skip kinetics all together. Even if you do need them close, you should probably confirm your triggers by running with non-kinetics first.

If you'd like, I'll try to get my hands on our lab's Zyla and see if I can do some tests on my own. It will likely take me a few days to sort all that out though if time is of the essence.

-David


On Wednesday, April 15, 2020 at 7:48:35 AM UTC-4, Rohit Prasad Bhatt wrote:
Hi David,

Thanks a lot for the detailed info. I would like to tell you why I thought its SDK2 and not SDK3. Also I have some questions.
1. I thought it was SDK2 and not SDK3 because the manual which came with our purchase of SDK3 did not have any mention of the functions that are defined in the andor_solis file. Rather it mentions features which can be used and also which of them are ZYLA specific. Then it also tells how to call the functions for setting those features. This left me puzzled  so I tried to see if there is any other manual for SDK3 which has any other description. Doing this I came across a file from ANDOR themselves which clearly mentions the difference between SDK2 and SDK3 and there is where I saw some of the functions defined in the andor_solis file as SDK2 functions. I had also uploaded that comparison table in one of my previous posts.
Now I got a bit suspicious so I searched for SDK2 manual and I found one online. That manual had entire description of all the functions defined in the andor_solis file. So I became almost sure that this code is for SDK2. Also what I understood from the andor_solis file that the functions defined there, are passed to the SDK in the def uint_winapi(argtypes=[]): function. I thought that since the SDK will only understand functions which are predefined by ANDOR, the current andor_solis file functions should be for SDK2. All in all, I find ANDOR documentation confusing.
Also I had to change some parameters in andor_utils file. But this I thought maybe due to the different camera model we are using because I thought that the ANDOR code was mainly tested for EMCCD camera.
2. Now I would mention what exactly are the changes I did to the andor_utils file. But I must start with a disclaimer that I made those changes mostly to skip over the errors rather than actually trying to solve the issue. So if I ran into an error, I would go to that part of the code and try to guess how critical this section is for taking images. If I thought that its not so critical, i would just comment it out to see if I can proceed to taking images. Moreover the changes I made are minor edits. With this preface, here are the changes :
  • First I tried to keep the minimum number of parameters in the default_acquisition_attrs dictionary in the andor_utils file (for e.g. I disabled shutter attributes etc). Now this is a bad practice because I can also pass the dictionary when instantiating the camera in connection_table. But I am just reporting the changes that I did and some are dirty.
  • I set up a call to setup_acquisition() function in the initialize_camera(self) function.
  • Then I changed the parameter in the SetFanMode() function. Since we don't have water cooling it would try to call SetFanMode(1) which was throwing an error. I resolved this by changing to SetFanMode(0). Then I checked whether the sensor is getting cooled and stabilized. And it was indeed getting cooled and stabilized.
  • Next I made changes in the setup_acquisition() function. I commented out a call to setup_horizontal_shift(), setup_vertical_shift() and setup_shutter() because they were giving errors. Also, I set the wait_until_stable to False in the call to enable_cooldown() function.
Apart from that to take image in buffered mode, I commented out two print statements in blacs_workers. They print the horizontal and vertical shift speeds and are defined in grab_multiple() function.
3. Now I have a question about using the ANDOR ZYLA in Kinetic series for buffered sequence. I tried it by setting the 'acquisition':'kinetic_series', 'number_kinetics':2 and 'trigger': 'external_exposure' because we want two images per shot. We control the Camera exposure via width of external TTL pulse so we work in external exposure mode. I call expose function equal to  'number_kinetics' times. I make sure there is enough interval between two expose calls. Now when I run the sequence the camera can only take and save one image per shot. I checked with 'number_kinetics':3 and still takes one image only namely the first one. I also checked this by setting  'number_kinetics':1 and calling expose only once in Experiment script, then the script ran without any problems.
Basically the error for higher  'number_kinetics' was that the circular buffer of the camera only has one image, so the shape of the data array mismatches which throws out an error. Could it be that the acquire function called from blacs_workers is only taking one image when it should take multiple? I could not resolve this issue.
But I read that even you had tested this code with your ZYLA cameras. So could you please tell me how you would configure the ZYLA camera for a Kinetic series with external exposure mode? Maybe I am committing a mistake.
Whatever I wrote here is all what I understood from the code and literature. I would be happy to be corrected wherever, I am wrong. Thanks again.
Regards,
Rohit Prasad Bhatt


Virus-free. www.avast.com

David Meyer

unread,
Jun 16, 2020, 4:40:34 PM6/16/20
to the labscript suite
Hi Rohit,

I'm guessing you have somewhat moved on from this but I've made a little progress on the continuous mode acquisitions that I'll post here so I remember it for the future.

For starters, I've found the slow down. Basically, the AndorCam.acquire() method calls the SDK function StartAcquisition, which is quite time consuming, for every single image acquired. This method also has an apparently unnecessary 50ms wait built into its loop as well. More fundamentally, the issue is that the current implementation only uses a single image at a time acquisition mode for everything. There is mention of a run_till_abort acquisition mode, but it does not appear to be implemented fully enough to actually be usable.

Perhaps more interestingly, I discovered that the Andor SDK3 has two levels of API. A very low level API (all functions that begin AT_) found in the arcore.dll library and a higher level API with standard helper functions in the atmcd64d.dll library. I've only ever used the lower API in the past because the documentation that I have with my installation only covers those functions. However, the current Andor_Solis camera in labscript uses the higher level API exclusively. The only documentation I can find on the higher level API is the function names in the corresponding header file or the function comments ported over by Pablo in andor_solis.py. Without access to that documentation I'm struggling to make any headway getting the run till abort acquisition mode working. It's also likely that making other improvements to this implementation to be more general with regards to camera options will be hard without that documentation as well.

I'm hoping I can get that documentation from Andor though it isn't going well so far. If someone has it I'd be interested to take a look.

-David

Rohit Prasad Bhatt

unread,
Jun 21, 2020, 6:48:02 AM6/21/20
to 'Philip Starkey' via The labscript suite
Hi David,
Thanks a lot for reviving this conversation. I read your previous reply and it seems that there were some questions which I had not fully answered. Also for your current efforts I am attaching the documentation from our SDK3  installation. But, I am not sure how different this documentation is from yours.
######### Reply to your previous post #########
The reason for using kinetic series mode is that we typically take more than one image per camera during one shot. And I think kinetic series fits this scenario well because there you can specify (before start of acquisition) how many images you will take. And for the trigger type we use "External exposure", because it allows us to change the exposure time by varying the width of TTL pulse which is a global in runmanager. So I think kinetic series mode is suitable for us.

I do not think the "fast kinetics" mode is even possible with the sCMOS cameras. ANDOR support had earlier told me that this mode is specific to EMCCDs (although I think CCDs also have this). The point in fast kinetics is that you take an image and transfer it to the dark part of the camera sensor chip. The readout takes place at the end when you have taken required number of images OR the entire chip is filled with images. I understand that this mode allows for very fast exposures followed by a slow readout at the end. Also you have to physically mask that part of the camera which will be buffer-storing images coming from the exposed part of the chip. It is easy to see that the number of consecutive images you can take depends on the frame size of one image and the total frame size of the camera sensor.

However in sCMOS cameras you can get fast exposures by reducing the number of sensor rows required for the frame. I think one row of Zyla can give an exposure time of 10 μs. So the ROI settings of the camera govern the speed. The selected ROI is read out after every exposure of the kinetic series, unlike a single readout of the fast Kinetics.

######### Reply to your present post #########
The fact that the Andor SDK3 has two levels of API might have been my cause of confusion between SDK2 and SDK3 earlier.

At the moment we heavily use our ANDOR Zyla cameras. But Zyla is not a device in BLACS. We use a dirty hack wherein labscript just sends triggers to the camera and the camera configuration is completely controlled from Micromanager. For kinetic series, we use the Multi-Dimensional acquisition of Micromanager. Then at the end of a shot when Micromanager has fetched all images and stored it in our specified directory, we use the Fuction_runner class of labscript to pack these images into the HDF file for the current shot. We pack the images into the default "images" group of labscript used for supported cameras. Since the camera is now not fully in sync with labscript timing, we have to add additional delay in the stop(t) statement at the end of experiment to make sure the camera has finished saving all images in the directory where Fuction_runner will look.

This strategy could be used for any camera whose support is not included in labscript but the vendor provides some software to control the camera. Of course writing BLACS integration is a cleaner solution.

Please find attached herewith the documentation for our SDK3 install.

Regards,
Rohit Prasad Bhatt

You received this message because you are subscribed to the Google Groups "the labscript suite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to labscriptsuit...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/labscriptsuite/526f703c-9357-46e1-afbc-bec70f83fc32o%40googlegroups.com.
Andor Software Development Kit 3.pdf
Reply all
Reply to author
Forward
0 new messages