Implementing NI PXIe-6535 and PXI-6733 cards

374 views
Skip to first unread message

Cameron Straatsma

unread,
Jun 11, 2016, 3:29:51 PM6/11/16
to The labscript suite
I have the labscript suite setup and running on a NI PXI chassis, but I am having some issues getting the digital input/output card to operate properly (PXIe-6535). I am using a single PineBlaster to generate the clock, and I have confirmed that it is functioning correctly. I have written my own .py device files (attached), including an edit of the NIBoard.py (myNIBoard.py) file since the DIO card I am using spreads its 32 channels over 4 different ports. At the moment I am trying to implement the first 8 channels located in port 0. So far I have had the following successes:
  • I can compile files in runmanager and pass runs to BLACS
  • I can operate all output channels on all cards (digital and analog) in manual mode in BLACS without any errors
  • I can program a ramp for an analog channel and execute it in buffered mode without any errors (confirmed by looking at the clock signal and analog output simultaneously on a scope)
I can't seem to get the digital card to operate properly in buffered mode. Looking at the HDF5 file it looks like there isn't any data being passed to the card, and therefore it doesn't output anything when it gets clocked. I have attached my HDF5 files for the connectiontable and a test run with a single digital channel instantiated. For this test run, looking at the output of the digital channel on a scope I see it go high and stay high, rather than toggle and stay low after the run is complete.

Does anyone have any insight here? I'm hoping its a simple fix or a simple error in one of the .py device files...thanks!

Cheers,

Cameron
connectiontable.h5
20160611T130554_tests_0.h5
myNIBoard.py
NI_PXIe_6535.py
NI_PXI_6733.py
connectiontable.py
tests.py

Cameron Straatsma

unread,
Jun 11, 2016, 4:56:50 PM6/11/16
to The labscript suite
I think I have it figured out now. Turns out the HDF5 file did contain data for the card, so that wasn't the problem.

In the original .py device file for the NI PCIe-6363, which I modified to implement the PXIe-6363, there seems to be a mixture of numpy.uint32 and numpy.uint8 being used (e.g. line 29, line 139, line 206, line 218). I think this should be numpy.uint8 always since the DAQmx function DAQmxWriteDigitalLines takes an array of uInt8 as an input for the samples to write to the channel.

Anyways, changing all instances to numpy.uint8 appears to have fixed the problem, and I now get the correct output on the digital lines in buffered mode.

These changes might have an unintended consequence, so hopefully someone who has a deeper understanding of this code can comment.

The updated device files myNIBoard.py and NI_PXIe_6535.py are attached in case anyone is interested. These implement all channels (ports 0:3, lines 0:7) on the card.

Cheers,

Cameron
myNIBoard.py
NI_PXIe_6535.py

Philip Starkey

unread,
Jun 12, 2016, 12:58:38 AM6/12/16
to The labscript suite
Hi Cameron,

I believe the current usage of np.uint32 and np.uint8 is correct for the NI PCIe 6363 in the original configuration. 

The code using np.uint32 relates to the storage of the state of all 32 channels in the HDF5 file. Here one channel corresponds to one binary bit. However, as you say, the NI function requires each uint32 number to be converted to an array of 32x uint8 numbers (one for each channel). The places where you see uint8 are where the data is in the correct format for the NI function, and the uint32 is where it is in the format for storage in the HDF5 file. The reason for the different formats is simply to reduce the HDF5 file space (1x 32 bit number is significantly smaller than 32x 8 bit numbers!).

In your case, because you were only trying to get the first 8 channels to work, you are correct in changing the uint32 to uint8 so that the conversion to the format for the NI function doesn't include data for 24 channels that don't exist! (this is likely why it didn't work correctly for you at first). I believe that the code for the NI PCI-6733 uses uint8 everywhere because it only has 8 buffered output channels.

I notice in your follow up email, that you indicate you have all 4 ports working. Have you tested simultaneous output of all 32 digital lines? In changing it to use uint8 everywhere, I believe you will only be storing the first 8 channels...Oh I think I see the issue. Line 37 of myNIBoard.py is outputarray[line] = output.raw_output. This ignores the port. As such, you are overwriting the first 8 channels with the second 8, and then the second 8 with the third 8, etc. I think you will find that the output of each port is identical, regardless of how you command them. To fix this I think you need to change back to using uint32 is the appropriate places (but not everywhere, see above) and then change that line to read outputarray[port*8+line] = output.raw_output

There may be additional changes required as well, but I think that is the main one. A good way to check is to convert the integers stored in the HDF5 file into a binary string and see if the output state of each bit matches what you expect.

Let us know how you go!

Cheers,
Phil

P.S. We'd be keen to make NIBoard.py compatible with your myNIBoard.py file, and then have your NI_PXI_6535.py file subclass NIBoard like the other NI cards do. But it is probably best to wait to do this until you have confirmed all of your ports are working correctly.

Cameron Joseph Edgar Straatsma

unread,
Jun 12, 2016, 1:26:27 AM6/12/16
to labscri...@googlegroups.com
Hi Phil,

Thanks for the response!

I definitely jumped the gun a bit (my bad), and ran into some issues with implementing all 4 ports. I thought I had it worked out, but it didn't go quite as expected. After playing around a bit more I agree that the current usage of uint32 and uint8 is correct. Going through the code again, you've certainly identified an issue on line 37 of myNIBoard.py. I'll get to this first thing on Monday, and let you know how things pan out.

Thanks for the tip with the integers in the HDF5 file. That's a nice way to check that the data being written to the card is correct.

I'd be happy to add the functionality here to your existing code once I've got it debugged and working. Do you have a preferable way of going about this? Perhaps pushing these additions to my own fork of the bitbucket repository and adding a pull request?

Cheers,

Cameron

--
You received this message because you are subscribed to a topic in the Google Groups "The labscript suite" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/labscriptsuite/TFqG8EOoYtM/unsubscribe.
To unsubscribe from this group and all its topics, send an email to labscriptsuit...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Philip Starkey

unread,
Jun 12, 2016, 1:35:33 AM6/12/16
to The labscript suite, Cameron....@colorado.edu
Hi Cameron,

No worries!

Yes, the best way would be to commit to your fork, and then make a pull request. We'll then review the code and get you to make any changes if necessary. If you can make sure you're working from the tip of the labscript_devices repo prior to making the pull request that would be preferable (the labscript suite installer defaults to the highest tagged version of the repository which may not be the tip, so you might need to update in Hg Workbench and merge your changes locally first).

Cheers,


On Sunday, 12 June 2016 15:26:27 UTC+10, Cameron Straatsma wrote:
Hi Phil,

Thanks for the response!

I definitely jumped the gun a bit (my bad), and ran into some issues with implementing all 4 ports. I thought I had it worked out, but it didn't go quite as expected. After playing around a bit more I agree that the current usage of uint32 and uint8 is correct. Going through the code again, you've certainly identified an issue on line 37 of myNIBoard.py. I'll get to this first thing on Monday, and let you know how things pan out.

Thanks for the tip with the integers in the HDF5 file. That's a nice way to check that the data being written to the card is correct.

I'd be happy to add the functionality here to your existing code once I've got it debugged and working. Do you have a preferable way of going about this? Perhaps pushing these additions to my own fork of the bitbucket repository and adding a pull request?

Cheers,

Cameron

Cameron Straatsma

unread,
Jun 13, 2016, 2:17:54 PM6/13/16
to The labscript suite, Cameron....@colorado.edu
Hi Phil,

The fix you suggested on line 37 of the myNIBoard.py file worked. I did some tests this morning and it looks like all 32 digital channels (spread over 4 ports on the PXIe-6535 card) are functioning correctly. I tested in manual mode using the front panel of BLACS as well as in buffered run mode and looked at the outputs of each channel on a scope.

I forked the labscript_devices repository, added changes to the NIBoard.py file, and added my .py device files for the PXI cards I'm using. I submitted a pull request with you as the reviewer...hopefully that all worked. I'm kind of new to using Mercurial and Bitbucket, so please let me know if I messed something up.

Cheers,

Cameron
Reply all
Reply to author
Forward
0 new messages