pycontrol / photometry integreation in Neurodata Without Border framework

44 views
Skip to first unread message

Julien Carponcy

unread,
Mar 11, 2022, 8:23:46 AM3/11/22
to pyControl
Hello,

I was wondering if anybody was aware of people/projects having made efforts in integrating pyControl and/or pyPhotometry data into a more complex and general framework, particularly in NWB (https://pynwb.readthedocs.io/).

Especially interested if there is attempts outthere to merge these with DeepLabCut data and possibly ephys/sorting outputs. Did not come across anything like that yet (with PyControl/PyPhotometry).

I am also aware of DataJoints Neuro, which is NWB compatible, but this was just a general prompting to see if people are aware of nice shared work to organize neatly all these data together. 

On a side point, I was wondering if people tried to exted the Session class and/or integrate into NWB to integrate trial levels or have a framework to integrate pyphotometry/pycontrol data already pre-processed by trials.

Anyway, this is kind of an open-ended question to see how people are tackling the issues of integrating pycontrol data with other modalities, especally in frameworks which make them more "shareable" or "reusable" in standardized (open-source) formats. As such, I will appreciate almost any kind of feedback before I try to implement that systematically myself.

Best regards to everyone, congratulations Thomas for the published work, and thanks again for this amazing piece of work.

Julien

thoma...@neuro.fchampalimaud.org

unread,
Mar 11, 2022, 2:19:59 PM3/11/22
to pyControl
Hi Julien,

Thanks for raising this question.  I am not aware of any work to integrate pyControl or pyPhotometry data with other frameworks like Neurodata Without Borders but think this could be very valuable.  I've never worked with the NWB or DataJoints formats myself so don't have a sense for how complicated this would be to implement, or what the strengths and limitations of the formats are.

We do work routinely on experiments that integrate pyControl and/or pyPhotometry data with Open Ephys and/or DeepLabCut.  The initial steps of this workflow are fairly standardised.  We do the data synchronisation by sending sync pulses generated using the pyControl Rsync class to all the other devices including the camera (as detailed in the sychronisation docs).  The first step of the analysis is then to import all the different data types and setup Rsync_aligner instances using the sync pulse times recorded by each system.  These can then be used to convert times between different systems, e.g. video frames to pyControl clock ms etc.

For tasks that have a discrete trial structure we normally than align activity across trials by linearly time-warping between reference points (e.g. trial initiation and choice).  I've put a version of the code I use for this on github here.  I often then attach the trial aligned data for each session onto the Session object as an attribute, and I will normally have annother attribute which contains the behavioural data processed into a trial-by-trial data structure.  This is quite convinient for writing analysis code as you can just pass lists of session objects to analysis functions each containing both the behvioural and neural data.  An example of analysis code written in this way is the imaging analysis here from this paper (it does not use Rsync for alignment as that was not implemented when the data were acquired). 

If you do end up implementing anything more systematic than this I would be really interested to hear about it.

best,

Thomas
Reply all
Reply to author
Forward
0 new messages