data sync in Bonsai and openephysGUI

812 views
Skip to first unread message

王策群

unread,
Oct 14, 2021, 3:39:59 AM10/14/21
to Bonsai Users
Dear there,

I am using a intan RHD2000 evaluation board, and a Logitech USB webcam for video recording, 
No idea how to sync the video and ephys data,  the intan RHD2000 interface software doesn't support camera data. 
With brunocruz's help, I tried to record both data in Bonsai, the ephys workflow from the bonsai-example, and I add another plugin written by Jonnewman, which send the ephys data to openephysGUI for visualization.  and the video workflow from the CINPLA's tracking plugin, but I find that the ephys data shows in openephysGUI is a little bit weird. it seems i did some wrong settings?  Does anyone use this plugin?  and how do you figure out the sync of the video and ephys data? 

My Best,
Cequn

微信图片_20211014150757.png微信图片_20211014150751.png

brunocruz

unread,
Oct 14, 2021, 5:02:37 AM10/14/21
to Bonsai Users
Hi Cequn,

Regarding the OpenEphys plugin, I am not sure I can be of help since I never used it before. However, if you can, include the workflow (bonsai and openEphys) you are currently using, I'd be curious to take a look :D.

Regarding the syncing of the ephys system with the camera I can try to explain a bit better what I simple solution would be. 
First of all, as you correctly pointed out, the camera is acquired from a parallel system and not with the RHD2000 and so they are not sychronized.  As a result, we need to "calibrate" one stream versus the other. One way to achieve this calibration is to know when a certain event happens in Ephys and camera streams, that way you know (making some assumptions like stable acquisition rates, etc...) to what camera frame a certain ephys sample corresponds to.

Now, how do we do this? Different cameras might afford different ways to do it, however, if you are using a "vanilla" webcam you probably don't have the ability to either trigger frame acquisition or embedded the state of a sync signal in the metadata of a given frame. However, we can "embedded" a sync signal directly on the frame of the image. 

To achieve this:
> Connect a pair of wires from one of your devices (Arduino, OpenEphys etc...) capable of generating a TTL pulse. 
> Split this into 2 (or more if you want to synchronize more devices) wires. Connect one pair to one of the digital inputs ports in the OpenEphys board. Use the second pair to power a LED that you position in a place that can be seen by the camera. 
>Now, whenever you send a pulse through this line (make sure the pulse lasts more than a couple of frames so you don't miss it), it will be simultaneously recorded in the Ephys data (thought the digital input) and in the camera (since the LED will turn on). Afterward, you just need to track the luminosity of that small LED ROI and you will be able to know which sample and which frame "saw" the same pulse onset.

I hope this is clearer!
Cheers,
Bruno

王策群

unread,
Oct 15, 2021, 4:01:13 AM10/15/21
to Bonsai Users
Hi Bruinocruz,

just upload the simple workflow. 
for my simple syncing way, I just refer to the script here:https://github.com/nikolaskaralis/OE_Bonsai_network_sync/blob/master/startExperiment.py 
the author wrote a python script to send recording start/stop commands to OE(zeroMQ) and Bonsai(OSC).

but you are right, that would be more precise, I find that the intan 32-ch headstage does support adding LED,  https://intantech.com/files/Intan_RHD2000_adding_LED.pdf
Can you also help me modify the workflow?

My best,
Cequn
Ephys-video.bonsai

brunocruz

unread,
Oct 17, 2021, 9:32:14 AM10/17/21
to Bonsai Users
Hi Cequn, 

"just upload the simple workflow. 
for my simple syncing way, I just refer to the script here:https://github.com/nikolaskaralis/OE_Bonsai_network_sync/blob/master/startExperiment.py 
the author wrote a python script to send recording start/stop commands to OE(zeroMQ) and Bonsai(OSC)."

Sorry for the confusion. I actually meant the .bonsai file and the signal chain file from the OpenEphys GUI that you used in your attached printscreen. Regardless, looking at your workflow I suspect the "AdcScale" node might be changing the data format from the one expected in OpenCVMatUDPClient. Can you try to apply the AdcScale transformation AFTER sending the data to the socket?

"but you are right, that would be more precise, I find that the intan 32-ch headstage does support adding LED,  https://intantech.com/files/Intan_RHD2000_adding_LED.pdf
Can you also help me modify the workflow?"

You can simply use a normal LED positioned in your camera's field on view. Preferentially this LED will be static which will make your life much easier when tracking its luminosity. Perhaps another post will help clarify? https://groups.google.com/g/bonsai-users/c/2YVFDX4yaJo/m/OAsPzYgQAgAJ
To toggle the LED you can use any external device capable of generating pulses (e.g. Arduino). 

Hope this helps,
Bruno



jonathan...@gmail.com

unread,
Oct 22, 2021, 8:51:23 AM10/22/21
to Bonsai Users
Relevant thread over on open ephys forums addressing the usage of the communication between Bonsai and Open Ephys GUI: https://groups.google.com/g/open-ephys/c/B_ECysQoCYA/m/QszBiEWqBQAJ
Reply all
Reply to author
Forward
0 new messages