Re: [open-ephys] New Open Ephys user at IMM : xyt position tracking anyone?

1,712 views
Skip to first unread message

Goncalo Lopes

unread,
Aug 31, 2015, 10:04:22 AM8/31/15
to Mikkel Elle Lepperød, Josh Siegle, Open Ephys, bonsai...@googlegroups.com
Hey Mikkel,

The way I usually do it is start by synchronizing the frame acquisition. This can be done in two ways, depending on the kind of camera you have:

1) If you have a professional/industrial grade camera (e.g. PointGrey, Basler, IDS) then they usually provide a GPIO output that can be configured to send a digital pulse whenever the shutter is open for integration. If you feed in this pulse directly to one of the OpenEphys digital inputs, this gives you precise hardware synchronization of when each individual frame was grabbed. If you have support for this in your camera, this is definitely the gold standard. If you have this you're pretty much done, as long as you don't drop frames, since no matter how much extra latency your analysis adds you will always know which frame you were working on and by extension which slice of the neural data that matches to. In order to account for dropped frames you should also store the hardware frame counter of the camera.

2) If you just have a webcam or a lower grade camera with no GPIO capabilities, then you will probably have to hack together an external sync pulse, something like an external LED that blinks periodically in a place where your camera can see it at all times (i.e. away from the animal). This is most commonly simply an Arduino running the Blink sample. In addition to the LED for the camera, you pass a wire that also connects the Arduino output to the OpenEphys digital input. In this way, whenever the LED goes high, the ephys system will see the digital pulse and the camera will see an optical signal. You can track when the optical signal lights up in Bonsai using a Crop node (see example of Crop at https://www.youtube.com/watch?v=736G93Qaak0) followed by Sum, which you can then threshold to find your sync pulse rising edges.

As long as you have one of these methods in place, then you can play with your online analysis without worrying so much about synchronization.

Regarding visualization of tracking coverage, it is also possible to do it in a couple of ways:

1) If you prefer to do it in OpenEphys, then yes, you need to find a way to pipe the XY values to the GUI. I guess in principle you could setup Bonsai to report the XY to a NI-DAQ analog output and then read that into OpenEphys, but this seems to me like a bit of an overkill and likely to introduce more latency than necessary.

The best way would be for Bonsai to communicate directly with the OpenEphys GUI using some kind of IPC (inter-process communication). Usually I do this using the Bonsai.OSC module. OSC is a lightweight binary communication protocol much used in the electronic music scene to control various input and output devices (think of it as MIDI 2.0). Bonsai supports sending most data types directly through OSC, including Points, orientations, times, etc, etc. It would be nice if you could somehow have the OpenEphys GUI have an OSC source (and sink while you're at it). This would enable Bonsai and the OpenEphys GUI to communicate much more effectively data back and forth (Josh et al. what do you think?). The spec is very easy and there are a number of reference implementations you could probably grab and plug into a source.

2) Another way is to build the visualization using Bonsai itself. For this there are also a number of options. The easiest way is to simply use the built-in visualizers. For this you can just follow the steps in this tutorial (https://youtu.be/_uJVtsGtI1M). Another way is to accumulate a 2D histogram of tracked positions over time. This is easy enough to do with OpenCV and will be included in the next Bonsai release (2.2). If you prefer to go this way let me know and I can show a couple of ways to do this in the current version using Python nodes. There are even other options, but they're probably too involved for what you need right now.

Regarding alternative ways of tracking 3D position, you can use the Bonsai.Aruco module to extract the 3D position and orientation of known fiducial markers using the ArUco library. This works by printing out a square black and white code pattern which the camera can easily identify and extract the geometry of (see also Figure 4D in the Bonsai paper for an example). You also need to calibrate the camera's intrinsic parameters (either using OpenCV or a manual procedure), however better to ask in another thread if you are interested in this option.

Regarding the overlay of physiology data directly on the video, this would also be very interesting to do. Again you could mix up OpenEphys and Bonsai.

Probably the easiest way here would be to start with Bonsai since it already has built-in video visualizers. I was imagining that if OpenEphys could send OSC messages to Bonsai (see above), then you could potentially imagine streaming-in a series of color codes (one for each clustered cell) as an OSC stream to Bonsai. Then you could combine that stream with the current tracking state and simply color the current position with the color of the cell. Then you would have the classical place-cell visualization online. There are also many other options, but probably it's better to trail off here as this e-mail is getting long.

I'm taking the liberty to forward this also to the Bonsai mailing list as this turned out to be a useful synthesis of a very frequently asked category of questions :-) Feel free to sign in if you have future questions about how to use Bonsai.

Cheers and hope this helps. Thanks for the feedback,
Gonçalo

On 31 August 2015 at 09:06, Mikkel Elle Lepperød <lep...@gmail.com> wrote:
Hi Josh,

So we will do research on spatially modulated cells i.e grid cells, place cells, head direction cells etc.. Here we have the animal moving freely in an open field - a 1m2 box. So, my first dream, would be to have the tracking write a position graph such that we can monitor (make sure) that the animal has explored the entire environment during one experiment. Preferably, the position data would be saved to the .kwik of open ephys output for offline analysis.

Then, my next dream, is to connect LFP and spikes to the tracking such that we can make online activity vs position graphs and feedback loops with optogenetics.

I think the first solution would fit us well and we do have expertise on programming in the lab such that we can make an own module for this in the open ephys GUI. But it would be really nice to know where to start off :)

Goncalo Lopes: On our old system we track two LED's in order to get both position and head direction. One problem with this is that reflections from the LED's in the box can produce noise. Is this easilly fixed with Bonsai or do you have some other idea on how to do this? In addition do you have some input on how to feed (synched) x/y postition to the open ephys GUI?

Best regards

Mikkel


--
Mikkel Elle Lepperød
Centre for Integrative Neuroplasticity (CINPLA)
University of Oslo (UiO)
Kristine Bonnevies hus, room 2513



--
Gonçalo Cardoso Lopes
Intelligent Systems / Learning Laboratory
Champalimaud Centre for the Unknown
Av. Brasília, Doca de Pedrouços
1400-038 Lisboa, Portugal
www.neuro.fchampalimaud.org

Klára Gerlei

unread,
Nov 10, 2017, 7:20:17 AM11/10/17
to Bonsai Users
Hi Gonçalo,

I have two questions and both are related to these things, so I hope it is okay to reply here.

1) I'm recording ephys data using open ephys, and tracking movement in Bonsai on the same computer, so they both have the timestamps saved. Do you think it is still necessary to synchronize them? (Wouldn't the signal coming from the Arduino with the LED potentially take a different amount of time to arrive to the open ephys gui than how long the detection of the light intensity takes?)

2) I would also really like to detect when the animal covered the whole surface (twice). Is that something that was implemented since? If not, could you please give me some pointers on how to get and analyze the histograms in Bonsai? I'm happy to write things in Python, but I'm a bit lost in the Bonsai GUI. The motion and head-direction tracking is already working really well, and the xy coordinates are saved using a script you helped with (for Sarah Tennant and Elizabeth Allison).

Thank you!

Best wishes,
Klara

Gonçalo Lopes

unread,
Nov 11, 2017, 6:49:54 AM11/11/17
to Klára Gerlei, Bonsai Users
Hi Klára and welcome to the forums!

1) I'm recording ephys data using open ephys, and tracking movement in Bonsai on the same computer, so they both have the timestamps saved. Do you think it is still necessary to synchronize them? (Wouldn't the signal coming from the Arduino with the LED potentially take a different amount of time to arrive to the open ephys gui than how long the detection of the light intensity takes?)

It all depends on what is your acceptable synchronization jitter. There is of course always some minimum jitter depending on how your systems discretize time. For example, if you are using a 120Hz camera, your minimum time unit is 1 frame, which corresponds roughly to 8ms. This means you will have a minimum jitter of +/- 1 frame, or 8ms.

The OpenEphys, on the other hand, operates typically between 20-30 kHz, which means it has a minimum time unit which is below 1ms. Transmission of raw electrical signals can be considered to operate at the speed of light (so delays are negligible in our case). The clever bit about the GPIO method outlined below is that it couples an electrical signal to the mechanical act of opening the camera shutter. This way, we know exactly the moment when the camera collected information from the world, no matter how much delay is present in electronics/communication/driver software, etc.

The second method with the LED is the best you can do if you do not have access to this coupling. You generate a pulse of light, and then look in the camera for when you see the photons. Again, turning on a LED is pretty much instantaneous for our purposes. What happens is that the LED can turn on out-of-phase with the camera, e.g. it  can turn on at the end of an exposure (and so show up very dark). In practice, using this method you have to accept a minimum jitter of +/- 1 frame. How bad 1 frame will be depends on how fast your camera is. For cheap webcams this can be as high as 30ms so yeah, maybe it doesn't matter so much in this case.

However, if your camera can go to 60Hz (16ms jitter) it may start to pay off as it will be more reliable than the timestamp jitter from the computer clock (I usually assume on the order of 10ms jitter between CPU timestamps of two distinct physical sources, and can sometimes be higher).

To sum up, if you can do everything with direct electrical signals, this will definitely be much better than just software timestamps in almost any situation and usually is the only way to do sub-millisecond synchronization. Other than that, it depends on your hardware and requirements.

2) I would also really like to detect when the animal covered the whole surface (twice). Is that something that was implemented since? If not, could you please give me some pointers on how to get and analyze the histograms in Bonsai? I'm happy to write things in Python, but I'm a bit lost in the Bonsai GUI. The motion and head-direction tracking is already working really well, and the xy coordinates are saved using a script you helped with (for Sarah Tennant and Elizabeth Allison).

Ah, actually yes, this has been included in Bonsai since :-)
You just have to take the Centroid output of the tracking and send it to the Histogram2D node. Here is an example (also attached):


You can do the same for 1D signals.

Hope this helps!
 

--
You received this message because you are subscribed to the Google Groups "Bonsai Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users+unsubscribe@googlegroups.com.
Visit this group at https://groups.google.com/group/bonsai-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/bonsai-users/eca454f0-4b46-48e7-8a42-5d44cad91988%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

histogram2d.bonsai

Amina Kinkhabwala

unread,
Nov 15, 2018, 1:22:52 PM11/15/18
to Bonsai Users
Hi Goncalo, 
One thing I'm still trying to wrap my head around is when to acquire electrophysiology using Bonsai rather than using the Open Ephys app?  My original hope was to trust Bonsai to give timestamps for acquisition for both Open Ephys and video acquisition for synchronization. I guess my question is whether there is any benefit for acquiring Open Ephys data in Bonsai besides feedback control experiments where data is processed online to do something?
Another question I've had a hard time solving from google group chats is how to get a global timestamp from Bonsai. I'm trying to do something similar to a lot of people, just video acquisition paired with Open Ephys acquisition (and an arduino to control light timing). If anyone has code to share for how to synchronize video and ephys acquisition I'd love to look at some examples.

These are pretty basic questions but I've been surfing the group chats for awhile and still don't have a great grasp on the answers.
Thanks!
amina

Gonçalo Lopes

unread,
Nov 22, 2018, 7:00:06 PM11/22/18
to Amina Kinkhabwala, bonsai...@googlegroups.com
Hi Amina,

The OpenEphys GUI and Bonsai Ephys plugins were developed independently of each other (some time ago) and targeted slightly different goals. The Bonsai Ephys plugin is compatible with all Rhythm boards (including Intan RHD2000 and OE USB 3 version). From Bonsai's perspective, the Ephys is just another acquisition device. All available data that is streamed from the board (including timestamps) can be collected with Bonsai.

The OpenEphys GUI has some very nice built-in visualization plugins such as tetrode spike clustering and online PSTHs. With Bonsai you have all operations up to spike detection and basic visualization of time series and spike waveforms. So when use one versus the other? It really depends on your application. If you just want to collect and save raw Ephys data to a binary file, both should be relatively equivalent.

As an example, we originally used Bonsai Ephys to trigger moving windows of data on spikes coming from an external patch system through the I/O board, for guiding paired recordings online. We also use it frequently just for recording, when we are already collecting data with Bonsai and there is nothing special to be done on the Ephys side. This way we avoid running two separate programs, but it's not really a good argument if you have your workflow setup for using the OpenEphys GUI.

Regarding the question about global timestamps, the answer really depends on what the "global" clock means for you. For a simple setup of Ephys + Video, there are usually at least 3 clocks involved (PC time, Ephys time, and Video time). Every device is counting time from its own perspective and for precise sub-millisecond synchronization you really need to have hardware triggers across the different devices. What device counts as your "global" clock is a matter of choice, but we tend to go for the faster clock (typically Ephys), so the easiest solution is to send all other events to be sampled synchronously with the Ephys trace (e.g. all camera frame triggers, behavior board events, etc, to the I/O board of OpenEphys).

That said, there are many other possibilities that depend on setup: do you have a projector display that needs to be synchronized with video acquisition? You may want to have a photodiode detect the projector refresh rate, and have that trigger the camera. Are you playing a sound stimulus from the sound card that needs to be precisely synchronized with ephys data? You may need to duplicate the sound card output on the ephys analog or digital input to catch the onset of stimulus presentation.

In some cases when using just behavior you can go even simpler and just use video frame counts as your measure of time, or just use the Timestamp operator in Bonsai to sample the 50MHz high-precision event timing clock on the CPU. That would be the closest to an automatic global clock. Unfortunately, although this clock is really-precise, there is communication latency from the device to the computer that makes it impossible to use this for millisecond precision.

Sorry for the long email, not really sure if it helped to clear the confusion, but the bottom line is that precise synchronization of signals is at the moment a problem that is only really solved at the hardware level. You cannot trust non-real-time operating systems to precisely synchronize your data automatically, so unfortunately we need these custom hardware solutions because we care too much about the flexibility and user-friendliness of consumer-level operating systems.

Feel free to ask questions about the specific synchronization setup that you were after and I'm happy to help suggest possible solutions.
Hope this helps somewhat.

--
You received this message because you are subscribed to the Google Groups "Bonsai Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages