Real-time stimulation conditioned to a region in space

1,071 views
Skip to first unread message

Xiong Xiao

unread,
Oct 12, 2015, 12:26:44 AM10/12/15
to Bonsai Users
Hi,

Can anyone tell me how to perform the real-time stimulation conditioned to a region in space (Bonsai paper Figure 4E)? Can anyone send me an example workflow to do that?

Thanks in advance!

Xiong
Bonsai paper_Figure 4E.png

goncaloclopes

unread,
Oct 12, 2015, 5:34:14 AM10/12/15
to Bonsai Users
Hi Xiong and welcome to the forums!

Attached you can find a workflow for activating an Arduino conditioned to a region in space. In order to get this to work you need an Arduino board configured with Firmata. If you have questions on how to get this to work, you can refer to the last part of this thread.

The way this works is by checking whether the center of the largest object is inside the defined bounding box. In order to redefine the bounding box you need to edit the properties of the RoiActivity node.

Hope this helps,
tracking_boxcenter_v2.bonsai

Xiong Xiao

unread,
Oct 12, 2015, 10:07:10 AM10/12/15
to Bonsai Users

Hi Gonçalo,

 

Thank you so much. I just tried the Bonsai workflow you sent to me and it worked wonderful!

 

Weeks ago I was still considering to buy a commercial behavioral system to do the real-time stimulation experiment. But now with your help I can use the Bonsai together with Arduino to beautifully achieve my goals. It is so nice!

 

Thanks!

Xiong

Zack Shortt

unread,
Apr 12, 2016, 7:40:29 PM4/12/16
to Bonsai Users
Hi Gonçalo,

Im wondering if this can be modified to use a circular area of interest rather than a square.  

We are trying to generate a circle randomly onto the bottom of a test area for mice using a clear base and a projector. We will get a random x and y coordinate from a python function and a constant radius. With this we will draw a circle onto a shader which is projected to the bottom of the test area.  I want to be able to use this area as an area of interest from a camera to see if the mouse will cross onto the inside of the circle. 

Any help would be greatly appreciated.

Thanks,
Zack

goncaloclopes

unread,
Apr 12, 2016, 8:21:08 PM4/12/16
to bonsai...@googlegroups.com
Hey Zack,

In general, this is a problem of collision detection. There are a couple of ways to solve this. Specifically, you can solve this in image-space or vector-space. Conceptually, it goes like this:

Image-space: get the thresholded (b/w) image of the animal, get the b/w image of the circle, multiply both images together and the result will have white where the two shapes overlap.

Vector-space: get the position of the animal (e.g. centroid), and then take the center of the circle (the X/Y point) and compute the distance between the point and the circle using e.g. euclidean distance D = sqrt((Xrat - X)^2 + (Yrat - Y)^2). If the distance is smaller than the radius of the circle, you have a hit. If you want you can also define a circle around the rat and intersect as long as the two circles intersect.

The thing with shaders is that the data you send to the shaders is used to generate the output in the projector, but then is gone, you can't read it back (technically this is not exactly true, but believe me you don't want to read GPU data back because it's just too slow...).

The easiest way is to take the same parameters you send to the shader and use them to either compute the intersection directly in vector-space (may be the easiest), or generate an image in the CPU to intersect with your computer vision (more costly but intersects with every pixel). If you are generating images on the CPU though, then maybe you just want to send those images to the GPU directly rather than redrawing the circle in the shader code. All possibilities are doable with different trade-offs and implications.

EDIT: Yet another option if you want shape-perfect collisions in vector-space is to test collisions with every single point in the contour of the animal returned by FindContours.

Hope this helps but do let me know if it would be useful to clarify anything in more detail about this.
Best,

Kensaku Nomoto

unread,
Apr 14, 2016, 7:10:07 PM4/14/16
to Bonsai Users
Hi Gonçalo,

I would like to do the same thing that Xiong mentioned by using AnalogOutput of NI DAQ.
My workflow is something like the attached image.

If there is no sink like AudioPlayback or AnalogOutput, it seemed to work based on the output of ExpressionTransform (this is actually Item1?Item2:Item3). However, once I added AnalogOutput, it didn't seem to send a sequence. I used 441 samples of sine wave and 1000 samples of scalar value, and set buffer as 1000, and sampling rate as 1000 Hz in AnalogOutput (I don't understand how to set these parameters, though).

Any comments would be appreciated!

Best,
Kensaku
workflow.png

goncaloclopes

unread,
Apr 15, 2016, 5:28:45 AM4/15/16
to Bonsai Users
Hi Kensaku,

The AudioPlayback node actually requires a constant stream of inputs in order to work, meaning that if it gets to the end of your buffer, playback will stop.

MouseMove on the other hand will only generate events when you actually physically move the mouse, so it probably cannot generate buffers at the required rate. One way to work around this is to have another clock regularly sampling the current state of the buffers with enough frequency.

Can you try something like this?


What I did was connect a second FunctionGenerator that does nothing except Sample from CombineLatest. Basically, this node is simply used as a master clock to decide when it is time to send another buffer to the output.

Regarding the parameters, if I understand correctly, both FunctionGenerator and ScalarBuffer should send out the same size of buffer (I am assuming you only want to actually change the state of the output line in order to simulate a digital output). I believe this size does not necessarily need to match the size of the AnalogOutput internal ring buffer, but I'm not entirely sure (it's been some time since I've used these nodes, and I don't have any NI-DAQ system with me at the moment).

Let me know if you run into any difficulties.

Pedro Feliciano

unread,
Jan 25, 2017, 5:38:51 PM1/25/17
to Bonsai Users
Hi all,

I'm new to bonsai. I have the same question, however, my setup used a Blackfly GigE Point Grey camera for video tracking. My goal is to activate the phase detector plugin of the Open-Ephys GUI when the animal (mouse) is located in the region of interest. Can anyone provide me an updated workflow to do that? At the same time, I would like to save the original video and XYT position data. Thank you in advance. 

goncal...@gmail.com

unread,
Jan 25, 2017, 5:54:06 PM1/25/17
to Pedro Feliciano, Bonsai Users

Hi Pedro,

 

This sounds like another great idea to include as a built-in example! I will add this in the next couple of days and reply back to this thread to discuss the OpenEphys communication.

--
You received this message because you are subscribed to the Google Groups "Bonsai Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users...@googlegroups.com.
Visit this group at https://groups.google.com/group/bonsai-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/bonsai-users/2758eb41-8ea7-4855-a677-3fd535cc5992%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

 

Gonçalo Lopes

unread,
Feb 2, 2017, 6:49:00 PM2/2/17
to Pedro Feliciano, Bonsai Users
Hi Pedro,

Very sorry for the extended delay. I have finally uploaded an example to the Gallery that should cover this use case. If you open Bonsai and look in the menu Tools->Gallery and search for the Roi Trigger example, you should be able to find it.

The example basically demonstrates how to trigger an effect (in this case playing a sound), conditioned on movement detected in an ROI. There are many variants that could be designed depending on what signal you want to detect, and what effect you want to cause, but hopefully this can help you get started.

Do let me know if you have questions on the example and in general any feedback on whether it is useful.

On 25 January 2017 at 22:54, <goncal...@gmail.com> wrote:

Hi Pedro,

 

This sounds like another great idea to include as a built-in example! I will add this in the next couple of days and reply back to this thread to discuss the OpenEphys communication.

 

 

From: Pedro Feliciano
Sent: 25 January 2017 22:38
To: Bonsai Users
Subject: [bonsai-users] Re: Real-time stimulation conditioned to a region inspace

 

Hi all,

 

I'm new to bonsai. I have the same question, however, my setup used a Blackfly GigE Point Grey camera for video tracking. My goal is to activate the phase detector plugin of the Open-Ephys GUI when the animal (mouse) is located in the region of interest. Can anyone provide me an updated workflow to do that? At the same time, I would like to save the original video and XYT position data. Thank you in advance. 


On Monday, October 12, 2015 at 5:34:14 AM UTC-4, goncaloclopes wrote:

Hi Xiong and welcome to the forums!

 

Attached you can find a workflow for activating an Arduino conditioned to a region in space. In order to get this to work you need an Arduino board configured with Firmata. If you have questions on how to get this to work, you can refer to the last part of this thread.

The way this works is by checking whether the center of the largest object is inside the defined bounding box. In order to redefine the bounding box you need to edit the properties of the RoiActivity node.

 

Hope this helps,


On Monday, 12 October 2015 05:26:44 UTC+1, Xiong Xiao wrote:

Hi,

 

Can anyone tell me how to perform the real-time stimulation conditioned to a region in space (Bonsai paper Figure 4E)? Can anyone send me an example workflow to do that?

 

Thanks in advance!

 

Xiong

--
You received this message because you are subscribed to the Google Groups "Bonsai Users" group.

To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users+unsubscribe@googlegroups.com.

Pedro Feliciano

unread,
Feb 10, 2017, 3:54:06 PM2/10/17
to Bonsai Users, pdr...@gmail.com
Hi Goncalo,

Thank you for your quick reply. I have downloaded the ROI Trigger, however, it gives me the following error (in the picture) when using it with the fly capture module .
I'm wondering what could be wrong?
*FlyCaptureSDK2 is installed and working properly in my computer.
Thank you for your help.

Best,
Pedro

To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users...@googlegroups.com.

Auto Generated Inline Image 1

Gonçalo Lopes

unread,
Feb 10, 2017, 5:54:27 PM2/10/17
to Pedro Feliciano, Bonsai Users
Hi Pedro,

The FlyCapture node does not send out just an image, but rather a FlyCaptureDataFrame which includes both the image and all available embedded metadata (timestamp, framecounter, GPIO pin states, etc).

In order to use the Crop, you need to select the Image that is part of the data frame. To do this, you can right-click on the FlyCapture node and select Image from the Output drop-down.

Hope this helps.

To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users+unsubscribe@googlegroups.com.

Pedro Feliciano

unread,
Feb 10, 2017, 6:30:10 PM2/10/17
to Bonsai Users, pdr...@gmail.com
Thank you for the quick response. I think there is a problem with the source.image. . I can not see the pointgrey video.

Best,
Pedro
Auto Generated Inline Image 1

goncal...@gmail.com

unread,
Feb 10, 2017, 8:29:20 PM2/10/17
to Pedro Feliciano, Bonsai Users, pdr...@gmail.com

Hmm, this looks like an problem with your OpenGL drivers. Can you visualize a normal webcam on that computer or even just a video? Of you switch to the ObjectTextVisualizer can you see image information (right-click on the source.image node -> visualizer -> ObjectTextVisualizer).

 

If you cannot visualize any images, I would suggest trying to update your graphics card drivers.

 

 

From: Pedro Feliciano
Sent: 10 February 2017 23:30
To: Bonsai Users
Cc: pdr...@gmail.com
Subject: Re: [bonsai-users] Re: Real-time stimulation conditioned to a regioninspace

 

Thank you for the quick response. I think there is a problem with the source.image. cid:autoGeneratedInlineImage1. I can not see the pointgrey video.



Best,
Pedro

On Friday, February 10, 2017 at 5:54:27 PM UTC-5, goncaloclopes wrote:

Hi Pedro,

 

The FlyCapture node does not send out just an image, but rather a FlyCaptureDataFrame which includes both the image and all available embedded metadata (timestamp, framecounter, GPIO pin states, etc).

 

In order to use the Crop, you need to select the Image that is part of the data frame. To do this, you can right-click on the FlyCapture node and select Image from the Output drop-down.

 

Hope this helps.

On 10 February 2017 at 20:54, Pedro Feliciano <pdr...@gmail.com> wrote:

Hi Goncalo,

Thank you for your quick reply. I have downloaded the ROI Trigger, however, it gives me the following error (in the picture) when using it with the fly capture module https://groups.google.com/group/bonsai-users/attach/a09db7f9fa892/Auto%20Generated%20Inline%20Image%201?part=0.1&authuser=0.

Pedro Feliciano

unread,
Feb 11, 2017, 5:42:02 PM2/11/17
to Bonsai Users, pdr...@gmail.com
Thanks, installing the OpenGL drivers fixed the problem. However, the problem now is that Bonsai crashes when using the FlyCapture node. Here is a picture . Let me know if you want me to post this in a another post.

Best,
Pedro
Auto Generated Inline Image 1

Gonçalo Lopes

unread,
Feb 11, 2017, 6:11:18 PM2/11/17
to Pedro Feliciano, Bonsai Users
Hmmm, yeah, this looks like a different issue.

My guess would be that maybe the PointGrey camera is having USB link issues. If you open the FlyCap Viewer (the app installed with the FlyCap SDK), do you see any "Link Recovery Counts" or "Skipped Frames" marked in red and growing while the camera is running? If this is the case, it means you are having USB cable issues and should try to switch cables or switch the USB port in the back of the computer.

You can also check the relevant troubleshooting section in the PointGrey website:

If the problem persists, let's debug it in another post (start a new question).

To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users+unsubscribe@googlegroups.com.

Pedro Feliciano

unread,
Feb 15, 2017, 7:23:15 PM2/15/17
to Bonsai Users, pdr...@gmail.com
You were absolutely right. Just for the record, I did the following: 1) I installed the GigE performance driver, 2) I enabled jumbo packets, and 3) packet resend were turned on. Doing these steps eliminate "image consistency errors" and prevent Bonsai crashes. In addition, in case somebody is interested I attached a workflow which I successfully used to activate a LED connected to an Arduino. Thank you for all your help.

Best,
Pedro
PointGreyArduinoTrackingTrigger.bonsai
Reply all
Reply to author
Forward
0 new messages