Hi Gonçalo,
Thank you so much. I just tried the Bonsai workflow you sent to me and it worked wonderful!
Weeks ago I was still considering to buy a commercial behavioral system to do the real-time stimulation experiment. But now with your help I can use the Bonsai together with Arduino to beautifully achieve my goals. It is so nice!
Thanks!
Xiong
Hi Pedro,
This sounds like another great idea to include as a built-in example! I will add this in the next couple of days and reply back to this thread to discuss the OpenEphys communication.
--
You received this message because you are subscribed to the Google Groups "Bonsai Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users...@googlegroups.com.
Visit this group at https://groups.google.com/group/bonsai-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/bonsai-users/2758eb41-8ea7-4855-a677-3fd535cc5992%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Hi Pedro,
This sounds like another great idea to include as a built-in example! I will add this in the next couple of days and reply back to this thread to discuss the OpenEphys communication.
From: Pedro Feliciano
Sent: 25 January 2017 22:38
To: Bonsai Users
Subject: [bonsai-users] Re: Real-time stimulation conditioned to a region inspace
Hi all,
I'm new to bonsai. I have the same question, however, my setup used a Blackfly GigE Point Grey camera for video tracking. My goal is to activate the phase detector plugin of the Open-Ephys GUI when the animal (mouse) is located in the region of interest. Can anyone provide me an updated workflow to do that? At the same time, I would like to save the original video and XYT position data. Thank you in advance.
On Monday, October 12, 2015 at 5:34:14 AM UTC-4, goncaloclopes wrote:Hi Xiong and welcome to the forums!
Attached you can find a workflow for activating an Arduino conditioned to a region in space. In order to get this to work you need an Arduino board configured with Firmata. If you have questions on how to get this to work, you can refer to the last part of this thread.
The way this works is by checking whether the center of the largest object is inside the defined bounding box. In order to redefine the bounding box you need to edit the properties of the RoiActivity node.
Hope this helps,
On Monday, 12 October 2015 05:26:44 UTC+1, Xiong Xiao wrote:Hi,
Can anyone tell me how to perform the real-time stimulation conditioned to a region in space (Bonsai paper Figure 4E)? Can anyone send me an example workflow to do that?
Thanks in advance!
Xiong
--
You received this message because you are subscribed to the Google Groups "Bonsai Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users+unsubscribe@googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users...@googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users+unsubscribe@googlegroups.com.
Visit this group at https://groups.google.com/group/bonsai-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/bonsai-users/f15f88bc-b3a5-40d7-85f6-73efcf51082a%40googlegroups.com.
Hmm, this looks like an problem with your OpenGL drivers. Can you visualize a normal webcam on that computer or even just a video? Of you switch to the ObjectTextVisualizer can you see image information (right-click on the source.image node -> visualizer -> ObjectTextVisualizer).
If you cannot visualize any images, I would suggest trying to update your graphics card drivers.
From: Pedro Feliciano
Sent: 10 February 2017 23:30
To: Bonsai Users
Cc: pdr...@gmail.com
Subject: Re: [bonsai-users] Re: Real-time stimulation conditioned to a regioninspace
Thank you for the quick response. I think there is a problem with the source.image.
. I can not see the pointgrey video.
Best,
Pedro
On Friday, February 10, 2017 at 5:54:27 PM UTC-5, goncaloclopes wrote:
Hi Pedro,
The FlyCapture node does not send out just an image, but rather a FlyCaptureDataFrame which includes both the image and all available embedded metadata (timestamp, framecounter, GPIO pin states, etc).
In order to use the Crop, you need to select the Image that is part of the data frame. To do this, you can right-click on the FlyCapture node and select Image from the Output drop-down.
Hope this helps.
On 10 February 2017 at 20:54, Pedro Feliciano <pdr...@gmail.com> wrote:
Hi Goncalo,
Thank you for your quick reply. I have downloaded the ROI Trigger, however, it gives me the following error (in the picture) when using it with the fly capture module.
To view this discussion on the web visit https://groups.google.com/d/msgid/bonsai-users/ae0e087c-691b-4f84-b340-db529d13ffe3%40googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to bonsai-users+unsubscribe@googlegroups.com.
Visit this group at https://groups.google.com/group/bonsai-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/bonsai-users/ae1f4123-2a8a-4413-a9fd-c23ee4cf9b1f%40googlegroups.com.