Question regarding raspberry pi lick monitor

53 views
Skip to first unread message

jlsaun...@gmail.com

unread,
Sep 29, 2020, 5:20:18 PM9/29/20
to autopilot-users

Received via email (name removed for privacy):

Hello Jonny,

Hope this email finds you well. My name is [removed] and I am a user on Github. I have been reading and learning your codes on the autopilot/Nacf task and they have been really helpful for me. So thank you greatly. 

If you don't mind, I have several questions. How do you monitor the events such as licks or nose pokes throughout the task? Is this execution separate from the task(in a parallel processer)? And do you save the timestamps of the events along in the task or could it be possible to save it in the separate processor with all the events recorded like I mentioned above?

I hope you could answer some of my questions, but I want to say sorry ahead of time if I'm being imprudent. Thank you again for all your contributions and resources. 

Best,
[removed]

jlsaun...@gmail.com

unread,
Sep 30, 2020, 6:45:45 PM9/30/20
to autopilot-users
These are great questions (and not at all imprudent!)

```
How do you monitor the events such as licks or nose pokes throughout the task?
```

Digital logic events can be monitored using the hardware.Digital_In class -- they are by default. To use it in a task you would provide the parameters for the object in the `prefs.json` file, so you would have something like this in your `HARDWARE` dictionary in prefs:
```
"INPUT_1": {
    "name": "INPUT_1",
    "pin": 10,
    "record": true
}
```
This makes it so logic events are recorded as `(logic_level, timestamp)` tuples in the `events` attribute.

If you would like to save all the events in the subject's data file, you would declare the data in the `ContinuousData` descriptor in the heading of a task (eg. here: https://docs.auto-pi-lot.com/en/latest/guide.task.html#go-no-go-parameterization ), and then use the assign_cb method to assign a callback function that sends the logic level and timestamp back to the terminal whenever there is a logic event.

It's a little clunky -- the goal is to mature the API to a point where data collection is automatic and hardware data can be streamed around the program with a `input_pin.event.connect(other_object.some_attribute)` syntax.

```
Is this execution separate from the task(in a parallel processer)?
```

Yes! all the communication with the GPIO pins is done with pigpio, which runs as a separate process. 

```
And do you save the timestamps of the events along in the task or could it be possible to save it in the separate processor with all the events recorded like I mentioned above?
```

let me know if i didn't answer this above :)

Felix Jng

unread,
Oct 23, 2020, 7:20:16 AM10/23/20
to autopilot-users

Hi Jonny,

I am currently struggling a bit to record continuous data within autopilot which I assume is mainly due to my very rudimentary understanding on how to best use callback function for sending the logic level and timestamp to the terminal.
My aim is to record a synchro pulses at 30 Hz from another system throughout the session and store the data. The pin to record is configured in my prefs.json file as this (I am not sure about the event/trigger options):

        "S": {
                "event": "",
                "name": "S",
                "pin": 15,
                "polarity": 1,
                "pull": 0,
                "record": true,
                "trigger": "",
                "type": "gpio.Digital_In"
            }

and I am able to call callback functions when the pin receives input (e.g. self.triggers['S'] = [*callback function*]). I also configured the ContinuousData descriptor in the task heading (so that I find the empty 'continuous_data' key next to the 'trial_data' in my .h5 file).

My question now is: how do I return the continuous data (or basically the timestamps from the continuous inputs to my pin) to the terminal, so that I store it in my .h5 file? From your response above, I understand that the way to go would be to trigger a callback function each time the pin receives input and that this functions sends the timestamps (and logical status) to the terminal. Unfortunately, this is the point where I am currently struggle to understand on how to solve this issue and I would be very glad if you could help me. :)
Maybe as a side note, for now, I was thinking to record the synchro pulses with the same pilot that is controlling my task, so not to use a child for recording the synchro pulse.

Thanks a lot in advance!
Best,
Felix

jlsaun...@gmail.com

unread,
Oct 25, 2020, 6:28:33 PM10/25/20
to autopilot-users
Running out the door just now, but i need to think about the best way to do this --

the `triggers` (if they're an attribute of the task class) are cleared every time they are called, but I think that should be configurable in Task subclasses, so making a note of that.

the networking modules have a method `get_stream` that is intended for use here, see an example of using it in the camera class here:
https://github.com/wehr-lab/autopilot/blob/6c437a926fbee2ee5c2e69d948b98d6c296d7e9b/autopilot/hardware/cameras.py#L355

where one just puts timestamps and values in a queue like this:
https://github.com/wehr-lab/autopilot/blob/6c437a926fbee2ee5c2e69d948b98d6c296d7e9b/autopilot/hardware/cameras.py#L294

so you would want to create a stream queue with your hardware object's `node` (it should have one from the Hardware metaclass) and then write a `stream` method that puts timestamps into the queue. then use `add_callback` on the Digital_In object to have it called on a logic event.

that answer is ugly to me, so I think i'll add a method similar to the Camera object's `stream` methods to the Digital_In class so it's easier to stream logic events (seems like a basic thing everyone would want to do...)

Another way is that if `record` is set then timestamps of logic events will be stored in the `events` list. you could then poll it from another place in the Task object, but that seems even clunkier still.

Felix Jng

unread,
Oct 27, 2020, 4:08:47 AM10/27/20
to autopilot-users
great thanks :)
I'll try to implement a solution using the stream method; one question on the data storage, do I still need to declare `ContinuousData` descriptor? and if I understand it correctly, I send the data in the queue (i.e. the pulses and timestamps) to the terminal each time a logical transition happens on the pin. How does the terminal handle (and store) this data? Sorry for all these questions and thanks a lot in advance :)

jlsaun...@gmail.com

unread,
Oct 27, 2020, 8:13:02 PM10/27/20
to autopilot-users
tell you what -- i have to prepare for presenting at neuromatch on Friday, but afterwards i'll write a stream method for digital_in devices to make this much easier.

a stream will group and batch events depending on the parameters you use to instantiate it, see those here (in another place i'm seeing the docs be woefully inadequate): https://docs.auto-pi-lot.com/en/latest/autopilot.core.networking.html#autopilot.core.networking.Net_Node.get_stream

since the docs are so bad you can check out the source for now to see how the params work: https://docs.auto-pi-lot.com/en/latest/_modules/autopilot/core/networking.html#Net_Node.get_stream

batching is more important for larger data like video streams, for booleans and timestamps you could set min_size=1 to just send every sample as soon as you put it in the queue.

You need some ContinuousData descriptor to indicate that the data-storage Subject class should be expecting it, yes... I initially wanted to have all data declarations be explicit in the interest of really clear data provenance, but that pretty naturally conflicts with effortlessness/making sure *everything* gets documented no matter what, so I think i might move towards making all data storage be automatic.

all networked events are handled with "listen" callbacks that specify some action for different types of messages (ie. messages with different "key"s). The "station" networking object first expands the streamed message (that is a container for multiple messages), and in the case of continuous data calls l_continuous which sends the data to the plot widget and to the terminal for storage. You can see the details of how that works in the Subject class here: https://docs.auto-pi-lot.com/en/latest/_modules/autopilot/core/subject.html#Subject.data_thread

Please do not apologize for your questions! They are extremely useful for me to see where the program and docs need to be improved, and as you can see there are certainly a lot of places where they can be improved!!

Chris Rodgers

unread,
Oct 27, 2020, 8:26:46 PM10/27/20
to autopilot-users
Oh cool you're presenting at NMC? Everyone should try to go! Looks like it's Friday 1:30pm Eastern.
https://neuromatch.io/abstract?submission_id=recI5D0QaJ857Y4JI

I'm hosting a seminar speaker right at that time, so I won't be able to make it, unfortunately. But I think they're recorded so I'll try to watch it offline.


--
You received this message because you are subscribed to the Google Groups "autopilot-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to autopilot-use...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/autopilot-users/bee9c06a-0c8b-4d8a-be46-18c38e691388n%40googlegroups.com.

jlsaun...@gmail.com

unread,
Oct 27, 2020, 9:00:39 PM10/27/20
to autopilot-users
I'm absolutely charmed :), yes of course would love to see whoever can make it to the talk, will be talking broadly about the fever dream of the project, where it's going, and how we've been using it recently, incl. the DeepLabCut-Live stuff and integrating some v esoteric custom hardware into this very cool visual task! would be happy to talk about any visions y'all have for experiments you want to build :).

Felix Jng

unread,
Oct 28, 2020, 6:03:50 AM10/28/20
to autopilot-users
super cool thanks ! Looking forward to your talk and good luck with the prep :)

Mikkel Roald-Arbøl

unread,
Nov 2, 2020, 4:16:57 AM11/2/20
to autopilot-users
If anyone wants to hear Jonny's talk, you can find it on Youtube here: 4:13:50-4:27:30 https://www.youtube.com/watch?v=OecxzqzmzRU :-D 

jlsaun...@gmail.com

unread,
Nov 20, 2020, 7:16:13 PM11/20/20
to autopilot-users
Opened an issue to track writing a stream method for digital_in objects:

https://github.com/wehr-lab/autopilot/issues/43
Reply all
Reply to author
Forward
0 new messages