Eyetracking and coroutines / multitherading for interactive evaluation of gaze data

236 views
Skip to first unread message

Denis-Alexander Engemann

unread,
Apr 16, 2012, 8:01:43 AM4/16/12
to psychop...@googlegroups.com
Dear List,

I am looking for some help with regard to multithreading / socket / subprocesses / coroutines  in Psychopy. 

Background: 
Currently, I am preparing an eyetracking experiment in which subjects are asked to respond to dynamic face stimuli by performing saccades. The eyetracker (SMI) provides samples at 1250hz and it comes with an API (iViewXaPI) that allows for convenient access to the structs and functions of the related C header file / the dll from within Python. In order to better control the data quality I implemented an interactive routine that allows me to evaluate the subjects gaze position and classify and process his or her responses in an online-fashion during the running experiment. This works fine, but unfortunately, the API only supports single sample updates instead of arrays or lists of the positions recorded since the last update (which would be Psychopy's event.getKeys approach). Therefore, using a while loop like this:

while thistrial:
stimulus.draw()
win.flip()
dosomethingwith(myeyetracker.currentgazecoordinates)
I won't be able to take advantage of the SMI's full samplng frequency, as the sample gets only updated once in a loop which waits for the win.flip() method. As a result, the effective samping rate of my routine is reduced to my monitor's refreshrate. 

Problem: How can I use the eytracker's full, or at least a higher, sampling rate for interactive evaluation?
I have only a few rather vague ideas how to improve my interactive gaze evaluation.

i) draw more updates in a loop (this would rather constitute a hack and would just shift the problem; I then could draw as many samples as the speed of the method and funciton calls between the wirst and the last line of my loop would permit).
ii) see how event.getKeys() does it and maybe subclass / imitate it for processing eyetracking data interactively. (event.getKeys relies on pygame which presumably uses some compiled code that does not support "plugging in" my eyetracker in a straight forward way).
iii) Compile some code that continuosly draws samples from the eyetracke's API and runs via socket / subprocess / etc. (Unfortunately, I'm not really experienced with this. Moreover, I wonder whether it is that easy to launch and stop, that is to tightly sync such kind of process to my experiment).

Does anyone have experience with this kind of problem?
Is it worth pursuing one of the possibilities mentioned?

I would be grateful for any kind of help or pointer.

Cheers,
Denis

Jeremy Gray

unread,
Apr 16, 2012, 9:16:48 AM4/16/12
to psychop...@googlegroups.com
Hi Denis,

How about having a second thread just to monitor the eye-tracker
thread and construct a list of samples?

--Jeremy

> --
> You received this message because you are subscribed to the Google Groups
> "psychopy-users" group.
> To post to this group, send email to psychop...@googlegroups.com.
> To unsubscribe from this group, send email to
> psychopy-user...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/psychopy-users?hl=en.

Denis-Alexander Engemann

unread,
Apr 16, 2012, 9:26:36 AM4/16/12
to psychop...@googlegroups.com
Hi Jeremy, 
thanks for your fast reply.

but how can I do that? My naive assumption was that the GIL does not allow for that.
As I tried to tell, I'm not yet experienced with multithreads; a glance at the standard library documentation (sockets,threading, multiprocess(only Python 2.7)) was not that instructive for me. Is there any example in the Psychopy source code you could recommend as example?

Denis

Jeremy Gray

unread,
Apr 16, 2012, 10:14:23 AM4/16/12
to psychop...@googlegroups.com
I've only written ~6 thread classes (two examples in
psychopy.hardware.emulator.py), and have not run into any GIL issues.
so I could well be naive here.

my intuition is that a second thread would do the trick if it:
a) takes an eye-tracker thread as an argument
b) has a while loop in its run() method that simply detects whether a
value is available from the eye-tracker, and appends any such value to
an internal list, and
c) has a method that returns the stored list of eye-tracker values,
and resets the list back to empty

so its nothing fancy, just a loop to see whether a given object has a
specific value set, and if so, append that value to an internal list.
the object in question happens to be another thread.

--Jeremy


On Mon, Apr 16, 2012 at 9:26 AM, Denis-Alexander Engemann

Gary Lupyan

unread,
Apr 16, 2012, 10:17:36 AM4/16/12
to psychop...@googlegroups.com
This is only a problem if you need to move stuff around while keeping
track of eye-samples. If you just have a stationary stimulus and want
to capture the eye-data, move the drawing code out of the loop:

stimulus.draw()
win.flip()
while thistrial:
dosomethingwith(myeyetracker.currentgazecoordinates)
also keep track of whatever terminates the trial.. timeout,
response, etc.

-----------------------------------------------------
Gary Lupyan - lup...@wisc.edu
Assistant Professor of Psychology
University of Wisconsin, Madison
http://sapir.psych.wisc.edu
-----------------------------------------------------

On Mon, Apr 16, 2012 at 8:26 AM, Denis-Alexander Engemann

Denis-Alexander Engemann

unread,
Apr 16, 2012, 10:37:00 AM4/16/12
to psychop...@googlegroups.com

Thanks Gary,
As I am studying responses to dynamic stimuli this, unfortunately, does not work; I had used it in other cases ;-)

@Jeremy,
thanks, this sounds really encouraging; and thanks for pointing this out. Actually I was looking for something like that in the hardware module but unfortunately the emulator class / module was not obvious (had to explicitly import it from psychopy.hardware)
I will try to construct something similar to the pp.hardware.emulator.ResponseEmulator class;
Might this actually be of potential interest for the hardware module, maybe together with some convenience methods I wrote for handling the SMI Tracker?

Best,
Denis

Jonathan Peirce

unread,
Apr 16, 2012, 10:57:11 AM4/16/12
to psychop...@googlegroups.com


On 16/04/2012 15:14, Jeremy Gray wrote:
I've only written ~6 thread classes (two examples in
psychopy.hardware.emulator.py), and have not run into any GIL issues.
so I could well be naive here.
The GIL prevents python running on multiple cores, but doesn't stop you having multiple python threads (interleaved) on a single core.

So threads would be a good start, but I you may run into issues if you're wanting to check the eyetracker at a very high rate you'll consume a lot of CPU time and that will affect your rendering. Balancing the load between drawing and rending is likely to be an issue I imagine.

There are also libraries to allow you to get around this and use separate cores (multiprocessing or parallel python), but then the communication between your processes needs to be managed. From my quick look at multiprocessing docs (actually available from 2.6 but probably not yet in the Standalone PsychoPy - I'll add it) it looks like that should be reasonably straight forward:
from mult
iprocessing import Process, Pipe

def f(conn):
    conn.send([42, None, 'hello'])    
    conn.send('something else')
    conn.close()#if you're done with the connection

if __name__ == '__main__':
    parent_conn, child_conn = Pipe()
    p = Process(target=f, args=(child_conn,))
    p.start()
    print parent_conn.recv()   # prints "[42, None, 'hello']"
    p.join()

Even then, my guess is that there could be some performance hit sending data at 1.2Khz between your processes. I'm wondering what you need the high rate of samples for? eg. If your aim is to create a gaze-contingent display or something, because the display is only updating at 60Hz, I imagine you'd only need to update your eye-gaze measure at the same rate?

Jon
-- 
Jonathan Peirce
Nottingham Visual Neuroscience

http://www.peirce.org.uk

This message and any attachment are intended solely for the addressee and may contain confidential information. If you have received this message in error, please send it back to me, and immediately delete it. Please do not use, copy or disclose the information contained in this message or in any attachment. Any views or opinions expressed by the author of this email do not necessarily reflect the views of the University of Nottingham.

This message has been checked for viruses but the contents of an attachment may still contain software viruses which could damage your computer system: you are advised to perform your own checks. Email communications with the University of Nottingham may be monitored as permitted by UK legislation.

Denis-Alexander Engemann

unread,
Apr 16, 2012, 11:10:58 AM4/16/12
to psychop...@googlegroups.com
Thanks for this extended and helpful comment, Jon


Am 16. April 2012 16:57 schrieb Jonathan Peirce <jonatha...@nottingham.ac.uk>:


On 16/04/2012 15:14, Jeremy Gray wrote:
I've only written ~6 thread classes (two examples in
psychopy.hardware.emulator.py), and have not run into any GIL issues.
so I could well be naive here.
The GIL prevents python running on multiple cores, but doesn't stop you having multiple python threads (interleaved) on a single core.

So threads would be a good start, but I you may run into issues if you're wanting to check the eyetracker at a very high rate you'll consume a lot of CPU time and that will affect your rendering. Balancing the load between drawing and rending is likely to be an issue I imagine.

There are also libraries to allow you to get around this and use separate cores (multiprocessing or parallel python), but then the communication between your processes needs to be managed. From my quick look at multiprocessing docs (actually available from 2.6 but probably not yet in the Standalone PsychoPy - I'll add it) it looks like that should be reasonably straight forward:


It seems both are available to the PyShell in Psychopy 1.73.05
 

from mult
iprocessing import Process, Pipe

def f(conn):
    conn.send([42, None, 'hello'])    
    conn.send('something else')
    conn.close()#if you're done with the connection

if __name__ == '__main__':
    parent_conn, child_conn = Pipe()
    p = Process(target=f, args=(child_conn,))
    p.start()
    print parent_conn.recv()   # prints "[42, None, 'hello']"
    p.join()

Even then, my guess is that there could be some performance hit sending data at 1.2Khz between your processes. I'm wondering what you need the high rate of samples for? eg. If your aim is to create a gaze-contingent display or something, because the display is only updating at 60Hz, I imagine you'd only need to update your eye-gaze measure at the same rate?


This is true; however, a sampling rate above 60hz might be useful for classifying responses as saccades or fixations to or in ROIs; e.g., on the basis of their velocity profile, the mean and the variance of the gaze coordinate as provided by a sliding window buffer (numpy array). My current approach is ok for detecting fixations but somewhat less for calculating reaction times or analyzing movement characteristics online. I think a somewhat higher sampling rate might add some precision. 

Denis

Jeremy Gray

unread,
Apr 16, 2012, 11:14:54 AM4/16/12
to psychop...@googlegroups.com
> Might this actually be of potential interest for the hardware module, maybe
> together with some convenience methods I wrote for handling the SMI Tracker?

sounds great to me. I was hoping you'd be interested :-)

--Jeremy

Michael MacAskill

unread,
Apr 16, 2012, 6:50:50 PM4/16/12
to psychop...@googlegroups.com, Daniel Myall
Hi Denis,

We use the SMI 1250 Hz system too and have been controlling it with PsychoPy for a few years now. That was actually before SMI released their Python SDK, so everything is actually able to be controlled without access to that (and so I've never felt the need to investigate it further, relying on direct UDP messaging).

The one outstanding thing for us to do is implement gaze-contingent responses, as our work doesn't really require that at the moment. But I would like to get back into it to examine some ideas in saccadic suppression and adaptation…

When we have discussed how to implement it, it was just by using a simple thread outside the main PsychoPy loop. There would be no need to subclass anything in pyglet etc or to use compiled code. One just monitors the UDP port that iView is sending to, which can be done in pure Python code in a cross-platform way. Then one would be getting iView samples at something like the full 1250 Hz speed, and be able to do online saccade/fixation classification, or filter out erroneous position values. Then in PsychoPy itself, there would be no need to do anything faster than the frame rate. i.e. just once a frame, query the thread to get the latest saccade/fixation state or filtered location, but those values would be based on the last n samples, and not just on the latest single sample.

I'm happy to share our existing code which handles direct control of iView, calibration, etc, in pure Python/PsychoPy code. Might also be the spur I need to implement the gaze contingent thread stuff: perhaps we could benefit from each other's work there?

Cheers,

Michael

--
Michael R. MacAskill, PhD michael....@nzbri.org
Research Director,
New Zealand Brain Research Institute

66 Stewart St http://www.nzbri.org/macaskill
Christchurch 8011 Ph: +64 3 3786 072
NEW ZEALAND Fax: +64 3 3786 080


Michael MacAskill

unread,
Apr 16, 2012, 7:12:42 PM4/16/12
to psychop...@googlegroups.com

>> Even then, my guess is that there could be some performance hit sending data at 1.2Khz between your processes. I'm wondering what you need the high rate of samples for? eg. If your aim is to create a gaze-contingent display or something, because the display is only updating at 60Hz, I imagine you'd only need to update your eye-gaze measure at the same rate?
>
> This is true; however, a sampling rate above 60hz might be useful for classifying responses as saccades or fixations to or in ROIs; e.g., on the basis of their velocity profile, the mean and the variance of the gaze coordinate as provided by a sliding window buffer (numpy array). My current approach is ok for detecting fixations but somewhat less for calculating reaction times or analyzing movement characteristics online. I think a somewhat higher sampling rate might add some precision.

Hi,

As Denis says, 60 Hz is not fast enough to react to the onset of a saccade. The gaze processing generally needs to be faster than the display update rate as one can't usefully detect saccade onset based on a comparison of where the eye was 16 ms ago. We need velocity, position, or acceleration thresholds based on multiple successive samples. Even simple position tracking (e.g. just detecting which stimulus a person is looking at), should be based on multiple samples rather than a single instantaneous measurement, because the signal can be noisy, contain blinks, etc. But the good news is that 1250 Hz is indeed overkill, and anything above 200 Hz is sufficient for most saccade-contingent tasks. So as long as the second thread could operate at that sort of rate, everything should be good. i.e. missed samples aren't a problem as long as the samples themselves are accurately time-stamped.

We have done (pre-PsychoPy) effective intrasaccadic stimulus changes on a 60 Hz display with a 200 Hz analog eye tracker, but the speed of the display can be limiting (i.e. one can be limited to dealing with large amplitude saccades whose durations exceed the frame interval). The main sequence relationship can mean that small saccades are completed within one or two refreshes of a 60 Hz screen. Ideally, one should use a faster display (e.g. 100 Hz and above CRT/DLP). LCD displays are still often limited not only by the 60 Hz refresh but also additional absolute lag times.

Cheers,

Mike

Michael MacAskill

unread,
Apr 16, 2012, 11:43:58 PM4/16/12
to psychop...@googlegroups.com

> So threads would be a good start, but I you may run into issues if you're wanting to check the eyetracker at a very high rate you'll consume a lot of CPU time and that will affect your rendering. Balancing the load between drawing and rending is likely to be an issue I imagine.
>
> There are also libraries to allow you to get around this and use separate cores (multiprocessing or parallel python), but then the communication between your processes needs to be managed. From my quick look at multiprocessing docs (actually available from 2.6 but probably not yet in the Standalone PsychoPy - I'll add it) it looks like that should be reasonably straight forward:

After a year or two of putting this off, it only took an hour or two to implement a thread to monitor the real time output from the SMI iView X. Even on my personal laptop, running Mail, Excel and web browsers etc, I had to **slow down** the thread so that it wasn't polling faster than the 1250 Hz UDP stream. Python didn't get above 13% of one core in CPU usage (the Terminal was using about 50% but that may been due to all the printing going on).

That is, in the run event of the monitoring thread:

# method which will run when the thread is called:
def run (self):
while True:
if self.__stop:
break
print (self.receiveNoBlock())
time.sleep(0.0005)

If the time.sleep() call is not there, the thread runs so fast that it gathers many more empty values from the UDP port than actual samples. With the small sleep value above, the mean time between samples was 0.800 ms (i.e. exactly 1250 Hz), with near 0 standard deviation.

iView's ET_STR command (i.e. "start streaming") has an optional sample rate parameter, so one could set it to, say, 500 Hz, put a longer sleep time in the thread, and that would hopefully leave enough time for PsychoPy to do its stuff (setting a lower output rate isn't necessary, though, as we could just sample at a lower rate from the 1250 Hz stream. Each packet contains its own microsecond time stamp, so irregular sampling isn't too problematic).

Have tested this stuff just in Python so far, only with text output of the eye tracker samples. Next step is to wire it into an actual PsychoPy task, extract the eye pixel coordinates and get a gaze-controlled image implemented. That will be the proof of the pudding as to whether simple threads give the necessary performance, or if the multiprocessing approach is necessary.

Cheers,

Mike


Denis A. Engemann

unread,
Apr 17, 2012, 1:29:52 AM4/17/12
to psychop...@googlegroups.com, psychop...@googlegroups.com
Thanks, michael, this really encouraging; i fiddled a bit arround with it and wrote a thread as well, but your solution seems even simpler / more straightforward. I'm still on my to the lab but will try this for my current experiment as soon as i have arrived ;-)

On 17.04.2012, at 05:43, Michael MacAskill <michael....@otago.ac.nz> wrote:

>
>> So threads would be a good start, but I you may run into issues if you're wanting to check the eyetracker at a very high rate you'll consume a lot of CPU time and that will affect your rendering. Balancing the load between drawing and rending is likely to be an issue I imagine.
>>
>> There are also libraries to allow you to get around this and use separate cores (multiprocessing or parallel python), but then the communication between your processes needs to be managed. From my quick look at multiprocessing docs (actually available from 2.6 but probably not yet in the Standalone PsychoPy - I'll add it) it looks like that should be reasonably straight forward:
>
> After a year or two of putting this off, it only took an hour or two to implement a thread to monitor the real time output from the SMI iView X. Even on my personal laptop, running Mail, Excel and web browsers etc, I had to **slow down** the thread so that it wasn't polling faster than the 1250 Hz UDP stream. Python didn't get above 13% of one core in CPU usage (the Terminal was using about 50% but that may been due to all the printing going on).
>
> That is, in the run event of the monitoring thread:
>
> # method which will run when the thread is called:
> def run (self):
> while True:
> if self.__stop:
> break
> print (self.receiveNoBlock())

What kind of method is .receiveNoBlock()? Or is it just for demo purposes?

> time.sleep(0.0005)
>
> If the time.sleep() call is not there, the thread runs so fast that it gathers many more empty values from the UDP port than actual samples. With the small sleep value above, the mean time between samples was 0.800 ms (i.e. exactly 1250 Hz), with near 0 standard deviation.
>

Excellent!!!

> iView's ET_STR command (i.e. "start streaming") has an optional sample rate parameter, so one could set it to, say, 500 Hz, put a longer sleep time in the thread, and that would hopefully leave enough time for PsychoPy to do its stuff (setting a lower output rate isn't necessary, though, as we could just sample at a lower rate from the 1250 Hz stream. Each packet contains its own microsecond time stamp, so irregular sampling isn't too problematic).

Good point.

>
> Have tested this stuff just in Python so far, only with text output of the eye tracker samples. Next step is to wire it into an actual PsychoPy task, extract the eye pixel coordinates and get a gaze-controlled image implemented. That will be the proof of the pudding as to whether simple threads give the necessary performance, or if the multiprocessing approach is necessary.
>
> Cheers,
>
> Mike
>

Let's compare results at some point. That would be interesting / helpful.

Cheers, denis

Denis A. Engemann

unread,
Apr 17, 2012, 1:54:48 AM4/17/12
to psychop...@googlegroups.com

On 17.04.2012, at 00:50, Michael MacAskill <michael....@otago.ac.nz> wrote:

> Hi Denis,
>
> We use the SMI 1250 Hz system too and have been controlling it with PsychoPy for a few years now. That was actually before SMI released their Python SDK, so everything is actually able to be controlled without access to that (and so I've never felt the need to investigate it further, relying on direct UDP messaging).
>

This is good to hear!

> The one outstanding thing for us to do is implement gaze-contingent responses, as our work doesn't really require that at the moment. But I would like to get back into it to examine some ideas in saccadic suppression and adaptation…
>
> When we have discussed how to implement it, it was just by using a simple thread outside the main PsychoPy loop. There would be no need to subclass anything in pyglet etc or to use compiled code. One just monitors the UDP port that iView is sending to, which can be done in pure Python code in a cross-platform way.

This is an approach I havent thought about yet. Sometimes it maybe is good just not to have an API. I should learn more about the UDP port.

> Then one would be getting iView samples at something like the full 1250 Hz speed, and be able to do online saccade/fixation classification, or filter out erroneous position values. Then in PsychoPy itself, there would be no need to do anything faster than the frame rate. i.e. just once a frame, query the thread to get the latest saccade/fixation state or filtered location, but those values would be based on the last n samples, and not just on the latest single sample.
>
> I'm happy to share our existing code which handles direct control of iView, calibration, etc, in pure Python/PsychoPy code. Might also be the spur I need to implement the gaze contingent thread stuff: perhaps we could benefit from each other's work there?
>

I would be happy to have a look at your code and to share what I have done so far to master the SMI using psychopy. If this is of interest for you we could switch to the developpers list and keep on duscussing this over there. Maybe we arrive at some solution worth implementing in the psychopy.hardware module.

Cheers,

Denis

> Chefs,

Jonathan Peirce

unread,
Apr 17, 2012, 5:22:27 AM4/17/12
to psychop...@googlegroups.com
This all sounds great guys. I agree with Denis that it's probably worth
moving discussion to the dev list.

If any users are getting particularly addicted to the conversation they
can head over there! ;-)
http://groups.google.com/group/psychopy-dev/browse_thread/thread/cd4ad31bb68bc1c4

Jon

--

Nate Vack

unread,
Apr 17, 2012, 12:23:21 PM4/17/12
to psychop...@googlegroups.com
On Mon, Apr 16, 2012 at 10:43 PM, Michael MacAskill
<michael....@otago.ac.nz> wrote:

>    # method which will run when the thread is called:
>    def run (self):
>        while True:
>            if self.__stop:
>                break
>            print (self.receiveNoBlock())
>            time.sleep(0.0005)
>
> If the time.sleep() call is not there, the thread runs so fast that it gathers many more empty values from the UDP port than actual samples.

Maybe I'm being thick here, but couldn't you ditch the sleep() and do
a blocking recv()?

-n

Reply all
Reply to author
Forward
0 new messages