How about having a second thread just to monitor the eye-tracker
thread and construct a list of samples?
--Jeremy
> --
> You received this message because you are subscribed to the Google Groups
> "psychopy-users" group.
> To post to this group, send email to psychop...@googlegroups.com.
> To unsubscribe from this group, send email to
> psychopy-user...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/psychopy-users?hl=en.
my intuition is that a second thread would do the trick if it:
a) takes an eye-tracker thread as an argument
b) has a while loop in its run() method that simply detects whether a
value is available from the eye-tracker, and appends any such value to
an internal list, and
c) has a method that returns the stored list of eye-tracker values,
and resets the list back to empty
so its nothing fancy, just a loop to see whether a given object has a
specific value set, and if so, append that value to an internal list.
the object in question happens to be another thread.
--Jeremy
On Mon, Apr 16, 2012 at 9:26 AM, Denis-Alexander Engemann
stimulus.draw()
win.flip()
while thistrial:
dosomethingwith(myeyetracker.currentgazecoordinates)
also keep track of whatever terminates the trial.. timeout,
response, etc.
-----------------------------------------------------
Gary Lupyan - lup...@wisc.edu
Assistant Professor of Psychology
University of Wisconsin, Madison
http://sapir.psych.wisc.edu
-----------------------------------------------------
On Mon, Apr 16, 2012 at 8:26 AM, Denis-Alexander Engemann
I've only written ~6 thread classes (two examples in psychopy.hardware.emulator.py), and have not run into any GIL issues. so I could well be naive here.
-- Jonathan Peirce Nottingham Visual Neuroscience http://www.peirce.org.uk
This message and any attachment are intended solely for the addressee and may contain confidential information. If you have received this message in error, please send it back to me, and immediately delete it. Please do not use, copy or disclose the information contained in this message or in any attachment. Any views or opinions expressed by the author of this email do not necessarily reflect the views of the University of Nottingham.
This message has been checked for viruses but the contents of an attachment may still contain software viruses which could damage your computer system: you are advised to perform your own checks. Email communications with the University of Nottingham may be monitored as permitted by UK legislation.
The GIL prevents python running on multiple cores, but doesn't stop you having multiple python threads (interleaved) on a single core.
On 16/04/2012 15:14, Jeremy Gray wrote:I've only written ~6 thread classes (two examples in psychopy.hardware.emulator.py), and have not run into any GIL issues. so I could well be naive here.
So threads would be a good start, but I you may run into issues if you're wanting to check the eyetracker at a very high rate you'll consume a lot of CPU time and that will affect your rendering. Balancing the load between drawing and rending is likely to be an issue I imagine.
There are also libraries to allow you to get around this and use separate cores (multiprocessing or parallel python), but then the communication between your processes needs to be managed. From my quick look at multiprocessing docs (actually available from 2.6 but probably not yet in the Standalone PsychoPy - I'll add it) it looks like that should be reasonably straight forward:
from mult iprocessing import Process, Pipe def f(conn): conn.send([42, None, 'hello']) conn.send('something else') conn.close()#if you're done with the connection if __name__ == '__main__': parent_conn, child_conn = Pipe() p = Process(target=f, args=(child_conn,)) p.start() print parent_conn.recv() # prints "[42, None, 'hello']" p.join()
Even then, my guess is that there could be some performance hit sending data at 1.2Khz between your processes. I'm wondering what you need the high rate of samples for? eg. If your aim is to create a gaze-contingent display or something, because the display is only updating at 60Hz, I imagine you'd only need to update your eye-gaze measure at the same rate?
sounds great to me. I was hoping you'd be interested :-)
--Jeremy
We use the SMI 1250 Hz system too and have been controlling it with PsychoPy for a few years now. That was actually before SMI released their Python SDK, so everything is actually able to be controlled without access to that (and so I've never felt the need to investigate it further, relying on direct UDP messaging).
The one outstanding thing for us to do is implement gaze-contingent responses, as our work doesn't really require that at the moment. But I would like to get back into it to examine some ideas in saccadic suppression and adaptation…
When we have discussed how to implement it, it was just by using a simple thread outside the main PsychoPy loop. There would be no need to subclass anything in pyglet etc or to use compiled code. One just monitors the UDP port that iView is sending to, which can be done in pure Python code in a cross-platform way. Then one would be getting iView samples at something like the full 1250 Hz speed, and be able to do online saccade/fixation classification, or filter out erroneous position values. Then in PsychoPy itself, there would be no need to do anything faster than the frame rate. i.e. just once a frame, query the thread to get the latest saccade/fixation state or filtered location, but those values would be based on the last n samples, and not just on the latest single sample.
I'm happy to share our existing code which handles direct control of iView, calibration, etc, in pure Python/PsychoPy code. Might also be the spur I need to implement the gaze contingent thread stuff: perhaps we could benefit from each other's work there?
Cheers,
Michael
--
Michael R. MacAskill, PhD michael....@nzbri.org
Research Director,
New Zealand Brain Research Institute
66 Stewart St http://www.nzbri.org/macaskill
Christchurch 8011 Ph: +64 3 3786 072
NEW ZEALAND Fax: +64 3 3786 080
>> Even then, my guess is that there could be some performance hit sending data at 1.2Khz between your processes. I'm wondering what you need the high rate of samples for? eg. If your aim is to create a gaze-contingent display or something, because the display is only updating at 60Hz, I imagine you'd only need to update your eye-gaze measure at the same rate?
>
> This is true; however, a sampling rate above 60hz might be useful for classifying responses as saccades or fixations to or in ROIs; e.g., on the basis of their velocity profile, the mean and the variance of the gaze coordinate as provided by a sliding window buffer (numpy array). My current approach is ok for detecting fixations but somewhat less for calculating reaction times or analyzing movement characteristics online. I think a somewhat higher sampling rate might add some precision.
Hi,
As Denis says, 60 Hz is not fast enough to react to the onset of a saccade. The gaze processing generally needs to be faster than the display update rate as one can't usefully detect saccade onset based on a comparison of where the eye was 16 ms ago. We need velocity, position, or acceleration thresholds based on multiple successive samples. Even simple position tracking (e.g. just detecting which stimulus a person is looking at), should be based on multiple samples rather than a single instantaneous measurement, because the signal can be noisy, contain blinks, etc. But the good news is that 1250 Hz is indeed overkill, and anything above 200 Hz is sufficient for most saccade-contingent tasks. So as long as the second thread could operate at that sort of rate, everything should be good. i.e. missed samples aren't a problem as long as the samples themselves are accurately time-stamped.
We have done (pre-PsychoPy) effective intrasaccadic stimulus changes on a 60 Hz display with a 200 Hz analog eye tracker, but the speed of the display can be limiting (i.e. one can be limited to dealing with large amplitude saccades whose durations exceed the frame interval). The main sequence relationship can mean that small saccades are completed within one or two refreshes of a 60 Hz screen. Ideally, one should use a faster display (e.g. 100 Hz and above CRT/DLP). LCD displays are still often limited not only by the 60 Hz refresh but also additional absolute lag times.
Cheers,
Mike
After a year or two of putting this off, it only took an hour or two to implement a thread to monitor the real time output from the SMI iView X. Even on my personal laptop, running Mail, Excel and web browsers etc, I had to **slow down** the thread so that it wasn't polling faster than the 1250 Hz UDP stream. Python didn't get above 13% of one core in CPU usage (the Terminal was using about 50% but that may been due to all the printing going on).
That is, in the run event of the monitoring thread:
# method which will run when the thread is called:
def run (self):
while True:
if self.__stop:
break
print (self.receiveNoBlock())
time.sleep(0.0005)
If the time.sleep() call is not there, the thread runs so fast that it gathers many more empty values from the UDP port than actual samples. With the small sleep value above, the mean time between samples was 0.800 ms (i.e. exactly 1250 Hz), with near 0 standard deviation.
iView's ET_STR command (i.e. "start streaming") has an optional sample rate parameter, so one could set it to, say, 500 Hz, put a longer sleep time in the thread, and that would hopefully leave enough time for PsychoPy to do its stuff (setting a lower output rate isn't necessary, though, as we could just sample at a lower rate from the 1250 Hz stream. Each packet contains its own microsecond time stamp, so irregular sampling isn't too problematic).
Have tested this stuff just in Python so far, only with text output of the eye tracker samples. Next step is to wire it into an actual PsychoPy task, extract the eye pixel coordinates and get a gaze-controlled image implemented. That will be the proof of the pudding as to whether simple threads give the necessary performance, or if the multiprocessing approach is necessary.
Cheers,
Mike
On 17.04.2012, at 05:43, Michael MacAskill <michael....@otago.ac.nz> wrote:
>
>> So threads would be a good start, but I you may run into issues if you're wanting to check the eyetracker at a very high rate you'll consume a lot of CPU time and that will affect your rendering. Balancing the load between drawing and rending is likely to be an issue I imagine.
>>
>> There are also libraries to allow you to get around this and use separate cores (multiprocessing or parallel python), but then the communication between your processes needs to be managed. From my quick look at multiprocessing docs (actually available from 2.6 but probably not yet in the Standalone PsychoPy - I'll add it) it looks like that should be reasonably straight forward:
>
> After a year or two of putting this off, it only took an hour or two to implement a thread to monitor the real time output from the SMI iView X. Even on my personal laptop, running Mail, Excel and web browsers etc, I had to **slow down** the thread so that it wasn't polling faster than the 1250 Hz UDP stream. Python didn't get above 13% of one core in CPU usage (the Terminal was using about 50% but that may been due to all the printing going on).
>
> That is, in the run event of the monitoring thread:
>
> # method which will run when the thread is called:
> def run (self):
> while True:
> if self.__stop:
> break
> print (self.receiveNoBlock())
What kind of method is .receiveNoBlock()? Or is it just for demo purposes?
> time.sleep(0.0005)
>
> If the time.sleep() call is not there, the thread runs so fast that it gathers many more empty values from the UDP port than actual samples. With the small sleep value above, the mean time between samples was 0.800 ms (i.e. exactly 1250 Hz), with near 0 standard deviation.
>
Excellent!!!
> iView's ET_STR command (i.e. "start streaming") has an optional sample rate parameter, so one could set it to, say, 500 Hz, put a longer sleep time in the thread, and that would hopefully leave enough time for PsychoPy to do its stuff (setting a lower output rate isn't necessary, though, as we could just sample at a lower rate from the 1250 Hz stream. Each packet contains its own microsecond time stamp, so irregular sampling isn't too problematic).
Good point.
>
> Have tested this stuff just in Python so far, only with text output of the eye tracker samples. Next step is to wire it into an actual PsychoPy task, extract the eye pixel coordinates and get a gaze-controlled image implemented. That will be the proof of the pudding as to whether simple threads give the necessary performance, or if the multiprocessing approach is necessary.
>
> Cheers,
>
> Mike
>
Let's compare results at some point. That would be interesting / helpful.
Cheers, denis
On 17.04.2012, at 00:50, Michael MacAskill <michael....@otago.ac.nz> wrote:
> Hi Denis,
>
> We use the SMI 1250 Hz system too and have been controlling it with PsychoPy for a few years now. That was actually before SMI released their Python SDK, so everything is actually able to be controlled without access to that (and so I've never felt the need to investigate it further, relying on direct UDP messaging).
>
This is good to hear!
> The one outstanding thing for us to do is implement gaze-contingent responses, as our work doesn't really require that at the moment. But I would like to get back into it to examine some ideas in saccadic suppression and adaptation…
>
> When we have discussed how to implement it, it was just by using a simple thread outside the main PsychoPy loop. There would be no need to subclass anything in pyglet etc or to use compiled code. One just monitors the UDP port that iView is sending to, which can be done in pure Python code in a cross-platform way.
This is an approach I havent thought about yet. Sometimes it maybe is good just not to have an API. I should learn more about the UDP port.
> Then one would be getting iView samples at something like the full 1250 Hz speed, and be able to do online saccade/fixation classification, or filter out erroneous position values. Then in PsychoPy itself, there would be no need to do anything faster than the frame rate. i.e. just once a frame, query the thread to get the latest saccade/fixation state or filtered location, but those values would be based on the last n samples, and not just on the latest single sample.
>
> I'm happy to share our existing code which handles direct control of iView, calibration, etc, in pure Python/PsychoPy code. Might also be the spur I need to implement the gaze contingent thread stuff: perhaps we could benefit from each other's work there?
>
I would be happy to have a look at your code and to share what I have done so far to master the SMI using psychopy. If this is of interest for you we could switch to the developpers list and keep on duscussing this over there. Maybe we arrive at some solution worth implementing in the psychopy.hardware module.
Cheers,
Denis
> Chefs,
If any users are getting particularly addicted to the conversation they
can head over there! ;-)
http://groups.google.com/group/psychopy-dev/browse_thread/thread/cd4ad31bb68bc1c4
Jon
--
> # method which will run when the thread is called:
> def run (self):
> while True:
> if self.__stop:
> break
> print (self.receiveNoBlock())
> time.sleep(0.0005)
>
> If the time.sleep() call is not there, the thread runs so fast that it gathers many more empty values from the UDP port than actual samples.
Maybe I'm being thick here, but couldn't you ditch the sleep() and do
a blocking recv()?
-n