The rest of the below is about the notion of threading/multiprocess.
I think using the multiprocessing module is going to be the way to go
(and it looks like it /is/ already included on the Standalone
distribution). I expect any code you've already written to work on
multiple threads will translate really easily to that module and run on
multiple cores, and would have a performance advantage whenever more
than one core is available (nearly all recent machines). On a single
core machine there might be a very slight additional overhead, but
otherwise it will operate just as a second thread alternating for
control of the core. Potentially on a second core you could do some more
detailed analysis of the inputs and just send back the result when
needed, or simply keep storing the current result in some shared location.
Having had a little play with this it looks fairly straightforward and I
might well add a class to hardware/__init__ like
class AsyncProcess():
def addFunction(self,f,args=[]):
"""add a function that you want to run as part of the
process. Multiple functions could be run in turn
"""
def readOutputs(self):
"""read the contents of an output buffer, containing
anything that the functions have returned in running
"""
Already I'm thinking that this is going to be the easiest way to handle
the new methods for fetching keyboard events too in a separate process.
Jon
On 17/04/2012 04:43, Michael MacAskill wrote:
>> So threads would be a good start, but I you may run into issues if you're wanting to check the eyetracker at a very high rate you'll consume a lot of CPU time and that will affect your rendering. Balancing the load between drawing and rending is likely to be an issue I imagine.
>>
>> There are also libraries to allow you to get around this and use separate cores (multiprocessing or parallel python), but then the communication between your processes needs to be managed. From my quick look at multiprocessing docs (actually available from 2.6 but probably not yet in the Standalone PsychoPy - I'll add it) it looks like that should be reasonably straight forward:
> After a year or two of putting this off, it only took an hour or two to implement a thread to monitor the real time output from the SMI iView X. Even on my personal laptop, running Mail, Excel and web browsers etc, I had to **slow down** the thread so that it wasn't polling faster than the 1250 Hz UDP stream. Python didn't get above 13% of one core in CPU usage (the Terminal was using about 50% but that may been due to all the printing going on).
>
> That is, in the run event of the monitoring thread:
>
> # method which will run when the thread is called:
> def run (self):
> while True:
> if self.__stop:
> break
> print (self.receiveNoBlock())
> time.sleep(0.0005)
>
> If the time.sleep() call is not there, the thread runs so fast that it gathers many more empty values from the UDP port than actual samples. With the small sleep value above, the mean time between samples was 0.800 ms (i.e. exactly 1250 Hz), with near 0 standard deviation.
>
> iView's ET_STR command (i.e. "start streaming") has an optional sample rate parameter, so one could set it to, say, 500 Hz, put a longer sleep time in the thread, and that would hopefully leave enough time for PsychoPy to do its stuff (setting a lower output rate isn't necessary, though, as we could just sample at a lower rate from the 1250 Hz stream. Each packet contains its own microsecond time stamp, so irregular sampling isn't too problematic).
>
> Have tested this stuff just in Python so far, only with text output of the eye tracker samples. Next step is to wire it into an actual PsychoPy task, extract the eye pixel coordinates and get a gaze-controlled image implemented. That will be the proof of the pudding as to whether simple threads give the necessary performance, or if the multiprocessing approach is necessary.
>
> Cheers,
>
> Mike
>
>
--
Jonathan Peirce
Nottingham Visual Neuroscience
This message and any attachment are intended solely for the addressee and may contain confidential information. If you have received this message in error, please send it back to me, and immediately delete it. Please do not use, copy or disclose the information contained in this message or in any attachment. Any views or opinions expressed by the author of this email do not necessarily reflect the views of the University of Nottingham.
This message has been checked for viruses but the contents of an attachment
may still contain software viruses which could damage your computer system:
you are advised to perform your own checks. Email communications with the
University of Nottingham may be monitored as permitted by UK legislation.
> Maybe I'm being thick here, but couldn't you ditch the sleep() and do
> a blocking recv()?
For testing purposes, by not blocking, I was able to see how fast the thread would max out, which seemed to be about 10 times as fast as the 1250 Hz stream (i.e. there were about 10 null receives for every real packet). It also let me simulate (with larger values of the sleep time) what would happen if the thread was only able to sample slower than 1250 Hz due to competing activities.
Agree that blocking receives would easier to deal with as it would avoid still receiving occasional null packets, and would probably be the way to go when doing this for real. But the blocking is a black box to me: I have no idea whether that would still entail a very tight (but hidden under the hood) polling loop until a valid packet is received. The control freak in me likes explicitly releasing time to other activities via sleep. But no doubt the under-the-hood stuff is documented somewhere.
Cheers,
Mike
> I would be happy to have a look at your code and to share what I have done so far to master the SMI using psychopy. If this is of interest for you we could switch to the developpers list and keep on duscussing this over there. Maybe we arrive at some solution worth implementing in the psychopy.hardware module.
Hi Denis,
Here is some simple code for just displaying the data stream. We have a lot more for detailed control of the eye tracker which I'll send you directly as won't be of interest to most others.
As Jon suggests, we should shift this to multiprocessing for its various advantages.
****First file: test.py:****
# test gaze thread
import gazeContingent
import time
iViewThread = gazeContingent.GazeContingent()
iViewThread.start()
time.sleep(5)
"""just sit here for n seconds monitoring the data stream. In practice,
this is where PsychoPy would be doing its stuff, hopefully yielding sufficient
time to the thread.
You should see n seconds worth of data being printed to the terminal.
Need to add code to the thread to extract the gaze position, detect saccades,
etc, and allow these to be queried periodically."""
# then signal the thread to stop gracefully:
iViewThread.stop()
****Second file: named gazeContingent.py ****
# Class to monitor iView eye tracker in real time in a separate thread
# =====================================================================
#
# Example Usage:
#
# import gazeContingent
# gazeThread = gazeContingent.GazeContingent()
# gazeThread.run()
# #do something here
# gazeThread.stop()
#
# 2012-04-17: v0.1: initial implementation
#
# Written by Michael MacAskill <michael....@nzbri.org>
import threading # this class is a thread sub-class
import socket # it listens on a UDP port
import time # for the sleep function
class GazeContingent(threading.Thread):
# initialise:
# use defaults appropriate to your lab IP addresses and ports:
def __init__(self, port = 5555, iViewIP = "192.168.110.63", iViewPort = 4444):
# UDP port to listen for iView data on.
# Set iView software to duplicate stream to this port number so that we don't conflict with
# the listening and sending on the main port number.
# Ports that we send and receive on:
self.port = port
self.iViewPort = iViewPort
# address to send some messages to iView
self.iViewIP = iViewIP
# Bind to all interfaces:
self.host = '0.0.0.0'
# Setup the socket:
self.sock = socket.socket( socket.AF_INET, socket.SOCK_DGRAM )
# The size of the buffer we use for receiving:
self.buffer = 4096
# Bind to the local host and port
self.sock.bind((self.host,self.port))
# get iView to start streaming data
self.send('ET_FRM "%ET %TU %SX %SY"') # set the format of the datagram (see iView manual)
self.send('ET_STR') # start streaming (can also add optional integer to specify rate)
# create self as a thread
threading.Thread.__init__(self)
self.__stop = False
def receiveNoBlock(self):
# copied from iview.py
""" Get any data that has been received
If there is no data waiting it will return immediately
Returns data (or 0 if nothing)"""
self.sock.setblocking(0)
try:
data = self.sock.recv(self.buffer)
except Exception:
return 0
else:
return data
def receiveBlock(self):
# copied from iview.py
""" Get any data that has been received or wait until some is received
If there is no data waiting it will block until some is received
Returns data"""
self.sock.setblocking(1)
data = self.sock.recv(self.buffer)
return data
def send(self, message):
# send msgs to iView when required
#Note: iview requires a newline (\n)
entire_message = message+"\n"
try:
self.sock.sendto( entire_message, (self.iViewIP, self.iViewPort) )
except:
print "Could not send UDP message"
print(message)
# method which will run when the thread is called:
def run (self):
i=0
while True:
if self.__stop:
break
i = i + 1
print (i, self.receiveNoBlock())
# could receive with block and skip the sleep()
time.sleep(0.0005)
# so caller can ask for the thread to stop monitoring:
def stop (self):
self.send("ET_EST") # tell iView to stop streaming
self.__stop = True # the run() method monitors this flag
import threading # this class is a thread sub-class
Michael R. MacAskill, PhD michael.maca...@nzbri.org