-----------------------------------------------------
Gary Lupyan - lup...@wisc.edu
Assistant Professor of Psychology
University of Wisconsin, Madison
http://mywebspace.wisc.edu/lupyan/web/
-----------------------------------------------------
> --
> You received this message because you are subscribed to the Google Groups "psychopy-users" group.
> To post to this group, send email to psychop...@googlegroups.com.
> To unsubscribe from this group, send email to psychopy-user...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/psychopy-users?hl=en.
>
>
I'd love to support more hardware like this natively in PsychoPy (e.g.
on the Builder) but;
a) I (or someone) need the device to test on
b) it takes a while to write the code for each device
On day...
cheers,
Jon
--
Dr. Jonathan Peirce
Nottingham Visual Neuroscience
This message and any attachment are intended solely for the addressee and may contain confidential information. If you have received this message in error, please send it back to me, and immediately delete it. Please do not use, copy or disclose the information contained in this message or in any attachment. Any views or opinions expressed by the author of this email do not necessarily reflect the views of the University of Nottingham.
This message has been checked for viruses but the contents of an attachment
may still contain software viruses which could damage your computer system:
you are advised to perform your own checks. Email communications with the
University of Nottingham may be monitored as permitted by UK legislation.
We've got an Eyelink here in the lab and I'm in the process of writing
code for an eye-tracking experiment. I will post updates.
Manuel
--
Manuel Spitschan
UG Research Assistant/Lab Programmer
University of St Andrews
Vision Lab
School of Psychology
South Street
St Andrews
KY16 9JP
E-mail: ms...@st-andrews.ac.uk
A good way to start would be to download the Pylink library from the SR
website [1] -- note that you a login is required to access the file. There
are some examples in the files provided but if I recall correctly, they're
for pygame and VisionEgg. But this doesn't mean anything, because the
Pylink library doesn't care which software you use for stimulus display.
So if you have a look at the example code there, this might be a good
starting point, because you could just take all the pygame/VisionEgg out
and replace it with your PsychoPy code.
Eventually you will also have to come to grips with EyeLink API; and it
obviously depends on the type of experiment you're running, that is, if
you just want to record eye movements locked to time (stimulus onset), or
if you want to do gaze- or saccade-contingent stuff.
[1] https://www.sr-support.com/forums/showthread.php?t=14
One things I did struggle with is forcing it to do drift correction on
demand, i.e., how do you force it to realign the track based on the
error between target position and eye position?
-----------------------------------------------------
Gary Lupyan - lup...@wisc.edu
Assistant Professor of Psychology
University of Wisconsin, Madison
http://mywebspace.wisc.edu/lupyan/web/
-----------------------------------------------------
> To unsubscribe from this group, send email to psychopy-user...@googlegroups.com.
After playing around with this for quite some time now, I think the best
way to integrate Pylink in Psychopy is to use the Pylink module by SR
Research from the SR Research forum (log-in required), have the
calibration done with the pygame code supplied by SR Research, and only
then move to the main experiment in PsychoPy (as suggested by Britt
earlier). That is, one way would be to take their example code, and strip
it down to the bare minimum -- namely the commands to communicate with the
host PC, and the calibration stuff -- and do everything else in PsychoPy.
Prior to the experiment, you have to define the file name of the EDF file
in which all data is recorded using
pylink.getEYELINK().openDataFile(edf_file_name). Note that this has an 8
character limit (in total, i.e. with the file extension).
You can start the real time mode with pylink.beginRealTimeMode(100) and
start recording with pylink.getEYELINK().startRecording(1, 1, 1, 1).
If you want to recalibrate every N trials, and do drift correction every M
trials, I would put a conditional statement at the beginning of the trials
loop that executes the pygame stuff.
If you want to do gaze-contingent stuff, the best way would be by defining
a function get_current_gaze that returns gaze x-y and gets the most recent
sample from the EyeLink with pylink.getEYELINK().getNewestSample() --
which checks for a sample update -- and el_sample.getRightEye().getGaze()
and el_sample.getLeftEye().getGaze() -- which gets the gaze position.
Prior to this, and contigent on whether you do monocular or binocular
tracking, you want to check which eyes are available using
pylink.getEYELINK().eyeAvailable().
A convenient way to record, mark or tag events in the EyeLink EDF file is
by using pylink.getEYELINK().sendMessage("$MESSAGE"). This adds an event
in the FEVENT structure of the EDF file containing $MESSAGE.
After the whole experimental session, you need to close the link to the
eye tracker and receive the data file:
| # End the tracking with a delay of 600 mseconds
| pylink.endRealTimeMode()
| pylink.getEYELINK().setOfflineMode()
| pylink.msecDelay(600)
| # Close file on Eyelink
| pylink.getEYELINK().closeDataFile()
| pylink.getEYELINK().receiveDataFile(edf_file_name, edf_file_name_out)
| pylink.getEYELINK().close()
This is fairly technical info, probably not too helpful for beginners but
hopefully it will be of use for some.
Best,
Manuel
I agree, it would be nice to have this functionality, to add eye tracker
events in the builder. I never use the builder though, so I would first
have to figure out how that works.