eye-link II eye-tracker

1,539 views
Skip to first unread message

shark_scott

unread,
Jan 28, 2011, 2:09:04 AM1/28/11
to psychopy-users
Hi,
I was wondering if there is any plan to integrate communication
with the eye-link II eye-tracker from SR Research. They already have
a component written in Python. My supervisor wants to run an eye-
tracking experiment with this particular eye-tracker and I thought
Psychopy would be the best bet for running it.
Thanks!
Mark

Gary Lupyan

unread,
Jan 28, 2011, 2:10:36 AM1/28/11
to psychop...@googlegroups.com
Mark, you can just use the Pylink library to control the eyetracker
and use psychopy functions to control stimulus presentation/collect
responses.

-----------------------------------------------------
Gary Lupyan - lup...@wisc.edu
Assistant Professor of Psychology
University of Wisconsin, Madison
http://mywebspace.wisc.edu/lupyan/web/
-----------------------------------------------------

> --
> You received this message because you are subscribed to the Google Groups "psychopy-users" group.
> To post to this group, send email to psychop...@googlegroups.com.
> To unsubscribe from this group, send email to psychopy-user...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/psychopy-users?hl=en.
>
>

Jonathan Peirce

unread,
Jan 28, 2011, 6:20:51 AM1/28/11
to psychop...@googlegroups.com
Exactly as Gary says. If you're using the Standalone PsychoPy then
you'll need to know how to download and add that module to the psychopy
path (making it accessible to PsychoPy scripts). For info on that see
this thread for now:
http://groups.google.com/group/psychopy-users/browse_thread/thread/9a4eb71e0f418d9a

I'd love to support more hardware like this natively in PsychoPy (e.g.
on the Builder) but;
a) I (or someone) need the device to test on
b) it takes a while to write the code for each device

On day...

cheers,
Jon

--
Dr. Jonathan Peirce
Nottingham Visual Neuroscience

http://www.peirce.org.uk/


This message and any attachment are intended solely for the addressee and may contain confidential information. If you have received this message in error, please send it back to me, and immediately delete it. Please do not use, copy or disclose the information contained in this message or in any attachment. Any views or opinions expressed by the author of this email do not necessarily reflect the views of the University of Nottingham.

This message has been checked for viruses but the contents of an attachment
may still contain software viruses which could damage your computer system:
you are advised to perform your own checks. Email communications with the
University of Nottingham may be monitored as permitted by UK legislation.

Manuel Spitschan

unread,
Jan 28, 2011, 7:06:02 AM1/28/11
to psychop...@googlegroups.com
On Fri, 28 Jan 2011, Jonathan Peirce wrote:
> I'd love to support more hardware like this natively in PsychoPy (e.g. on the
> Builder) but;
> a) I (or someone) need the device to test on
> b) it takes a while to write the code for each device

We've got an Eyelink here in the lab and I'm in the process of writing
code for an eye-tracking experiment. I will post updates.

Manuel

--
Manuel Spitschan
UG Research Assistant/Lab Programmer

University of St Andrews
Vision Lab
School of Psychology
South Street
St Andrews
KY16 9JP

E-mail: ms...@st-andrews.ac.uk

shark_scott

unread,
Jan 29, 2011, 12:25:17 PM1/29/11
to psychopy-users
Hi all,
Thanks very much for the information - sounds like the task will
be less daunting than I feared. Manuel, it's great to hear you'll
have some code that will help - can't wait to take a look at it -
thanks!
Mark

Manuel Spitschan

unread,
Jan 29, 2011, 12:42:23 PM1/29/11
to psychopy-users
On Sat, 29 Jan 2011, shark_scott wrote:
> Thanks very much for the information - sounds like the task will
> be less daunting than I feared. Manuel, it's great to hear you'll
> have some code that will help - can't wait to take a look at it -
> thanks!

A good way to start would be to download the Pylink library from the SR
website [1] -- note that you a login is required to access the file. There
are some examples in the files provided but if I recall correctly, they're
for pygame and VisionEgg. But this doesn't mean anything, because the
Pylink library doesn't care which software you use for stimulus display.
So if you have a look at the example code there, this might be a good
starting point, because you could just take all the pygame/VisionEgg out
and replace it with your PsychoPy code.

Eventually you will also have to come to grips with EyeLink API; and it
obviously depends on the type of experiment you're running, that is, if
you just want to record eye movements locked to time (stimulus onset), or
if you want to do gaze- or saccade-contingent stuff.

[1] https://www.sr-support.com/forums/showthread.php?t=14

shark_scott

unread,
Feb 1, 2011, 9:02:02 PM2/1/11
to psychopy-users
Hi,
Thanks for the information - I'm a ways away from getting into the
guts of the API, but I'll have to take a look in a while. I have
access to the website, so I'll be trying to get to grips with it.
Thanks!
Mark

He Jibo

unread,
Feb 1, 2011, 9:05:07 PM2/1/11
to psychop...@googlegroups.com
I would like to hear the solutions to combine psychopy with pylink. I have read through pylink's example code for several times, and used their Experimental Builder for three years. Let me know you run into any problem.

---------------------------
He Jibo
Department of Psychology,
Beckman Institute for Advanced Science and Technology
University of Illinois, Urbana Champaign,
603 East Daniel St.,
Champaign, IL 61820
website: http://hejibo.appspot.com/file/index.html



Britt

unread,
Feb 2, 2011, 9:53:34 AM2/2/11
to psychopy-users
I have recently begun using psychopy and the eyelink. I am not an
expert, but I have gotten things to work. If you are an expert, you
won't learn anything here. If you are a fellow amateur, here are some
thoughts on my recent experience.

Eyelink will talk to all platforms: windows, os X, and linux,but I
have only used it with windows and linux.

Windows is much easier and even if, like me, you plan on using things
under linux, it will help you get up to speed if you can begin playing
on the windows side of a dual boot machine. Also, I have not gotten
the Eyelink companion program (Dataviewer) to work on the linux side.
Even if you don't plan on using that in the long term, it helps to get
you started. There is a program that runs under linux (http://
www.cogsci.nl/software/elascii) that will do some of the same things.

Another advantage of beginning to "play" under Windows is that you can
easily project the tracker computer's screen onto your windows display
machine, but this doesn't seem to work for me under linux because of
some missing adobe fonts.

The Eyelink system has a reasonable learning curve. If you are new to
Eyelink and Pylink, then you will be advised to stay on the Windows
side first with the Eyelink example programs, like Track. Once you
are comfortable with Eyelink itself, the transition to python will be
much easier.

The pylink module available from sr-research has different versions
for different versions of python. Make sure that you get the right
one. Once you have it make sure you put the pylink module folder
somewhere where your version of python that runs psychopy can find it.
You also need to rename the folder to pylink, instead of pylink2.X or
whatever. (N.B. Although all the automated, standalone psychopy
installers were great for me in the beginning, I now just install all
the dependencies, grab the psychopy.zip file, expand it and do an old
fashion python setup.py configure ... to put things where I can find
them (this may only work for me, because I never use any of the
builder features. If you use the stand-alone version there will be a
lot of people who will be able to tell you exactly where to move the
pylink folder).

The easiest way to check if you have the right version of pylink
installed and in the right place is to open a python interpreter from
a terminal or command window and "import pylink". If that doesn't give
you an error, you are over one hurdle. If it can't find it, you need
to update the path python looks on (you can do this with the sys
module or by moving the pylink folder). If you have the wrong version
of pylink for your version of python you will get an error that
mentions __init__. I think I also had to create one symlink for a
library eyelink wanted that was older than what I had.

If you do all the above, then you are mostly there. The basic method
I use is to run calibration, validation before I create the
visual.Window. Then you can use pygame or pyglet or whatever you
prefer. For me this means taking my psychopy script and adding import
pylink as pl. Then the basic recipe is well illustrated in the
examples that come with the pylink archive.

Basically,
el = pl.EyeLink() #creates an instance of the EyeLink object
then you send commands to tell the eyelink what the screen size is and
parse method is. You might choose to change some calibration
parameters or you might not. Then you can start the calibration with
el.doTrackerSetup(). After that is done, el.setOfflineMode()

Now, you create your psychopy window, and for each trial in your
protocol you start by making sure you are connected to the eyetracker
and start recording.

At the end of the trial stop recording and put in offline mode. When
you loop to the next trial all starts again.

This is the most basic. You can do drift correction at the start of
each trial (in the examples folder from pylink is a visionegg program
that does this, the commands have nothing to do with visionegg, I am
pointing you to the example program to look at).

The other decision to make is to decide if you want to output data
from your psychopy program to the eye tracker or do you want to grab
data from the eyetracker into your psychopy program. You do the former
by sending messages. For example, if you record rt in a variable RT,
you might do,

rtmsg = 'trialRT = %.4f' %RT
el.sendMessage(rtmsg)

This formats a string (you can only send strings as messages), and
then sends it to the eyetracker. It will end up as a time stamped line
in the eyetracker EDF file. If you do this for all your important
output you can get everything in the EDF file all interwoven with the
eyetracking data.

If your protocol requires gaze contingent actions, then you have no
choice but to grab eyelink data into your program. For that there are
different commands like getNextSample() and getLatestSample() (or for
saccades and fixations and blinks: Events). These are all detailed in
the pylink manual pdf. You will then have to parse what you get back
to find the data you want (which eye, pixel coordinates (GAZE) or head
referenced rotations (HREF)) and convert it into units that will mean
something to psychopy. Psychopy does have conversion functions that
will take you from pixels to other units that you can use once you get
the data you want.

It is rather clunky, and while it may seem a little complicated to get
going, once you have even a trivial example working, you can mostly
cut and paste into any psychopy script you have and it will work.

Mostly, I would like to encourage others to give it a try so that we
can have a bit more of a community trading experiences and examples.

Cheers,
Britt



On Feb 1, 9:05 pm, He Jibo <hej...@gmail.com> wrote:
> I would like to hear the solutions to combine psychopy with pylink. I have
> read through pylink's example code for several times, and used their
> Experimental Builder for three years. Let me know you run into any problem.
>
> ---------------------------
> He Jibo
> Department of Psychology,
> Beckman Institute for Advanced Science and Technology
> University of Illinois, Urbana Champaign,
> 603 East Daniel St.,
> Champaign, IL 61820
> website:http://hejibo.appspot.com/file/index.html
>
> > psychopy-user...@googlegroups.com<psychopy-users%2Bunsu...@googlegroups.com>
> > .

Gary Lupyan

unread,
Feb 2, 2011, 1:18:50 PM2/2/11
to psychop...@googlegroups.com
Thanks Britt! I've used pylink w/ Visionegg and it worked quite well.
Have since transitioned to psychopy and will post a couple code
examples when I port the eyetracker code (I've an Eyelink 1000, but
it should all work with Eyelink 2)

One things I did struggle with is forcing it to do drift correction on
demand, i.e., how do you force it to realign the track based on the
error between target position and eye position?

-----------------------------------------------------
Gary Lupyan - lup...@wisc.edu
Assistant Professor of Psychology
University of Wisconsin, Madison
http://mywebspace.wisc.edu/lupyan/web/
-----------------------------------------------------

> To unsubscribe from this group, send email to psychopy-user...@googlegroups.com.

Manuel Spitschan

unread,
Mar 21, 2011, 5:05:22 PM3/21/11
to psychopy-users
> On Feb 1, 9:05�pm, He Jibo <hej...@gmail.com> wrote:
>> I would like to hear the solutions to combine psychopy with pylink. I have
>> read through pylink's example code for several times, and used their
>> Experimental Builder for three years. Let me know you run into any problem.

After playing around with this for quite some time now, I think the best
way to integrate Pylink in Psychopy is to use the Pylink module by SR
Research from the SR Research forum (log-in required), have the
calibration done with the pygame code supplied by SR Research, and only
then move to the main experiment in PsychoPy (as suggested by Britt
earlier). That is, one way would be to take their example code, and strip
it down to the bare minimum -- namely the commands to communicate with the
host PC, and the calibration stuff -- and do everything else in PsychoPy.

Prior to the experiment, you have to define the file name of the EDF file
in which all data is recorded using
pylink.getEYELINK().openDataFile(edf_file_name). Note that this has an 8
character limit (in total, i.e. with the file extension).

You can start the real time mode with pylink.beginRealTimeMode(100) and
start recording with pylink.getEYELINK().startRecording(1, 1, 1, 1).

If you want to recalibrate every N trials, and do drift correction every M
trials, I would put a conditional statement at the beginning of the trials
loop that executes the pygame stuff.

If you want to do gaze-contingent stuff, the best way would be by defining
a function get_current_gaze that returns gaze x-y and gets the most recent
sample from the EyeLink with pylink.getEYELINK().getNewestSample() --
which checks for a sample update -- and el_sample.getRightEye().getGaze()
and el_sample.getLeftEye().getGaze() -- which gets the gaze position.
Prior to this, and contigent on whether you do monocular or binocular
tracking, you want to check which eyes are available using
pylink.getEYELINK().eyeAvailable().

A convenient way to record, mark or tag events in the EyeLink EDF file is
by using pylink.getEYELINK().sendMessage("$MESSAGE"). This adds an event
in the FEVENT structure of the EDF file containing $MESSAGE.

After the whole experimental session, you need to close the link to the
eye tracker and receive the data file:

| # End the tracking with a delay of 600 mseconds
| pylink.endRealTimeMode()
| pylink.getEYELINK().setOfflineMode()
| pylink.msecDelay(600)
| # Close file on Eyelink
| pylink.getEYELINK().closeDataFile()
| pylink.getEYELINK().receiveDataFile(edf_file_name, edf_file_name_out)
| pylink.getEYELINK().close()

This is fairly technical info, probably not too helpful for beginners but
hopefully it will be of use for some.

Best,

Manuel

shark_scott

unread,
Mar 23, 2011, 1:48:12 AM3/23/11
to psychopy-users
That's a lot of very useful information. I'll be going through it
with my labmates as we put together the experiment. Thanks very much
for sharing this.

I should mention - though it feels somehow disloyal to PsychoPy - that
the OpenSesame software that was mentioned on this site recently has
an add-on for using the Eyelink II so that you can drag and drop icons
into your experiment to run eye-tracker events (like calibrating,
logging...). There was some suggestion that the PsychoPy and
OpenSesame teams might somehow collaborate; sounds like a dream come
true.
Mark

On Mar 21, 2:05 pm, Manuel Spitschan <ms...@st-andrews.ac.uk> wrote:

Manuel Spitschan

unread,
Mar 23, 2011, 6:48:14 AM3/23/11
to psychopy-users
On Tue, 22 Mar 2011, shark_scott wrote:
> That's a lot of very useful information. I'll be going through it
> with my labmates as we put together the experiment. Thanks very much
> for sharing this.
>
> I should mention - though it feels somehow disloyal to PsychoPy - that
> the OpenSesame software that was mentioned on this site recently has
> an add-on for using the Eyelink II so that you can drag and drop icons
> into your experiment to run eye-tracker events (like calibrating,
> logging...). There was some suggestion that the PsychoPy and
> OpenSesame teams might somehow collaborate; sounds like a dream come
> true.

I agree, it would be nice to have this functionality, to add eye tracker
events in the builder. I never use the builder though, so I would first
have to figure out how that works.

Sebastiaan Mathot

unread,
Mar 29, 2011, 11:07:21 AM3/29/11
to psychopy-users
OpenSesame (of which I'm the author) indeed has experimental support
for the Eyelink, but much of that code could also be used in PsychoPy.
The graphical part of the plug-in is specific to OpenSesame, of
course, but the plug-in includes a fairly straight-forward library.
There is a tutorial (see link below) for getting the PyLink libraries
installed (under Windows, which is what we use on the Eyelink PC). You
can download the Eyelink plug-in and extract libeyelink.py from the
eyelink_calibrate folder and use this in PsychoPy.

http://www.cogsci.nl/blog/tutorials/125-installing-the-opensesame-eyelink-plug-ins

As a final point, because SR Research felt it necessary to include
their own custom version of SDL, you will need to import pylink before
pygame, otherwise it won't work. For the same reason, you may suffer
stability issues, but I definitely found it usable. I hope this is of
some help to anyone!

shark_scott

unread,
Apr 6, 2011, 8:26:36 PM4/6/11
to psychopy-users
Hi,
Thanks for the information - I'm still trying to get our lab's eye-
tracker to work at all at the moment (having a problem with drivers),
once that's up and running though I may end up using your suggestion,
since I'm already familiar with PsychoPy (and like it very much) - it
may be easiest to use the code developed by the OpenSesame plug-in and
insert it into PsychoPy.
Thanks!
Mark

On Mar 29, 8:07 am, Sebastiaan Mathot <ceebassmu...@gmail.com> wrote:
> OpenSesame (of which I'm the author) indeed has experimental support
> for the Eyelink, but much of that code could also be used in PsychoPy.
> The graphical part of the plug-in is specific to OpenSesame, of
> course, but the plug-in includes a fairly straight-forward library.
> There is a tutorial (see link below) for getting the PyLink libraries
> installed (under Windows, which is what we use on the Eyelink PC). You
> can download the Eyelink plug-in and extract libeyelink.py from the
> eyelink_calibrate folder and use this in PsychoPy.
>
> http://www.cogsci.nl/blog/tutorials/125-installing-the-opensesame-eye...

ccbd....@gmail.com

unread,
Nov 27, 2014, 7:38:26 AM11/27/14
to psychop...@googlegroups.com
Hi, 
I programed the the project with psychopy and I have a problem when do ‘getEYELINK().doTrackerSetup()’: I can not get the window from the Eyelink 1000. (but I do can get it when I program with python). Now, I want to make sure if I can do that in psychopy or I must do that with some other ways. Appreciate for any messages or suggestions from you.
Bests,
Jack

Sol Simpson

unread,
Nov 28, 2014, 4:51:37 PM11/28/14
to psychop...@googlegroups.com
Just to confirm my understanding based on your message: you are using the pylink package directly, and not the psychopy.iohub common eye tracker interface, to connect to the eyelink system?

Hause

unread,
Sep 14, 2015, 9:56:33 PM9/14/15
to psychopy-users
Hi,

This is a relatively old thread, but I'm now trying to get Eyelink 1000 eye tracker to work with PsychoPy and pylink. Before creating a window in PsychoPy, I run pylink.getEYELINK().doTrackerSetup() and the eye tracker's setup display doesn't show up on the display computer (but only on the eye tracker PC). This means I can't calibrate or validate or do anything. If I press 'Exit Setup' on the eye tracker PC, my PsychoPy experiment continues running smoothly and the eye tracker collects data too. Wonder if anyone has found a solution or know why this might be happening? Really appreciate if anyone can help. Thanks very much.

Hause

Sol Simpson

unread,
Sep 21, 2015, 9:43:50 AM9/21/15
to psychopy-users
I presume your later post replaces this one; let me know if this is incorrect. Thanks.

Hause

unread,
Sep 21, 2015, 9:51:16 AM9/21/15
to psychopy-users
Yes, Sol. Rather than trying to get pylink to work, I'm trying the iohub solution now. Though I'd happy to use either pylink directly or iohub if I can get either to start calibration (sending messages during the experiment works now).
Reply all
Reply to author
Forward
0 new messages