I am just getting started with my Pupil Labs headset and have been reading through the Docs. I am looking to extract real-time data of where the user's eyes are looking in a python script. For example, I want to start with some very basic real-time feature detection, such as "Looking to the left" or "Looking up" or "Looking down" etc. Are there scripts in the pupil helpers that will help me achieve this goal? Where should I start?
Thank you so much,
Corten
To be sure, I just need to extract the xy-coordinates of the pupil, so I can determine the gaze direction in my own script.