eye tracking vs mouse tracking

232 views
Skip to first unread message

j.vo...@rug.nl

unread,
Jul 9, 2020, 1:36:58 PM7/9/20
to Online Experiments
Dear all,

Thank you for a very helpful workshop so far. I had been planning several VWP eye-tracking studies when COVID struck, and I'm looking for ways in which I could continue these studies online if necessary. I've looked into WebGazer a bit, but as Joshua pointed out on Tuesday, things such as lighting are difficult to control, and accuracy will not be up to the standards of actual eye trackers. Therefore I'm wondering whether mouse tracking would be a good alternative. Does anyone have experience with that? How does it compare to (online) eye tracking in terms of quality/accuracy of data and RT lags? What would be things to keep in mind when doing mouse tracking online?

Also, are there any software packages currently including support for online mouse tracking? Joshua mentioned that you could create your own jsPsych plugin, but how easy is that for someone with only basic programming skills?

Connie Bainbridge

unread,
Jul 9, 2020, 1:51:33 PM7/9/20
to Online Experiments
Hello,

It looks like this JavaScript library would make customizing a plugin like this much easier than coding from scratch: https://github.com/ineventapp/musjs

As far as the raw data goes, mouse tracking would certainly bypass the major accuracy variability one would face with eye-tracking. This paper looked into comparing eye-tracking and mouse tracking behaviors, which do seem to have a strong relationship: https://dl.acm.org/doi/abs/10.1145/634067.634234

I'm curious to see if anyone else can chime in with more direct experience with mouse tracking in online studies!

Best,
-Connie

Josh de Leeuw

unread,
Jul 9, 2020, 2:00:56 PM7/9/20
to Online Experiments
I think the correlation between eye movements and mouse movements will depend quite a bit on the task. In a visual world paradigm (at least the canonical ones that I'm familiar with -- not an expert here!), there's not much built-in incentive for participants to actively track with the mouse. I suspect that without modifying the task there wouldn't be very much mouse movement. 

It looks like lab.js may have some mouse tracking capability via this github repository, but I'm having a hard time finding documentation. 

It also looks like PsychoPy's online translator can handle mouse tracking with some work. See this thread.

Adding mouse tracking to existing jsPsych plugins would not be a terribly difficult task for a relatively new programmer. JavaScript is already set up to report the location of the mouse, so it would only require capturing this data and storing it in the trial data. If you'd like to pursue that and want some help I'd encourage you to ask about it over at the jsPsych discussion forum.

On Thursday, July 9, 2020 at 1:36:58 PM UTC-4, j.vo...@rug.nl wrote:

Joshua Hartshorne

unread,
Jul 9, 2020, 2:03:35 PM7/9/20
to Online Experiments
To build on Josh's point, I don't know if there has been systematic work on this, but anecdotally, you get much stronger eyetracking effects in the visual world paradigm if subjects have to physically interact with the target object. (This could just be pointing to the target.) By analogy, I would expect this to be true of mousetracking as well. You'll probably want to make sure subjects have to interact with the stimuli using the mouse. 

Of cousre, what this looks like may depend on the nature of your experiment.

j.vo...@rug.nl

unread,
Jul 9, 2020, 2:47:50 PM7/9/20
to Online Experiments
Thanks all for the useful references. One of my studies is about reference resolution, so I could indeed instruct participants to click on the correct referent, and see whether their mouse trajectory diverges in the direction of one of the distractors.

Another study is about reference production in interaction, so I would need to combine the mouse tracking with both audio recording and a video/audio connection with another participant. Is it feasible to have these three things running in parallel, or could there be interference of some kind (e.g. on RT measurements)?

Best,
Jorrig

Josh de Leeuw

unread,
Jul 9, 2020, 3:13:21 PM7/9/20
to Online Experiments
Hi Jorrig,

I don't know if there is anyone who has combined live videoconferencing with other experimental elements like mouse tracking. In theory, I think this is possible given the existence of sites like Omegle that create interactive video chats in a browser. I don't see any in principle reason why this couldn't be combined with an experiment on the same page. But, this would definitely be hard and require a substantial amount of development work. 

One critical factor here is whether you need to synchronize the video conferencing with other experimental events. If you only require a very loose synchronization, then one option would be to use standard video conferencing software like Zoom for the communication and implement the experiment in existing experiment software, like jsPsych. You could have these two windows open in parallel. If you need participants to interact with each other outside the video call, like seeing each other's mouse movements, then there are some platforms that are developed for group-interaction studies, including oTree, Empirica, and NodeGame.

Cheers,
Josh
Reply all
Reply to author
Forward
0 new messages