An improved version of this experiment can be found here.

Eye tracking via a webcam

This experiment demonstrates how to conduct eye tracking via a webcam using the webgazer library. The experiment consists of the following phases:

  • loading, which downloads the required library (webgazer) and waits until it is ready
  • webcam, which asks the participant for permission to use their webcan and starts the eye tracking
  • calibration, which calibrates the eye tracker by mapping the paritipant's gaze to points on the screen that they you need to click
  • tracking, which shows a square that follows the participant's gaze

How good is it?

Disclaimer: I built this experiment to learn more about integrating complicated software libraries into PsychoJS. I picked eye tracking because it is a popular method and to showcase how flexible PsychoJS is. On the route, I learned some basic things about eye tracking, but I never actually conducted a study with it. Having said this, here are some things I noticed:

  • To yield accurate results your participant needs to keep their head still
  • The eye tracking can require quite a lot of processing power, so it could be tough on older computers
  • Good calibration is very important. In my version I have participants click squares that form a 5x4 grid on the screen. Accuracy seems lower at the edges of the screen than at the center. Having more calibration squares at the edge of the screen could improve that.
  • In the tracking phase, accuracy seems to drop off over time. Recalibrating occassionally could address this issue. To keep on calibrating, have the participant click things on the screen and remove the statement window.webgazer.removeMouseEventListeners(); in the tracking_square code component.
  • This experiment loads version 1.7.3 of the webgazer library, but I also included version 2.0.1. As pointed out by Alexander Anwyl-Irvine, this version should be a lot more accurate (see Twitter and PsychMaps Facebook Group)

What can do with your experiment?

Whatever you'd like! I delivered a bare-bones version to show that it's possible to eye track with PsychoJS, but as listed above, a lot of improvements could be made. Feel free to clone this experiment, improve it, and adapt it to your needs. Please share your improvements with our community. You can do so by posting about it on the PsychoPy forum or by making a pull request for this Gitlab repo.

WARNING

The webgazer library downloads JS files from servers hosted by an external party. In an actual experiment, this could be a threat to privacy. On the upside, since webgazer is maintained by an academic institution, I deem the risks lower than with libraries that are maintained by commercial parties. Alternatively, you could get in touch with the webgazer team to discuss ways to have the JS files downloaded from Pavlovia. If you do so, I'd love to be in the loop!

More demos

See e2e_experiments for a list of all the demos that I built for PsychoJS.