Student projects on Human-Computer Interaction with Eye-traking

Lucas C. Parra

Here is a HS student project on controlling navigation in a video-game (minecraft) from :

NYC teen's eye-tracking computer project, Fox 5 News Apr 23, 2015


Here is a MS thesis project from John Ettikkalayil on controlling a drone with an eye-tracker (2013):

Design, Implementation, and Performance Study of an Open Source Eye-Control System to Pilot a Parrot AR.Drone Quadrocopter


Both these projects were motivated by earlier work from 2011 which demnostrated eye-gaze controlled navigation working with UG students in BME and EE at CCNY. In both instances the goal was to move the the point of gaze to the center of the screen, simply put "where you look is where it goes".

EyeDrone

This is the popular AR.Drone quadricopter controlled with this policy using an EyeLink 2000 eye-tracker. The drone pivots left and right when the operator looks into those directions. It tilts up and down (making it fly backwards and forwards) when the operator looks up or down. Simple, isn't it?


EyeDrone was implemented by Lucas C Parra in C++ with much help from Michael Quintian.

The chin rest here is not really needed as the basic control algorithm is inherently stable and miscalibrations are of little concern. When we first tried this we had not realized how stable it was.

John M. Ettikkalayil, MS student in BME, reimplemented this using an open source sourftware. This solution is much less expensive as one can use a simple grayscale camera. A full description of the system is on the neural engineering lab site:

EyeMap

This is an image map scrolling application implemented in matlab using psychtoolbox using a plugin for the EyeLink eye-tracker. The object being looked at is moved gradually to the center of the screen. A bit of logic was needed to scroll smoothly.
EyeMap was implemented in matlab by Mohammod Arafat with help of Krishan Mistry and Lucas C Parra.

If you want to get involved or learn more about this project contact Lucas C Parra.