Sie sind hier

Adaption and evaluation of gaze-controlled applications for touch input

As part of the EU-Project MAMEM [2], we have developed a gaze-controlled Web browser [1] that enables users to navigate and interact with the Web solely through their eye-movements. An eye-tracking device records the user's gaze and interprets it as coordinate on the computer's screen. Through the occurence of calibration errors and gaze-jittering, interface elements are designed to compensate inaccuracies. This is mostly implemented by an adapted size or a zooming behavior. Similar approaches have been chosen by operation systems and applications on mobile touch-based devices. Those use rather large buttons to compensate the inaccuracy of touch input.

The offered thesis shall investigate the similarities and differences between gaze- and touch-centered interface designs. The gaze-controlled Web browser must be adapted for touch-input and compared in terms of usability and task-performance to the existing gaze-controlled version. It is to be decided which interactions can be kept the way they are implemented for gaze-control and which may perform better in a different version for touch input. One must compare the gaze- and touch-controlled application in effiency and joy of use and may propose a combination of both input sources. Therefore, basic understanding of C++ is recommended.

You will be supported in the programming task and we grant access to a Windows tablet computer for testing and evaluation purposes.

References:

1. Browser on GitHub: https://github.com/MAMEM/GazeTheWeb/tree/master/Browse

2. MAMEM page: http://www.mamem.eu/

3. SMI REDn scientific eye-tracker: http://www.smivision.com/en/gaze-and-eye-tracking-systems/products/redn-scientific.html

4. Google Material Design: https://material.google.com/

5. Combination of Gaze and Touch: https://www.youtube.com/watch?v=DpLR7QqS6K8

Studienart: 
master
Ausschreibungsdatum: 
2016