Projekt-/Forschungspraktikum "Eye tracking in word processing"[zur Übersicht]
Word processing with tools like Microsoft Word or Google Docs is a core task that users perform on their computers. The interaction with a mouse for placing the cursor and a keyboard for entering text has not changed much since the introduction of the mouse in the seventies of the last century. However, recently eye tracking has been integrated into more and more devices and will be available in the upcoming augmented and virtual reality systems like the Microsoft HoloLens 2. Eye tracking can determine which elements in an interface a user is focusing on and it is even used for making interaction more accessible to motor-impaired people.
In this research lab, you will explore the question “What is the future of word processing?” You will create a Web-based word processing platform that incorporates - besides mouse and keyboard input - also eye-tracking input. You can work on various levels of integration of eye-tracking input. From visualizing the attention of multiple users looking at the same document in real-time , over improving everyday tasks through the application of eye tracking , up to enabling hands’ free experiences [3,4] to make word processing more accessible. Further modalities like voice-based dictation  or touch devices  might be incorporated.
The kick-off meeting will take place at B 016 on 17th February at 2 p.m. Details about the registration process will be discussed there. The participation in the kick-off meeting is mandatory. The research lab is targeted at both, bachelor and master students of the computer science department. Master students will have the additional task to evaluate the platform.
The slides from the kick-off meeting with registration information can be accessed here: https://files.west.uni-koblenz.de/studying/courses/ss20/research-lab-eye-tracking/Kickoff.pdf
- Christian Schlösser. 2018. Towards concise gaze sharing. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA ’18). Association for Computing Machinery, New York, NY, USA, Article 78, 1–3. DOI:https://doi.org/10.1145/3204493.3207416
- Shyamli Sindhwani, Christof Lutteroth, and Gerald Weber. 2019. ReType: Quick Text Editing with Keyboard and Gaze. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). Association for Computing Machinery, New York, NY, USA, Paper 203, 1–13. DOI:https://doi.org/10.1145/3290605.3300433
- Beelders, Tanya & Blignaut, Pieter. (2011). The Usability of Speech and Eye Gaze as a Multimodal Interface for a Word Processor. 10.5772/16604.
- Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free Text Entry Using Gaze Paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16). Association for Computing Machinery, New York, NY, USA, 1952–1956. DOI:https://doi.org/10.1145/2858036.2858335
- Sengupta, K., Bhattarai, S., Sarcar, S., MacKenzie, I. S., & Staab, S. 2020. Leveraging Error Correction in Voice-based Text Entry by Talk-and-Gaze (CHI ’20, to appear)
- Kumar, C., Hedeshy, R., MacKenzie, I. S., & Staab, S. 2020. TAGSwipe: Touch Assisted Gaze Swipe for Text Entry (CHI ’20, to appear)