Sie sind hier

Multi-model interaction support for user specific needs

People with disabilities face numerous daily challenges, and the use of computers for these individuals is one of them. The mouse and keyboard, traditional devices for computer interaction, are impractical for people with disabilities, e.g., Parkinson patients can hardly use a mouse because of the required precision to achieve any task. Interfaces with too many menus and buttons causes people with disabilities to refuse the use of computers because they get confused.

In this master thesis your approach to cater the specific needs of these individuals. The focus is to interpret information from combining user input modes (that are more convenient to a patient) and build a multimodal application interface for him/her. Multimodal interaction means interaction via several interaction channels such as speech, gesture, eye tracking, graphics etc. The mode of interaction and interface design would cater to the specific user requirement.

Eye tracking devices would help you capture user’s point of interest through gaze control. Eyes allow determining where the user is focusing his attention, and the computer as a pointing change translates every focus change. At WeST we already have the project group [2] and expertise of working with eye tracking applications (e.g. browser [1]). The ultimate goal is to obtain the ease and robustness of user communications integrating other methods like automatic speech recognition, to improve the output of a multimodal application. There are several research questions can be discussed for the master thesis, e.g., When should certain interaction modalities be used? How should multiple interaction modalities (speech and eye tracking) be combined? How can modalities be adapted according to context of use for a certain user with limited abilities?

You would involve the patient in requirement analysis, user centered design and evaluation. We will provide you the contact to targeted users (e.g., our contact at university of Cologne [7]), the MAMEM project resources [2], working prototype with eye tracking interaction [1], and state of art eye tracking device [3]. Some relevant papers in the field are attached [4,5,6].


1. Eye browser demo
2. MAMEM project
3. SMI Redn scientific eye trackers
4. Eye Tracking and Eye-Based Human–Computer Interaction
5. Speech-Augmented Eye Gaze Interaction with Small Closely Spaced Targets
6. Gaze as a Supplementary Modality for Interacting with Ambient Intelligence Environments
7.  Der körperbehinderte Mensch e.V.