A Middleware for Multi-Modal Mobile Device InputBachelor Thesis
16 June 2023, by Oliver Kafke

Photo: Janick Edinger
Abstract:
As with all Human-Computer Interactions, the accessibility of the user interfaces is of great importance when interacting with mobile devices. Key technologies ensuring accessibility include but are not limited to captions for users that are hard of hearing or voice-based interfaces for users who are visually impaired. For users with motor impairments, alternative input modalities need to be provided. We observed that research has discovered a plethora of such modalities. However, it has mostly been focused on the modalities themselves, much less on assistive software using them. Consequently, much less is known about the requirements of such systems and existing ones tend to be limited by either variety of inputs, availability, scalability, integrability and other factors. Our contributions are as follows: We designed and implemented a framework around a middleware for multi-modal mobile device input on a major mobile operating system. It mediates between inputs and application, allowing users to control applications through various interchangeable inputs, while decoupling the inputs from the application itself. Developers do not need to worry about the inputs used to control their application. Additionally, we provided them with a Software development kit (SDK) to integrate our framework. We demonstrate how our middleware solves the mentioned issues found in existing solutions and integrate it into sample applications. What’s more, a user study has been conducted. We found out that, using our middleware, users were able to type slightly faster, using fewer inputs compared to when using existing assistive software.
Supervised by:
Prof Dr. Janick Edinger, Tim Rolff