Publication details

Home Publications Publication details

Nonvisual Presentation, Navigation and Manipulation of Structured Documents on Mobile and Wearable Devices
Dorigo M, Harriehausen-Mühlbauer B, Stengel I, Dowland PS
International Conference on Computers for Handicapped Persons, ISBN: 978-3-319-08595-1, pp383-390, 2014
Sorry, we do not have a copy of this publication available for download.
Links:  External link available

There are a large number of highly structured documents, for example: newspaper articles, scientific, mathematical or technical literature. As a result of inductive research with 200 blind and visually impaired participants, a multi-modal user interface for non-visual presentation, navigation and manipulation of structured documents on mobile and wearable devices like smart phones, smart watches or smart tablets has been developed. It enables the user to get a fast overview over the document structure and to efficiently skim and scan over the document content by identifying the type, level, position, length, relationship and content text of each element as well as to focus, select, activate, move, remove and insert structure elements or text. These interactions are presented in a non-visual way using earcons, tactons and speech synthesis, serving the aural and tactile human sense. Navigation and manipulation is provided by using the multitouch, motion (linear acceleration and rotation) or speech recognition input modality. It is a complete solution for reading, creating and editing structured documents in a non-visual way. There is no special hardware required. For the development, testing and evaluation of the user interface, a flexible platform independent software architecture has been developed and implemented for iOS and Android. The evaluation of the user interface has been undertaken by a structured observation of 160 blind and visually impaired participants using an implemented software (App) over the Internet.

Dorigo M, Harriehausen-Mühlbauer B, Stengel I, Dowland PS