Seamless Integration of Handwriting Recognition into Pen-Enabled Displays for Fast User Interaction

Seamless Integration of Handwriting Recognition into Pen-Enabled Displays for Fast User Interaction
Marcus Liwicki, Markus Weber, Tobias Zimmermann, Andreas Dengel
10th IAPR International Workshop on Document Analysis Systems IAPR International Workshop on Document Analysis Systems (DAS-10)

Abstract:
This paper proposes a framework for the integration of handwriting recognition into natural user interfaces. As more and more pen-enabled touch displays are available, we make use of the distinction between touch actions and pen actions. Furthermore, we apply a recently introduced mode detection approach to distinguish between handwritten strokes and graphics drawn with the pen. These ideas are implemented in the Touch & Write SDK which can be used for various applications. In order to evaluate the effectiveness of our approach, we have conducted experiments for an annotation scenario. We asked several users to mark and label several objects in videos. We have measured the labeling time when using our novel user interaction system and compared it to the time needed when using common labeling tools. Furthermore, we compare our handwritten input paradigm to other existing systems. It turns out that the annotation is performed much faster when using our method and the user experience is also much better.

Seamless Integration of Handwriting Recognition into Pen-Enabled Displays for Fast User Interaction

Seamless Integration of Handwriting Recognition into Pen-Enabled Displays for Fast User Interaction
(Hrsg.)
10th IAPR International Workshop on Document Analysis Systems IAPR International Workshop on Document Analysis Systems (DAS-10)

Abstract:
This paper proposes a framework for the integration of handwriting recognition into natural user interfaces. As more and more pen-enabled touch displays are available, we make use of the distinction between touch actions and pen actions. Furthermore, we apply a recently introduced mode detection approach to distinguish between handwritten strokes and graphics drawn with the pen. These ideas are implemented in the Touch & Write SDK which can be used for various applications. In order to evaluate the effectiveness of our approach, we have conducted experiments for an annotation scenario. We asked several users to mark and label several objects in videos. We have measured the labeling time when using our novel user interaction system and compared it to the time needed when using common labeling tools. Furthermore, we compare our handwritten input paradigm to other existing systems. It turns out that the annotation is performed much faster when using our method and the user experience is also much better.