News Archive
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
Dr.-Ing. Nils Petersen receives award for his doctoral thesis on learning-based authoring for Augmented Reality

Each year an expert jury from the University of Kaiserslautern is nominating outstanding doctoral theses for several academic subjects together with the Kreissparkasse Kaiserslautern..

We congratulate Dr.-Ing. Petersen on winning this award for Computer Science.

This award was granted for his doctoral thesis that deals with the understanding of manual workflows from video examples. An important application of his research work is the automatic generation of interactive Augmented Reality manuals from the observation of a reference performance.

Links:

https://www.kskkl.de/module/ueber_uns/pressecenter/upload/preisverleihung_stiftung_fuer_die_tu_kl.pdf

Abstract  doctoral thesis:

Workflow knowledge comprises both explicit, verbalizable knowledge and implicit knowledge, which is acquired through practice. Learning a complex workflow therefore benefits from training with a permanent corrective. Augmented Reality manuals that display instructive step-by-step information directly into the user’s field of view provide an intuitive and provably effective learning environment. However, their creation process is rather work intensive and current technological approaches lead to insufficient interactivity with the user. In this thesis we present a comprehensive technical approach to algorithmically analyze manual workflows from video examples and use the acquired information to teach explicit and implicit workflow knowledge using Augmented Reality. The technical realization starts with unsupervised segmentation of single work steps and their categorization into a coarse taxonomy. Thereafter, we analyze the single steps for their modalities using a hand and finger tracking approach optimized for this particular application. Using explicit, work step specific generalization we are able to compensate for morphological differences among different users and thus to reduce the need for large amounts of training data. To render this information communicable, i.e., understandable by a different person, we present the further processed data using Augmented Reality as an interactive tutoring system. The resulting system allows for fully or semi-automatic creation of Augmented Reality (AR-) manuals from video examples as well as their context-driven presentation in AR. The method is able to extract and to teach procedural, implicit workflow knowledge from given video examples. In an extensive evaluation, we demonstrate the applicability of all proposed technical components for the given task.