Search
Publication Authors

Prof. Dr. Didier Stricker

Dr. Alain Pagani

Dr. Gerd Reis

Eric Thil

Keonna Cunningham

Dr. Oliver Wasenmüller

Dr. Muhammad Zeshan Afzal

Dr. Gabriele Bleser

Dr. Muhammad Jameel Nawaz Malik
Dr. Bruno Mirbach

Dr. Jason Raphael Rambach

Dr. Nadia Robertini

Dr. René Schuster

Dr. Bertram Taetz

Ahmed Aboukhadra

Sk Aziz Ali

Mhd Rashed Al Koutayni

Yuriy Anisimov

Jilliam Maria Diaz Barros

Ramy Battrawy
Hammad Butt

Mahdi Chamseddine

Steve Dias da Cruz
Fangwen Shu

Torben Fetzer

Ahmet Firintepe

Sophie Folawiyo

David Michael Fürst
Anshu Garg

Christiano Couto Gava
Suresh Guttikonda

Tewodros Amberbir Habtegebrial

Simon Häring

Khurram Azeem Hashmi
Henri Hoyez

Alireza Javanmardi

Jigyasa Singh Katrolia

Andreas Kölsch
Ganesh Shrinivas Koparde
Onorina Kovalenko

Stephan Krauß
Paul Lesur

Michael Lorenz

Dr. Markus Miezal

Mina Ameli

Nareg Minaskan Karabid

Mohammad Minouei

Pramod Murthy

Mathias Musahl

Peter Neigel

Manthan Pancholi

Mariia Podguzova

Praveen Nathan
Qinzhuan Qian
Rishav
Marcel Rogge
María Alejandra Sánchez Marín
Dr. Kripasindhu Sarkar

Alexander Schäfer

Pascal Schneider

Mohamed Selim

Tahira Shehzadi
Lukas Stefan Staecker

Yongzhi Su

Xiaoying Tan
Christian Witte

Yaxu Xie

Vemburaj Yadav

Dr. Vladislav Golyanik

Dr. Aditya Tewari

André Luiz Brandão
Publication Archive
New title
- ActivityPlus
- AlterEgo
- AR-Handbook
- ARVIDA
- Auroras
- AVILUSplus
- Be-greifen
- Body Analyzer
- CAPTURE
- COGNITO
- DAKARA
- Density
- DYNAMICS
- EASY-IMP
- Eyes Of Things
- iACT
- IMCVO
- IVMT
- LARA
- LiSA
- Marmorbild
- Micro-Dress
- Odysseus Studio
- On Eye
- OrcaM
- PAMAP
- PROWILAN
- ServiceFactory
- STREET3D
- SUDPLAN
- SwarmTrack
- TuBUs-Pro
- VIDETE
- VIDP
- VisIMon
- VISTRA
- You in 3D
Acquiring and transferring workflow knowledge using Augmented Reality
Acquiring and transferring workflow knowledge using Augmented Reality
Nils Petersen
PhD Thesis
- Abstract:
- Workflow knowledge comprises both explicit, verbalizable knowledge and implicit knowledge, which is acquired through practice. Learning a complex workflow therefore benefits from training with a permanent corrective. Augmented Reality manuals that display instructive step-by-step information directly into the user’s field of view provide an intuitive and provably effective learning environment. However, their creation process is rather work intensive and current technological approaches lead to insufficient interactivity with the user. In this thesis we present a comprehensive technical approach to algorithmically analyze manual workflows from video examples and use the acquired information to teach explicit and implicit workflow knowledge using Augmented Reality. The technical realization starts with unsupervised segmentation of single work steps and their categorization into a coarse taxonomy. Thereafter, we analyze the single steps for their modalities using a hand and finger tracking approach optimized for this particular application. Using explicit, work step specific generalization we are able to compensate for morphological differences among different users and thus to reduce the need for large amounts of training data. To render this information communicable, i.e., understandable by a different person, we present the further processed data using Augmented Reality as an interactive tutoring system. The resulting system allows for fully or semi-automatic creation of Augmented Reality (AR-)manuals from video examples as well as their context-driven presentation in AR. The method is able to extract and to teach procedural, implicit workflow knowledge from given video examples. In an extensive evaluation, we demonstrate the applicability of all proposed technical components for the given task.