[vc_row][vc_column width=”1/4″][vc_single_image image=”10463″ img_size=”full” css=”.vc_custom_1471429000094{margin-top: 10px !important;}”][/vc_column][vc_column width=”1/2″][vc_column_text]Contact person: Dr.-Ing. Nils Petersen, Alexander Lemken[/vc_column_text][/vc_column][vc_column width=”1/4″][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]

Digital manuals that are faded in directly into the field of view of the user via a head-mounted display are one of the most often used application examples for Augmented Reality (AR) scenarios. AR manuals can significantly simplify or accelerate maintenance, reparation or installation work on complex systems.


Intelligent Augmented Reality Handbooks

Show Them How it Works – Worker support for future factories

Digital handbooks, presented as step-by-step instructions in a head-mounted display (HUD) directly in the user’s field of view, facilitate and accelerate the maintenance, repair, or installation of complex units. They explain precisely and clearly each individual step at the site, can be called up at anytime, reduce the safety risk to the employee, and they contribute to perfect results.

DFKI’s Augmented Reality research department is working on simplifying the creation of these AR Handbooks through the integration of AI technologies with the aim of making them fit for actual operations. In the past, this so called “authoring” was generally performed manually and with the associated high costs. The system often required scripted descriptions of actions that had to be manually prepared; furthermore, expert knowledge of the tracking system in use and how to install tracking assistance was necessary.

At the Federal Ministry of Education and Research (BMBF) exhibit stand at CeBIT this year, DFKI introduces the new AR Handbook System that allows for automated documentation and support of simple work processes by means of a lightweight system.

An integrated camera recognizes each manual action performed and superimposes this in the HUD with previously recorded video sequences to effectively show the next work action. This does not require any special marker or other support and – in contrast to many other methods – it recognizes the undefined hand gestures. The job sequences lend themselves to quick and easy recording and require only minimal post-processing. This technology significantly decreases the labor time required for the creation of Augmented Reality manuals and because it is far less complex, it encourages wide spread use.

The authoring tool independently breaks down the sequence after viewing into its separate distinguishable actions and then combines the separate sections with a stochastic transition model. An action observed during operation can be assigned in time to the corresponding section and then, pointers can be displayed at the precise point in time for the subsequent section. This kind of learning (“Teach-In”) is found in many areas of Artificial Intelligence and is an especially current research subject in the field of robotics. It is also known in the literature as “programming by demonstration.”

Additionally, the method fully and automatically creates semi-transparent overlays in which a “shadow image” of the pending action is displayed. Important details or supplemental pointers can be emphasized by adding graphic symbols like arrows or lines. The simplified authoring and teach-in method, which is performed by employees who are trained in the specific operation rather than by software experts, opens up additional fields of application, for example, in quality management.

Technicians at an assembly work station can record “reference procedures” to ensure that all future assembly activities follow the same procedural pattern. A limited version of the AR Handbook is now available for Android smartphones and tablets. This means that in the future, even the private user can obtain support when assembling furniture or installing and operating household appliances.

Nils Petersen and Didier Stricker, ‘Learning Task Structure from Video Examples for Workflow Tracking and Authoring’, in Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR), 2012

Hier geht’s zum deutschen Text: pdfPB_AV_AR-Handbuch_20130218.pdf512.03 KB