News Archive
  • December 2024
  • October 2024
  • September 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
HCII 2022: Two papers accepted

We are pleased to announce that the Augmented Vision group presented two papers at the HCI International 2022 Conference from June 28th to July 1st, 2022.

The two accepted papers are:

Title: Learning Effect of Lay people in Gesture-Based Locomotion in Virtual Reality

Authors: Alexander Schäfer, Gerd Reis, Didier Stricker

Abstract: Locomotion in Virtual Reality (VR) is an important part of VR applications. Many scientists are enriching the community with different variations that enable locomotion in VR. Some of the most promising methods are gesture-based and do not require additional handheld hardware. Recent work focused mostly on user preference and performance of the different locomotion techniques. This ignores the learning effect that users go through while new methods are being explored. In this work, it is investigated whether and how quickly users can adapt to a hand gesture-based locomotion system in VR. Four different locomotion techniques are implemented and tested by participants. The goal of this paper is twofold: First, it aims to encourage researchers to consider the learning effect in their studies. Second, this study aims to provide insight into the learning effect of users in gesture-based systems.

Title: Human intelligent machine teaming in single pilot operation: A case study

Authors: Nareg Minaskan Karabid, Charles-Alban Dormoy, Alain Pagani, Jean-Marc Andre, Didier Stricker

Abstract: With recent advances in artificial intelligence (AI) and learning based systems, industries have started to integrate AI components into their products and workflows. In areas where frequent testing and development is possible these systems have proved to be quite useful such as in automotive industry where vehicle are now equipped with advanced driver-assistant systems (ADAS) capable of self-driving, route planning, and maintaining safe distances from lanes and other vehicles. However, as the safety-critical aspect of task increases, more difficult and expensive it is to develop and test AI-based solutions. Such is the case in aviation and therefore, development must happen over longer periods of time and in a step-by-step manner. This paper focuses on creating an interface between the human pilot and a potential assistant system that helps the pilot navigate through a complex flight scenario. Verbal communication and augmented reality (AR) were chosen as means of communication and the verbal communication was carried out in a wizard-of-Oz (WoOz) fashion. The interface was tested in a flight simulator and its usefulness was evaluated by NASA-TLX and SART questionnaires for workload and situation awareness.