News Archive
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023

November 2020

Successful Milestone Review of the project ENNOS

The Project ENNOS integrates color and depth cameras with the capabilities of deep neural networks on a compact FPGA-based platform to create a flexible and powerful optical system with a wide range of applications in production contexts. While FPGAs offer the flexibility to adapt the system to different tasks, they also constrain the size and complexity of the neural networks. The challenge is to transform the large and complex structure of modern neural networks into a small and compact FPGA architecture. To showcase the capabilities of the ENNOS concept three scenarios have been selected. The first scenario covers the automatic anonymization of people during remote diagnosis, the second one addresses semantic 3D scene segmentation for robotic applications and the third one features an assistance system for model identification and stocktaking in large facilities.

During the milestone review a prototype of the ENNOS camera could be presented. It integrates color and depth camera as well as an FPGA for the execution of neural networks in the device. Furthermore, solutions for the three scenarios could be demonstrated successfully with one prototype already running entirely on the ENNOS platform. This demonstrates that the project is on track to achieve its goals and validates the fundamental approach and concept of the project.

Project Partners:
Robert Bosch GmbH
Deutsches Forschungszentrum für Künstliche Intelligenz GmbH (DFKI)
KSB SE & Co. KGaA
ioxp GmbH
ifm eletronic GmbH*
PMD Technologies AG*

*Associated Partner

Contact: Stephan Krauß
Click here to visit our project page.

Paper accepted at ISMAR 2020

We are happy to announce that our paper “TGA: Two-level Group Attention for Assembly State Detection” has been accepted for publication at the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), which will take place online from November 9th to 13th. The IEEE ISMAR is the leading international academic conference in the fields of Augmented Reality and Mixed Reality. The symposium is organized and supported by the IEEE Computer Society, IEEE VGTC and ACM SIGGRAPH.

Abstract: Assembly state detection, i.e., object state detection, has a critical meaning in computer vision tasks, especially in AR assisted assembly. Unlike other object detection problems, the visual difference between different object states can be subtle. For the better learning of such subtle appearance difference, we proposed a two-level group attention module (TGA), which consists of inter-group attention and intro-group attention. The relationship between feature groups as well as the representation within a feature group is simultaneously enhanced. We embedded the proposed TGA module in a popular object detector and evaluated it on two new datasets related to object state estimation. The result shows that our proposed attention module outperforms the baseline attention module.

Authors: Hangfan Liu, Yongzhi Su, Jason Raphael Rambach, Alain Pagani, Didier Stricker

Please find our paper here.

Please also check out our YouTube Video.

Contact: Yongzhi.Su@dfki.de, Jason.Rambach@dfki.de

PTC buys DFKI spin-off ioxp GmbH

PTC has acquired ioxp GmbH, a German industrial start-up for cognitive AR and AI software. ioxp is a spin-off from the Augmented Vision Department of the German Research Center for Artificial Intelligence GmbH (DFKI). For more Information click here or here (both articles in German only).