News Archive
  • October 2024
  • September 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023

Kick-Off-Treffen des KIMBA Forschungsvorhabens// Kick-off meeting of the KIMBA research project.

Teilnehmer des Kick-Off-Treffens des KIMBA Forschungsvorhabens stehen vor einem mobilen Prallbrecher von Projektpartner KLEEMANN. // Participants of the kick-off meeting of the KIMBA research project standing in front of a mobile impact crusher from project partner KLEEMANN

[Deutsche Version]

Im Rahmen der Digital GreenTech Konferenz 2023 in Karlsruhe wurden kürzlich 14 neue Forschungsprojekte aus den Bereichen Wasserwirtschaft, nachhaltiges Landmanagement, Ressourceneffizienz und Kreislaufwirtschaft vorgestellt, darunter auch Kimba. Hierbei arbeiten wir gemeinsam mit unseren Projektpartnern an einer KI-basierten Prozesssteuerung und automatisiertem Qualitätsmanagement für das Recycling von Bau- und Abbruchabfällen in Echtzeit. Das spart Kosten, Zeit sowie Ressourcen und schont die Umwelt. So unterstützen wir die Baubranche auf ihrem Weg in die Zukunft.

Weitere Informationen zu KIMBA finden sich unter: https://www.ants.rwth-aachen.de/cms/IAR/Forschung/Aktuelle-Forschungsprojekte/~bdikqm/KIMBA/

Kontakt: Dr. Jason Rambach , Dr. Bruno Mirbach

[English Version]

At the Digital GreenTech Conference 2023 in Karlsruhe, 14 new research projects in the fields of water management, sustainable land management, resource efficiency and circular economy were recently presented, including Kimba. Here, we are working with our project partners on AI-based process control and automated quality management for recycling construction and demolition waste in real time. This saves costs, time and resources and protects the environment. This is how we support the construction industry on its way into the future.

Further Information to KIMBA can be found under: https://www.ants.rwth-aachen.de/cms/IAR/Forschung/Aktuelle-Forschungsprojekte/~bdikqm/KIMBA/

Contact: Dr. Jason Rambach, Dr. Bruno Mirbach

Kick-Off-Treffen des ReVise-UP Forschungsvorhabens. Kick-off meeting of the ReVise-UP research project.

Alt-Text: Teilnehmer des Kick-Off-Treffens des ReVise-UP Forschungsvorhabens stehen vor dem Bergbaugebäude der RWTH Aachen University. // Participants of the kick-off meeting of the ReVise-UP research project stand in front of the mining building of RWTH Aachen University.

Deutsche Version

Forschungsvorhaben „ReVise-UP“ zur Verbesserung der Prozesseffizienz des werkstofflichen Kunststoffrecyclings mittels Sensortechnik gestartet

Im September 2023 startete das vom BMBF geförderte Forschungsvorhaben ReVise-UP („Verbesserung der Prozesseffizienz des werkstofflichen Recyclings von Post-Consumer Kunststoff-Verpackungsabfällen durch intelligentes Stoffstrommanagement – Umsetzungsphase“).  In der vierjährigen Umsetzungsphase soll die Transparenz und Effizienz des werkstofflichen Kunststoffrecyclings durch Entwicklung und Demonstration sensorbasierter Stoffstromcharakterisierungsmethoden im großtechnischen Maßstab gesteigert werden.

Auf Basis der durch Sensordaten erzeugten Datentransparenz soll das bisherige Kunststoffrecycling durch drei Effekte verbessert werden: Erstens sollen durch die Datentransparenz positive Anreize für verbesserte Sammel- und Produktqualitäten und damit gesteigerte Rezyklatmengen und -qualitäten geschaffen werden. Zweitens sollen sensorbasiert erfasste Stoffstromcharakteristika dazu genutzt werden, Sortier-, Aufbereitungs- und Kunststoffverarbeitungsprozesse auf schwankende Stoffstromeigenschaften adaptieren zu können. Drittens soll die verbesserte Datenlage eine ganzheitliche ökologische und ökonomische Bewertung der Wertschöpfungskette ermöglichen.

An ReVise-UP beteiligen sich insgesamt 18 Forschungsinstitute, Verbände und Industriepartner. Das Bundesministerium für Bildung und Forschung (BMBF) fördert ReVise-UP im Rahmen der Förderrichtlinie „Ressourceneffiziente Kreislaufwirtschaft – Kunststoffrecyclingtechnologien (KuRT)” mit 3,92 Mio. €.

Weitere Informationen zu ReVise-UP finden sich unter: https://www.ants.rwth-aachen.de/cms/IAR/Forschung/Aktuelle-Forschungsprojekte/~bdueul/ReVise-UP/

Verbundpartner in ReVise-UP sind:

Als assoziierte Partner wird ReVise-UP unterstützt von:

Kontakt: Dr. Jason Rambach , Dr. Bruno Mirbach

English version

Research project “ReVise-UP” started to improve the process efficiency of mechanical plastics recycling using sensor technology

In September 2023, the BMBF-funded research project ReVise-UP (“Improving the process efficiency of mechanical recycling of post-consumer plastic packaging waste through intelligent material flow management – implementation phase”) started. In the four-year implementation phase, the transparency and efficiency of mechanical plastics recycling is to be increased by developing and demonstrating sensor-based material flow characterization methods on an industrial scale.


Based on the data transparency generated by sensor data, the current plastics recycling shall be improved by three effects: First, data transparency is intended to create positive incentives for improved collection and product qualities and thus increased recyclate quantities and qualities. Second, sensor-based material flow characteristics are to be used to adapt sorting, treatment and plastics processing processes to fluctuating material flow properties. Third, the improved data situation should enable a holistic ecological and economic evaluation of the value chain.

A total of 18 research institutes, associations and industrial partners are participating in ReVise-UP. The German Federal Ministry of Education and Research (BMBF)v is funding ReVise-UP with €3.92 million as part of the funding guideline “Resource-efficient recycling management – plastics recycling technologies (KuRT)”.

More information about ReVise-UP can be found at: https://www.ants.rwth-aachen.de/cms/IAR/Forschung/Aktuelle-Forschungsprojekte/~bdueul/ReVise-UP/?lidx=1

Project partners in ReVise-UP are:

Associated partners in ReVise-UP are:

Contact: Dr. Jason Rambach , Dr. Bruno Mirbach

DFKI Augmented Vision Researchers win 3 awards in Object Pose Estimation challenge (BOP Challenge, ICCV 2023)

DFKI Augmented Vision researchers Praveen Nathan, Sandeep Inuganti, Yongzhi Su and Jason Rambach received their 1st place award in the prestigious BOP Object Pose Estimation Challenge 2023 in the categories Overall Best RGB Method, Overall Best Segmentation Method and The Best BlenderProc-Trained Segmentation Method.

The BOP benchmark and challenge addresses the problem of 6-degree-of-freedom object pose estimation, which is of great importance for many applications such as robot grasping or augmented reality. This year, the BOP challenge was held within the “8th International Workshop on Recovering 6D Object Pose (R6D)” http://cmp.felk.cvut.cz/sixd/workshop_2023/  at the International Conference on Computer Vision (ICCV) in Paris, France https://iccv2023.thecvf.com/  .

The awards were received by Yongzhi Su and Dr. Jason Rambach on behalf of the DFKI Team and a short presentation of the method followed. The winning method was based on the CVPR 2022 paper “ZebraPose”  

The winning approach was developed by a team led by DFKI AV, with contributing researchers from Zhejiang University.

List of contributing researchers:

DFKI Augmented Vision: Praveen Nathan, Sandeep Inuganti, Yongzhi Su, Didier Stricker, Jason Rambach

Zhejiang University:  Yongliang Lin, Yu Zhang

DFKI AV – Stellantis Collaboration on Radar-Camera Fusion – Papers at GCPR and EUSIPCO

DFKI Augmented Vision is collaborating with Stellantis on the topic of Radar-Camera Fusion for Automotive Object Detection using Deep Learning. Recently, two new publications were accepted to the GCPR 2023 and EUSIPCO 2023 conferences.

The 2 new publications are:

1.  Cross-Dataset Experimental Study of Radar-Camera Fusion in Bird’s-Eye ViewProceedings of the 31st. European Signal Processing Conference (EUSIPCO-2023), September 4-8, Helsinki, Finland, IEEE, 2023.

Lukas Stefan Stäcker, Philipp Heidenreich, Jason Rambach, Didier Stricker

This paper investigates the influence of the training dataset and transfer learning on camera-radar fusion approaches, showing that while the camera branch needs large and diverse training data, the radar branch benefits more from a high-performance radar.

Cross-Dataset Experimental Study of Radar-Camera Fusion in Bird’s-Eye View

2. RC-BEVFusion: A Plug-In Module for Radar-Camera Bird’s Eye View Feature FusionProceedings of. Annual Symposium of the German Association for Pattern Recognition (DAGM-2023), September 19-22, Heidelberg, BW, Germany, DAGM, 9/2023.

Lukas Stefan Stäcker, Shashank Mishra, Philipp Heidenreich, Jason Rambach, Didier Stricker

This paper introduces a new Bird’s Eye view fusion network architecture for camera-radar fusion for 3D object detection that performs favorably on the NuScenes dataset benchmark.

RC-BEVFusion: A Plug-In Module for Radar-Camera Bird’s Eye View Feature Fusion

Contacts: Dr. Jason Rambach

ICCV 2023: 4 papers accepted

We are happy to announce that the Augmented Vision group will present 4 papers in the upcoming ICCV 2023 Conference, 2-6 October, Paris, France. The IEEE/CVF International Conference in Computer Vision (ICCV) is the premier international computer vision event. Homepage: https://iccv2023.thecvf.com/  

The 4 accepted papers are:

  1. U-RED: Unsupervised 3D Shape Retrieval and Deformation for Partial Point Clouds
    Yan Di, Chenyangguang Zhang, Ruida Zhang, Fabian Manhardt, Yongzhi Su, Jason Raphael Rambach, Didier Stricker, Xiangyang Ji, Federico Tombari
  2. FeatEnHancer: Enhancing Hierarchical Features for Object Detection and Beyond Under Low-Light Vision. Khurram Azeem Hashmi, Goutham Kallempudi, Didier Stricker, Muhammad Zeshan Afzal
  3. Introducing Language Guidance in Prompt-based Continual Learning Muhammad Gulzain Ali Khan, Muhammad Ferjad Naeem; Luc Van Gool; Federico  Tombari; Didier Stricker, Muhammad Zeshan Afzal
  4. DELO: Deep Evidential LiDAR Odometry using Partial Optimal Transport Sk Aziz Ali, Djamila Aouada, Gerd Reis, Didier Stricker
3rd place in Scan-to-BIM challenge (CV4_AEC Workshop, CVPR 2023) for HumanTech project team

The team of the EU Horizon Project HumanTech , consisting of Mahdi Chamseddine and Dr. Jason Rambach from DFKI Augmented Vision as well as Fabian Kaufmann from RPTU Kaiserslautern – department of Civil Engineering, received the 3rd place prize in the Scan-to-BIM challenge of the (Computer Vision in the Built Environment) CV4_AEC Workshop of the CVPR 2023 conference.

On the 18.6, the team presented their solution and results as part of the workshop program. Scan-to-BIM solutions are of great importance for the construction community as they automate the generation of as-built models of buildings from 3D scans, and can be used for quality monitoring, robotic task planning and XR visualization, among other applications.

HumanTech project: https://humantech-horizon.eu/

CV4AEC Workshop page: https://cv4aec.github.io/

Contact: Dr. Jason Rambach , Mahdi Chamseddine

Special Issue on the IEEE ARSO 2023 Conference: Human Factors in Construction Robotics

Dr. Jason Rambach, coordinator of the EU Horizon Project HumanTech co-organized a special session on “Human Factors in Construction Robotics” at the IEEE International Conference on Advanced Robotics and its Social Impacts (ARSO 2023) in Berlin, Germany (5.6-7.6). The organization of the special session was done by Jason Rambach, Gabor Sziebig, Research Manager at SINTEF, and Mihoko Niitsuma, Professor at Chuo University.

The program of the special session included the following talks:

  • Serena Ivaldi (INRIA) – Teleoperating a robot for removing asbestos tiles on roofs: Insights from a pilot study
  • Jason Rambach (DFKI) – Machine perception for human-robot handover scenarios in construction
  • Patricia Helen Rosen (BAUA) – Design recommendations for construction robots – a human-centred perspective
  • Dimitrios Giakoumis (CERTH ITI) – Designing human-robot interaction interfaces for shotcrete construction robots; the RobetArme project case

HumanTech project: https://humantech-horizon.eu/

Contact: Dr. Jason Rambach

Workshop on AI and Robotics in Construction at ERF 2023

Dr. Jason Rambach, coordinator of the EU Horizon Project HumanTech co-organized a workshop on “AI and Robotics in Construction” at the European Robotics Forum 2023 in Odense, Denmark (March 14th to 16th, 2023) in cooperation with the construction Robotics projects Beeyonders and RobetArme.

From the project HumanTech, Jason Rambach presented an overview of the project objectives as well as insights into the results achieved by Month 9 of the project. Patrick Roth from the partner Implenia, presented the perspective and challenges of the construction industry on the use of Robotics and AI in construction sites, while the project partners Dr. Bharath Sankaran (Naska.AI) and Dr. Gabor Sziebig (SINTEF) participated in a panel session discussing the future of Robotics in construction.

Workshop schedule: https://erf2023.sdu.dk/timetable/event/ai-and-robotics-in-construction/

HumanTech project: https://humantech-horizon.eu/                                                              

Contact: Dr. Jason Rambach

Dr. Jason Rambach giving his presentation.
Article in IEEE Robotics and Automation Letter (RA-L) journal

We are happy to announce that our article “OPA-3D: Occlusion-Aware Pixel-Wise Aggregation for Monocular 3D Object Detection” was published in the prestigious IEEE Robotics and Automation Letters (RA-L) Journal. The work is a collaboration of DFKI with the TU Munich and Google. The article is openly accessible at: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10021668                                                                      

Abstract: Monocular 3D object detection has recently made a significant leap forward thanks to the use of pre-trained depth estimators for pseudo-LiDAR recovery. Yet, such two-stage methods typically suffer from overfitting and are incapable of explicitly encapsulating the geometric relation between depth and object bounding box. To overcome this limitation, we instead propose to jointly estimate dense scene depth with depth-bounding box residuals and object bounding boxes, allowing a two-stream detection of 3D objects that harnesses both geometry and context information. Thereby, the geometry stream combines visible depth and depth-bounding box residuals to recover the object bounding box via explicit occlusion-aware optimization. In addition, a bounding box based geometry projection scheme is employed in an effort to enhance distance perception. The second stream, named as the Context Stream, directly regresses 3D object location and size. This novel two-stream representation enables us to enforce cross-stream consistency terms, which aligns the outputs of both streams, and further improves the overall performance. Extensive experiments on the public benchmark demonstrate that OPA-3D outperforms state-of-the-art methods on the main Car category, whilst keeping a real-time inference speed.

Yongzhi Su, Yan Di, Guangyao Zhai, Fabian Manhardt, Jason Rambach, Benjamin Busam, Didier Stricker and Federico Tombari “OPA-3D: Occlusion-Aware Pixel-Wise Aggregation for Monocular 3D Object Detection.IEEE Robotics and Automation Letters (2023).

Contacts: Yongzhi Su, Dr. Jason Rambach

Radar Driving Activity Dataset (RaDA) Released

DFKI Augmented Vision recently released the first publicly available UWB Radar Driving Activity Dataset (RaDA), consisting of over 10k data samples from 10 different participants annotated with 6 driving activities. The dataset was recorded in the DFKI driving simulator environment. For more information and to download the dataset please check the project website:  https://projects.dfki.uni-kl.de/rada/

The dataset release is accompanied by an article publication at the Sensors journal:

Brishtel, Iuliia, Stephan Krauss, Mahdi Chamseddine, Jason Raphael Rambach, and Didier Stricker. “Driving Activity Recognition Using UWB Radar and Deep Neural Networks.” Sensors 23, no. 2 (2023): 818.

Contacts: Dr. Jason Rambach, Iuliia Brishtel