KIMBA

KI-basierte Prozesssteuerung und automatisiertes Qualitätsmanagement im Recycling von Bau- und Abbruchabfällen durch sensorbasiertes Inline-Monitoring von Korngrößenverteilungen

KI-basierte Prozesssteuerung und automatisiertes Qualitätsmanagement im Recycling von Bau- und Abbruchabfällen durch sensorbasiertes Inline-Monitoring von Korngrößenverteilungen

With 587.4 million t/a of aggregates used, the construction industry is one of the most resource-intensive sectors in Germany. By substituting primary aggregates with recycled (RC) aggregates, natural resources are conserved and negative environmental impacts such as greenhouse gas emissions are reduced by up to 85%. So far, RC building materials cover only 12.5 wt% of the aggregate demand with 73.3 million t/a. With an use of 53.9 million t/a (73.5 wt%), their use has so far been limited mainly to underground construction applications. In order to secure and expand the ecological advantages of RC building materials, it is therefore crucial that in future more demanding applications in building construction can also be covered by RC building materials. For this purpose, on the one hand, a sufficient quality of RC building materials must be guaranteed, and on the other hand, the acceptance of the customers must be ensured by a guaranteed compliance with applicable standards for building construction applications. An essential quality criterion for RC building materials is the particle size distribution (PSD) according to DIN 66165-1, which is determined in the state-of-the-art by manual sampling and sieve analyses which is time-consuming and costly. In addition, analysis results are only available with a considerable time delay. Consequently, it is neither possible to react to quality changes at an early stage, nor can treatment processes be parameterized directly to changed material flow properties. This is where the KIMBA project steps in: Instead of time-consuming and costly sampling and sieve analyses, the PSD analysis in construction waste processing plants shall be automated in the future by sensor-based inline monitoring. The RC material produced will be measured inline during the processing stage using imaging sensor technology. Subsequently, deep-learning algorithms segment the measured heap into individual particles, whose grain size is predicted and aggregated to a digital PSD. The sensor-based PSDs are then to be used intelligently to increase the quality and thus acceptance of RC building materials and hence accelerate the transition to a sustainable circular economy. Based on the proof of concept, two applications will be developed and demonstrated on a large scale: An automated quality management system continuously records the PSD of the produced RC product in order to document it to the customers and to be able to intervene in the process at an early stage in case of deviations. An AI-based assistance system is to enable adaptive control of the preparation process on the basis of sensor-based monitored PSDs and machine parameters to enable consistently high product qualities to be produced even in the event of fluctuating input qualities.

Partners

MAV Krefeld GmbH Institut für Anthropogene Stoffkreisläufe (ANTS) Deutsche Forschungszentrum für Künstliche Intelligenz (DFKI) KLEEMANN GmbH Lehrstuhl für International Production Engineering and Management (IPEM) der Universität Siegen Point 8 GmbH vero – Verband der Bau- und Rohstoffindustrie e.V Verband Deutscher Maschinen- und Anlagenbau e.V. (VDMA)

Contact

Dr. Bruno Walter Mirbach

Dr.-Ing. Jason Raphael Rambach

Revise-UP

Verbesserung der Prozesseffizienz des werkstofflichen Recyclings von Post-Consumer Kunststoff-Verpackungsabfällen durch intelligentes Stoffstrommanagement

Verbesserung der Prozesseffizienz des werkstofflichen Recyclings von Post-Consumer Kunststoff-Verpackungsabfällen durch intelligentes Stoffstrommanagement

At 3.2 million tonnes per year, post-consumer packaging waste represents the most significant plastic waste stream in Germany. Despite progress to date, mechanical plastics recycling still has significant potential for improvement: In 2021, only about 27 Ma.-% (1.02 million Mg/a) of post-consumer plastics could be converted into recyclates, and only about 12 Ma.-% (0.43 million Mg/a) served as substitutes for virgin plastics (Conversio Market & Strategy GmbH, 2022).

So far, mechanical plastics recycling has been limited by the high effort of manual material flow characterisation, which leads to a lack of transparency along the value chain. During the ReVise concept phase, it was shown that post-consumer material flows can be characterised automatically using inline sensor technology. The subsequent four-year ReVise implementation phase (ReVise-UP) will explore the extent to which sensor-based material flow characterisation can be implemented on an industrial scale to increase transparency and efficiency in plastics recycling.

Three main effects are expected from this increased data transparency. Firstly, positive incentives for improving collection and product qualities should be created in order to increase the quality and use of plastic recyclates. Secondly, sensor-based material flow characteristics are to be used to adapt sorting, treatment and plastics processing processes to fluctuating material flow properties. This promises a considerable increase in the efficiency of the existing technical infrastructure. Thirdly, the improved data situation should enable a holistic ecological and economic evaluation of the entire value chain. As a result, technical investments can be used in a more targeted manner to systematically optimise both ecological and economic benefits.

Our goal is to fundamentally improve the efficiency, cost-effectiveness and sustainability of post-consumer plastics recycling.

Partners

Deutsches Forschungszentrum für Künstliche Intelligenz GmbH Deutsches Institut für Normung e. V. Human Technology Center der RWTH Aachen University Hündgen Entsorgungs GmbH & Co. KG Krones AG Kunststoff Recycling Grünstadt GmbH SKZ – KFE gGmbH STADLER Anlagenbau GmbH Wuppertal Institut für Klima, Umwelt, Energie gGmbH PreZero Recycling Deutschland GmbH & Co. KG bvse – Bundesverband Sekundärrohstoffe und Entsorgung e. V. cirplus GmbH HC Plastics GmbH Henkel AG Initiative „Mülltrennung wirkt“ Procter & Gamble Service GmbH TOMRA Sorting GmbH

Contact

Dr. Bruno Walter Mirbach

Dr.-Ing. Jason Raphael Rambach

TWIN4TRUCKS

TWIN4TRUCKS – Digitaler Zwilling und KI in der vernetzten Fabrik für die integrierte Nutzfahrzeug­produktion, Logistik und Qualitätssicherung

Am 1. September 2022 startete das Forschungsprojekt Twin4Trucks (T4T). Darin verbinden sich wissenschaftliche Forschung und industrielle Umsetzung in einzigartiger Weise. Das Projektkonsortium besteht aus sechs Unternehmen aus Forschung und Industrie: Die Daimler Truck AG (DTAG) ist Konsortialführer des Projekts. Sie ist der größte Nutzfahrzeughersteller der Welt und mithilfe von Twin4Trucks soll ihre Produktion durch die Implementierung neuer Technologien wie Digitaler Zwillinge oder eines Digital Foundation Layer optimiert werden. Die Technologie-Initiative SmartFactory Kaiserslautern (SF-KL) und das Deutsche Forschungszentrum für Künstliche Intelligenz (DFKI) geben als visionäre Wissenschaftseinrichtungen mit Production Level 4 die Entwicklungsrichtung vor. Der IT-Dienstleister Atos ist zuständig für den Datenaustausch über Gaia-X, die Qualitätssicherung durch KI-Methoden und das Umsetzungskonzept des DFL. Infosys ist zuständig für die Netzwerkarchitektur, 5G Netzwerke und Integrationsleistungen. Das Unternehmen PFALZKOM baut eine Regional Edge Cloud auf, sowie ein Datencenter. Dazu kommen Gaia-X Umsetzung und Betriebskonzepte für Netzwerke.

Contact

Simon Bergweiler

Dr.-Ing. Jason Raphael Rambach

HumanTech

Human Centered Technologies for a Safer and Greener European Construction Industry

Human Centered Technologies for a Safer and Greener European Construction Industry

The European construction industry faces three major challenges: improve its productivity, increase the safety and wellbeing of its workforce and make a shift towards a green, resource efficient industry. To address these challenges adequately, HumanTech proposes a human-centered approach, involving breakthrough technologies such as wearables for worker safety and support, and intelligent robotic technology that can harmoniously co-exist with human workers while also contributing to the green transition of the industry.

Our aim is to achieve major advances beyond the current state-of-the-art in all these technologies, that can have a disruptive effect in the way construction is conducted.

These advances will include:

Introduction of robotic devices equipped with vision and intelligence to enable them to navigate autonomously and safely in a highly unstructured environment, collaborate with humans and dynamically update a semantic digital twin of the construction site.

Intelligent unobtrusive workers protection and support equipment ranging from exoskeletons triggered by wearable body pose and strain sensors, to wearable cameras and XR glasses to provide real-time worker localisation and guidance for the efficient and accurate fulfilment of their tasks.

An entirely new breed of Dynamic Semantic Digital Twins (DSDTs) of construction sites simulating in detail the current state of a construction site at geometric and semantic level, based on an extended BIM formulation (BIMxD)

Partners

Hypercliq IKE Technische Universität Kaiserslautern Scaled Robotics SL Bundesanstalt für Arbeitsschutz und Arbeitsmedizin Sci-Track GmbH SINTEF Manufacturing AS Acciona construccion SA STAM SRL Holo-Industrie 4.0 Software GmbH Fundacion Tecnalia Research & Innovation Catenda AS Technological University of the Shannon : Midlands Midwest Ricoh international BV Australo Interinnov Marketing Lab SL Prinstones GmbH Universita degli Studi di Padova European Builders Confederation Palfinger Structural Inspection GmbH Züricher Hochschule für Angewandte Wissenschaften Implenia Schweiz AG Kajima corporation

Contact

Dr. Bruno Walter Mirbach

Dr.-Ing. Jason Raphael Rambach

GreifbAR

Greifbare Realität - geschickte Interaktion von Benutzerhänden und -fingern mit realen Werkzeugen in Mixed-Reality Welten

Greifbare Realität – geschickte Interaktion von Benutzerhänden und -fingern mit realen Werkzeugen in Mixed-Reality Welten

On 01.10.2021, the research project GreifbAR started under the leadership of the DFKI (research area Augmented Reality). The goal of the GreifbAR project is to make mixed reality (MR) worlds, including virtual (VR) and augmented reality (“AR”), tangible and graspable by allowing users to interact with real and virtual objects with their bare hands. Hand accuracy and dexterity is paramount for performing precise tasks in many fields, but the capture of hand-object interaction in current MR systems is woefully inadequate. Current systems rely on hand-held controllers or capture devices that are limited to hand gestures without contact with real objects. GreifbAR solves this limitation by introducing a sensing system that detects both the full hand grip including hand surface and object pose when users interact with real objects or tools. This sensing system will be integrated into a mixed reality training simulator that will be demonstrated in two relevant use cases: industrial assembly and surgical skills training. The usability and applicability as well as the added value for training situations will be thoroughly analysed through user studies.

Partners

Berliner Charite (University Medicine Berlin) NMY (Mixed reality applications for industrial and communication customers) Uni Passau (Chair of Psychology with a focus on human-machine interaction).

Contact

Dr. Dipl.-Inf. Gerd Reis

Dr.-Ing. Nadia Robertini

AuRoRas

Automotive Robust Radar Sensing

Automotive Robust Radar Sensing

Radar sensors are very important in the automotive industry because they have the ability to directly measure the speed of other road users. The DFKI is working with our partners to develop intelligent software solutions to improve the performance of high-resolution radar sensors. We are using machine learning and deep neural networks to detect ghost targets in radar data thus improving their reliability and opens up a wide area of possibilities for highly automated driving.

Partners

ASTYX GmbH (Dr. Georg Kuschk), Lise-Meitner-Straße 2a, 85521, Ottobrunn, DE

BIT Technology Solutions gmbH (Geschäftsleitung), Gewerbering 3, 83539 Pfaffing OT Forsting, DE

Contact

Dr.-Ing. Jason Raphael Rambach

VIZTA

Vision, Identification, with Z-sensing Technologies and  key Applications

Vision, Identification, with Z-sensing Technologies and key Applications

VIZTA project, coordinated by ST Micrelectronics, aims at developing innovative technologies in the field of optical sensors and laser sources for short to long-range 3D-imaging and to demonstrate their value in several key applications including automotive, security, smart buildings, mobile robotics for smart cities, and industry4.0. The key differentiating 12-inch Silicon sensing technologies developed during VIZTA are:

1-Innovative SPAD and lock-in pixel for Time of Flight architecture sensors. 2-Unprecedent and cost-effective NIR and RGB-Z filters on-chip solutions. 3-complex RGB+Z pixel architectures for multimodal 2D/3D imaging.

For short-range sensors : advanced VCSEL sources including wafer-level GaAs optics and associated high speed driver. These developed differentiating technologies allows the development and validation of innovative 3D imaging sensors products with the following highly integrated prototypes demonstrators:

1-High resolution (>77 000 points) time-of-flight ranging sensor module with integrated VCSEL, drivers, filters and optics. 2-Very High resolution (VGA min) depth camera sensor with integrated filters and optics.

For Medium and Long range sensing, VIZTA also adresses new LiDAR systems with dedicated sources, optics and sensors. Technology developments of sensors and emitters are carried out by leading semiconductor product suppliers (ST Microelectronics, Philips, III-V Lab) with the support of equipment suppliers (Amat, Semilab) and CEA Leti RTO.

VIZTA project also include the developement of 6 demonstrators for key applications including automotive, security, smart buildings, mobile robotics for smart cities, and industry4.0 with a good mix of industrial and academic partners (Ibeo, Veoneer, Ficosa, Beamagine, IEE, DFKI, UPC, Idemia, CEA-List, ISD, BCB, IDE, Eurecat). VIZTA consortium brings together 23 partners from 9 countries in Europe: France, Germany, Spain, Greece, Luxembourg, Latvia, Sweden, Hungary, and United Kingdom.

Partners

Universidad Politecnica Catalunya Commisariat a l Energie Atomique et aux Energies Alternatives (CEA Paris) Fundacio Eurecat STMICROELECTRONICS SA BCB Informática y Control Alter Technology TÜV Nord SA FICOMIRRORS SA Philips Photonics GmbH Applied Materials France SARL SEMILAB FELVEZETO FIZIKAI LABORATORIUM RESZVENYTARSASAG ELEKTRONIKAS UN DATORZINATNU INSTITUTS LUMIBIRD IEE S.A. IBEO Automotive Systems GmbH STMICROELECTRONICS RESEARCH & DEVELOPMENT LTD STMICROELECTRONICS SA IDEMIA IDENITY & SECURITY FRANCE Beamagine S.L. Integrated Systems Development S.A. VEONEER SWEDEN AB III-V Lab STMICROELECTRONICS (ALPS) SAS STMICROELECTRONICS GRENOBLE 2 SAS

Contact

Dr.-Ing. Jason Raphael Rambach

Be-greifen

Be-greifen

Comprehensible, interactive experiments: practice and theory in the MINT study

© S. Siegesmund

Be-greifenThe project is funded by the Federal Ministry of Education and Research (BMBF). Combine tangible, manipulatable objects (“tangibles”) with advanced technologies (“Augmented Reality”) to develop new, intuitive user interfaces. Through interactive experiments, it will be possible to actively support the learning process during the MINT study and to provide the learner with more theoretical information about physics.

In the project interfaces of Smartphones, Smartwatches or Smartglasses are used. For example, a data gadget that allows you to view content through a combination of subtle head movements, eyebrows, and voice commands, and view them on a display attached above the eye. Through this casual information processing, the students are not distracted in the execution of the experiment and can access the objects and manipulate them.

A research project developed as a preliminary study demonstrates the developments. For this purpose, scientists at the DFKI and at the Technical University Kaiserslautern have developed an app that supports students and students in the determination of the relationship between the fill level of a glass and the height of the sound. The gPhysics application captures the amount of water, measures the sound frequency and transfers the results into a diagram. The app can be operated only by gestures of the head and without manual interaction. In gPhysics, the water quantity is recorded with a camera and the value determined is corrected by means of head gestures or voice commands, if required. The microphone of the Google Glass measures the sound frequency. Both information is displayed in a graph that is continuously updated on the display of Google Glass. In this way, the learners can follow the frequency curve in relation to the water level directly when filling the glass. Since the generation of the curve is comparatively fast, the learners have the opportunity to test different hypotheses directly during the interaction process by varying various parameters of the experiment.

In the project, further experiments on the physical basis of mechanics and thermodynamics are constructed. In addition, the consortium develops technologies that enable learners to discuss video and sensor recordings as well as analyze their experiments in a cloud and to exchange ideas with fellow students or to compare results.

Partners

The DFKI is a co-ordinator of five other partners in research and practice: the Technical University of Kaiserslautern, studio klv GmbH & Co. KG Berlin, University of Stuttgart, Con Partners GmbH from Bremen and Embedded Systems Academy GmbH from Barsinghausen.

Funding programm: German BMBF

  • Begin: 01.07.2016
  • End: 30.06.2019

Contact

Dr. Jason Raphael Rambach