FLUENTLY

Fluently - the essence of human-robot interaction

Fluently – the essence of human-robot interaction

Fluently leverages the latest advancements in AI-driven decision-making process to achieve true social collaboration between humans and machines while matching extremely dynamic manufacturing contexts. The Fluently Smart Interface unit features: 1) interpretation of speech content, speech tone and gestures, automatically translated into robot instructions, making industrial robots accessible to any skill profile; 2) assessment of the operator’s state through a dedicated sensors’ infrastructure that complements a persistent context awareness to enrich an AI-based behavioural framework in charge of triggering the generation of specific robot strategies; 3) modelling products and production changes in a way they could be recognized, interpreted and matched by robots in cooperation with humans. Robots equipped with Fluently will constantly embrace humans’ physical and cognitive loads, but will also learn and build experience with their human teammates to establish a manufacturing practise relying upon quality and wellbeing.

FLUENTLY targets three large scale industrial value chains playing an instrumental role in the present andfuture manufacturing industry in Europe, that are: 1) lithium cell batteries dismantling and recycling (fullymanual); 2) inspection and repairing of aerospace engines (partially automated); 3) laser-based multi-techs forcomplex metal components manufacturing, from joining and cutting to additive manufacturing and surfacefunctionalization (fully automated in the equipment but strongly dependent upon human process assessment).

Partners

  • REPLY DEUTSCHLAND SE (Reply), Germany,
  • STMICROELECTRONICS SRL (STM), Italy,
  • BIT & BRAIN TECHNOLOGIES SL (BBR), Spain,
  • MORPHICA SOCIETA A RESPONSABILITA LIMITATA (MOR), Italy,
  • IRIS SRL (IRIS), Italy,
  • SYSTHMATA YPOLOGISTIKIS ORASHS IRIDA LABS AE (IRIDA), Greece,
  • GLEECHI AB (GLE), Sweden,
  • FORENINGEN ODENSE ROBOTICS (ODE), Denmark,
  • TRANSITION TECHNOLOGIES PSC SPOLKA AKCYJNA (TT), Poland,
  • MALTA ELECTROMOBILITY MANUFACTURING LIMITED (MEM), Malta,
  • POLITECNICO DI TORINO (POLITO), Italy,
  • DEUTSCHES FORSCHUNGSZENTRUM FUR KUNSTLICHE INTELLIGENZ GMBH (DFKI), Germany,
  • TECHNISCHE UNIVERSITEIT EINDHOVEN (TUe), Netherlands,
  • SYDDANSK UNIVERSITET (SDU), Denmark,
  • COMPETENCE INDUSTRY MANUFACTURING 40 SCARL (CIM), Italy,
  • PRIMA ADDITIVE SRL (PA), Italy,
  • SCUOLA UNIVERSITARIA PROFESSIONALE DELLA SVIZZERA ITALIANA (SUPSI), Switzerland,
  • MCH-TRONICS SAGL (MCH),Switzerland,
  • FANUC SWITZERLAND GMBH (FANUC Europe), Switzerland,
  • UNIVERSITY OF BATH (UBAH), United Kingdom
  • WASEDA UNIVERSITY (WUT), Japan

Contact

Dipl.-Inf. Bernd Kiefer

Dr.-Ing. Alain Pagani

CORTEX2

Cooperative Real-Time Experience with Extended reality

Cooperative Real-Time Experience with Extended reality

The consortium of CORTEX2 — “COoperative Real-Time EXperiences with EXtended reality” — is proud to announce the official start of this European initiative, funded by the European Commission under the Horizon Europe research and innovation programme.

The COVID-19 pandemic pushed individuals and companies worldwide to work primarily from home or completely change their work model in order to stay in business. The share of employees who usually or sometimes work from home rose from 14.6% to 24.4% between 2019 and 2021. In Europe, the proportion of people who work remotely went from 5% to 40% as a result of the pandemic. Today, all the signs are that remote work is here to stay: 72% of employees say their organization is planning some form of permanent teleworking in the future, and 97% would like to work remotely, at least part of their working day, for the rest of their career. But not all organizations are ready to adapt to this new reality, where team collaboration is vital.

Existing services and applications aimed at facilitating remote team collaboration — from video conferencing systems to project management platforms — are not yet ready to efficiently and effectively support all types of activities. And extended reality (XR)-based tools, which can enhance remote collaboration and communication, present significant challenges for most businesses.

The mission of CORTEX2 is to democratize access to the remote collaboration offered by next-generation XR experiences across a wide range of industries and SMEs.

To this aim, CORTEX2 will provide:

  • Full support for AR experience as an extension of video conferencing systems when using heterogeneous service end devices through a novel Mediation Gateway platform. – Resource-efficient teleconferencing tools through innovative transmission methods and automatic summarization of shared long documents. – Easy-to-use and powerful XR experiences with instant 3D reconstruction of environments and objects, and simplified use of natural gestures in collaborative meetings. – Fusion of vision and audio for multichannel semantic interpretation and enhanced tools such as virtual conversational agents and automatic meeting summarization. – Full integration of internet of things (IoT) devices into XR experiences to optimize interaction with running systems and processes. – Optimal extension possibilities and broad adoption by delivering the core system with open APIs and launching open calls to enable further technical extensions, more comprehensive use cases, and deeper evaluation and assessment.

Overall, we will invest a total of 4 million Euros in two open calls, which will be aimed at recruiting tech startups/SMEs to co-develop CORTEX2; engaging new use-cases from different domains to demonstrate CORTEX2 replication through specific integration paths; assessing and validating the social impact associated with XR technology adoption in internal and external use cases.

The first one will be published on October 2023 with the aim to collect two types of applications: Co-development and Use-case. The second one will be published on April 2024, targeting only Co-development type projects.

The CORTEX2 consortium is formed by 10 organizations in 7 countries, which will work together for 36 months. ​​The German Research Center for Artificial Intelligence (DFKI) leads the consortium.

More information on the Project Website: https://cortex2.eu

Partners

  1. – DFKI – Deutsches Forschungszentrum für Künstliche Intelligenz GmbH Germany 2 – LINAGORA – GSO France 3 – ALE – Alcatel-Lucent Entreprise International France 4 – ICOM – Intracom SA Telecom Solutions Greece 5 – AUS – AUSTRALO Alpha Lab MTÜ Estonia 6 – F6S – F6S Network Limited Ireland 7 – KUL– Katholieke Universiteit Leuven Belgium 8 – CEA – Commissariat à l’énergie atomique et aux énergies alternatives France 9 – ACT – Actimage GmbH Germany 10 – UJI – Universitat Jaume I De Castellon

Contact

Dr.-Ing. Alain Pagani

HAIKU

Human AI teaming Knowledge and Understanding for aviation safety

Human AI teaming Knowledge and Understanding for aviation safety

It is essential both for safe operations, and for society in general, that the people who currently keep aviation so safe can work with, train and supervise these AI systems, and that future autonomous AI systems make judgements and decisions that would be acceptable to humans. HAIKU will pave the way for human-centric-AI by developing new AI-based ‘Digital Assistants’, and associated Human-AI Teaming practices, guidance and assurance processes, via the exploration of interactive AI prototypes in a wide range of aviation contexts.

Therefore, HAIKU will:

  1. Design and develop a set of AI assistants, demonstrated in the different use cases.
  2. Develop a comprehensive Human Factors design guidance and methods capability (‘HF4AI’) on how to develop safe, effective and trustworthy Digital Assistants for Aviation, integrating and expanding on existing state-of-the-art guidance.
  3. Conduct controlled experiments with high operational relevance – illustrating the tasks, roles, autonomy and team performance of the Digital Assistant in a range of normal and emergency scenarios
  4. Develop new safety and validation assurance methods for Digital Assistants, to facilitate early integration into aviation systems by aviation stakeholders and regulatory authorities
  5. Deliver guidance on socially acceptable AI in safety critical operations, and for maintaining aviation’sstrong safety record.

Partners

  1. Deep Blue Italy DBL 2 EUROCONTROL Belgium ECTL 3 FerroNATS Air Traffic Services Spain FerroNATS 4 Center for Human Performance Research Netherlands CHPR 5 Linköping University Sweden LiU 6 Thales AVS France TAVS 7 Institute Polytechnique de Bordeaux France Bordeaux INP 8 Centre Aquitain des Technologies del’Information Electroniques France CATIE 9 Deutsches Forschungszentrum für Künstliche Intelligenz Germany DFKI 10 Engineering Ingegneria Informatica SpA Italy ENG 11 Luftfartsverket, Air Navigation Service Provider Sweden Sweden LFV 12 Ecole Nationale De L’aviation Civile France ENAC 13 TUI Airways Ltd United Kingdom TUI 14 Suite5 Data Intelligence Solutions Limited Cyprus Suite5 15 Airholding SA Portugal EMBRT 16 Embraer SA Brazil EMBSA 17 Ethniko Kentro Erevnas Kai Technologikis Anaptyxis Greece CERTH 18 London Luton Airport Operations Ltd United Kingdom LLA

Contact

Nareg Minaskan Karabid

Dr.-Ing. Alain Pagani

HERON

Self-referenced Mobile Collaborative Robotics applied to collaborative and flexible production systems

Self-referenced Mobile Collaborative Robotics applied to collaborative and flexible production systems

The project will deliver a complete novel vision-guided mobile robotic solution to automate the assembly and screwdriving of final ssembly operations, which are currently performed manually. The solution will include a robotic cell integrating real-time process control to guarantee the process quality including a digital twin platform for accurate process simulation and trajectory optimization to minimize setup time and increase flexibility. A demonstrator will be built for system validation performing quality control procedures and screwing of automotive parts on the chassis of a vehicle.

Partners

Aldakin S.L (Spain)

Simumatik A.B (Sweden)

Visometry GmbH (Germany)

Contact

Dr.-Ing. Alain Pagani

TWIN4TRUCKS

TWIN4TRUCKS – Digitaler Zwilling und KI in der vernetzten Fabrik für die integrierte Nutzfahrzeug­produktion, Logistik und Qualitätssicherung

Am 1. September 2022 startete das Forschungsprojekt Twin4Trucks (T4T). Darin verbinden sich wissenschaftliche Forschung und industrielle Umsetzung in einzigartiger Weise. Das Projektkonsortium besteht aus sechs Unternehmen aus Forschung und Industrie: Die Daimler Truck AG (DTAG) ist Konsortialführer des Projekts. Sie ist der größte Nutzfahrzeughersteller der Welt und mithilfe von Twin4Trucks soll ihre Produktion durch die Implementierung neuer Technologien wie Digitaler Zwillinge oder eines Digital Foundation Layer optimiert werden. Die Technologie-Initiative SmartFactory Kaiserslautern (SF-KL) und das Deutsche Forschungszentrum für Künstliche Intelligenz (DFKI) geben als visionäre Wissenschaftseinrichtungen mit Production Level 4 die Entwicklungsrichtung vor. Der IT-Dienstleister Atos ist zuständig für den Datenaustausch über Gaia-X, die Qualitätssicherung durch KI-Methoden und das Umsetzungskonzept des DFL. Infosys ist zuständig für die Netzwerkarchitektur, 5G Netzwerke und Integrationsleistungen. Das Unternehmen PFALZKOM baut eine Regional Edge Cloud auf, sowie ein Datencenter. Dazu kommen Gaia-X Umsetzung und Betriebskonzepte für Netzwerke.

Contact

Simon Bergweiler

Dr.-Ing. Jason Raphael Rambach

AI-Observer

Artificial Intelligence for Earth Observation Twinning

Artificial Intelligence for Earth Observation Twinning

Artificial Intelligence (AI) has a major impact on many sectors and its influence is predicted to expand rapidly in the coming years. One area with considerable untapped potential for AI is the field of Earth Observation, where it can be used to manage large datasets, find new insights in data and generate new products and services. AI is one of the missing core areas that need to be integrated in the EO capabilities of the ERATOSTHENES Centre of Excellence (ECoE). AI-OBSERVER project aims to significantly strengthen and stimulate the scientific excellence and innovation capacity, as well as the research management and administrative skills of the ECoE, through several capacity building activities on AI for EO applications in the Disaster Risk Reduction thematic area, upgrading and modernising its existing department of Resilient Society, as well as its research management and administration departments, and assisting the ECoE to reach its long-term objective of raised excellence on AI for EO on environmental hazards. A close and strategic partnership between the ECoE from Cyprus (Widening country) and two internationally top-class leading research institutions, the German Research Centre for Artificial Intelligence (DFKI) from Germany and the University of Rome Tor Vergata (UNITOV) from Italy, will lead to a research exploratory project on the application of AI on EO for multi-hazard monitoring and assessment in Cyprus. Moreover, CELLOCK Ltd. (CLK), the project’s industrial partner, will lead commercialisation, exploitation and product development aspects of AI-OBSERVER and its exploratory project outputs. All outputs will be disseminated and communicated to stakeholders, the research community, and the public, assisting the ECoE to accomplish its exploitation goals, by creating strong links with various stakeholders from academia and industry in Cyprus and beyond, that ECoE will capitalise on, long after the end of the project.

Partners

ERATOSTHENES Centre of Excellence (ECoE), Zypern (Koordinator) DFKI, Deutshland Universität Rom Tor Vergata, Italien CELLOCK Ltd., Zypern

Contact

Dr. Dipl.-Inf. Gerd Reis

OrthoSuPer

Sichere Datenplattform und intelligente Sensorik für die Versorgung der Zukunft in der Orthopädie

Sichere Datenplattform und intelligente Sensorik für die Versorgung der Zukunft in der Orthopädie

Muskel-Skelett-Erkrankungen, vor allem Rücken- und Knieleiden, sind in Deutschland einer der häufigsten diagnostizierten Krankheiten. Zum Teil auch arbeitsbedingt, sind diese für einen großen Teil der Arbeitsunfähigkeit verantwortlich, was zu erheblichen Kosten für Unternehmen und des Gesundheitssystems führt. Das größte Leistungsvolumen der Krankenkassen fällt in diesem Zusammenhang auf die Verschreibung von Physiotherapien. Für die betroffene Patient:innen ist dies eine erhebliche Einschränkung der Lebensqualität. Schmerzen, sowie langwierigen Diagnostik- und Therapieprozessen, die wiederum an zahlreiche Behandlungstermine und Überweisungen zu den jeweiligen behandelnden Ärzt:innen und Physiotherapeut:innen geknüpft sind, sind die Folge. In OrthoSuPer wird die Entwicklung eines intelligenten Wearables sowie eines Computer-Vision Technologie für orthopädische Fälle, wie Knie-Rehabilitation und orthopädietechnischen Versorgungen angestrebt. Eine gemeinsame Datenplattform in Form einer App, wird sich an Ärzt:innen, Physiotherapeut:innen, Orthopädietechniker:innen als auch an Patient:innen richten. Durch die Integration einer mobilen Bewegungsanalyse mittels des Wearables, sowie eines markerlosen kamerabasierten Systems in die digitale Prozesskette, ergeben sich enorme Vorteile für Diagnostik, Monitoring des Krankheitsverlaufs, von Behandlungsfortschritten und Therapiezyklen, bis hin zum Therapieende. Ferner kann auch die Nachsorge in Form von Rehabilitation, Kontrollen und Prävention wesentlich verbessert werden. Dies führt zu einer erheblichen Arbeitserleichterung für behandelnde Ärzte und Therapeuten. In Deutschland werden jährlich ca. 60.000 Patient:innen mit einer Knie Totalendoprothese versorgt. Davon können laut Krankenkassen ungefähr 30% durch ein SmartWearable vermieden oder aufgeschoben werden. Das entspricht also ca. 20.000 Patient:innen jährlich. Die Expertise der bisherigen einzelnen Prozessbeteiligten wird durch digitalisierte, automatisierte Prozesse verknüpft, beispielsweise mit einer sicheren Kommunikation und Dokumentation über die mobile Applikation. Durch die Digitalisierung wird in der Patient:innenversorgung Zeit gewonnen und der Versorgungsprozess insgesamt für alle transparenter, die Daten helfen Ergebnisse zu objektivieren und gleichzeitig die orthopädische Versorgung zu verbessern. Patient:innen können von einem wesentlich geringeren Aufwand sowie qualitativ höherwertiger Diagnostik, Therapie und Nachsorge profitieren. Zusätzlich gewinnen sie maßgeblich an Kontrolle und Unabhängigkeit. Für die Krankenkassen und das Gesundheitssystem kann OrthoSuPer durch die Vernetzung und transparente medizinischen Prozesse, über die gesamte Patient Journey, erheblich zu einer Optimierung der eingesetzten Ressourcen beitragen.

Partners

Deutsches Forschungszentrum für Künstliche Intelligenz GmbH Ottobock SE & Co. KGaA SIMI Reality Motion Systems GmbH wearHEALTH, TU Kaiserslautern (TUK) Routine Health GmbH Orthopädisches Krankenhaus Schloss Werneck

Contact

Michael Lorenz, M.Sc.

Daniela Wittmann

SHARESPACE

Embodied Social Experiences in Hybrid Shared Spaces

Embodied Social Experiences in Hybrid Shared Spaces

SHARESPACE will demonstrate a radically new technology for promoting ethical and social interaction in eXtended Reality (XR) Shared Hybrid Spaces (SHS), anchored in human sensorimotor communication. Our core concept is to identify and segment social sensorimotor primitives and reconstruct them in hybrid settings to build continuous, embodied, and rich human- avatar experiences.

To achieve this, three interconnected science-towards-technology breakthroughs will be delivered: – novel computational cognitive architectures, – a unique self-calibrating body sensor network, and – a fully mobile spatial Augmented Reality (AR) and virtual human rendering.

We will create a library of social motion primitives and use them to design AI-based architectures of our artificial agents. SHARESPACE mobile capturing technologies combine loosely-coupled visual-inertial tracking of full body kinematic, hand pose and facial expression, incorporating novel neural encoding/decoding functionalities, together with local context-aware animations and highly realistic neural rendering.

Our technology will be iteratively tested in 2 Proofs-of-principles involving human and artificial agents interacting in SHS, and 3 real-world use case scenarios in Health, Sport and Art. We will demonstrate a fully functional prototype of SHARESPACE tailored to the agents’ personalized characteristics (gender, culture, and social dispositions). SHARESPACE will support community-building and exploitation with concrete initiatives, including (i) public engagement around our research and innovation, (ii) promoting high-tech innovation and early transfer to our deep-tech companies, as premises for the consolidation of human-centric and sovereign European market areas such Industry AR and SHS, eHealth and tele-Health. Our long-term vision is to bring XR to a radically new level of presence and sociality by reconstructing sensorimotor primitives that enable ethical, trusted and inclusive modes of social interaction.

Partners

  • DEUTSCHES FORSCHUNGSZENTRUM FUR KUNSTLICHE INTELLIGENZ GMBH (DFKI), Germany
  • UNIVERSITE DE MONTPELLIER (UM), France
  • CRDC NUOVE TECNOLOGIE PER LE ATTIVITA PRODUTTIVE SCARL (CRdC), Italy
  • UNIVERSITAETSKLINIKUM HAMBURG-EPPENDORF (UKE), Germany
  • ALE INTERNATIONAL (ALE), France
  • UNIVERSITAT JAUME I DE CASTELLON (UJI), Spain
  • GOLAEM SA (GOLAEM), France
  • SIA LIGHTSPACE TECHNOLOGIES (LST), Latvia
  • CYENS CENTRE OF EXCELLENCE (CYENS), Cyprus
  • RICOH INTERNATIONAL BV (RICOH), Netherlands
  • INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET AUTOMATIQUE (INRIA), France
  • ARS ELECTRONICA LINZ GMBH & CO KG (AE), Austria
  • FUNDACIO HOSPITAL UNIVERSITARI VALL D’HEBRON -INSTITUT DE RECERCA (VHIR), Spain

Contact

Prof. Dr. Didier Stricker

KIMBA

KI-basierte Prozesssteuerung und automatisiertes Qualitätsmanagement im Recycling von Bau- und Abbruchabfällen durch sensorbasiertes Inline-Monitoring von Korngrößenverteilungen

KI-basierte Prozesssteuerung und automatisiertes Qualitätsmanagement im Recycling von Bau- und Abbruchabfällen durch sensorbasiertes Inline-Monitoring von Korngrößenverteilungen

With 587.4 million t/a of aggregates used, the construction industry is one of the most resource-intensive sectors in Germany. By substituting primary aggregates with recycled (RC) aggregates, natural resources are conserved and negative environmental impacts such as greenhouse gas emissions are reduced by up to 85%. So far, RC building materials cover only 12.5 wt% of the aggregate demand with 73.3 million t/a. With an use of 53.9 million t/a (73.5 wt%), their use has so far been limited mainly to underground construction applications. In order to secure and expand the ecological advantages of RC building materials, it is therefore crucial that in future more demanding applications in building construction can also be covered by RC building materials. For this purpose, on the one hand, a sufficient quality of RC building materials must be guaranteed, and on the other hand, the acceptance of the customers must be ensured by a guaranteed compliance with applicable standards for building construction applications. An essential quality criterion for RC building materials is the particle size distribution (PSD) according to DIN 66165-1, which is determined in the state-of-the-art by manual sampling and sieve analyses which is time-consuming and costly. In addition, analysis results are only available with a considerable time delay. Consequently, it is neither possible to react to quality changes at an early stage, nor can treatment processes be parameterized directly to changed material flow properties. This is where the KIMBA project steps in: Instead of time-consuming and costly sampling and sieve analyses, the PSD analysis in construction waste processing plants shall be automated in the future by sensor-based inline monitoring. The RC material produced will be measured inline during the processing stage using imaging sensor technology. Subsequently, deep-learning algorithms segment the measured heap into individual particles, whose grain size is predicted and aggregated to a digital PSD. The sensor-based PSDs are then to be used intelligently to increase the quality and thus acceptance of RC building materials and hence accelerate the transition to a sustainable circular economy. Based on the proof of concept, two applications will be developed and demonstrated on a large scale: An automated quality management system continuously records the PSD of the produced RC product in order to document it to the customers and to be able to intervene in the process at an early stage in case of deviations. An AI-based assistance system is to enable adaptive control of the preparation process on the basis of sensor-based monitored PSDs and machine parameters to enable consistently high product qualities to be produced even in the event of fluctuating input qualities.

Partners

MAV Krefeld GmbH Institut für Anthropogene Stoffkreisläufe (ANTS) Deutsche Forschungszentrum für Künstliche Intelligenz (DFKI) KLEEMANN GmbH Lehrstuhl für International Production Engineering and Management (IPEM) der Universität Siegen Point 8 GmbH vero – Verband der Bau- und Rohstoffindustrie e.V Verband Deutscher Maschinen- und Anlagenbau e.V. (VDMA)

Contact

Dr. Bruno Walter Mirbach

Dr.-Ing. Jason Raphael Rambach

Revise-UP

Verbesserung der Prozesseffizienz des werkstofflichen Recyclings von Post-Consumer Kunststoff-Verpackungsabfällen durch intelligentes Stoffstrommanagement

Verbesserung der Prozesseffizienz des werkstofflichen Recyclings von Post-Consumer Kunststoff-Verpackungsabfällen durch intelligentes Stoffstrommanagement

At 3.2 million tonnes per year, post-consumer packaging waste represents the most significant plastic waste stream in Germany. Despite progress to date, mechanical plastics recycling still has significant potential for improvement: In 2021, only about 27 Ma.-% (1.02 million Mg/a) of post-consumer plastics could be converted into recyclates, and only about 12 Ma.-% (0.43 million Mg/a) served as substitutes for virgin plastics (Conversio Market & Strategy GmbH, 2022).

So far, mechanical plastics recycling has been limited by the high effort of manual material flow characterisation, which leads to a lack of transparency along the value chain. During the ReVise concept phase, it was shown that post-consumer material flows can be characterised automatically using inline sensor technology. The subsequent four-year ReVise implementation phase (ReVise-UP) will explore the extent to which sensor-based material flow characterisation can be implemented on an industrial scale to increase transparency and efficiency in plastics recycling.

Three main effects are expected from this increased data transparency. Firstly, positive incentives for improving collection and product qualities should be created in order to increase the quality and use of plastic recyclates. Secondly, sensor-based material flow characteristics are to be used to adapt sorting, treatment and plastics processing processes to fluctuating material flow properties. This promises a considerable increase in the efficiency of the existing technical infrastructure. Thirdly, the improved data situation should enable a holistic ecological and economic evaluation of the entire value chain. As a result, technical investments can be used in a more targeted manner to systematically optimise both ecological and economic benefits.

Our goal is to fundamentally improve the efficiency, cost-effectiveness and sustainability of post-consumer plastics recycling.

Partners

Deutsches Forschungszentrum für Künstliche Intelligenz GmbH Deutsches Institut für Normung e. V. Human Technology Center der RWTH Aachen University Hündgen Entsorgungs GmbH & Co. KG Krones AG Kunststoff Recycling Grünstadt GmbH SKZ – KFE gGmbH STADLER Anlagenbau GmbH Wuppertal Institut für Klima, Umwelt, Energie gGmbH PreZero Recycling Deutschland GmbH & Co. KG bvse – Bundesverband Sekundärrohstoffe und Entsorgung e. V. cirplus GmbH HC Plastics GmbH Henkel AG Initiative „Mülltrennung wirkt“ Procter & Gamble Service GmbH TOMRA Sorting GmbH

Contact

Dr. Bruno Walter Mirbach

Dr.-Ing. Jason Raphael Rambach