Search
Publication Authors

Prof. Dr. Didier Stricker

Dr. Alain Pagani

Dr. Gerd Reis

Eric Thil

Keonna Cunningham

Dr. Oliver Wasenmüller

Dr. Muhammad Zeshan Afzal

Dr. Gabriele Bleser

Dr. Muhammad Jameel Nawaz Malik

Dr. Bruno Mirbach

Dr. Jason Raphael Rambach

Dr. Nadia Robertini

Dr. René Schuster

Dr. Bertram Taetz

Ahmed Aboukhadra

Sk Aziz Ali

Mhd Rashed Al Koutayni

Yuriy Anisimov

Muhammad Asad Ali

Jilliam Maria Diaz Barros

Ramy Battrawy
Katharina Bendig
Hammad Butt

Mahdi Chamseddine
Chun-Peng Chang

Steve Dias da Cruz
Fangwen Shu

Torben Fetzer

Ahmet Firintepe

Sophie Folawiyo

David Michael Fürst
Anshu Garg

Christiano Couto Gava
Suresh Guttikonda

Tewodros Amberbir Habtegebrial

Simon Häring

Khurram Azeem Hashmi

Dr. Anna Katharina Hebborn

Hamoun Heidarshenas
Henri Hoyez

Alireza Javanmardi
M.Sc. Sai Srinivas Jeevanandam

Jigyasa Singh Katrolia

Matin Keshmiri

Andreas Kölsch
Ganesh Shrinivas Koparde
Onorina Kovalenko

Stephan Krauß
Paul Lesur

Michael Lorenz

Dr. Markus Miezal

Mina Ameli

Nareg Minaskan Karabid

Mohammad Minouei

Shashank Mishra

Pramod Murthy

Mathias Musahl
Peter Neigel

Manthan Pancholi

Mariia Podguzova

Praveen Nathan
Qinzhuan Qian
Rishav

Marcel Rogge
María Alejandra Sánchez Marín
Dr. Kripasindhu Sarkar

Alexander Schäfer

Pascal Schneider

Dr. Mohamed Selim

Tahira Shehzadi
Lukas Stefan Staecker

Yongzhi Su

Xiaoying Tan

Shaoxiang Wang
Christian Witte

Yaxu Xie

Vemburaj Yadav

Dr. Vladislav Golyanik

Dr. Aditya Tewari

André Luiz Brandão
Publication Archive
New title
- ActivityPlus
- AlterEgo
- AR-Handbook
- ARVIDA
- Auroras
- AVILUSplus
- Be-greifen
- Body Analyzer
- CAPTURE
- Co2Team
- COGNITO
- DAKARA
- Density
- DYNAMICS
- EASY-IMP
- ENNOS
- Eyes Of Things
- iACT
- IMCVO
- IVMT
- LARA
- LiSA
- Marmorbild
- Micro-Dress
- Odysseus Studio
- On Eye
- OrcaM
- PAMAP
- PROWILAN
- ServiceFactory
- STREET3D
- SUDPLAN
- SwarmTrack
- TuBUs-Pro
- VIDETE
- VIDP
- VisIMon
- VISTRA
- VIZTA
- You in 3D
Let there be IMU data: generating training data for wearable, motion sensor based activity recognition from monocular RGB videos
Let there be IMU data: generating training data for wearable, motion sensor based activity recognition from monocular RGB videos
Vitor Fortes Rey, Peter Hevesi, Onorina Kovalenko, Paul Lukowicz
UbiComp/ISWC '19 Adjunct Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers. International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp-2019) September 9-14 London United Kingdom Seiten 699-708 '19 ACM 9/2019 .
- Abstract:
- Recent advances in Machine Learning, in particular Deep Learning have been driving rapid progress in fields such as computer vision and natural language processing. Human activity recognition (HAR) using wearable sensors, which has been a thriving research field for the last 20 years, has benefited much less from such advances. This is largely due to the lack of adequate amounts of labeled training data. In this paper we propose a method to mitigate the labeled data problem in wearable HAR by generating wearable motion data from monocular RGB videos, which can be collected from popular video platforms such as YouTube. Our method works by extracting 2D poses from video frames and then using a regression model to map them to sensor signals. We have validated it on fitness exercises as the domain for which activity recognition is trained and shown that we can improve a HAR recognition model using data that was produced from a YouTube video.