Search
Publication Authors

Prof. Dr. Didier Stricker

Dr. Alain Pagani

Dr. Gerd Reis

Eric Thil

Keonna Cunningham

Monika Miersch

Dr. Oliver Wasenmüller

Dr. Muhammad Zeshan Afzal

Dr. Gabriele Bleser

Dr. Muhammad Jameel Nawaz Malik

Dr. Bruno Mirbach

Dr. Jason Raphael Rambach

Dr. Nadia Robertini

Dr. René Schuster

Dr. Bertram Taetz

Ahmed Aboukhadra

Sk Aziz Ali

Mhd Rashed Al Koutayni

Yuriy Anisimov

Muhammad Asad Ali

Jilliam Maria Diaz Barros

Ramy Battrawy
Katharina Bendig
Hammad Butt

Mahdi Chamseddine
Chun-Peng Chang
Steve Dias da Cruz
Fangwen Shu

Torben Fetzer

Ahmet Firintepe

Sophie Folawiyo

David Michael Fürst
Anshu Garg

Christiano Couto Gava
Suresh Guttikonda

Tewodros Amberbir Habtegebrial

Simon Häring

Khurram Azeem Hashmi

Dr. Anna Katharina Hebborn

Hamoun Heidarshenas
Henri Hoyez

Pragati Jaiswal

Alireza Javanmardi
M.Sc. Sai Srinivas Jeevanandam

Jigyasa Singh Katrolia

Matin Keshmiri

Andreas Kölsch
Ganesh Shrinivas Koparde
Onorina Kovalenko

Stephan Krauß
Paul Lesur

Michael Lorenz

Dr. Markus Miezal

Mina Ameli

Nareg Minaskan Karabid

Mohammad Minouei

Shashank Mishra

Pramod Murthy

Mathias Musahl
Peter Neigel

Manthan Pancholi

Mariia Podguzova

Praveen Nathan
Qinzhuan Qian
Rishav

Marcel Rogge
María Alejandra Sánchez Marín
Dr. Kripasindhu Sarkar

Alexander Schäfer

Pascal Schneider

Dr. Mohamed Selim

Tahira Shehzadi
Lukas Stefan Staecker

Yongzhi Su

Xiaoying Tan

Shaoxiang Wang
Christian Witte

Yaxu Xie

Vemburaj Yadav

Yu Zhou

Dr. Vladislav Golyanik

Dr. Aditya Tewari

André Luiz Brandão
Publication Archive
New title
- ActivityPlus
- AlterEgo
- AR-Handbook
- ARVIDA
- Auroras
- AVILUSplus
- Be-greifen
- Body Analyzer
- CAPTURE
- Co2Team
- COGNITO
- DAKARA
- Density
- DYNAMICS
- EASY-IMP
- ENNOS
- Eyes Of Things
- iACT
- IMCVO
- IVMT
- LARA
- LiSA
- Marmorbild
- Micro-Dress
- Odysseus Studio
- On Eye
- OrcaM
- PAMAP
- PROWILAN
- ServiceFactory
- STREET3D
- SUDPLAN
- SwarmTrack
- TuBUs-Pro
- VIDETE
- VIDP
- VisIMon
- VISTRA
- VIZTA
- You in 3D
Simple and effective deep hand shape and pose regression from a single depth image
Simple and effective deep hand shape and pose regression from a single depth image
Muhammad Jameel Nawaz Malik, Ahmed Elhayek, Fabrizio Nunnari, Didier Stricker
Computers & Graphics (CAG) 85 Seiten 85-91 ELSEVIER 10/2019 .
- Abstract:
- Simultaneously estimating the 3D shape and pose of a hand in real time is a new and challenging computer graphics problem, which is important for animation and interactions with 3D objects in virtual environments with personalized hand shapes. CNN-based direct hand pose estimation methods are the state-of-the-art approaches, but they can only regress a 3D hand pose from a single depth image. In this study, we developed a simple and effective real-time CNN-based direct regression approach for simultaneously estimating the 3D hand shape and pose, as well as structure constraints for both egocentric and third person viewpoints by learning from the synthetic depth. In addition, we produced the first million-scale egocentric synthetic dataset called SynHandEgo, which contains egocentric depth images with accurate shape and pose annotations, as well as color segmentation of the hand parts. Our network is trained based on combined real and synthetic datasets with full supervision of the hand pose and structure constraints, and semi-supervision of the hand mesh. Our approach performed better than the state-of-the-art methods based on the SynHand5M synthetic dataset in terms of both the 3D shape and pose recovery. By learning simultaneously using real and synthetic data, we demonstrated the feasibility of hand mesh recovery from two real hand pose datasets, i.e., BigHand2.2M and NYU. Moreover, our method obtained more accurate estimates of the 3D hand poses based on the NYU dataset compared with the existing methods that output more than joint positions. The SynHandEgo dataset has been made publicly available to promote further research in the emerging domain of hand shape and pose recovery from egocentric viewpoints (https://bit.ly/2WMWM5u).