OnEye Tracking Framework 2
OnEye Tracking Framework
OnEye Generic Object Tracking Framework – Tracking examples – 2011-2012
Clothes tracking (Catwalk sequence 2)

Papers:

  1. “OnEye — Producing and broadcasting generalized interactive video”, Alain Pagani and Christian Bailer and Didier Stricker, Proceedings of the Networked and Electronic Media Summit (NEM Summit), 2013
  2. “A user supported tracking framework for interactive video production”, Christian Bailer and Alain Pagani and Didier Stricker, Proceedings of the European Conference on Visual Media Production (CVMP) 2013
OnEye Tracking Framework 2

{youtube}UeBj1fmiZ-Y{/youtube}

OnEye Generic Object Tracking Framework – Tracking examples – 2011-2012
Clothes tracking (Catwalk sequence 2)


Papers:
1. “OnEye — Producing and broadcasting generalized interactive video”, Alain Pagani and Christian Bailer and Didier Stricker, Proceedings of the Networked and Electronic Media Summit (NEM Summit), 2013
2.”A user supported tracking framework for interactive video production”, Christian Bailer and Alain Pagani and Didier Stricker, Proceedings of the European Conference on Visual Media Production (CVMP) 2013

AlphaView – Sneek Peek
AlphaView - Sneak Peek
AlphaView was build from ground up to provide a state of the art rendering solution for Real time Image Based Lighting (IBL). In this video we show a sneak peek at our solution which goes from capturing a 360° spherical surrounding image to compute the spherical harmonics-based visibility function for environment lighting to the screen space scene effect solution for the IBL rendering of any 3d scene at about 60 FPS where geometry, illumination and shading materials can all be tweaked in real time.

A more technical presentation will be forthcoming.

AlphaView – Sneek Peek

{youtube}u9QZbPETu7c{/youtube}

AlphaView was build from ground up to provide a state of the art rendering solution for Real time Image Based Lighting (IBL). In this video we show a sneak peek at our solution which goes from capturing a 360° spherical surrounding image to compute the spherical harmonics-based visibility function for environment lighting to the screen space scene effect solution for the IBL rendering of any 3d scene at about 60 FPS where geometry, illumination and shading materials can all be tweaked in real time.

A more technical presentation will be forthcoming.

Visual-inertial head tracking in cars
Visiual-inertial head tracking in cars
Visual-inertial head tracking in cars

{youtube}VAwocrga__o{/youtube}

Online Ergonomic Assessment in an Industrial Environment
Online Ergonomic Assessment in an Industrial Environment

Nowadays, ergonomic evaluation of manual workflows is mostly based on subjective assessment and is performed offline. This video demonstrates a system, which provides objective measures for global ergonomic evaluation and even permits real-time assessment and feedback. The system continuously estimates the worker’s motions based on a body sensor network and derives global biomechanical scores using the ergonomic tool Rapid Upper Limb Assessment (RULA). Based on this, the user receives visual and acoustic feedback in real-time through a head-mounted display. This permits the worker to modify his posture immediately in order to decrease the risk of musculosceletal disorders. Moreover, the ergonomic scores are documented for offline analysis. The system could be used for planning, optimizing or training new workflows. It has been developed within the European project COGNITO (www.ict-cognito.org) in close cooperation between signal processing, biomechanics and end user requirements experts.

Online Ergonomic Assessment in an Industrial Environment

{youtube}O_3yhNenyN8{/youtube}

Nowadays, ergonomic evaluation of manual workflows is mostly based on subjective assessment and is performed offline. This video demonstrates a system, which provides objective measures for global ergonomic evaluation and even permits real-time assessment and feedback. The system continuously estimates the worker’s motions based on a body sensor network and derives global biomechanical scores using the ergonomic tool Rapid Upper Limb Assessment (RULA). Based on this, the user receives visual and acoustic feedback in real-time through a head-mounted display. This permits the worker to modify his posture immediately in order to decrease the risk of musculosceletal disorders. Moreover, the ergonomic scores are documented for offline analysis. The system could be used for planning, optimizing or training new workflows. It has been developed within the European project COGNITO (www.ict-cognito.org) in close cooperation between signal processing, biomechanics and end user requirements experts. 

Contact: Gabriele.Bleser@dfki.de

Full-body motion tracking with 10 wireless IMUs
Full-body motion tracking with 10 wireless IMUs
Full-body motion tracking with 10 wireless IMUs

{youtube}tBnS0NPdvvQ{/youtube}