AlphaView – Sneek Peek
AlphaView was build from ground up to provide a state of the art rendering solution for Real time Image Based Lighting (IBL). In this video we show a sneak peek at our solution which goes from capturing a 360° spherical surrounding image to compute the spherical harmonics-based visibility function for environment lighting to the screen space scene effect solution for the IBL rendering of any 3d scene at about 60 FPS where geometry, illumination and shading materials can all be tweaked in real time.

A more technical presentation will be forthcoming.

AlphaView – Sneek Peek

{youtube}u9QZbPETu7c{/youtube}

AlphaView was build from ground up to provide a state of the art rendering solution for Real time Image Based Lighting (IBL). In this video we show a sneak peek at our solution which goes from capturing a 360° spherical surrounding image to compute the spherical harmonics-based visibility function for environment lighting to the screen space scene effect solution for the IBL rendering of any 3d scene at about 60 FPS where geometry, illumination and shading materials can all be tweaked in real time.

A more technical presentation will be forthcoming.

Visual-inertial head tracking in cars
Visual-inertial head tracking in cars

{youtube}VAwocrga__o{/youtube}

Online Ergonomic Assessment in an Industrial Environment

Nowadays, ergonomic evaluation of manual workflows is mostly based on subjective assessment and is performed offline. This video demonstrates a system, which provides objective measures for global ergonomic evaluation and even permits real-time assessment and feedback. The system continuously estimates the worker’s motions based on a body sensor network and derives global biomechanical scores using the ergonomic tool Rapid Upper Limb Assessment (RULA). Based on this, the user receives visual and acoustic feedback in real-time through a head-mounted display. This permits the worker to modify his posture immediately in order to decrease the risk of musculosceletal disorders. Moreover, the ergonomic scores are documented for offline analysis. The system could be used for planning, optimizing or training new workflows. It has been developed within the European project COGNITO (www.ict-cognito.org) in close cooperation between signal processing, biomechanics and end user requirements experts.

Online Ergonomic Assessment in an Industrial Environment

{youtube}O_3yhNenyN8{/youtube}

Nowadays, ergonomic evaluation of manual workflows is mostly based on subjective assessment and is performed offline. This video demonstrates a system, which provides objective measures for global ergonomic evaluation and even permits real-time assessment and feedback. The system continuously estimates the worker’s motions based on a body sensor network and derives global biomechanical scores using the ergonomic tool Rapid Upper Limb Assessment (RULA). Based on this, the user receives visual and acoustic feedback in real-time through a head-mounted display. This permits the worker to modify his posture immediately in order to decrease the risk of musculosceletal disorders. Moreover, the ergonomic scores are documented for offline analysis. The system could be used for planning, optimizing or training new workflows. It has been developed within the European project COGNITO (www.ict-cognito.org) in close cooperation between signal processing, biomechanics and end user requirements experts. 

Contact: Gabriele.Bleser@dfki.de

Full-body motion tracking with 10 wireless IMUs
Full-body motion tracking with 10 wireless IMUs

{youtube}tBnS0NPdvvQ{/youtube}

AR Handbook – Installing RAM
Nils Petersen and Didier Stricker, ‘Learning Task Structure from Video Examples for Workflow Tracking and Authoring’, in Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR), 2012
AR Handbook – Installing RAM

{youtube}eRX9aqAPkB8{/youtube}

Nils Petersen and Didier Stricker, ‘Learning Task Structure from Video Examples for Workflow Tracking and Authoring’, in Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR), 2012