Search
Publication Authors

Prof. Dr. Didier Stricker

Dr. Alain Pagani

Dr. Gerd Reis

Eric Thil

Keonna Cunningham

Dr. Oliver Wasenmüller

Dr. Gabriele Bleser
Dr. Bruno Mirbach

Dr. Jason Raphael Rambach

Dr. Bertram Taetz
Dr. Muhammad Zeshan Afzal

Sk Aziz Ali

Mhd Rashed Al Koutayni
Murad Almadani
Alaa Alshubbak
Yuriy Anisimov

Jilliam Maria Diaz Barros

Ramy Battrawy
Hammad Butt

Mahdi Chamseddine
Steve Dias da Cruz

Fangwen Shu

Torben Fetzer

Ahmet Firintepe
Sophie Folawiyo

David Michael Fürst
Kamalveerkaur Garewal

Christiano Couto Gava
Leif Eric Goebel

Tewodros Amberbir Habtegebrial
Simon Häring
Khurram Hashmi

Jigyasa Singh Katrolia

Andreas Kölsch
Onorina Kovalenko

Stephan Krauß
Paul Lesur

Muhammad Jameel Nawaz Malik
Michael Lorenz
Markus Miezal

Mina Ameli

Nareg Minaskan Karabid
Mohammad Minouei

Pramod Murthy

Mathias Musahl

Peter Neigel

Manthan Pancholi
Qinzhuan Qian

Engr. Kumail Raza
Dr. Nadia Robertini
María Alejandra Sánchez Marín
Dr. Kripasindhu Sarkar

Alexander Schäfer
Pascal Schneider

René Schuster

Mohamed Selim
Lukas Stefan Staecker

Dennis Stumpf

Yongzhi Su

Xiaoying Tan
Yaxu Xie

Dr. Vladislav Golyanik

Dr. Aditya Tewari

André Luiz Brandão
A user supported object tracking framework for interactive video production
A user supported object tracking framework for interactive video production
Christian Bailer, Alain Pagani, Didier Stricker
k.A.; : Journal of Virtual Reality and Broadcasting
- Abstract:
- We present a user supported tracking framework that combines automatic tracking with extended user input to create error free tracking results that are suitable for interactive video production. The goal of our approach is to keep the necessary user input as small as possible. In our framework, the user can select between different tracking algorithms - existing ones and new ones that are described in this paper. Furthermore, the user can automatically fuse the results of different tracking algorithms with our robust fusion approach. The tracked object can be marked in more than one frame, which can significantly improve the tracking result. After tracking, the user can validate the results in an easy way, thanks to the support of a powerful interpolation technique. The tracking results are iteratively improved until the complete track has been found. After the iterative editing process the tracking result of each object is stored in an interactive video file that can be loaded by our player for interactive videos.