Online parameter tuning for object tracking algorithms super()

Object tracking quality usually depends on video scene conditions (e.g. illumination, density of objects, object occlusion level). In order to overcome this limitation, this article presents a new control approach to adapt the object tracking process to the scene condition variations. More precisely...

Full description

Saved in:
Bibliographic Details
Published in:Image and vision computing Vol. 32; no. 4; pp. 287 - 302
Main Authors: Chau, Duc Phu, Thonnat, Monique, Bremond, Francois, Corvee, Etienne
Format: Journal Article
Language:English
Published: 01-04-2014
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Object tracking quality usually depends on video scene conditions (e.g. illumination, density of objects, object occlusion level). In order to overcome this limitation, this article presents a new control approach to adapt the object tracking process to the scene condition variations. More precisely, this approach learns how to tune the tracker parameters to cope with the tracking context variations. The tracking context, or context, of a video sequence is defined as a set of six features: density of mobile objects, their occlusion level, their contrast with regard to the surrounding background, their contrast variance, their 2D area and their 2D area variance. In an offline phase, training video sequences are classified by clustering their contextual features. Each context cluster is then associated to satisfactory tracking parameters. In the online control phase, once a context change is detected, the tracking parameters are tuned using the learned values. The approach has been experimented with three different tracking algorithms and on long, complex video datasets. This article brings two significant contributions: (1) a classification method of video sequences to learn offline tracking parameters and (2) a new method to tune online tracking parameters using tracking context.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
content type line 23
ObjectType-Feature-1
ISSN:0262-8856
DOI:10.1016/j.imavis.2014.02.003