Deformable Object Tracking With Gated Fusion

The tracking-by-detection framework receives growing attention through the integration with the convolutional neural networks (CNNs). Existing tracking-by-detection-based methods, however, fail to track objects with severe appearance variations. This is because the traditional convolutional operatio...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on image processing Vol. 28; no. 8; pp. 3766 - 3777
Main Authors: Wenxi Liu, Yibing Song, Dengsheng Chen, Shengfeng He, Yuanlong Yu, Tao Yan, Hancke, Gehard P., Lau, Rynson W. H.
Format: Journal Article
Language:English
Published: United States IEEE 01-08-2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The tracking-by-detection framework receives growing attention through the integration with the convolutional neural networks (CNNs). Existing tracking-by-detection-based methods, however, fail to track objects with severe appearance variations. This is because the traditional convolutional operation is performed on fixed grids, and thus may not be able to find the correct response while the object is changing pose or under varying environmental conditions. In this paper, we propose a deformable convolution layer to enrich the target appearance representations in the tracking-by-detection framework. We aim to capture the target appearance variations via deformable convolution, which adaptively enhances its original features. In addition, we also propose a gated fusion scheme to control how the variations captured by the deformable convolution affect the original appearance. The enriched feature representation through deformable convolution facilitates the discrimination of the CNN classifier on the target object and background. The extensive experiments on the standard benchmarks show that the proposed tracker performs favorably against the state-of-the-art methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2019.2902784