Head direction estimation from low resolution images with scene adaptation

•A head pose estimation method that adapts to individual scenes is proposed.•The method automatically collects training dataset for head pose estimators.•The method handle appearance differences within the same scene.•The method demonstrate high performance without manually collected training data....

Full description

Saved in:
Bibliographic Details
Published in:Computer vision and image understanding Vol. 117; no. 10; pp. 1502 - 1511
Main Authors: Chamveha, Isarun, Sugano, Yusuke, Sugimura, Daisuke, Siriteerakul, Teera, Okabe, Takahiro, Sato, Yoichi, Sugimoto, Akihiro
Format: Journal Article
Language:English
Published: Amsterdam Elsevier Inc 01-10-2013
Elsevier
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•A head pose estimation method that adapts to individual scenes is proposed.•The method automatically collects training dataset for head pose estimators.•The method handle appearance differences within the same scene.•The method demonstrate high performance without manually collected training data. This paper presents an appearance-based method for estimating head direction that automatically adapts to individual scenes. Appearance-based estimation methods usually require a ground-truth dataset taken from a scene that is similar to test video sequences. However, it is almost impossible to acquire many manually labeled head images for each scene. We introduce an approach that automatically aggregates labeled head images by inferring head direction labels from walking direction. Furthermore, in order to deal with large variations that occur in head appearance even within the same scene, we introduce an approach that segments a scene into multiple regions according to the similarity of head appearances. Experimental results demonstrate that our proposed method achieved higher accuracy in head direction estimation than conventional approaches that use a scene-independent generic dataset.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1077-3142
1090-235X
DOI:10.1016/j.cviu.2013.06.005