Human-Robot Interaction by Whole Body Gesture Spotting and Recognition

An intelligent robot is required for natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural human-robot interaction (HRI). Previous HRI research focused on issues such as hand gesture, sign language, and command gesture recognition. Automatic recogn...

Full description

Saved in:
Bibliographic Details
Published in:18th International Conference on Pattern Recognition (ICPR'06) Vol. 4; pp. 774 - 777
Main Authors: Hee-Deok Yang, A-Yeon Park, Seong-Whan Lee
Format: Conference Proceeding
Language:English
Published: IEEE 2006
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:An intelligent robot is required for natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural human-robot interaction (HRI). Previous HRI research focused on issues such as hand gesture, sign language, and command gesture recognition. Automatic recognition of whole body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole body gestures, is a complex task. This paper presents a new method for recognition of whole body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3D. A feature vector is then mapped to a codeword of gesture HMMs. In order to spot key gestures accurately, a sophisticated method of designing a garbage gesture model is proposed; model reduction, which merges similar states, based on data-dependent statistics and relative entropy. The proposed method has been tested with 20 persons' samples and 200 synthetic data. The proposed method achieved a reliability rate of 94.8% in spotting task and a recognition rate of 97.4% from an isolated gesture
ISBN:0769525210
9780769525211
ISSN:1051-4651
2831-7475
DOI:10.1109/ICPR.2006.642