Human Activity Recognition and Location Based on Temporal Analysis
Current methods of human activity recognition face many challenges, such as the need for multiple sensors, poor implementation, unreliable real-time performance, and lack of temporal location. In this research, we developed a method for recognizing and locating human activities based on temporal act...
Saved in:
Published in: | Journal of engineering (Cairo, Egypt) Vol. 2018; no. 2018; pp. 1 - 11 |
---|---|
Main Authors: | , , , , |
Format: | Journal Article |
Language: | English |
Published: |
Cairo, Egypt
Hindawi Publishing Corporation
01-01-2018
Hindawi Hindawi Limited |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Current methods of human activity recognition face many challenges, such as the need for multiple sensors, poor implementation, unreliable real-time performance, and lack of temporal location. In this research, we developed a method for recognizing and locating human activities based on temporal action recognition. For this work, we used a multilayer convolutional neural network (CNN) to extract features. In addition, we used refined actionness grouping to generate precise region proposals. Then, we classified the candidate regions by employing an activity classifier based on a structured segmented network and a cascade design for end-to-end training. Compared with previous methods of action classification, the proposed method adds the time boundary and effectively improves the detection accuracy. To test this method empirically, we conducted experiments utilizing surveillance video of an offshore oil production plant. Three activities were recognized and located in the untrimmed long video: standing, walking, and falling. The accuracy of the results proved the effectiveness and real-time performance of the proposed method, demonstrating that this approach has great potential for practical application. |
---|---|
ISSN: | 2314-4904 2314-4912 2314-4912 |
DOI: | 10.1155/2018/4752191 |