Cookies

We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.


Durham Research Online
You are in:

2D pose-based real-time human action recognition with occlusion-handling.

Angelini, Federico and Fu, Zeyu and Long, Yang and Shao, Ling and Naqvi, Syed Mohsen (2020) '2D pose-based real-time human action recognition with occlusion-handling.', IEEE transactions on multimedia., 22 (6). pp. 1433-1446.

Abstract

Human Action Recognition (HAR) for CCTV-oriented applications is still a challenging problem. Real-world scenarios HAR implementations is difficult because of the gap between Deep Learning data requirements and what the CCTV-based frameworks can offer in terms of data recording equipments. We propose to reduce this gap by exploiting human poses provided by the OpenPose, which has been already proven to be an effective detector in CCTV-like recordings for tracking applications. Therefore, in this work, we first propose ActionXPose: a novel 2D pose-based approach for pose-level HAR. ActionXPose extracts low- and high-level features from body poses which are provided to a Long Short-Term Memory Neural Network and a 1D Convolutional Neural Network for the classification. We also provide a new dataset, named ISLD, for realistic pose-level HAR in a CCTV-like environment, recorded in the Intelligent Sensing Lab. ActionXPose is extensively tested on ISLD under multiple experimental settings, e.g. Dataset Augmentation and Cross-Dataset setting, as well as revising other existing datasets for HAR. ActionXPose achieves state-of-the-art performance in terms of accuracy, very high robustness to occlusions and missing data, and promising results for practical implementation in real-world applications.

Item Type:Article
Full text:(AM) Accepted Manuscript
Download PDF
(1380Kb)
Status:Peer-reviewed
Publisher Web site:https://doi.org/10.1109/TMM.2019.2944745
Publisher statement:© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Date accepted:25 September 2019
Date deposited:16 June 2020
Date of first online publication:30 September 2019
Date first made open access:16 June 2020

Save or Share this output

Export:
Export
Look up in GoogleScholar