Action recognition using Kinematics Posture Feature on 3D skeleton joint locations

Article


Ahad, M. A. R., Ahmed, M., Antar, A. D., Makihara, Y. and Yagi. Y. 2021. Action recognition using Kinematics Posture Feature on 3D skeleton joint locations. Pattern Recognition Letters. 145, pp. 216-224. https://doi.org/10.1016/j.patrec.2021.02.013
AuthorsAhad, M. A. R., Ahmed, M., Antar, A. D., Makihara, Y. and Yagi. Y.
Abstract

Action recognition is a very widely explored research area in computer vision and related fields. We propose Kinematics Posture Feature (KPF) extraction from 3D joint positions based on skeleton data for improving the performance of action recognition. In this approach, we consider the skeleton 3D joints as kinematics sensors. We propose Linear Joint Position Feature (LJPF) and Angular Joint Position Feature (AJPF) based on 3D linear joint positions and angles between bone segments. We then combine these two kinematics features for each video frame for each action to create the KPF feature sets. These feature sets encode the variation of motion in the temporal domain as if each body joint represents kinematics position and orientation sensors. In the next stage, we process the extracted KPF feature descriptor by using a low pass filter, and segment them by using sliding windows with optimized length. This concept resembles the approach of processing kinematics sensor data. From the segmented windows, we compute the Position-based Statistical Feature (PSF). These features consist of temporal domain statistical features (e.g., mean, standard deviation, variance, etc.). These statistical features encode the variation of postures (i.e., joint positions and angles) across the video frames. For performing classification, we explore Support Vector Machine (Linear), RNN, CNNRNN, and ConvRNN model. The proposed PSF feature sets demonstrate prominent performance in both statistical machine learning- and deep learning-based models. For evaluation, we explore five benchmark datasets namely UTKinect-Action3D, Kinect Activity Recognition Dataset (KARD), MSR 3D Action Pairs, Florence 3D, and Office Activity Dataset (OAD). To prevent overfitting, we consider the leave-one-subject-out framework as the experimental setup and perform 10-fold cross-validation. Our approach outperforms several existing methods in these benchmark datasets and achieves very promising classification performance.

KeywordsAI; Activity recognition; Skeleton; Vision; Deep learning
JournalPattern Recognition Letters
Journal citation145, pp. 216-224
ISSN0167-8655
Year2021
PublisherElsevier
Publisher's version
License
File Access Level
Anyone
Digital Object Identifier (DOI)https://doi.org/10.1016/j.patrec.2021.02.013
Publication dates
Online03 Mar 2021
PrintMay 2021
Publication process dates
Accepted26 Feb 2021
Deposited04 Dec 2023
Copyright holder© 2021, The Authors
Permalink -

https://repository.uel.ac.uk/item/8wz27

Download files


Publisher's version
prl aa.pdf
License: CC BY 4.0
File access level: Anyone

  • 31
    total views
  • 12
    total downloads
  • 3
    views this month
  • 0
    downloads this month

Export as

Related outputs

Learn Programming with C: An Easy Step-by-Step Self-Practice Book for Learning C
Imran, S. M. S. and Ahad, M. A. R. 2024. Learn Programming with C: An Easy Step-by-Step Self-Practice Book for Learning C. CRC Press: Taylor & Francis Group.
Deep learning with image-based autism spectrum disorder analysis: A systematic review
Uddin, M. Z., Shahriar, M. A., Mahamood, M. N., Alnajjar, F., Pramanik, M. I. and Ahad, M. A. R. 2024. Deep learning with image-based autism spectrum disorder analysis: A systematic review. Engineering Applications of Artificial Intelligence. 127 (Art. 107185). https://doi.org/doi.org/10.1016/j.engappai.2023.107185
Annotator-dependent uncertainty-aware estimation of gait relative attributes
Shehata, A., Makihara, Y., Muramatsu, D., Ahad, M. and Yasushi, Y. 2023. Annotator-dependent uncertainty-aware estimation of gait relative attributes. Pattern Recognition. 136 (Art. 109197). https://doi.org/10.1016/j.patcog.2022.109197
HID 2022: The 3rd International Competition on Human Identification at a Distance
Yu, S., Huang, Y., Wang, L., Makihara, Y., Wang, S., Ahad, M. and Nixon, M. 2022. HID 2022: The 3rd International Competition on Human Identification at a Distance. IJCB 2022: IEEE International Joint Conference on Biometrics. Abu Dhabi, UAE 10 - 13 Dec 2023 IEEE. https://doi.org/10.1109/IJCB54206.2022.10007993
Automated detection approaches to autism spectrum disorder based on human activity analysis: A review
Rahman, S., Ahmed, S. F., Shahid, O., Arrafi, M. A. and Ahad, M. A. R. 2022. Automated detection approaches to autism spectrum disorder based on human activity analysis: A review. Cognitive Computation. 14, pp. 1773-1800. https://doi.org/10.1007/s12559-021-09895-w
A Sleep Monitoring System Using Ultrasonic Sensors
Shammi, U. A. and Ahad, M. 2022. A Sleep Monitoring System Using Ultrasonic Sensors. International Journal of Biomedical Soft Computing and Human Sciences. 27 (1), pp. 13-20. https://doi.org/10.24466/ijbschs.27.1_13
Can Ensemble of Classifiers Provide Better Recognition Results in Packaging Activity?
Nazmus Sakib, A. H. M., Basak, P., Doha Uddin, S., Mustavi Tasin, S. and Ahad, M. 2022. Can Ensemble of Classifiers Provide Better Recognition Results in Packaging Activity? 3rd International Conference on Activity and Behavior Computing (ABC 2021). Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_10
Identification of Food Packaging Activity Using MoCap Sensor Data
Anwar, A., Islam Tapotee, M., Saha, P. and Ahad, M. 2022. Identification of Food Packaging Activity Using MoCap Sensor Data. 3rd International Conference on Activity and Behavior Computing (ABC 2021). Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_11
Lunch-Box Preparation Activity Understanding from Motion Capture Data Using Handcrafted Features
Pritom, Y. A., Rahman, M. S., Rahman, H. R., Kowshik, M. A. and Ahad, M. 2022. Lunch-Box Preparation Activity Understanding from Motion Capture Data Using Handcrafted Features. 3rd International Conference on Activity and Behavior Computing (ABC 2021). Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_12
Bento Packaging Activity Recognition Based on Statistical Features
Rakib Sayem, F., Sheikh, M. M. and Ahad, M. 2022. Bento Packaging Activity Recognition Based on Statistical Features. 3rd International Conference on Activity and Behavior Computing (ABC 2021). Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_13
MUMAP: Modified Ultralightweight Mutual Authentication protocol for RFID enabled IoT networks
Raju, M. H., Ahmed, M. U. and Ahad, M. A. R. 2021. MUMAP: Modified Ultralightweight Mutual Authentication protocol for RFID enabled IoT networks. Journal of the Institute of Industrial Applications Engineers. 9 (2), pp. 33-39. https://doi.org/10.12792/JIIAE.9.33
Emotion Recognition from EEG Signal Focusing on Deep Learning and Shallow Learning Techniques
Islam, M. R., Moni, M. A., Islam, M. M., Rashed-Al-Mahfuz, M., Islam, M. S., Hasan, M. K., Hossain, M. S., Ahmad, M., Uddin, S., Azad, A., Alyami, S. A., Ahad, M. A. R. and Lió, P. 2021. Emotion Recognition from EEG Signal Focusing on Deep Learning and Shallow Learning Techniques. IEEE Access. 9, pp. 94601-94624. https://doi.org/10.1109/ACCESS.2021.3091487
Static Postural Transition-based Technique and Efficient Feature Extraction for Sensor-based Activity Recognition
Ahmed, M., Das Antar, A. and Ahad, M. 2021. Static Postural Transition-based Technique and Efficient Feature Extraction for Sensor-based Activity Recognition. Pattern Recognition Letters. 147, pp. 25-33. https://doi.org/10.1016/j.patrec.2021.04.001
Activity Recognition from Accelerometer Data Based on Supervised Learning for Wireless Sensor Network
Israt, F. A., Hossain, T., Inoue, S. and Ahad, M. A. R. 2021. Activity Recognition from Accelerometer Data Based on Supervised Learning for Wireless Sensor Network. International Journal of Biomedical Soft Computing and Human Sciences. 26 (2), pp. 73-86. https://doi.org/10.24466/ijbschs.26.2_73
Exploring Human Activities Using eSense Earable Device
Islam, M. S., Hossain, T., Ahad, M. and Inoue, S. 2021. Exploring Human Activities Using eSense Earable Device. in: Ahad, M., Inoue, S., Roggen, D. and Fujinami, K. (ed.) Activity and Behavior Computing Springer Singapore. pp. 169–185
Contactless Human Monitoring: Challenges and Future Direction
Mahbub, U., Rahman, T. and Ahad, M. 2021. Contactless Human Monitoring: Challenges and Future Direction. in: Ahad, M., Mahbub, U. and Ahad, M. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 335-364
Contactless Human Emotion Analysis Across Different Modalities
Nahid, N., Rahman, A. and Ahad, M. 2021. Contactless Human Emotion Analysis Across Different Modalities. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 237-269
Contactless Fall Detection for the Elderly
Nahian, M. J. A., Raju, M. H., Tasnim, Z., Mahmud, M., Ahad, M. and Kaiser, M. S. 2021. Contactless Fall Detection for the Elderly. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 203-235
Signal Processing for Contactless Monitoring
Billah, M. S., Ahad, M. and Mahbub, U. 2021. Signal Processing for Contactless Monitoring. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 113-144
Skeleton-Based Activity Recognition: Preprocessing and Approaches
Sarker, S., Rahman, S., Hossain, T., Faiza Ahmed, S., Jamal, L. and Ahad, M. 2021. Skeleton-Based Activity Recognition: Preprocessing and Approaches. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 48-81
IoT Sensor-Based Activity Recognition: Human Activity Recognition
Ahad, M., Antar, A. D. and Ahmed, M. 2021. IoT Sensor-Based Activity Recognition: Human Activity Recognition. Springer, Cham.
A Method for Sensor-Based Activity Recognition in Missing Data Scenario
Hossain, T., Ahad, M. A. R. and Inoue, S. 2020. A Method for Sensor-Based Activity Recognition in Missing Data Scenario. Sensors. 20 (14), pp. 1-23. https://doi.org/10.3390/s20143811
An AI-based Visual Aid with Integrated Reading Assistant for the Completely Blind
Khan, M. A., Paul, P., Rashid, M., Hossain, M. and Ahad, M. 2020. An AI-based Visual Aid with Integrated Reading Assistant for the Completely Blind. IEEE Transactions on Human-Machine Systems. 50 (6), pp. 507-517. https://doi.org/10.1109/THMS.2020.3027534