Exploring Human Activities Using eSense Earable Device

Book chapter


Islam, M. S., Hossain, T., Ahad, M. and Inoue, S. 2021. Exploring Human Activities Using eSense Earable Device. in: Ahad, M., Inoue, S., Roggen, D. and Fujinami, K. (ed.) Activity and Behavior Computing Springer Singapore. pp. 169–185
AuthorsIslam, M. S., Hossain, T., Ahad, M. and Inoue, S.
EditorsAhad, M., Inoue, S., Roggen, D. and Fujinami, K.
Abstract

Detecting head- and mouth-related human activities of elderly people are very important for nurse care centers. They need to track different types of activities of elderly people like swallowing, eating, etc., to measure the health status of elderly people. In this regard, earable devices open up interesting possibilities for monitoring personal-scale behavioral activities. Here, we introduce activity recognition based on an earable device called ‘eSense’. It has multiple sensors that can be used for human activity recognition. ‘eSense’ has a 6-axis inertial measurement unit with a microphone and Bluetooth. In this paper, we propose an activity recognition framework using eSense device. We collect accelerometer and gyroscope sensor data from eSense device to detect head- and mouth-related activities along with other normal human activities. We evaluated the classification performance of the classifier using both accelerometer and gyroscope data. For this work, we develop a smartphone application for data collection from the eSense. Several statistical features are exploited to recognize head- and mouth-related activities (e.g., head nodding, head shaking, eating, and speaking), and regular activities (e.g., stay, walk, and speaking while walking). We explored different types of machine learning approaches like Convolutional Neural Network (CNN), Random Forest (RnF), K-Nearest Neighbor (KNN), Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), etc., for classifying activities. We have achieved satisfactory results. Our results show that using both accelerometer and gyroscope sensors can improve performance. We achieve accuracy of 80.45% by LDA, 93.34% by SVM, 91.92% by RnF, 91.64% by KNN, and 93.76% by CNN while we exploit both accelerometer and gyroscope sensor data together. The results demonstrate the prospect of eSense device for detecting human activities in various healthcare monitoring system.

Book titleActivity and Behavior Computing
Page range169–185
Year2021
PublisherSpringer Singapore
Publication dates
Print24 Dec 2020
Publication process dates
Deposited26 Jul 2023
Edition1
SeriesSmart Innovation, Systems and Technologies
ISBN9789811589447
9789811589430
ISSN2190-3018
Digital Object Identifier (DOI)https://doi.org/978-981-15-8944-7_11
Web address (URL)https://link.springer.com/book/10.1007/978-981-15-8944-7
Permalink -

https://repository.uel.ac.uk/item/8w595

  • 84
    total views
  • 0
    total downloads
  • 0
    views this month
  • 0
    downloads this month

Export as

Related outputs

Optimizing Endotracheal Suctioning Classification: Leveraging Prompt Engineering in Machine Learning for Feature Selection
Islam, M. R., Ferodous, A. M., Hossain, S., Alnajjar, F. and Ahad, M. 2024. Optimizing Endotracheal Suctioning Classification: Leveraging Prompt Engineering in Machine Learning for Feature Selection. ABC 2024: 6th International Conference on Activity and Behavior Computing. Kyushu, Japan 28 - 31 May 2024 IEEE.
Nurse Activity Recognition based on Temporal Frequency Features
Rahman, M. S., Rahman, H. R., Zarif, A., Pritom, Y. A. and Ahad, M. A. R. 2024. Nurse Activity Recognition based on Temporal Frequency Features. in: Ahad, M. A. R., Inoue, S., Lopez, G. and Hossain, T. (ed.) Human Activity and Behavior Analysis: Advances in Computer Vision and Sensors, Vol. 1 CRC Press: Taylor & Francis Group. pp. 311-322
A Sequential-based Analytical Approach for Nurse Care Activity Forecasting
Sheikh, M. M., Hossain, S. and Ahad, M. A. R. 2024. A Sequential-based Analytical Approach for Nurse Care Activity Forecasting. in: Ahad, M. A. R., Inoue, S., Lopez, G. and Hossain, T. (ed.) Human Activity and Behavior Analysis Advances in Computer Vision and Sensors: Volume 1 CRC Press: Taylor & Francis Group. pp. 349-368
Psychological Analysis in Human-Robot Collaboration from Workplace Stress Factors: A Review
Nahid, N., Xinyi, M., Inoue, S. and Ahad, M. A. R. 2024. Psychological Analysis in Human-Robot Collaboration from Workplace Stress Factors: A Review. in: Ahad, M. A. R., Inoue, S., Lopez, G. and Hossain, T. (ed.) Human Activity and Behavior Analysis: Advances in Computer Vision and Sensors: Volume 2 Boca Raton, Florida CRC Press: Taylor & Francis Group. pp. 165-197
Static Sign Language Recognition Using Segmented Images and HOG on Cluttered Backgrounds
Sadeghzadeh, A., Islam, B. and Ahad, M. A. R. 2024. Static Sign Language Recognition Using Segmented Images and HOG on Cluttered Backgrounds. in: Ahad, M. A. R., Inoue, S., Lopez, G. and Hossain, T. (ed.) Human Activity and Behavior Analysis: Advances in Computer Vision and Sensors: Volume 2 Boca Raton, Florida CRC Press: Taylor & Francis Group. pp. 23-45
E2ETCA: End-to-end training of CNN and attention ensembles for rice disease diagnosis
Uddin, M. Z., Mahamood, M. N., Ray, A., Pramanik, M. I., Alnajjar, F. and Ahad, M. A. R. 2024. E2ETCA: End-to-end training of CNN and attention ensembles for rice disease diagnosis. Journal of Integrative Agriculture. In Press. https://doi.org/10.1016/j.jia.2024.03.075
Elderly Motion Analysis to Estimate Emotion: A Systematic Review
Hassan, I., Nahid, N., Ahad, M. and Inoue, S. 2024. Elderly Motion Analysis to Estimate Emotion: A Systematic Review. International Journal of Activity and Behavior Computing. (2), pp. 1-23. https://doi.org/10.60401/ijabc.23
Integrating Human Behavioral Model for Intimate-distance Human Robot Collaboration
Nahid, N., Hassan, I., Min, X., Ryoke, N., Ahad, M. and Inoue, S. 2024. Integrating Human Behavioral Model for Intimate-distance Human Robot Collaboration. International Journal of Activity and Behavior Computing. (2), pp. 1-26. https://doi.org/10.60401/ijabc.27
Stereoscopic Video Deblurring Transformer
Imani, H., Islam, M. B., Junayed, M, S. and Ahad, M. A R. 2024. Stereoscopic Video Deblurring Transformer. Scientific Reports. 14 (Art. 14342). https://doi.org/10.1038/s41598-024-63860-9
Learn Programming with C: An Easy Step-by-Step Self-Practice Book for Learning C
Imran, S. M. S. and Ahad, M. A. R. 2024. Learn Programming with C: An Easy Step-by-Step Self-Practice Book for Learning C. CRC Press: Taylor & Francis Group.
Deep learning with image-based autism spectrum disorder analysis: A systematic review
Uddin, M. Z., Shahriar, M. A., Mahamood, M. N., Alnajjar, F., Pramanik, M. I. and Ahad, M. A. R. 2024. Deep learning with image-based autism spectrum disorder analysis: A systematic review. Engineering Applications of Artificial Intelligence. 127 (Art. 107185). https://doi.org/10.1016/j.engappai.2023.107185
Unsupervised Stereoscopic Video Style Transfer
Imani, H., Islam, M. B. and Ahad, M. A. R. 2023. Unsupervised Stereoscopic Video Style Transfer. ASYU 2023: Innovations in Intelligent Systems and Applications Conference. Sivas, Türkiye 11 - 13 Oct 2023 IEEE. https://doi.org/10.1109/ASYU58738.2023.10296716
Human Identification at a Distance: Challenges, Methods and Results on HID 2023
Yu, S., Weng, C., Zhao, Y., Wang, L., Wang, M., Li, Q., Li, W., Wang, R., Huang, Y., Wang, L., Makihara, Y. and Ahad, M. A. R. 2023. Human Identification at a Distance: Challenges, Methods and Results on HID 2023. IJCB 2023: IEEE International Joint Conference on Biometrics. Ljubljana, Slovenia 25 - 28 Sep 2023 IEEE. https://doi.org/10.1109/IJCB57857.2023.10448952
Autism Spectrum Disorder Classification via Local and Global Feature Representation of Facial Image
Mahamood, M. N., Uddin, M. Z., Shahriar, M. A., Alnajjar, F. and Ahad, M. A. R. 2023. Autism Spectrum Disorder Classification via Local and Global Feature Representation of Facial Image. SMC 2023: IEEE International Conference on Systems, Man, and Cybernetics. Hawaii, USA 01 - 04 Oct 2023 IEEE. https://doi.org/10.1109/SMC53992.2023.10394092
Annotator-dependent uncertainty-aware estimation of gait relative attributes
Shehata, A., Makihara, Y., Muramatsu, D., Ahad, M. and Yasushi, Y. 2023. Annotator-dependent uncertainty-aware estimation of gait relative attributes. Pattern Recognition. 136 (Art. 109197). https://doi.org/10.1016/j.patcog.2022.109197
HID 2022: The 3rd International Competition on Human Identification at a Distance
Yu, S., Huang, Y., Wang, L., Makihara, Y., Wang, S., Ahad, M. and Nixon, M. 2022. HID 2022: The 3rd International Competition on Human Identification at a Distance. IJCB 2022: IEEE International Joint Conference on Biometrics. Abu Dhabi, UAE 10 - 13 Dec 2023 IEEE. https://doi.org/10.1109/IJCB54206.2022.10007993
Advances in Human Action, Activity and Gesture Recognition
Mahbub, U. and Ahad, M. 2022. Advances in Human Action, Activity and Gesture Recognition. Pattern Recognition Letters. 155, pp. 186-190. https://doi.org/10.1016/j.patrec.2021.11.003
Automated detection approaches to autism spectrum disorder based on human activity analysis: A review
Rahman, S., Ahmed, S. F., Shahid, O., Arrafi, M. A. and Ahad, M. A. R. 2022. Automated detection approaches to autism spectrum disorder based on human activity analysis: A review. Cognitive Computation. 14, pp. 1773-1800. https://doi.org/10.1007/s12559-021-09895-w
A Sleep Monitoring System Using Ultrasonic Sensors
Shammi, U. A. and Ahad, M. 2022. A Sleep Monitoring System Using Ultrasonic Sensors. International Journal of Biomedical Soft Computing and Human Sciences. 27 (1), pp. 13-20. https://doi.org/10.24466/ijbschs.27.1_13
Can Ensemble of Classifiers Provide Better Recognition Results in Packaging Activity?
Nazmus Sakib, A. H. M., Basak, P., Doha Uddin, S., Mustavi Tasin, S. and Ahad, M. 2022. Can Ensemble of Classifiers Provide Better Recognition Results in Packaging Activity? ABC 2021: 3rd International Conference on Activity and Behavior Computing. Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_10
Identification of Food Packaging Activity Using MoCap Sensor Data
Anwar, A., Islam Tapotee, M., Saha, P. and Ahad, M. 2022. Identification of Food Packaging Activity Using MoCap Sensor Data. ABC 2021: 3rd International Conference on Activity and Behavior Computing. Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_11
Lunch-Box Preparation Activity Understanding from Motion Capture Data Using Handcrafted Features
Pritom, Y. A., Rahman, M. S., Rahman, H. R., Kowshik, M. A. and Ahad, M. 2022. Lunch-Box Preparation Activity Understanding from Motion Capture Data Using Handcrafted Features. ABC 2021: 3rd International Conference on Activity and Behavior Computing. Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_12
Bento Packaging Activity Recognition Based on Statistical Features
Rakib Sayem, F., Sheikh, M. M. and Ahad, M. 2022. Bento Packaging Activity Recognition Based on Statistical Features. ABC 2021: 3rd International Conference on Activity and Behavior Computing. Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_13
MUMAP: Modified Ultralightweight Mutual Authentication protocol for RFID enabled IoT networks
Raju, M. H., Ahmed, M. U. and Ahad, M. A. R. 2021. MUMAP: Modified Ultralightweight Mutual Authentication protocol for RFID enabled IoT networks. Journal of the Institute of Industrial Applications Engineers. 9 (2), pp. 33-39. https://doi.org/10.12792/JIIAE.9.33
Emotion Recognition from EEG Signal Focusing on Deep Learning and Shallow Learning Techniques
Islam, M. R., Moni, M. A., Islam, M. M., Rashed-Al-Mahfuz, M., Islam, M. S., Hasan, M. K., Hossain, M. S., Ahmad, M., Uddin, S., Azad, A., Alyami, S. A., Ahad, M. A. R. and Lió, P. 2021. Emotion Recognition from EEG Signal Focusing on Deep Learning and Shallow Learning Techniques. IEEE Access. 9, pp. 94601-94624. https://doi.org/10.1109/ACCESS.2021.3091487
Static Postural Transition-based Technique and Efficient Feature Extraction for Sensor-based Activity Recognition
Ahmed, M., Das Antar, A. and Ahad, M. 2021. Static Postural Transition-based Technique and Efficient Feature Extraction for Sensor-based Activity Recognition. Pattern Recognition Letters. 147, pp. 25-33. https://doi.org/10.1016/j.patrec.2021.04.001
Recognition of human locomotion on various transportations fusing smartphone sensors
Das Antar, A., Ahmed, M. and Ahad, M. 2021. Recognition of human locomotion on various transportations fusing smartphone sensors. Pattern Recognition Letters. 148, pp. 146-153. https://doi.org/10.1016/j.patrec.2021.04.015
Activity Recognition from Accelerometer Data Based on Supervised Learning for Wireless Sensor Network
Israt, F. A., Hossain, T., Inoue, S. and Ahad, M. A. R. 2021. Activity Recognition from Accelerometer Data Based on Supervised Learning for Wireless Sensor Network. International Journal of Biomedical Soft Computing and Human Sciences. 26 (2), pp. 73-86. https://doi.org/10.24466/ijbschs.26.2_73
Action recognition using Kinematics Posture Feature on 3D skeleton joint locations
Ahad, M. A. R., Ahmed, M., Antar, A. D., Makihara, Y. and Yagi. Y. 2021. Action recognition using Kinematics Posture Feature on 3D skeleton joint locations. Pattern Recognition Letters. 145, pp. 216-224. https://doi.org/10.1016/j.patrec.2021.02.013
Contactless Human Monitoring: Challenges and Future Direction
Mahbub, U., Rahman, T. and Ahad, M. 2021. Contactless Human Monitoring: Challenges and Future Direction. in: Ahad, M., Mahbub, U. and Ahad, M. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 335-364
Contactless Human Emotion Analysis Across Different Modalities
Nahid, N., Rahman, A. and Ahad, M. 2021. Contactless Human Emotion Analysis Across Different Modalities. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 237-269
Contactless Fall Detection for the Elderly
Nahian, M. J. A., Raju, M. H., Tasnim, Z., Mahmud, M., Ahad, M. and Kaiser, M. S. 2021. Contactless Fall Detection for the Elderly. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 203-235
Signal Processing for Contactless Monitoring
Billah, M. S., Ahad, M. and Mahbub, U. 2021. Signal Processing for Contactless Monitoring. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 113-144
Skeleton-Based Activity Recognition: Preprocessing and Approaches
Sarker, S., Rahman, S., Hossain, T., Faiza Ahmed, S., Jamal, L. and Ahad, M. 2021. Skeleton-Based Activity Recognition: Preprocessing and Approaches. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 48-81
IoT Sensor-Based Activity Recognition: Human Activity Recognition
Ahad, M., Antar, A. D. and Ahmed, M. 2021. IoT Sensor-Based Activity Recognition: Human Activity Recognition. Springer, Cham.
A Method for Sensor-Based Activity Recognition in Missing Data Scenario
Hossain, T., Ahad, M. A. R. and Inoue, S. 2020. A Method for Sensor-Based Activity Recognition in Missing Data Scenario. Sensors. 20 (14), pp. 1-23. https://doi.org/10.3390/s20143811
An AI-based Visual Aid with Integrated Reading Assistant for the Completely Blind
Khan, M. A., Paul, P., Rashid, M., Hossain, M. and Ahad, M. 2020. An AI-based Visual Aid with Integrated Reading Assistant for the Completely Blind. IEEE Transactions on Human-Machine Systems. 50 (6), pp. 507-517. https://doi.org/10.1109/THMS.2020.3027534