Exploring Human Activities Using eSense Earable Device

Book chapter


Islam, M. S., Hossain, T., Ahad, M. and Inoue, S. 2021. Exploring Human Activities Using eSense Earable Device. in: Ahad, M., Inoue, S., Roggen, D. and Fujinami, K. (ed.) Activity and Behavior Computing Springer Singapore. pp. 169–185
AuthorsIslam, M. S., Hossain, T., Ahad, M. and Inoue, S.
EditorsAhad, M., Inoue, S., Roggen, D. and Fujinami, K.
Abstract

Detecting head- and mouth-related human activities of elderly people are very important for nurse care centers. They need to track different types of activities of elderly people like swallowing, eating, etc., to measure the health status of elderly people. In this regard, earable devices open up interesting possibilities for monitoring personal-scale behavioral activities. Here, we introduce activity recognition based on an earable device called ‘eSense’. It has multiple sensors that can be used for human activity recognition. ‘eSense’ has a 6-axis inertial measurement unit with a microphone and Bluetooth. In this paper, we propose an activity recognition framework using eSense device. We collect accelerometer and gyroscope sensor data from eSense device to detect head- and mouth-related activities along with other normal human activities. We evaluated the classification performance of the classifier using both accelerometer and gyroscope data. For this work, we develop a smartphone application for data collection from the eSense. Several statistical features are exploited to recognize head- and mouth-related activities (e.g., head nodding, head shaking, eating, and speaking), and regular activities (e.g., stay, walk, and speaking while walking). We explored different types of machine learning approaches like Convolutional Neural Network (CNN), Random Forest (RnF), K-Nearest Neighbor (KNN), Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), etc., for classifying activities. We have achieved satisfactory results. Our results show that using both accelerometer and gyroscope sensors can improve performance. We achieve accuracy of 80.45% by LDA, 93.34% by SVM, 91.92% by RnF, 91.64% by KNN, and 93.76% by CNN while we exploit both accelerometer and gyroscope sensor data together. The results demonstrate the prospect of eSense device for detecting human activities in various healthcare monitoring system.

Book titleActivity and Behavior Computing
Page range169–185
Year2021
PublisherSpringer Singapore
Publication dates
Print24 Dec 2020
Publication process dates
Deposited26 Jul 2023
Edition1
SeriesSmart Innovation, Systems and Technologies
ISBN9789811589447
9789811589430
ISSN2190-3018
Digital Object Identifier (DOI)https://doi.org/978-981-15-8944-7_11
Web address (URL)https://link.springer.com/book/10.1007/978-981-15-8944-7
Permalink -

https://repository.uel.ac.uk/item/8w595

  • 45
    total views
  • 0
    total downloads
  • 3
    views this month
  • 0
    downloads this month

Export as

Related outputs

Learn Programming with C: An Easy Step-by-Step Self-Practice Book for Learning C
Imran, S. M. S. and Ahad, M. A. R. 2024. Learn Programming with C: An Easy Step-by-Step Self-Practice Book for Learning C. CRC Press: Taylor & Francis Group.
Deep learning with image-based autism spectrum disorder analysis: A systematic review
Uddin, M. Z., Shahriar, M. A., Mahamood, M. N., Alnajjar, F., Pramanik, M. I. and Ahad, M. A. R. 2024. Deep learning with image-based autism spectrum disorder analysis: A systematic review. Engineering Applications of Artificial Intelligence. 127 (Art. 107185). https://doi.org/doi.org/10.1016/j.engappai.2023.107185
Annotator-dependent uncertainty-aware estimation of gait relative attributes
Shehata, A., Makihara, Y., Muramatsu, D., Ahad, M. and Yasushi, Y. 2023. Annotator-dependent uncertainty-aware estimation of gait relative attributes. Pattern Recognition. 136 (Art. 109197). https://doi.org/10.1016/j.patcog.2022.109197
HID 2022: The 3rd International Competition on Human Identification at a Distance
Yu, S., Huang, Y., Wang, L., Makihara, Y., Wang, S., Ahad, M. and Nixon, M. 2022. HID 2022: The 3rd International Competition on Human Identification at a Distance. IJCB 2022: IEEE International Joint Conference on Biometrics. Abu Dhabi, UAE 10 - 13 Dec 2023 IEEE. https://doi.org/10.1109/IJCB54206.2022.10007993
Automated detection approaches to autism spectrum disorder based on human activity analysis: A review
Rahman, S., Ahmed, S. F., Shahid, O., Arrafi, M. A. and Ahad, M. A. R. 2022. Automated detection approaches to autism spectrum disorder based on human activity analysis: A review. Cognitive Computation. 14, pp. 1773-1800. https://doi.org/10.1007/s12559-021-09895-w
A Sleep Monitoring System Using Ultrasonic Sensors
Shammi, U. A. and Ahad, M. 2022. A Sleep Monitoring System Using Ultrasonic Sensors. International Journal of Biomedical Soft Computing and Human Sciences. 27 (1), pp. 13-20. https://doi.org/10.24466/ijbschs.27.1_13
Can Ensemble of Classifiers Provide Better Recognition Results in Packaging Activity?
Nazmus Sakib, A. H. M., Basak, P., Doha Uddin, S., Mustavi Tasin, S. and Ahad, M. 2022. Can Ensemble of Classifiers Provide Better Recognition Results in Packaging Activity? 3rd International Conference on Activity and Behavior Computing (ABC 2021). Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_10
Identification of Food Packaging Activity Using MoCap Sensor Data
Anwar, A., Islam Tapotee, M., Saha, P. and Ahad, M. 2022. Identification of Food Packaging Activity Using MoCap Sensor Data. 3rd International Conference on Activity and Behavior Computing (ABC 2021). Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_11
Lunch-Box Preparation Activity Understanding from Motion Capture Data Using Handcrafted Features
Pritom, Y. A., Rahman, M. S., Rahman, H. R., Kowshik, M. A. and Ahad, M. 2022. Lunch-Box Preparation Activity Understanding from Motion Capture Data Using Handcrafted Features. 3rd International Conference on Activity and Behavior Computing (ABC 2021). Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_12
Bento Packaging Activity Recognition Based on Statistical Features
Rakib Sayem, F., Sheikh, M. M. and Ahad, M. 2022. Bento Packaging Activity Recognition Based on Statistical Features. 3rd International Conference on Activity and Behavior Computing (ABC 2021). Online 22 - 23 Oct 2021 Springer Singapore. https://doi.org/10.1007/978-981-19-0361-8_13
MUMAP: Modified Ultralightweight Mutual Authentication protocol for RFID enabled IoT networks
Raju, M. H., Ahmed, M. U. and Ahad, M. A. R. 2021. MUMAP: Modified Ultralightweight Mutual Authentication protocol for RFID enabled IoT networks. Journal of the Institute of Industrial Applications Engineers. 9 (2), pp. 33-39. https://doi.org/10.12792/JIIAE.9.33
Emotion Recognition from EEG Signal Focusing on Deep Learning and Shallow Learning Techniques
Islam, M. R., Moni, M. A., Islam, M. M., Rashed-Al-Mahfuz, M., Islam, M. S., Hasan, M. K., Hossain, M. S., Ahmad, M., Uddin, S., Azad, A., Alyami, S. A., Ahad, M. A. R. and Lió, P. 2021. Emotion Recognition from EEG Signal Focusing on Deep Learning and Shallow Learning Techniques. IEEE Access. 9, pp. 94601-94624. https://doi.org/10.1109/ACCESS.2021.3091487
Static Postural Transition-based Technique and Efficient Feature Extraction for Sensor-based Activity Recognition
Ahmed, M., Das Antar, A. and Ahad, M. 2021. Static Postural Transition-based Technique and Efficient Feature Extraction for Sensor-based Activity Recognition. Pattern Recognition Letters. 147, pp. 25-33. https://doi.org/10.1016/j.patrec.2021.04.001
Activity Recognition from Accelerometer Data Based on Supervised Learning for Wireless Sensor Network
Israt, F. A., Hossain, T., Inoue, S. and Ahad, M. A. R. 2021. Activity Recognition from Accelerometer Data Based on Supervised Learning for Wireless Sensor Network. International Journal of Biomedical Soft Computing and Human Sciences. 26 (2), pp. 73-86. https://doi.org/10.24466/ijbschs.26.2_73
Action recognition using Kinematics Posture Feature on 3D skeleton joint locations
Ahad, M. A. R., Ahmed, M., Antar, A. D., Makihara, Y. and Yagi. Y. 2021. Action recognition using Kinematics Posture Feature on 3D skeleton joint locations. Pattern Recognition Letters. 145, pp. 216-224. https://doi.org/10.1016/j.patrec.2021.02.013
Contactless Human Monitoring: Challenges and Future Direction
Mahbub, U., Rahman, T. and Ahad, M. 2021. Contactless Human Monitoring: Challenges and Future Direction. in: Ahad, M., Mahbub, U. and Ahad, M. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 335-364
Contactless Human Emotion Analysis Across Different Modalities
Nahid, N., Rahman, A. and Ahad, M. 2021. Contactless Human Emotion Analysis Across Different Modalities. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 237-269
Contactless Fall Detection for the Elderly
Nahian, M. J. A., Raju, M. H., Tasnim, Z., Mahmud, M., Ahad, M. and Kaiser, M. S. 2021. Contactless Fall Detection for the Elderly. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 203-235
Signal Processing for Contactless Monitoring
Billah, M. S., Ahad, M. and Mahbub, U. 2021. Signal Processing for Contactless Monitoring. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 113-144
Skeleton-Based Activity Recognition: Preprocessing and Approaches
Sarker, S., Rahman, S., Hossain, T., Faiza Ahmed, S., Jamal, L. and Ahad, M. 2021. Skeleton-Based Activity Recognition: Preprocessing and Approaches. in: Ahad, M., Mahbub, U. and Rahman, T. (ed.) Contactless Human Activity Analysis Springer, Cham. pp. 48-81
IoT Sensor-Based Activity Recognition: Human Activity Recognition
Ahad, M., Antar, A. D. and Ahmed, M. 2021. IoT Sensor-Based Activity Recognition: Human Activity Recognition. Springer, Cham.
A Method for Sensor-Based Activity Recognition in Missing Data Scenario
Hossain, T., Ahad, M. A. R. and Inoue, S. 2020. A Method for Sensor-Based Activity Recognition in Missing Data Scenario. Sensors. 20 (14), pp. 1-23. https://doi.org/10.3390/s20143811
An AI-based Visual Aid with Integrated Reading Assistant for the Completely Blind
Khan, M. A., Paul, P., Rashid, M., Hossain, M. and Ahad, M. 2020. An AI-based Visual Aid with Integrated Reading Assistant for the Completely Blind. IEEE Transactions on Human-Machine Systems. 50 (6), pp. 507-517. https://doi.org/10.1109/THMS.2020.3027534