Recognizing activities of daily living from patterns and extraction of web knowledge
Ihianle, I., Naeem, U., Tawil, A. and Azam, Muhammad Awais 2016. Recognizing activities of daily living from patterns and extraction of web knowledge. in: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct - UbiComp '16 New York, NY, USA ACM. pp. 1255-1262
|Authors||Ihianle, I., Naeem, U., Tawil, A. and Azam, Muhammad Awais|
The ability to infer and anticipate the activities of elderly individuals with cognitive impairment has made it possible to provide timely assistance and support, which in turn allows them to lead an independent life. Traditional non-intrusive activity recognition approaches are dependent on the use of various machine learning techniques to infer activities given the collected object usage data. Current activity recognition approaches are also based on knowledge driven techniques that require extensive modelling of the activities that needs to be inferred. These models can be seen as too restrictive, prescriptive and static as they are based on a finite set of activities. In this paper, we propose a novel “top down” approach to recognising activities based on object usage data, which detects patterns associated with the activity-object relationship and utilizes web knowledge in order to build dynamic activity models based on the objects used to perform the activity. Experimental results using the Kasteren dataset shows it is comparable to existing approaches.
|Keywords||Activity Recognition; Pattern Analysis; Topic Model; Web Extraction; Ontology Model|
|Book title||Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct - UbiComp '16|
|12 Sep 2016|
|Publication process dates|
|Deposited||11 May 2017|
|Place of publication||New York, NY, USA|
|Event||ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2016)|
|Digital Object Identifier (DOI)||doi:10.1145/2968219.2968440|
© 2016 Copyright is held by the Authors.
3views this month
4downloads this month