Activity recognition and resident identification in smart home environment

Thumbnail Image
Other Title
Kashyap, Venkatesh Subramanya
Author ORCID Profiles (clickable)
Master of Computing
Unitec Institute of Technology
Barmada, Bashar
Ramirez-Prado, Guillermo
Liesaputra, Veronica
Masters Thesis
Ngā Upoko Tukutuku (Māori subject headings)
New Zealand
smart homes
people with supported needs
behaviour tracking
ultrasonic sensors
machine learning
older people
aged care
resident identification
Kashyap, V. S. (2020). Activity recognition and resident identification in smart home environment. (Unpublished document submitted in partial fulfilment of the requirements for the degree of Master of Computing). Unitec Institute of Technology, Auckland, New Zealand. Retrieved from
World’s population is ageing rapidly. There have been various efforts to improve the quality of life for elderly. Ambient assisted living is one possible solution which enables elderly or disabled people to live a better lifestyle. Currently there are smart home systems that utilize a wide range of sensors to predict our everyday activities. However, research into activity recognition and resident identification using ultrasonic sensors are limited. This work introduces machine learning techniques with ultrasonic sensors to predict the activities of one and two person in the smart home environment. The proposed system is capable of recognising the activities and identifying the residents without the need to manually label the prior activities. Our evaluation demonstartes that the proposed approach can predict resident’s activities with high accuracy. The trained model could be used to predict other resident’s activities and also identify resident’s from each other. This research enables the smart home system to be widely adopted in people’s houses with minimal training and also enable people who need support, to live independently with less interference from caregivers which in turn enables caregivers to manage more people at the same time.
Link to ePress publication
Copyright holder
Copyright notice
All rights reserved
Copyright license
Available online at