Tracking a person’s behaviour in a smart house

Loading...
Thumbnail Image

Supplementary material

Other Title

Authors

Chand, G.
Ali M.
Barmada, Bashar
Liesaputra, Veronica
Ramirez-Prado, Guillermo

Author ORCID Profiles (clickable)

Degree

Grantor

Date

2018-11

Supervisors

Type

Conference Contribution - Paper in Published Proceedings

Ngā Upoko Tukutuku (Māori subject headings)

Keyword

smart homes
people with supported needs
behaviour tracking
ultrasonic sensors
machine learning
older people
aged care

Citation

Chand, G., Ali, M., Barmada, B., Liesaputra, V., & Prado, G. (2018). Tracking a Person’s Behaviour in a Smart House. In C. Pahl, J. Yin,. M. Vukovic,. Q. Yu (Ed.), The 16th International Conference on Service Oriented Computing (pp. 1-12).

Abstract

This paper proposes to use machine learning techniques with ultrasonic sensors to predict the behavior and status of a person when they live solely inside their house. The proposed system is tested on a single room. A grid of ultrasonic sensors is placed in the ceiling of a room to monitor the position and the status of a person (standing, sitting, lying down). The sensors readings are wirelessly communicated through a microcontroller to a cloud. An intelligent system will read the sensors values from the cloud and analyses them using machine learning algorithms to predict the person behavior and status and decide whether it is a normal situation or abnormal. If an abnormal situation is concluded, then an alert with be risen on a dashboard, where a care giver can take an immediate action. The proposed system managed to give results with accuracy exceeding 90%. Results out of this project will help people with supported needed, for example elderly people, to live their life as independent as possible, without too much interference from the caregivers. This will also free the care givers and allows them to monitors more units at the same time.

Publisher

Springer Verlag

Link to ePress publication

DOI

Copyright holder

Copyright notice

All rights reserved

Copyright license

Available online at

This item appears in: