Description:
There is a need of a richer and more detailed view of our daily activity to better manage our well-being and health conditions, and be able to observe positive evolution or disruptive pathological changes. Moreover, human physical activity monitoring has been increasingly sought in different areas, such as healthcare, sports, elderly oversee and safety. In addition, human activities recognition can increase the precision of indoor location systems, which have inherent challenges like the lack of GPS signals. These systems have several applications as indoor navigation, public safety, security management and ambient intelligence, besides providing huge potential around advertisement and retail businesses.
There are several solutions for human activities recognition, but they are mostly focused in widespread activities detection, as walking, running or standing, for example, and do not consider activities sequence. However, in a daily monitoring, contextual information and detailed activities should be also taken into account, like the action of open a door, transitions from standing to sitting, turning on the light, dressing or undressing, take something from a shelf in the kitchen or buying a product in the supermarket. In an indoor location system, know if a door was opened, for example, can decrease the possible locations and, consequently, ease the process of location.
This project proposes the development of a framework for detailed human activities recognition and characterisation, based on smartphone and wearable sensors.
Main Outcomes:
Develop novel algorithms to detect users’ detailed activities using smartphone or wearable sensors signals and machine learning algorithms. These algorithms may complement the current frameworks in order to increase the overall algorithm’s range and performance, and can be applied on many other projects, namely PIL and DEMSmartMoves.
Author: Mariana Abreu
Type: MSc thesis
Partner: Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa
Year: 2018
Download: