The gesture recognition feature allows to automatically detects specific events like during normal movement data. Trained machine learning models were used to distinguish between actions of daily living and emergency events. The use of this feature requires no additional sensors or other hardware, as all algorithms relay on the build-in sensors of the smart phone. 

To create a reliable system the training data contains thousands of hours of recorded movement date. Data collection is also done during this project, with the help of a special recording application (also part of the projects work, sample image left). Data analysis, annotation and further processing is done at evoAIDs server infrastructure. 

Although the R&D project is already reaching it's end, data collection will be continued to further increase the already great detection rates of the models.