Real-time human movement prediction based on human habits and emotion

Date

2021-08

Authors

Rodriguez, Roberto

ORCID

Journal Title

Journal ISSN

Volume Title

Publisher

DOI

Abstract

This research project develops a new deep neural network model for real-time human movement prediction based on human habits and emotion through the layering of neural networks, computer vision algorithms, and mathematical matching with a dynamic database. By combining multiple residual neural networks using different layering algorithms, the new model can increase prediction accuracies with reduced errors because of how each neural network adjusts to fill in the gaps left behind be each other to average out a proper evaluation of human motion. Specifically, the new model contains the following components: ResNet50 and altered ResNet34 on ImageNet for motion in all three dimensions, FURIA algorithm on Dlib 68 facial landmarks for emotion, and a nearest neighbor neural network for prediction based off locomotion. The result of this combination was that the prediction would take 0.05 milliseconds after emotion is selected to initiate its prediction path. The prediction module follows motion at first but with reduced accuracy until emotion is given to give a proper prediction with visualized projection. As its database of motions grows, the accuracy grows as well which leads to near real-time movement prediction. In the experiments, the new model outperforms existing models in both prediction accuracy and training speed due to this dynamic database. While other models require retraining of the neural networks to adjust to new testing data, this model relies on only database motion additions which greatly speeds up overall training and testing. The possible usages of this model are, but not limited to, health caregiving, culprit movements during police engagements, and child safety monitoring.

Description

Keywords

computer vision, Human Prediction, machine learning, Motion Prediction, neural networks, Unity Game Engine

Sponsorship

Rights:

Attribution-NonCommercial-ShareAlike 4.0 International, This material is made available for use in research, teaching, and private study, pursuant to U.S. Copyright law. The user assumes full responsibility for any use of the materials, including but not limited to, infringement of copyright and publication rights of reproduced materials. Any materials used should be fully credited with its source. All rights are reserved and retained regardless of current or future development or laws that may apply to fair use standards. Permission for publication of this material, in part or in full, must be secured with the author and/or publisher.

Citation