Real-time human movement prediction based on human habits and emotion
Date
Authors
ORCID
Journal Title
Journal ISSN
Volume Title
Publisher
DOI
Abstract
This research project develops a new deep neural network model for real-time human movement prediction based on human habits and emotion through the layering of neural networks, computer vision algorithms, and mathematical matching with a dynamic database. By combining multiple residual neural networks using different layering algorithms, the new model can increase prediction accuracies with reduced errors because of how each neural network adjusts to fill in the gaps left behind be each other to average out a proper evaluation of human motion. Specifically, the new model contains the following components: ResNet50 and altered ResNet34 on ImageNet for motion in all three dimensions, FURIA algorithm on Dlib 68 facial landmarks for emotion, and a nearest neighbor neural network for prediction based off locomotion. The result of this combination was that the prediction would take 0.05 milliseconds after emotion is selected to initiate its prediction path. The prediction module follows motion at first but with reduced accuracy until emotion is given to give a proper prediction with visualized projection. As its database of motions grows, the accuracy grows as well which leads to near real-time movement prediction. In the experiments, the new model outperforms existing models in both prediction accuracy and training speed due to this dynamic database. While other models require retraining of the neural networks to adjust to new testing data, this model relies on only database motion additions which greatly speeds up overall training and testing. The possible usages of this model are, but not limited to, health caregiving, culprit movements during police engagements, and child safety monitoring.