Recognizing Hand Gestures using Temporally Blended Image Date and Deep Learning

Date

2018-12, 2018-122018-12

Authors

Arsekar, Shubharaj Pradeep
Arsekar, Shubharaj Pradeep

ORCID

Journal Title

Journal ISSN

Volume Title

Publisher

DOI

Abstract

Hand gestures can allow for natural approach to human-computer interaction. A novel low com- putation Hand Gesture Recognition System (HGRS) using temporally blended image data with a convolutional neural network (CNN) is presented. The goal of HGRS is to recognize hand gestures in an optimized and efficient way. We created a dataset using Kinect depth and body data stream frames. The dataset comprised of eight different hand gestures, each gesture was performed with the right hand within a duration of three seconds. Data is first processed by segmenting the hand from the background using body data joints mapped onto depth data. Reduction in the computation of the HGRS was achieved by blending the temporal depth data frames into a single frame. The blending of temporal depth data frames is defined as the addition of the frames into a single frame by increasing the intensity of each consecutive frame. The resolution of the depth data frames was reduced to an empirically evaluated frame size of 50 × 50 which further improved the computational efficiency of HGRS. We trained and validated a CNN model for hand gesture classification which consists of three convolutional layers each followed by a max pooling layer, and two fully connected layers in the end. We tested the performance of the model and observed a test accuracy of 98.45%. We performed a quantitative analysis to measure the overall performance of the model.

Description

Keywords

computer science, computer vision, deep learning, Image Processing, machine learning, Visualization

Sponsorship

Rights:

This material is made available for use in research, teaching, and private study, pursuant to U.S. Copyright law. The user assumes full responsibility for any use of the materials, including but not limited to, infringement of copyright and publication rights of reproduced materials. Any materials used should be fully credited with its source. All rights are reserved and retained regardless of current or future development or laws that may apply to fair use standards. Permission for publication of this material, in part or in full, must be secured with the author and/or publisher.

Citation