Hand Gesture Classification based on Inaudible Sound using Convolutional Neural Network

Proceedings of The 5th International Conference on Innovation in Science and Technology

Year: 2018

DOI: https://www.doi.org/10.33422/5ist.2018.12.117

[Fulltext PDF

Hand Gesture Classification based on Inaudible Sound using Convolutional Neural Network

Jinhyuck Kim, Jeongung Kim and Sunwoong Choi

 

ABSTRACT: 

Recognizing and classifying the gesture of a user has become important for an increase in the use of wearable devices. This study propose a method for classifying hand gestures by creating inaudible sound using a smartphone and reflected sound signal. The proposed method converts the sound data, which has been reflected and recorded, into an image using short-time Fourier transform (STFT), and the obtained data are applied to a convolutional neural network (CNN) model to classify hand gestures. The results showed classification accuracy for 6 hand gestures with an average of 92.17%.Furthermore, it is confirmed that the proposed method has a higher classification accuracy than other machine learning classification algorithms.