Development of Hand Sign Recognition Model for Differently- abled Persons
A recent study has shown that around five percent of the world’s population which is around half a billion people have fallen victim to hearing loss. And if this rate continues, then the numbers are projected to double by the time we reach 2050. The main aim of this paper is to bridge the gap between the differently-abled and the general population. It is proposed to create a chatbot which would take hand signs as input from the end-user and translate it into speech. A side product of this work would be that users can interact with the chatbot without typing in any commands, i.e. the hand signs would be translated into text, which would be passed onto the chatbot as input. Based on this text the chatbot returns an appropriate response to the question of the user. The local visual descriptors are to be extracted using CNN (Convolutional Neural Network). Upon completion of the model, it will be parsed with a chatbot using the NLTK python library
Keywords: Convolutional Neural Network (CNN), Chatbot, Sign Language, Tensorflow, Classifier.