AI Based Sign Language Recognition of Words Using Sensors
According to the Indian Census, 18 million of the Indian population is deaf and dumb. There are different sign languages used by the deaf and mute people for their communication. The communication among deaf and dumb people is by the means of gestures. The communication between deaf and dumb people and normal people is difficult, because normal people struggle to understand the meaning of the gestures. To mitigate the problems faced by these people, an automatic sign language recognition system is proposed. In this project, a smart sign language interpretation system is designed for recognizing the Indian sign language by fixing different sensors like flex sensor, pressure sensor and gyroscope sensor on a glove for detecting the gestures made. The continuous stream of values collected from the sensors for different gestures, made by the deaf and mute people is fed to the feed forward neural network for classification and recognition of sign words. This system aims to bridge the gap in communication between the deaf and dumb people and the normal people.