Abstract: Sign language is a form of communication that utilizes methods to express everyday messages. Unlike sign languages, the interpretation of ISL (International Sign Language) has received attention from researchers. In this paper we present a translation system specifically designed to interpret alphabets, in English sign language. This system focuses on analysing images of hands allowing users to interact with it naturally. By eliminating the need for interpreters, it offers individuals the opportunity to communicate with deaf people seamlessly. Our goal is to develop systems and methods for recognizing and translating sign language.
The initial step of this system involves creating a database for English Sign Language. Hand segmentation plays a role in achieving accurate hand gesture recognition rates within any recognition system. By improving the quality of output, we can enhance recognition rates significantly. Furthermore, our proposed system incorporates a robust hand segmentation and tracking algorithm to achieve results in recognition accuracy. We have utilized a collection of samples to recognize 43 isolated words, from Standard English sign language.
In our proposed system we aim to recognize elements of sign language and facilitate their translation into text well as vice versa in the English language.
Keywords: Sign Language, Machine Learning, Convolutional Neural Network, Gesture Recognition.