AI- Powered Deaf Companion System for Inclusive Communication between Deaf and Hearing Individuals
Abstract
Sign language is the usual means of communication used by speech and hearing disabled people, and Bengali Sign Language (BdSL) is especially difficult with its large vocabulary, alphabet, and diverse expressions. Conventional remedies like mastering BdSL or employing an interpreter tend to be expensive and hard to obtain. Machine translation of sign language provides a more accessible solution, and deep learning, especially computer vision, holds much promise. To meet this, we created KUNet, a CNN-based classifier optimized with Genetic Algorithms (GA) to classify BdSL gestures.
KUNet attained 99.11% accuracy on the KU-BdSL dataset and surpassed a number of state-of-the-art models. Explainable AI (XAI) was also utilized to explain the model’s decision process, increasing its transparency. The study seeks to develop a BdSL machine translator, which will benefit the non-verbal and hearing-impaired community since it will facilitate easy communication and less dependency on expensive interpreters.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.