AI- Powered Deaf Companion System for Inclusive Communication between Deaf and Hearing Individuals

  • I. Janani PSNA College of Engineering and Technology, Dindigul, India
  • S. Brindhashree PSNA College of Engineering and Technology, Dindigul, India
  • M. Boomika PSNA College of Engineering and Technology, Dindigul, India
  • M. Sahana Preethi Assistant Professor, Department of Information Technology, PSNA College of Engineering and Technology, Dindigul, India
Keywords: Bengali Sign Language (BdSL), Classification, Computer Vision, Deep Learning, Machine Learning, Sign Language Recognition, Optical Flow

Abstract

Sign language is the usual means of communication used by speech and hearing disabled people, and Bengali Sign Language (BdSL) is especially difficult with its large vocabulary, alphabet, and diverse expressions. Conventional remedies like mastering BdSL or employing an interpreter tend to be expensive and hard to obtain. Machine translation of sign language provides a more accessible solution, and deep learning, especially computer vision, holds much promise. To meet this, we created KUNet, a CNN-based classifier optimized with Genetic Algorithms (GA) to classify BdSL gestures.
KUNet attained 99.11% accuracy on the KU-BdSL dataset and surpassed a number of state-of-the-art models. Explainable AI (XAI) was also utilized to explain the model’s decision process, increasing its transparency. The study seeks to develop a BdSL machine translator, which will benefit the non-verbal and hearing-impaired community since it will facilitate easy communication and less dependency on expensive interpreters.

Published
2025-07-10
Statistics
Abstract views: 0 times
PDF downloads: 0 times