Sign Language Video Call Application
A real-time video call app that recognizes Indian Sign Language using both hands, enabling seamless communication for the hearing-impaired community.
Videos
Descripción
This project aims to build a real-time Indian Sign Language (ISL) recognition system capable of interpreting hand gestures involving both hands. Using a custom-built dataset categorized from 'A' to 'Z', the system captures live hand images via a camera, processes them through a deep learning model, and accurately predicts the corresponding sign.
Key features include:
Dual-hand gesture recognition for higher accuracy and complexity.
Real-time prediction using a webcam or mobile camera.
Custom dataset creation for ISL, ensuring model training on region-specific signs.
Potential integration into educational tools, communication aids for the hearing-impaired, and cross-language communication systems.
The system is designed for scalability and can be expanded to include numbers, dynamic gestures, or full-word recognition in the future
Progreso del hackathon
We built a working prototype that recognizes two-hand ISL signs in real-time using a custom dataset. The model is integrated into a video call interface, enabling live sign translation. UI and backend integration are in progress, with scalability and additional gestures planned next.