Authors :
Ledon Jay B. Jordan; Javidec D. Monsion; Kane Joy O. Urbayo; Hernan Jr. E. Trillano
Volume/Issue :
Volume 10 - 2025, Issue 3 - March
Google Scholar :
https://tinyurl.com/bdz6v4pf
Scribd :
https://tinyurl.com/db9zkw8k
DOI :
https://doi.org/10.38124/ijisrt/25mar1295
Google Scholar
Note : A published paper may take 4-5 working days from the publication date to appear in PlumX Metrics, Semantic Scholar, and ResearchGate.
Note : Google Scholar may take 15 to 20 days to display the article.
Abstract :
Communication is fundamental to human interaction, enabling individuals to express thoughts, emotions, and
ideas. However, deaf individuals face unique challenges that require alternative methods to bridge the gap between the
hearing and non-hearing communities. Despite technological advancements, communication barriers persist within the deaf
community. This study aimed to address these challenges by developing the Speech Interpretation and Gesture Notation
(S.I.G.N.) application, a communication tool designed to assist both deaf and hearing individuals in understanding each
other. The application provides real-time speech and gesture translation, facilitating more accessible interactions in various
situations. The development process followed the Incremental Agile Model, allowing each feature of the S.I.G.N. application
to be built, tested, and refined iteratively based on ongoing feedback. The application integrates OpenCV and MediaPipe
for sign language recognition, ensuring accurate translations. Results indicated that the S.I.G.N. application is both practical
and feasible in enhancing interactions between deaf or mute and hearing individuals. It successfully offers a modern and
inclusive solution to communication challenges. Therefore, implementing this system is essential to improving accessibility
and inclusivity for all users.
The System Usability Scale (SUS) evaluation showed an average score of 85.75, indicating that the system was highly
usable and effective. Both deaf and mute individuals and hearing users provided positive feedback, confirming that the
system functioned as intended. The application successfully facilitated smooth communication by converting gestures to text
and speech to gestures. While the system was generally easy to use, some respondents suggested that enhancing real-time
translation and expanding the sign language database could further improve the user experience. Overall, the SIGN app
met its goal of bridging communication gaps and promoting better interaction between deaf and hearing individuals.
Keywords :
Sign Language, Gesture-to-Text, Speech-to-Gesture, Accessibility, Communication System.
References :
- S. Geetha, R. N. K. Prasad, and S. Choudhury, “Understanding vision-based continuous sign language recognition,” IEEE Access, vol. 8, pp. 12345–12358, 2020.
- A. Garemilla, “Sign language recognition with advanced computer vision: Detecting sign language characters in real-time,” Proc. Int. Conf. on Machine Learning and Applications, pp. 523–531, 2022.
- R. Maitland, P. Singh, and A. Patel, “Real-time sign language recognition using machine learning and neural network,” IEEE Trans. Neural Netw. Learn. Syst., vol. 34, no. 1, pp. 45–59, 2022.
- J. Mendoza and L. Ramirez, “Improving communication for the deaf through sign language recognition applications,” Int. J. Comput. Appl., vol. 176, no. 1, pp. 15–21, 2020.
- C. Alberto, P. Cruz, and J. Gonzales, “Continuous sign language recognition system using deep learning with MediaPipe Holistic,” IEEE Int. Conf. Signal Process., pp. 78–85, 2024.
- J. Montefalcon, E. Javier, and M. Lim, “Filipino sign language recognition using deep learning,” Proc. Philippine Conf. on ICT and Applications, pp. 112–119, 2021.
- K. Gezza and F. Hanani, “Kinect to connect: A two-way sign language gesture sensor application,” Int. J. Comput. Vis., vol. 15, no. 3, pp. 312–325, 2022.
- D. Biswasa, H. Gupta, and S. Mishra, “Real-time American sign language interpretation using deep convolutional neural networks,” IEEE Trans. Image Process., vol. 32, pp. 89–102, 2023.
- R. Banag, “Development of sign language tutorial mobile application for Filipinos,” J. Comput. Sci. Eng., vol. 14, no. 2, pp. 45–53, 2023.
- J. Presto, A. Villar, and M. Santos, “Senyas: A 3D animated Filipino sign language interpreter using speech recognition,” IEEE Conf. Multimedia. Signal Process., pp. 134–141, 2022.
- J. Empe, A. Castro, and G. Lopez, “SimboWika: A mobile and web application to learn Filipino sign language for deaf students in elementary schools,” J. ICT Research and Applications, vol. 18, no. 2, pp. 210–221, 2020.
Communication is fundamental to human interaction, enabling individuals to express thoughts, emotions, and
ideas. However, deaf individuals face unique challenges that require alternative methods to bridge the gap between the
hearing and non-hearing communities. Despite technological advancements, communication barriers persist within the deaf
community. This study aimed to address these challenges by developing the Speech Interpretation and Gesture Notation
(S.I.G.N.) application, a communication tool designed to assist both deaf and hearing individuals in understanding each
other. The application provides real-time speech and gesture translation, facilitating more accessible interactions in various
situations. The development process followed the Incremental Agile Model, allowing each feature of the S.I.G.N. application
to be built, tested, and refined iteratively based on ongoing feedback. The application integrates OpenCV and MediaPipe
for sign language recognition, ensuring accurate translations. Results indicated that the S.I.G.N. application is both practical
and feasible in enhancing interactions between deaf or mute and hearing individuals. It successfully offers a modern and
inclusive solution to communication challenges. Therefore, implementing this system is essential to improving accessibility
and inclusivity for all users.
The System Usability Scale (SUS) evaluation showed an average score of 85.75, indicating that the system was highly
usable and effective. Both deaf and mute individuals and hearing users provided positive feedback, confirming that the
system functioned as intended. The application successfully facilitated smooth communication by converting gestures to text
and speech to gestures. While the system was generally easy to use, some respondents suggested that enhancing real-time
translation and expanding the sign language database could further improve the user experience. Overall, the SIGN app
met its goal of bridging communication gaps and promoting better interaction between deaf and hearing individuals.
Keywords :
Sign Language, Gesture-to-Text, Speech-to-Gesture, Accessibility, Communication System.