Design and Implementation of a Speech-to-Sign Robotic Arm for Deaf Communication
Keywords:
Assistive Robotics, Sign Language, Robotic Hand, Embedded Systems, Human–Robot InteractionAbstract
Sign language is a critical tool for those who are deaf or hard of hearing to communicate, but can present significant communication challenges when it is not known by the parties involved. These disparities have become a target of recent advances in assistive technology; many available solutions utilize screen-based avatars and computationally expensive vision systems. In this paper, we design and implement a low-cost robot hand that is able to express sign language signs in physical form using an on-board control unit. The proposed solution works with the English finger-spelling alphabet and is based on an Arduino microcontroller and a servo motor actuation, along with a 3D printed mechanical system to perform character-level sign language movements. Modular architecture is used to guarantee easy, cost-effective, and flexible. Experimental assessment is pro- vided by qualitative visual analysis of the performed gestures, showing stable, well-defined finger postures that can be used as a reference for assistive and educational purposes. Although the current prototype is bound to static gestures, the experimental results prove that a physical robotic hand can practically be used for sign language communication and can be considered as a basis for a future generation dynamic and intelligent sign language translation system.
References
[1] Mart´ın, L., Figueroa, M., Diego-L´azaro, B., Balboa-Castells, R., Morgan, G.: The- ory of mind development in deaf and hard-of-hearing individuals: A systematic review. Behavioral Sciences 15(8), 1065 (2025)
[2] West, E., Dettman, S.: A new method for documenting sign language productions in schools. Language, Speech, and Hearing Services in Schools 55(3), 994–1001 (2024)
[3] Wilson-Menzfeld, G., Gates, J., Jackson-Corbett, C., Erfani, G.: Communication experiences of deaf/hard-of-hearing patients during healthcare access and consul- tation: A systematic narrative review. Health & Social Care in the Community 2025(1), 8867224 (2025)
[4] Khalid, U., Majeed, N., Chovaz, C.J., Choudhary, F.R., Munawa, K.: Psycho- logical well-being and mental health risks in deaf and hard of hearing youth: a systematic review. European Child & Adolescent Psychiatry, 1–15 (2025)
[5] Fernandes, N., Leite Junior, A.J.M., Mar¸cal, E., Viana, W.: Augmented reality in education for people who are deaf or hard of hearing: a systematic literature review. Universal access in the information society 23(3), 1483–1502 (2024)
[6] Terry, J., Wilks, R., Davies, J.: Simulated learning interventions to improve com- munication and practice with deaf and hard of hearing patients: a systematic review and qualitative synthesis. Advances in Health Sciences Education, 1–19 (2025)
[7] Nith, R., Teng, S.-Y., Li, P., Tao, Y., Lopes, P.: Dextrems: Increasing dexterity in electrical muscle stimulation by combining it with brakes. In: The 34th Annual ACM Symposium on User Interface Software and Technology, pp. 414–430 (2021)
[8] Mohammed, H.B., Cavus, N.: Design and development of a mobile application for deaf and hard-of-hearing persons to create sound awareness. Universal Access in the Information Society, 1–15 (2025)
[9] David, D., Alamoodi, A.H., Albahri, O.S., Garfan, S., Albahri, A.S., Zaidan, B., Chen, J.: Sign language mobile apps: a systematic review of current app evaluation progress and solution framework. Evolving Systems 15(2), 669–686 (2024)
[10] Romero, R.L., Kates, F., Hart, M., Ojeda, A., Meirom, I., Hardy, S.: Quality of deaf and hard-of-hearing mobile apps: evaluation using the mobile app rating scale (mars) with additional criteria from a content expert. JMIR mHealth and uHealth 7(10), 14198 (2019)
[11] Chang, C.-M., Sanches, F., Gao, G., Johnson, S., Liarokapis, M.: An adaptive, affordable, humanlike arm hand system for deaf and deafblind communication with the american sign language. In: 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 871–878 (2022). IEEE
[12] Homburg, D., Thieme, M.S., V¨olker, J., Stock, R.: Robotalk-prototyping a humanoid robot as speech-to-sign language translator (2019)
[13] Bulgarelli, A., Toscana, G., Russo, L.O., Farulla, G.A., Indaco, M., Bona, B.: A low-cost open source 3d-printable dexterous anthropomorphic robotic hand with a parallel spherical joint wrist for sign languages reproduction. International Journal of Advanced Robotic Systems 13(3), 126 (2016)
[14] Aloysius, N., Geetha, M.: Understanding vision-based continuous sign language recognition. Multimedia Tools and Applications 79(31), 22177–22209 (2020)
[15] Grover, Y., Aggarwal, R., Sharma, D., Gupta, P.K.: Sign language translation systems for hearing/speech impaired people: a review. In: 2021 International Con- ference on Innovative Practices in Technology and Management (ICIPTM), pp. 10–14 (2021). IEEE
[16] Sharma, S., Singh, S.: Vision-based hand gesture recognition using deep learning for the interpretation of sign language. Expert Systems with Applications 182, 115657 (2021)
[17] Subburaj, S., Murugavalli, S.: Survey on sign language recognition in context of vision-based and deep learning. Measurement: Sensors 23, 100385 (2022)
[18] Sharma, S., Singh, S.: Vision-based sign language recognition system: A com- prehensive review. In: 2020 International Conference on Inventive Computation Technologies (ICICT), pp. 140–144 (2020). IEEE
[19] Al-Hammadi, M., Muhammad, G., Abdul, W., Alsulaiman, M., Bencherif, M.A., Alrayes, T.S., Mathkour, H., Mekhtiche, M.A.: Deep learning-based approach for sign language gesture recognition with efficient hand gesture representation. Ieee Access 8, 192527–192542 (2020)
[20] Chowdhury, P.K., Oyshe, K.U., Rahaman, M.A., Debnath, T., Rahman, A., Kumar, N.: Computer vision-based hybrid efficient convolution for isolated dynamic sign language recognition. Neural Computing and Applications 36(32), 19951–19966 (2024)
[21] Trpcheska, A., Zevnik, F., Bader, S.: Towards real-time vision-based sign language recognition on edge devices. In: 2024 IEEE Sensors Applications Symposium (SAS), pp. 1–6 (2024). IEEE
[22] Sadik, M.R., Sony, R.I., Prova, N.N.I., Mahanandi, Y., Al Maruf, A., Fahim, S.H., Islam, M.S.: Computer vision based bangla sign language recognition using transfer learning. In: 2024 Second International Conference on Data Science and Information System (ICDSIS), pp. 1–7 (2024). IEEE
[23] Nihal, R.A., Broti, N.M., Deowan, S.A., Rahman, S.: Design and development of a humanoid robot for sign language interpretation. SN Computer Science 2(3), 220 (2021)
[24] Adeyanju, I.A., Alabi, S.O., Esan, A.O., Omodunbi, B.A., Bello, O.O., Fanijo, S.: Design and prototyping of a robotic hand for sign language using locally-sourced materials. Scientific African 19, 01533 (2023)
[25] Othman, A.: Structure of sign language. In: Sign Language Processing: From Gesture to Meaning, pp. 17–40. Springer, (2024).