American Sign Language Translation from Deaf-Mute People Based on Novel System

American sign language arduino nano google colab neural network python slide potentiometer

Authors

  • Batool A. Sahm
    pgs.batool.abdullah@uobasrah.edu.iq
    Department of Computer Engineering, College of Engineering, University of Basrah, Al Hartha, Basra 61009, Iraq, Iraq
  • Hassanin Al-Fahaam Department of Computer Engineering, College of Engineering, University of Basrah, Al Hartha, Basra 61009, Iraq, Iraq
  • Abbas A. Jasim College of Oil and Gas Engineering, Basrah University for Oil and Gas, Al Quma, Basra 61016, Iraq, Iraq
April 30, 2024

Downloads

This paper presents a system to translate gestures from the American Sign Language alphabet using an instrumented wearable glove. This system represents an attempt to utilize a slide potentiometer in sign language translating. The hardware part of the system consists of five slide potentiometers and two force-sensitive resistors, which are best positioned on a glove, based on the analysis of American Sign Language (ASL) letters. In the software part a neural network is used, which was built and trained using Google Colab and Python as the programming language. The performance of the system was tested on three data sets with different numbers of samples. After that, the letters corresponding to the gestures performed are displayed on a computer screen in real time.