Sign Language recognition using image-based hand gesture recognition techniques



EOI: 10.11242/viva-tech.01.04.183

Download Full Text here



Citation

Prasad Suhas Tandel,Prof. Sonia Dubey, "Sign Language recognition using image-based hand gesture recognition techniques", VIVA-IJRI Volume 1, Issue 4, Article 183, pp. 1-6, 2021. Published by Computer Engineering Department, VIVA Institute of Technology, Virar, India.

Abstract

Touch is one of the most common forms of sign language used in oral communication. It is most commonly used by deaf and dumb people who have difficulty hearing or speaking. Communication between them or ordinary people. Various sign-language programs have been developed by many manufacturers around the world, but they are relatively flexible and affordable for end users. Therefore, this paper has presented software that introduces a type of system that can automatically detect sign language to help deaf and mute people communicate better with other people or ordinary people. Pattern recognition and hand recognition are developing fields of research. Being an integral part of meaningless hand-to-hand communication plays a major role in our daily lives. The handwriting system gives us a new, natural, easy-to-use communication system with a computer that is very common to humans. Considering the similarity of the human condition with four fingers and one thumb, the software aims to introduce a real-time hand recognition system based on the acquisition of some of the structural features such as position, mass centroid, finger position, thumb instead of raised or folded finger.

Keywords

Communication System, hand recognition, pattern, sign language, Touch

References

  1. Christopher Lee and Yangsheng Xu, “Online, interactive learning of gestures for human robot interfaces” Carnegie Mellon University, the Robotics Institute, Pittsburgh, Pennsylvania, USA, 1996
  2. Richard Watson, “Gesture recognition techniques”, Technical report, Trinity College, Department of Computer Science, Dublin, July, Technical Report No. TCD-CS-93-11, 1993
  3. Ray Lockton, ”Hand Gesture Recognition using computer Vision”,4th year project report, Ballilol College , Department of Engineering Science, Oxford University,2000
  4. Hai, P. T., Thinh, H. C., Van Phuc, B., & Kha, H. H. (2018). Automatic feature extraction for Vietnamese sign language recognition using support vector machine. 2018 2nd International Conference on Recent Advances in Signal Processing, Telecommunications & Computing (SigTelCom).
  5. Purva C. Badhe, Vaishali Kulkarni, “Indian Sign Language Translator using Gesture Recognition Algorithm”, 2015 IEEE International Conference on Computer Graphics, Graphics and Information Security (CGVIS).
  6. P.Gajalaxmi, T, SreeSharmila, “Sign Language Recognition for Invariant features based on multiclass Support Vector Machine with BeamECOC Optimization”, IEEE International Conference on Power, Control, Signals and Instrumentation Engineering (ICPCSI-2017).
  7. Etsuko Ueda, Yoshio Matsumoto, Masakazu Imai, Tsukasa Ogasawara, ”Hand Pose Estimation for Vision Based Human Interface”, IEEE Transactions on Industrial Electronics,Vol.50,No.4,pp.676-684,2003.
  8. International Multi Conference of Engineers and Computer Scientists 2009”, Hong Kong , Vol I IMECS 2009, March 18 - 20, 2009
  9. Ms. Rashmi D. Kyatanavar, Prof. P. R. Futane, “Comparative Study of Sign Language Recognition Systems”, Department of Computer Engineering, Sinhgad College of Engineering, Pune, India International Journal of Scientific and Research Publications, Volume 2, Issue 6, June 2012 ISSN 2250-315
  10. MdAzher Uddin, Shayhan Ameen Chowdhury, “Hand Sign language recognition for Bangla Alphabet using Support Vector Machines”, 2016 InternationalConference on Innovations in Science Engineerig and Technology (ICISET).
  11. Muhammad AminurRahaman, Mahmood Jasim, Md. Haider Ali and Md. Hasanuzzaman, “Real-Time Computer Vision based Bengali Sign Language Recognition”, 2014 17th International Conference on Computer and Information Technology (ICCIT).
  12. Etsuko Ueda, Yoshio Matsumoto, Masakazu Imai, Tsukasa Ogasawara, ”Hand Pose Estimation for Vision Based Human Interface”, IEEE Transactions on Industrial Electronics,Vol.50,No.4,pp.676-684,2003.
  13. Starner T, Weaver J, Pentland A (1998) Real-time American sign language recognition using desk and wearable computer based video. IEEE Trans Pattern Anal Mach Intell 20:1371–1375
  14. Starner T, Pentland A (1997) Real-time American sign language recognition from video using hidden Markov models. In: Motion-based recognition. Springer, pp?227–243
  15. “digital image processing” (2nd Edition) Rafael C. Gonzalez (Author), Richard E. Woods (Author) Publication Date: January 15, 2002 | ISBN-10: 0201180758 | ISBN-13: 978-0201180756.