Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
ABSTRACT
Communication plays a vital role in our life and in today’s world there are disabled people (deaf, dumb, blind, etc.) who faces a
lot of problems when they try to communicate with others. Sign language is one of the commonly used medium to establish
communication among disabled and common people. This paper describes an aiding device for the deaf, dumb and physically
challenged people. Such people are made to wear gloves fitted with flex sensors whose resistance changes with each gesture
shown by them. Flex sensor produces a voltage change and the Raspberry Pi will process and display the codes corresponding
to each gesture on LCD and the sound code is heard via speaker. The device gains more versatility by establishing
communication via GSM.
Keywords: Sign, Microcontroller, calling, communication
1. INTRODUCTION
Deaf and dumb people communicate with each other by means of sign-language. A sign language gesture is a
predefined movement of fingers and palm of the hands with a particular sign made out of them. At first, gesture is
formed by the user. Then recognizing gesture once it has been captured is challenging, especially in a continuous
stream. The problems faced by deaf and dumb people are categorized in two levels. First one is deaf and dumb people
communicating with other people and second one is communication with technology.
The device described in this paper aims at solving the above mentioned problems of deaf and dumb people. The
gestures formed by user in American sign languages converted into speech. The speech is transmitted using an
interfaced mobile phone. Thus, enabling deaf and dumb people to be able to use mobile phone. With decreasing prices
and easy availability of mobile phones, it can be made easily accessible for anyone. Thus enabling deaf and dumb
people to communicate easily with other normal people and using mobile phone for communication.
2. LITERATURE SURVEY
2.1. Previous Work
The past working of Ankit P.Parmar, Dr. Nehal G. Chitaliya [1] on “gesture recognition system for Indian sign
language on smart phone” is quite fascinating. In this they have make use of an android based smart phone application
to deliver sign language interpreter. Smart phone has camera module which is used to capture hand movement and the
captured image is compared with the predefined database image. If any match is found, an appropriate text is displayed
hence establishing communication. If not found the iteration is repeated until a match is found. But the whole action
from image capturing to image processing is quite a lengthy process and sometimes not very accurate. The accuracy
reduces when a same gesture can have two meanings or the flexibility of the hand gesture signifies a message which a
user doesn’t want
to convey. Hence the effort of image processing can be reduced by using sensors to capture real time data and put it on
to some useful purpose which is defined in this paper.
Supriya Shevate, Nikita Chorage, Sidhee Walunj [2] work on “Gesture based vocalizer for deaf and dumb” makes use
of sensors. They have used accelerometer for tilt detection, flex sensor for bend detection, microcontroller, ADC and
speech synthesis. A wireless data gloves attached with flex sensor on each finger and micro controller mounted over it.
Flex sensor and accelerometer reading is fed to microcontroller and this is processed and displayed on LCD. In this
paper there are certain limitations. It makes use of accelerometer which takes reading of only linear orientation and
hence accuracy of output is reduced. To increase the accuracy, the module described in this paper uses MPU6050 which
not
only takes linear but also angular orientation readings.
The work so far which is done and described in the paper did not use any way to communicate with people living far
way, while the work described here makes use of GSM which helps disabled to communicate with people far away.
3. RELAVANCE
Present modules make use of image processing to capture the gesture and relate the segmented picture with the
database which is quite a hectic process and not even accurate. There are other modules which uses flex sensors.
Though the module works well but gives varying reading of equivalent gestures. The module described in this paper
makes use of both flex sensors and accelerometer and gyroscope which gives proper orientation and gesture value. Also
it uses GSM to communicate with people living far away.
4. DEVICE WORKING
All the components required for the module are attached to a glove. The Raspberry Pi which is mounted on glove is
interfaced with remaining components. When the user makes a gesture, the orientation of palm is judged by MPU6050
and resistance value equivalent to finger bending is recorded and sent for further processing. The values are compared
with the value which is already stored in a database and response related to it is either shown on LCD or hear through
speaker. Further this response will be sent over GSM (using SIM 900) so that disabled (dumb, deaf and blind) can
speak to people staying far away.
6. FLOW MODEL
7. HARDWARE DESIGN
i) Flex sensor- It is resistance based sensor used to measure resistance of finger bending.
ii) SIM 900- It is a wireless module used to communicate people living far away.
iii) Keypad- It is a typing device used to type set of numbers.
iv) LCD- It is a display device to show the output response.
v) MPU6050- It is combination of accelerometer and gyroscope which measure hand orientation.
vi) Raspberry pi- It is a small operating system used to interface other components and process the data stored in
database.
8. RESULT ANALYSIS
The result of MPU6050 showing the orientations of hand in terms of roll (angular) and pitch (linear) are as follows:
Table 1. MPU6050 roll and pitch readings
9. FUTURE SCOPE
It helps dumb and deaf people in marketing areas and public sectors to communicate with others. It can also play a vital
role in the field of robotics, medical, musical instrument by replacing physical buttons and switches by hand gestures.
10. CONCLUSION
This system will be able to provide calling facility to dumb people and it will also serve as an interpreter. Moreover it
will reduce communication gap between common people and physically challenged people.
REFERENCES
[1]. Channaiah Chandana K, Nikhita K, Nikitha P, Bhavani N K, Sudeep J, “Hand gestures recognition system for
deaf, dumb and blind people”, International Journal of Innovative Research in Computer and Communication
Engineering (IJIRCCE), Vol- 5, Issue-5, May 2017, ISSN: 2320-9801, PP. No.2320-9798.
[2]. Supriya Shevate, Nikita Chorage, Siddhee Walunj, Moresh M. Mukhedkar, “Gesture based Vocalizer for Deaf and
Dumb”, International Journal of Advanced Research in Computer and Communication Engineering (IJARCCE),
Vol-5, Issue-3, March 2016, ISSN: 2278-1021, PP. No.2319-5940.
[3]. Sruthi Dinesh, “Talking Glove-A Boon for the Deaf, Dumb and Physically Challenged”, International Journal of
Advanced Research in Electronics and Communication Engineering (IJARECE), Vol-4, Issue-5, May 2015, ISSN:
2278-909x.
[4]. Anju V. Sathyan, Bindu V, “Aiding System for Deaf and Dumb Person Based on Speech and Image Processing
Algorithm” International Journal of Advance Engineering and Research Development(IJAERD), Vol-1, Issue-5,
May 2014, ISSN: 2348-4470, PP. No.2348-6406.
[5]. Vikram Sharma M, Vinay Kumar N, Shruti C Masaguppi, Suma MN, D R Ambika, “Virtual Talk for Deaf, Mute
and Normal Humans”, Texas Instruments India Educator’s Conference, Vol-2, Issue-4, 2013, ISSN: 4523-2257.
[6]. Michela Borghetti, Emilio Sardini, and Mauro Serpelloni, “Sensorized Glove for Measuring Hand Finger Flexion
for Rehabilitation Purposes”, IEEE Transactions On Instrumentation And Measurement, Vol. 62, No. 12,
December 2013.
[7]. Prashanth Suresh, Niraj Vasudevan, Nilesh Ananthanarayanan, “Computer-aided Interpreter for Hearing and
Speech Impaired”, Fourth International Conference on Computational Intelligence, Communication System and
Networks, 2012, ISSN: 364-367.
[8]. Supawadee Saengsri, Vit Niennattrakul, and Chotirat Ann Ratana mahatana, “TFRS: Thai Finger-Spelling Sing
Language Recognition System”, 2012 March, IEEE, pp: 457- 462, 2012.
[9]. Kunal Kadam, Rucha Ganu, Ankita Bhosekar, Prof.S.d.Joshi, “American Sign Language Interpreter”, 2012. July
,IEEE Fourth International Conference on Technology for Education.
[10]. Laura Dipietro, Angelo M. Sabatini, Senior Member, IEEE, and Paolo Dario, Fellow, IEEE, “A Survey of Glove-
based Systems and their Applications”, IEEE Transactions on Systems, Man and Cybernetics,-Part C: Applications
and Reviews, Vol.38, No.4, July 2008.
[11]. S.Sidney Fels and Geoffrey E. Hinton,”Glove-TalkII- A Neural- Network Interface which maps gestures to
parallel formant Speech Synthesizer Controls” IEEE Transactions on Neural Networks, Vol.9,No.1, January 1998.
[12]. Rung-Huei Liang , Ming Ouhyoung ,”A Real-time Continuous Ges ture Recognition System for Sign Language”
,IEEE International Conference on Automatic Face and Gesture Recognition, pp 558-567, Japan 1998.