SlideShare uma empresa Scribd logo
1 de 7
Baixar para ler offline
The International Journal of Engineering
And Science (IJES)
||Volume|| 1 ||Issue|| 2 ||Pages|| 19-25 ||2012||
 ISSN: 2319 – 1813 ISBN: 2319 – 1805

                                   HAND SIGN INTERPRETER
          1,
               Ajinkya Raut, 2,Vineeta Singh, 3,Vikrant Rajput, 4,Ruchika Mahale
                                                   1,2,3,4,
                                                          PES MCOE, PUNE




--------------------------------------------------------------- Abstract------------------------------------------------------------
For many deaf people, sign language is the principle means of communication. Normal people around
such cases end up facing problems while communicating. This system prototype that is able to automatically
recognize sign language to help normal people to communicate more effectively with speech impaired people.
This system recognizes the hand signs with the help of specially designed gloves. These recognized gestures are
translated into a voice signal in real time. Thus this system reduces the communication gap between normal and
the speech impaired people.
Keywords—speech impaired, hand signs, sensor glove, easy communication.

-----------------------------------------------------------------------------------------------------------------
Date of Submission: 16, November, 2012                                               Date of Publication: 5,December 2012
----------------------------------------------------------------------------------------------------------------

1. INTRODUCTION
         ‘Sign Language Interpreter’ is a recognition system for the vocally disabled. We have turned to glove-
based technique as it is more practical in gesture recognition which involves the use of specially designed sensor
glove which produces a signal corresponding to the hand sign. As the performance of the glove is not affected
by light, electric or magnetic fields or any other disturbance, the data that is generated is accurate. The processor
analyses the signal and fetches the corresponding audio signal from the memory IC which is fed to the amplifier.
The speaker generates the relevant sound. As the system uses the low cost and easily available sensors and IC’s
the system is very much cost effective.

2. LITERATURE SURVEY
        In the recent years, there has been tremendous research on the hand sign recognition. The technology of
gesture recognition is divided into two categories-

    Vision-based
         In vision-based methods, computer camera is the input device for observing the information of hands or
fingers. The Vision Based methods require only a camera, thus realizing a natural interaction between humans
and computers without the use of any extra devices. These systems tend to complement biological vision by
describing artificial vision systems that are implemented in software and/or hardware. This poses a challenging
problem as these systems need to be background invariant, lighting insensitive, person and camera independent
to achieve real time performance. Moreover, such systems must be optimized to meet the requirements,
including accuracy and robustness. [5]

    Glove-based methods
         In glove based systems data gloves are used which can archive the accurate positions of hand gestures
as its positions are directly measured. The Data-Glove based methods use sensor devices for digitizing hand and
finger motions into multi-parametric data. The extra sensors make it easy to collect hand configuration and
movement. However, the devices are quite expensive and bring much cumbersome experience to the users
Some of the earlier gesture recognition systems attempted to identify gestures using glove-based devices that
would measure the position and joint angles of the hand. However, these devices are very cumbersome and
usually have many cables connected to a computer. This has brought forth the motivation of using non-intrusive,
vision-based approaches for recognizing gestures Also the sensors used for the detection of the sign language
and the gesture recognition in the system that are available in the market are quite costly.



www.theijes.com                                                                  THE IJES                               Page 19
Hand Sign Interpreter
          In computer recognition of spoken language, speech data is captured using a microphone connected to
an ADC. Similarly a data-capturing device is also required in order to recognize sign language; in this case
measuring the position and movement of the signer's hands [1]. Two broad categories of input hardware have
been used for recognition of hand gestures – glove-based devices such as those used by Kramer et al (1989) [4]
and Fels et al (1993) [2], and camera-based systems as used by Holden (1993). The latter approach has some
benefits, particularly as it does not require specialized hardware, but this is offset by the complexity of the
computer vision problems faced in extracting the necessary data about the hands from a visual image. Therefore
for this research glove-based input was used, as this allowed the research effort to be focused on the area of sign
recognition rather than that of data capturing The specific input devices used in developing SLARTI were a
Cyber Glove, and a Polhemus Iso Trak. The Cyber Glove measures the degree of flexing of the various joints of
the hand and wrist. The version of the Cyber Glove used for this research provides 18 sensors. The Polhemus
allows tracking of the spatial position and orientation of the hand with respect to the fixed electro-magnetic
source [1].Linguistic analysis of sign language has revealed that signs can be described in terms of four basic
manual features, which may be modified in meaning by more subtle factors such as body language and facial
expression [3]. The hand shape defines the configuration of the joints of the hand. Orientation specifies the
direction in which the hand and fingers are pointing, whilst the place of articulation is the location of the hand
relative to the body. The most complex feature is motion, which consists of a change over time of any
combination of the other three. The task of transforming a stream of input data directly into a classification of
the sign is an extremely difficult task. The approach taken within SLARTI was to initially process the input
data so as to produce a description of this sequence in terms of the four features discussed above. The sign is
classified on this basis. The SLARTI system consists of four separate feature-extraction neural networks each
trained specifically to recognize one of the features of the sign.

         This approach of decomposing the problem and applying a modular structure of networks has a number
of benefits. First, as demonstrated on the task of speech-recognition by Waibeletal (1989), it allows the use of
several smaller networks rather than one massive network and thereby reduces the amount of training time and
data required. It may also result in superior classification accuracy. Second, it produces a system which can
more easily be extended. The features recognized by the feature-extraction networks are expressive enough to
describe an extremely large number of signs, not all of which may be recognized by the final classifier. If the
vocabulary of the system is to be extended then only the final classifier will require modification. This greatly
reduces the costs involved in performing such expansion of the system, and makes it practical to tailor the
vocabulary of the system to a particular user.

Projects available in market:
        In the recent era, there has been tremendous development in the work of sign language interpretation.
Many systems have been designed for gesture recognition.
Depending upon the two categories many systems are now available in the market.
1. Vision-based
2. Glove-based

Comparison:
         In case of vision- based gesture recognition systems, a lot of the digital signal processing has to be done.
Also there require large programming. Because of this the response of the system is quite slow. Also the electric,
magnetic fields, and any other disturbances may affect the performance of the system. In case of glove-based
sign language recognition system that is available in the market lot of the hardware is required. Large numbers
of cables have to be connected to the computers for monitoring the data. Hence the systems require lot of space.
Also the system is not handy. In case of our project flex sensors that we are using are of low cost. Also the ARM
processor that we use in our process is very compact. Hence the space required for our system is very less
compared to the other projects that are available in the market.

         Thus the system is portable. Performance of the system is not affected by the disturbances. Here we are
converting hand signs in to the corresponding speech signal; hence the system is the proper means of effective
communication. As the hardware required in designing are low cost, the overall cost of the system is less
compared to the other systems available in the market, and it is the system is flexible enough for a user to add,
modify or delete a hand sign.




www.theijes.com                                                         THE IJES                         Page 20
Hand Sign Interpreter

3. RESEARCH ELABORATIONS:
SYSTEM HARDWARE:
          System hardware consists of an accelerometer and Flex sensors attached to the glove which are the
input devices. The signal so generated by these two sensors is given to the voltage divider and then to the
processor i.e. LPC2138 which acts as the heart of the system. Flex sensors made up of resistive carbon element
shows the property of change in resistance according to the bending of sensor (fingers). Five Flex sensors are
mounted on five fingers of the hand. Accelerometer works on the concept of gravity and serves the purpose of
sensing the direction. The output of the Flex sensor is the varying resistance which is converted into a voltage
signal using a voltage divider bias circuit. Further, the accelerometer output and the voltage signal are provided
to the inbuilt ADC of the processor. The received output from the sensors is continuously monitored.rocessor is
interfaced with a memory IC which acts as an input/output device. Various audio files are recorded and then
stored in the memory IC for the different hand signs (signals). Thus the processor acts as master and the external
memory acts as slave. Depending on the output voltage from the Flex sensors and the accelerometer, processor
checks the ADC count in the lookup table and if an audio signal corresponding to the ADC count is found then
the audio signal is fetched from the memory IC (external memory). The data so fetched which is in the digital
form is converted into an analog signal and provided to the amplifier and thus to the speaker. LCD is used to
display the corresponding output, thus serves an important role for cross-verification. The output of the
processor is quite weak to drive the loudspeaker. Thus an audio amplifier comes into the picture. The output
taken from the processor acting as an input to the audio amplifier is in the form of an analog signal. That is
facilitated by DAC which is inbuilt in the processor. Speaker, at the end of the system plays the corresponding
audio file.Using only a single glove restricts the system to the recognition of one-handed signs (and hence
eliminates the possibility of recognizing the Auslan manual alphabet which is two-handed), but it is envisaged
that the techniques used in developing the system could be extended to two-handed signs if appropriate input
hardware was available.




                                    Fig. 1 Block diagram of system

Flex Sensors:
         The output the Flex sensor is in the form of the variable resistance. As the fingers are bend the
resistance of the Flex sensor changes. This property is shown below




                                   Working of Flex sensor



www.theijes.com                                                       THE IJES                         Page 21
Hand Sign Interpreter




                          Fig. 2 Voltage divider for flex sensors

Electrical Specifications:
1. Flat resistance : 25 k ohms
2. Tolerance : 30%
3. Bend Resistance Range : 45k to 125k ohms

Accelerometer:
         In the system a 3 axis accelerometer is used. Accelerometer works on the principle of the gravitational
force. Accelerometer Output changes according to the inclination, in response to the movement of the mass
inside. To maintain the simplicity in the prototype we are considering only the 2 dimensional models.
Accelerometer provides temperature compensation and g-select option which allow selection among 4 different
type of sensitivity. It can be directly interfaced to the ADC.

                                        Table I: Gravity Selection Table

      g-select 2                   g -select 1                    g –range              Sensitivity
         0                            0                             1.5g                800mV/g
         0                            1                              2g                 600mV/g
         1                            0                              4g                 300mV/g
         1                            1                              6g                 200mV/g


LPC2138:
      It belongs to ARM7TDMI family. LPC stands for low power consumption.
 Features:
 32 bit RISC processor.
 Hardware Debug Technology.
 Load store architecture.
 512 KB flash memory.
 32 KB RAM.

ADC (Analog to Digital Convertor):
        It uses 10 bit successive approximation technique. Input multiplexing among the 6 channels of ADC0
and 8 channels of ADC1 can be done. The 10 bit conversion time is approximately 2.44 us.

DAC (Digital to Analog Convertor):
      The 10 bit DAC uses resistor string architecture. It produces buffered output.

UART (Universal Asynchronous Receiver Transmitter):
         The Universal Asynchronous Receiver/Transmitter (UART) takes bytes of data from the LPC1238 and
transmits the individual bits in a sequential fashion which is received at the other side i.e. AT45db161. It has got
FIFOs capable of receiving and transmitting 16 bytes of data. It has got built-in fractional baud rate generator
with autobauding capabilities. It uses the mechanism that enables software and hardware flow control
implementation.


www.theijes.com                                                        THE IJES                          Page 22
Hand Sign Interpreter
SPI (Serial Peripheral Interface):
         SPI is a full duplex serial interfaces. It can handle multiple masters and slaves being connected to a
given bus. Only a single master and a single slave can communicate on the interface during a given data transfer.
During a data transfer, the master always sends 8 to 16 bits of data to the slave, and the slave always sends a
byte of data to the master. The maximum bit rate of data is one eighth of the input clock rate.


AT45db161:
         Memory IC is Electronically Erasable Programmable Read Only Memory (EEPROM), which is used in
portable devices. It provides a rapid serial peripheral interface, where 66 MHz is the max clock frequency. It
also provides a page program operation and user configurable page size. With presence of various erase options
the process become convenient.

LM386:
         The LM386 is a power amplifier designed for use in low voltage consumer applications. It’s a versatile
amplifier which can be used for different gains. The gain is set as 20 by default but it can be changed by
changing the components between
Pin no.1 and Pin no.8.

Software Design:
         Getting the analog input from the sensors and converting it into digital form is one of the crucial parts,
for this purpose different registers are used. ADCR, ADGDR, ADDR0, DACR... are used to control and
monitor the process.

1.   ADCR (A/D Control Register): The ADCR register must be written to select the operating mode before
     A/D conversion can occur.
2.   ADGDR (A/D Global Data Register): This register contains the ADC’s DONE bit and the result of the
     most recent A/D conversion. The count is stored

3.   ADDR0 (A/D Channel 0 Data Register): This register contains the result of the most recent conversion
     completed on channel 0.

   After fetching the audio signal and sending to the speaker DACR comes into picture.
4. DACR (D/A Control Register): This read/write register includes the digital value to be converted to analog,
   and a bit that trades off performance vs. power. Bits 5:0 are reserved for future, higher-resolution D/A
   converters.

 Writing a file in the memory IC:
          For the output to be an audio data the audio file should be stored in the memory IC. For this purpose
SPI (serial peripheral interface) and UART (universal asynchronous receiver/transmitter) are used. The data is
transferred in form of pages. Using the code set, the local buffers and SPI are initialized. There are two local
buffers 1 and 2 of size 528 Kbytes each, are used for sending and receiving of the data serially. The data file is
first taken into the local buffer 1 as soon as it is full the data is now taken into the local buffer 2 and data from
the local buffer 1 is send to the memory IC to a specific location. As soon as buffer 2 is full its switched back to
buffer 1 and the process continues the same way until whole data is written into the memory IC.

Reading a file from a memory IC:
          On the other hand while reading a file from the memory; we use an address to fetch a file from its
location. The same process is repeated only in a reverse manner, where the data is transferred from memory IC
to the processor.

 Result And Discussion:
 Application:
Being an application based project it only serves as nullifying the communication medium between dumb and
normal people.




www.theijes.com                                                         THE IJES                          Page 23
Hand Sign Interpreter

  Future Modification
  1. With a little modification these hand signs can be used for operating vehicles. The various operations like
      taking turns, starting or stopping vehicles can be implemented efficiently.
  2. The system can be modified to be a wireless one in such a way that it will minimize the complexity while
      using. Without the wirings around it will be a further step towards commercialization of the product.

  4. Result
            The output signal is an audio signal which is dependent on ADC count, all the hand signs that are to be
  fetched depends on the counts. The hand signs taken in the prototype do not belong to a specific language, rather
  these are taken in a way such that user can modify it as per their requirement.Every bending of the sensor (finger)
  produces a unique ADC count so that when different hand sign is made different ADC count is produced.
  Processor uses these ADC counts to decide which audio signal should be fetched and played accordingly. With
  different hand signs different signals can be produced. Using this concept of ADC count, more and more no. of
  hand signs can be used as per user and accuracy can be increased with a little change in the ADC count. System
  accuracy, user configurability, portability, its immunity to noise and environmental disturbances make it to be a
  better choice over the other products available in the market.




                                            Fig 1 Hand signs used in the system

                                       Table II: Count and Selection Table

              A         U        O        P         Q      HWR       HELLO        LEFT      RIGHT       FORWARD
ADC1(2)     <560      <560     <560     <560      >800     <560      >800
ADC1(3)     >900      <650     <650     <650      <650     <650      <650
ADC1(4)     <600      >850     <600     <600      <600     >850      <600
ADC1(5)     <450      <450     >700     <450      <450     >700      <450
ADC1(6)     <560      <560     <560     >800      <560     <560      >800
ADC(1)                                                                            <350      >650
ADC(2)                                                                                                  <350

  5. CONCLUSION
          In today’s digitized world, processing speeds have increased dramatically, with computers being
  advanced to the levels where they can assist humans in complex tasks. Yet, input technologies seem to cause a
  major bottleneck in performing some of the tasks, under-utilizing the available resources and restricting the
  expressiveness of application use. Hand Gesture recognition comes to the rescue here The system is designed in
  such a way that it helps in communication between speech impaired & normal people with the help of flex
  sensors & accelerometer. We made a prototype in which we have stored 10 different signals of as per the user’s
  hand sign. Three of them are totally based on the output of the accelerometer and seven of them are based on the
  output of flex sensors. The accuracy of the system is efficient enough to produce different outputs with different
  hand signs of the user. With different bending, angle different ADC count & hence different hand signs can be

  www.theijes.com                                                       THE IJES                         Page 24
Hand Sign Interpreter

made & could be stored accordingly. The created system is made in such a way that with small changes the
system can become more & more versatile. To continue this momentum it is clear that further research in the
areas of feature extraction, classification methods, and gesture representation are required to realize the ultimate
goal of humans interfacing with machines on their own natural terms.

References
[1]. Recognition of sign language gestures using neural networks by Peter Vamplew. Department of Computer
     Science, University of Tasmania
[2]. Fels and G Hinton (1993), Glove-Talk: A Neural Network Interface Between a Data-Glove and a Speech
     Synthesiser, IEEE Transactions on Neural Networks, 4, 1, pp. 2-8
[3]. Johnston (1989), Auslan: The Sign Language of the Australian Deaf Community, PhD thesis, Department
     of Linguistics, University of Sydney
[4]. Kramer and L Leifer (1989), The Talking Glove: A Speaking Aid for Nonvocal Deaf and Deaf-Blind
     Individuals, RESNA 12th Annual Conference, New Orleans, Louisiana
[5]. Finger Detection for Sign Language Recognition by Ravikiran J, Kavi Mahesh, Suhas Mahishi,Dheeraj R,
     Sudheender S, Nitin V Pujari




www.theijes.com                                                        THE IJES                          Page 25

Mais conteúdo relacionado

Mais procurados

ARDUINO EMBEDDED SYSTEM
ARDUINO EMBEDDED SYSTEMARDUINO EMBEDDED SYSTEM
ARDUINO EMBEDDED SYSTEMVishal GARG
 
IRJET - Home Automation for Physically Challenged and Elder People
IRJET -  	  Home Automation for Physically Challenged and Elder PeopleIRJET -  	  Home Automation for Physically Challenged and Elder People
IRJET - Home Automation for Physically Challenged and Elder PeopleIRJET Journal
 
IRJET - Cryptographic Communication between Two ESP32 Devices
IRJET - Cryptographic Communication between Two ESP32 DevicesIRJET - Cryptographic Communication between Two ESP32 Devices
IRJET - Cryptographic Communication between Two ESP32 DevicesIRJET Journal
 
Full report Final Year Project ( SMART TISSUE DISPENSER)
Full report Final Year Project ( SMART TISSUE DISPENSER)Full report Final Year Project ( SMART TISSUE DISPENSER)
Full report Final Year Project ( SMART TISSUE DISPENSER)Syarel Saz
 
IRJET- Hampered Serving Bot
IRJET-  	  Hampered Serving BotIRJET-  	  Hampered Serving Bot
IRJET- Hampered Serving BotIRJET Journal
 
IRJET - Aid for Blind People using IoT
IRJET - Aid for Blind People using IoTIRJET - Aid for Blind People using IoT
IRJET - Aid for Blind People using IoTIRJET Journal
 
IRJET- A Amenably Projected VR System of Wall Mounted Buttons
IRJET- A Amenably Projected VR System of Wall Mounted ButtonsIRJET- A Amenably Projected VR System of Wall Mounted Buttons
IRJET- A Amenably Projected VR System of Wall Mounted ButtonsIRJET Journal
 
Gesture Human-Machine Interface (GHMI) in Home Automation
Gesture Human-Machine Interface (GHMI) in Home AutomationGesture Human-Machine Interface (GHMI) in Home Automation
Gesture Human-Machine Interface (GHMI) in Home AutomationIRJET Journal
 
Biometric Identification using Opencv Based on Arduino
Biometric Identification using Opencv Based on ArduinoBiometric Identification using Opencv Based on Arduino
Biometric Identification using Opencv Based on ArduinoIRJET Journal
 
Smart pen(by ankit)
Smart pen(by ankit)Smart pen(by ankit)
Smart pen(by ankit)rankitreddy
 
Input Devices!!!
Input Devices!!!Input Devices!!!
Input Devices!!!Djorkayf
 
Glasses for the blind using ping ultrasonic, ATMEGA8535 and ISD25120
Glasses for the blind using ping ultrasonic, ATMEGA8535 and ISD25120Glasses for the blind using ping ultrasonic, ATMEGA8535 and ISD25120
Glasses for the blind using ping ultrasonic, ATMEGA8535 and ISD25120TELKOMNIKA JOURNAL
 
INTEGRATION OF AN RFID READER TO A WIRELESS SENSOR NETWORK AND ITS USE TO IDE...
INTEGRATION OF AN RFID READER TO A WIRELESS SENSOR NETWORK AND ITS USE TO IDE...INTEGRATION OF AN RFID READER TO A WIRELESS SENSOR NETWORK AND ITS USE TO IDE...
INTEGRATION OF AN RFID READER TO A WIRELESS SENSOR NETWORK AND ITS USE TO IDE...ijasuc
 
IRJET - Boat Shrinking Alert System using IoT
IRJET -  	  Boat Shrinking Alert System using IoTIRJET -  	  Boat Shrinking Alert System using IoT
IRJET - Boat Shrinking Alert System using IoTIRJET Journal
 
Hardware devices
Hardware devicesHardware devices
Hardware devicesBreach_P
 

Mais procurados (18)

ARDUINO EMBEDDED SYSTEM
ARDUINO EMBEDDED SYSTEMARDUINO EMBEDDED SYSTEM
ARDUINO EMBEDDED SYSTEM
 
IRJET - Home Automation for Physically Challenged and Elder People
IRJET -  	  Home Automation for Physically Challenged and Elder PeopleIRJET -  	  Home Automation for Physically Challenged and Elder People
IRJET - Home Automation for Physically Challenged and Elder People
 
IRJET - Cryptographic Communication between Two ESP32 Devices
IRJET - Cryptographic Communication between Two ESP32 DevicesIRJET - Cryptographic Communication between Two ESP32 Devices
IRJET - Cryptographic Communication between Two ESP32 Devices
 
Contiki wsn
Contiki wsnContiki wsn
Contiki wsn
 
Full report Final Year Project ( SMART TISSUE DISPENSER)
Full report Final Year Project ( SMART TISSUE DISPENSER)Full report Final Year Project ( SMART TISSUE DISPENSER)
Full report Final Year Project ( SMART TISSUE DISPENSER)
 
IRJET- Hampered Serving Bot
IRJET-  	  Hampered Serving BotIRJET-  	  Hampered Serving Bot
IRJET- Hampered Serving Bot
 
Chapter05_L14a_Input
Chapter05_L14a_InputChapter05_L14a_Input
Chapter05_L14a_Input
 
blue-eyes technology
blue-eyes technologyblue-eyes technology
blue-eyes technology
 
IRJET - Aid for Blind People using IoT
IRJET - Aid for Blind People using IoTIRJET - Aid for Blind People using IoT
IRJET - Aid for Blind People using IoT
 
IRJET- A Amenably Projected VR System of Wall Mounted Buttons
IRJET- A Amenably Projected VR System of Wall Mounted ButtonsIRJET- A Amenably Projected VR System of Wall Mounted Buttons
IRJET- A Amenably Projected VR System of Wall Mounted Buttons
 
Gesture Human-Machine Interface (GHMI) in Home Automation
Gesture Human-Machine Interface (GHMI) in Home AutomationGesture Human-Machine Interface (GHMI) in Home Automation
Gesture Human-Machine Interface (GHMI) in Home Automation
 
Biometric Identification using Opencv Based on Arduino
Biometric Identification using Opencv Based on ArduinoBiometric Identification using Opencv Based on Arduino
Biometric Identification using Opencv Based on Arduino
 
Smart pen(by ankit)
Smart pen(by ankit)Smart pen(by ankit)
Smart pen(by ankit)
 
Input Devices!!!
Input Devices!!!Input Devices!!!
Input Devices!!!
 
Glasses for the blind using ping ultrasonic, ATMEGA8535 and ISD25120
Glasses for the blind using ping ultrasonic, ATMEGA8535 and ISD25120Glasses for the blind using ping ultrasonic, ATMEGA8535 and ISD25120
Glasses for the blind using ping ultrasonic, ATMEGA8535 and ISD25120
 
INTEGRATION OF AN RFID READER TO A WIRELESS SENSOR NETWORK AND ITS USE TO IDE...
INTEGRATION OF AN RFID READER TO A WIRELESS SENSOR NETWORK AND ITS USE TO IDE...INTEGRATION OF AN RFID READER TO A WIRELESS SENSOR NETWORK AND ITS USE TO IDE...
INTEGRATION OF AN RFID READER TO A WIRELESS SENSOR NETWORK AND ITS USE TO IDE...
 
IRJET - Boat Shrinking Alert System using IoT
IRJET -  	  Boat Shrinking Alert System using IoTIRJET -  	  Boat Shrinking Alert System using IoT
IRJET - Boat Shrinking Alert System using IoT
 
Hardware devices
Hardware devicesHardware devices
Hardware devices
 

Destaque

K03504077089
K03504077089K03504077089
K03504077089theijes
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)theijes
 
C0371010015
C0371010015C0371010015
C0371010015theijes
 
G03503036044
G03503036044G03503036044
G03503036044theijes
 
E03503025029
E03503025029E03503025029
E03503025029theijes
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)theijes
 
C0372021026
C0372021026C0372021026
C0372021026theijes
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)theijes
 
B03504008012
B03504008012B03504008012
B03504008012theijes
 
G0371034040
G0371034040G0371034040
G0371034040theijes
 
Stress Distribution Analysis of Rear Axle Housing by using Finite Elements A...
	Stress Distribution Analysis of Rear Axle Housing by using Finite Elements A...	Stress Distribution Analysis of Rear Axle Housing by using Finite Elements A...
Stress Distribution Analysis of Rear Axle Housing by using Finite Elements A...theijes
 
A03730108
A03730108A03730108
A03730108theijes
 
E03702038045
E03702038045E03702038045
E03702038045theijes
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)theijes
 
G03504047053
G03504047053G03504047053
G03504047053theijes
 
F03502028032
F03502028032F03502028032
F03502028032theijes
 
Studies on Corrosion Characteristics of Carbon Steel Exposed to Na2CO3, Na2SO...
Studies on Corrosion Characteristics of Carbon Steel Exposed to Na2CO3, Na2SO...Studies on Corrosion Characteristics of Carbon Steel Exposed to Na2CO3, Na2SO...
Studies on Corrosion Characteristics of Carbon Steel Exposed to Na2CO3, Na2SO...theijes
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)theijes
 
J03504073076
J03504073076J03504073076
J03504073076theijes
 

Destaque (19)

K03504077089
K03504077089K03504077089
K03504077089
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)
 
C0371010015
C0371010015C0371010015
C0371010015
 
G03503036044
G03503036044G03503036044
G03503036044
 
E03503025029
E03503025029E03503025029
E03503025029
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)
 
C0372021026
C0372021026C0372021026
C0372021026
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)
 
B03504008012
B03504008012B03504008012
B03504008012
 
G0371034040
G0371034040G0371034040
G0371034040
 
Stress Distribution Analysis of Rear Axle Housing by using Finite Elements A...
	Stress Distribution Analysis of Rear Axle Housing by using Finite Elements A...	Stress Distribution Analysis of Rear Axle Housing by using Finite Elements A...
Stress Distribution Analysis of Rear Axle Housing by using Finite Elements A...
 
A03730108
A03730108A03730108
A03730108
 
E03702038045
E03702038045E03702038045
E03702038045
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)
 
G03504047053
G03504047053G03504047053
G03504047053
 
F03502028032
F03502028032F03502028032
F03502028032
 
Studies on Corrosion Characteristics of Carbon Steel Exposed to Na2CO3, Na2SO...
Studies on Corrosion Characteristics of Carbon Steel Exposed to Na2CO3, Na2SO...Studies on Corrosion Characteristics of Carbon Steel Exposed to Na2CO3, Na2SO...
Studies on Corrosion Characteristics of Carbon Steel Exposed to Na2CO3, Na2SO...
 
The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)The International Journal of Engineering and Science (IJES)
The International Journal of Engineering and Science (IJES)
 
J03504073076
J03504073076J03504073076
J03504073076
 

Semelhante a The International Journal of Engineering and Science

IRJET- Hand Movement Recognition for a Speech Impaired Person
IRJET-  	  Hand Movement Recognition for a Speech Impaired PersonIRJET-  	  Hand Movement Recognition for a Speech Impaired Person
IRJET- Hand Movement Recognition for a Speech Impaired PersonIRJET Journal
 
Gestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand GloveGestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand GloveIRJET Journal
 
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...IJERA Editor
 
Media Control Using Hand Gesture Moments
Media Control Using Hand Gesture MomentsMedia Control Using Hand Gesture Moments
Media Control Using Hand Gesture MomentsIRJET Journal
 
STEFANO CARRINO
STEFANO CARRINOSTEFANO CARRINO
STEFANO CARRINObutest
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET Journal
 
An Efficient and Optimal Systems For Medical & Industries With Concept of IOT
An Efficient and Optimal Systems For Medical & Industries With Concept of IOTAn Efficient and Optimal Systems For Medical & Industries With Concept of IOT
An Efficient and Optimal Systems For Medical & Industries With Concept of IOTIJERA Editor
 
Automatic Free Parking Slot Status Intimating System
Automatic Free Parking Slot Status Intimating SystemAutomatic Free Parking Slot Status Intimating System
Automatic Free Parking Slot Status Intimating SystemIRJET Journal
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand GesturesIRJET Journal
 
IRJET- Smart Speaking Glove for Speech Impaired People
IRJET- Smart Speaking Glove for Speech Impaired PeopleIRJET- Smart Speaking Glove for Speech Impaired People
IRJET- Smart Speaking Glove for Speech Impaired PeopleIRJET Journal
 
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014Journal For Research
 
IRJET- Sixth Sense Hand-Mouse Interface using Gestures & Randomized Key
IRJET-  	  Sixth Sense Hand-Mouse Interface using Gestures & Randomized KeyIRJET-  	  Sixth Sense Hand-Mouse Interface using Gestures & Randomized Key
IRJET- Sixth Sense Hand-Mouse Interface using Gestures & Randomized KeyIRJET Journal
 
Sign Language Recognition using Mediapipe
Sign Language Recognition using MediapipeSign Language Recognition using Mediapipe
Sign Language Recognition using MediapipeIRJET Journal
 
Wireless Data Transmission and Acquisition for Alive Detection
Wireless Data Transmission and Acquisition for Alive DetectionWireless Data Transmission and Acquisition for Alive Detection
Wireless Data Transmission and Acquisition for Alive DetectionIJMERJOURNAL
 
Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Harin Veera
 
A Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmA Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmIOSR Journals
 
A Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmA Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmIOSR Journals
 
HAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptxHAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptxDeepakkumaragrahari1
 
Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...
Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...
Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...IOSR Journals
 

Semelhante a The International Journal of Engineering and Science (20)

IRJET- Hand Movement Recognition for a Speech Impaired Person
IRJET-  	  Hand Movement Recognition for a Speech Impaired PersonIRJET-  	  Hand Movement Recognition for a Speech Impaired Person
IRJET- Hand Movement Recognition for a Speech Impaired Person
 
Gestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand GloveGestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand Glove
 
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
 
Media Control Using Hand Gesture Moments
Media Control Using Hand Gesture MomentsMedia Control Using Hand Gesture Moments
Media Control Using Hand Gesture Moments
 
STEFANO CARRINO
STEFANO CARRINOSTEFANO CARRINO
STEFANO CARRINO
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language Interpreter
 
An Efficient and Optimal Systems For Medical & Industries With Concept of IOT
An Efficient and Optimal Systems For Medical & Industries With Concept of IOTAn Efficient and Optimal Systems For Medical & Industries With Concept of IOT
An Efficient and Optimal Systems For Medical & Industries With Concept of IOT
 
Automatic Free Parking Slot Status Intimating System
Automatic Free Parking Slot Status Intimating SystemAutomatic Free Parking Slot Status Intimating System
Automatic Free Parking Slot Status Intimating System
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
 
IRJET- Smart Speaking Glove for Speech Impaired People
IRJET- Smart Speaking Glove for Speech Impaired PeopleIRJET- Smart Speaking Glove for Speech Impaired People
IRJET- Smart Speaking Glove for Speech Impaired People
 
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
 
IRJET- Sixth Sense Hand-Mouse Interface using Gestures & Randomized Key
IRJET-  	  Sixth Sense Hand-Mouse Interface using Gestures & Randomized KeyIRJET-  	  Sixth Sense Hand-Mouse Interface using Gestures & Randomized Key
IRJET- Sixth Sense Hand-Mouse Interface using Gestures & Randomized Key
 
Sign Language Recognition using Mediapipe
Sign Language Recognition using MediapipeSign Language Recognition using Mediapipe
Sign Language Recognition using Mediapipe
 
Wireless Data Transmission and Acquisition for Alive Detection
Wireless Data Transmission and Acquisition for Alive DetectionWireless Data Transmission and Acquisition for Alive Detection
Wireless Data Transmission and Acquisition for Alive Detection
 
Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data
 
A Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmA Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition Algorithm
 
A Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmA Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition Algorithm
 
HAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptxHAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptx
 
E010122431
E010122431E010122431
E010122431
 
Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...
Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...
Digital Pen for Handwritten Digit and Gesture Recognition Using Trajectory Re...
 

The International Journal of Engineering and Science

  • 1. The International Journal of Engineering And Science (IJES) ||Volume|| 1 ||Issue|| 2 ||Pages|| 19-25 ||2012|| ISSN: 2319 – 1813 ISBN: 2319 – 1805 HAND SIGN INTERPRETER 1, Ajinkya Raut, 2,Vineeta Singh, 3,Vikrant Rajput, 4,Ruchika Mahale 1,2,3,4, PES MCOE, PUNE --------------------------------------------------------------- Abstract------------------------------------------------------------ For many deaf people, sign language is the principle means of communication. Normal people around such cases end up facing problems while communicating. This system prototype that is able to automatically recognize sign language to help normal people to communicate more effectively with speech impaired people. This system recognizes the hand signs with the help of specially designed gloves. These recognized gestures are translated into a voice signal in real time. Thus this system reduces the communication gap between normal and the speech impaired people. Keywords—speech impaired, hand signs, sensor glove, easy communication. ----------------------------------------------------------------------------------------------------------------- Date of Submission: 16, November, 2012 Date of Publication: 5,December 2012 ---------------------------------------------------------------------------------------------------------------- 1. INTRODUCTION ‘Sign Language Interpreter’ is a recognition system for the vocally disabled. We have turned to glove- based technique as it is more practical in gesture recognition which involves the use of specially designed sensor glove which produces a signal corresponding to the hand sign. As the performance of the glove is not affected by light, electric or magnetic fields or any other disturbance, the data that is generated is accurate. The processor analyses the signal and fetches the corresponding audio signal from the memory IC which is fed to the amplifier. The speaker generates the relevant sound. As the system uses the low cost and easily available sensors and IC’s the system is very much cost effective. 2. LITERATURE SURVEY In the recent years, there has been tremendous research on the hand sign recognition. The technology of gesture recognition is divided into two categories-  Vision-based In vision-based methods, computer camera is the input device for observing the information of hands or fingers. The Vision Based methods require only a camera, thus realizing a natural interaction between humans and computers without the use of any extra devices. These systems tend to complement biological vision by describing artificial vision systems that are implemented in software and/or hardware. This poses a challenging problem as these systems need to be background invariant, lighting insensitive, person and camera independent to achieve real time performance. Moreover, such systems must be optimized to meet the requirements, including accuracy and robustness. [5]  Glove-based methods In glove based systems data gloves are used which can archive the accurate positions of hand gestures as its positions are directly measured. The Data-Glove based methods use sensor devices for digitizing hand and finger motions into multi-parametric data. The extra sensors make it easy to collect hand configuration and movement. However, the devices are quite expensive and bring much cumbersome experience to the users Some of the earlier gesture recognition systems attempted to identify gestures using glove-based devices that would measure the position and joint angles of the hand. However, these devices are very cumbersome and usually have many cables connected to a computer. This has brought forth the motivation of using non-intrusive, vision-based approaches for recognizing gestures Also the sensors used for the detection of the sign language and the gesture recognition in the system that are available in the market are quite costly. www.theijes.com THE IJES Page 19
  • 2. Hand Sign Interpreter In computer recognition of spoken language, speech data is captured using a microphone connected to an ADC. Similarly a data-capturing device is also required in order to recognize sign language; in this case measuring the position and movement of the signer's hands [1]. Two broad categories of input hardware have been used for recognition of hand gestures – glove-based devices such as those used by Kramer et al (1989) [4] and Fels et al (1993) [2], and camera-based systems as used by Holden (1993). The latter approach has some benefits, particularly as it does not require specialized hardware, but this is offset by the complexity of the computer vision problems faced in extracting the necessary data about the hands from a visual image. Therefore for this research glove-based input was used, as this allowed the research effort to be focused on the area of sign recognition rather than that of data capturing The specific input devices used in developing SLARTI were a Cyber Glove, and a Polhemus Iso Trak. The Cyber Glove measures the degree of flexing of the various joints of the hand and wrist. The version of the Cyber Glove used for this research provides 18 sensors. The Polhemus allows tracking of the spatial position and orientation of the hand with respect to the fixed electro-magnetic source [1].Linguistic analysis of sign language has revealed that signs can be described in terms of four basic manual features, which may be modified in meaning by more subtle factors such as body language and facial expression [3]. The hand shape defines the configuration of the joints of the hand. Orientation specifies the direction in which the hand and fingers are pointing, whilst the place of articulation is the location of the hand relative to the body. The most complex feature is motion, which consists of a change over time of any combination of the other three. The task of transforming a stream of input data directly into a classification of the sign is an extremely difficult task. The approach taken within SLARTI was to initially process the input data so as to produce a description of this sequence in terms of the four features discussed above. The sign is classified on this basis. The SLARTI system consists of four separate feature-extraction neural networks each trained specifically to recognize one of the features of the sign. This approach of decomposing the problem and applying a modular structure of networks has a number of benefits. First, as demonstrated on the task of speech-recognition by Waibeletal (1989), it allows the use of several smaller networks rather than one massive network and thereby reduces the amount of training time and data required. It may also result in superior classification accuracy. Second, it produces a system which can more easily be extended. The features recognized by the feature-extraction networks are expressive enough to describe an extremely large number of signs, not all of which may be recognized by the final classifier. If the vocabulary of the system is to be extended then only the final classifier will require modification. This greatly reduces the costs involved in performing such expansion of the system, and makes it practical to tailor the vocabulary of the system to a particular user. Projects available in market: In the recent era, there has been tremendous development in the work of sign language interpretation. Many systems have been designed for gesture recognition. Depending upon the two categories many systems are now available in the market. 1. Vision-based 2. Glove-based Comparison: In case of vision- based gesture recognition systems, a lot of the digital signal processing has to be done. Also there require large programming. Because of this the response of the system is quite slow. Also the electric, magnetic fields, and any other disturbances may affect the performance of the system. In case of glove-based sign language recognition system that is available in the market lot of the hardware is required. Large numbers of cables have to be connected to the computers for monitoring the data. Hence the systems require lot of space. Also the system is not handy. In case of our project flex sensors that we are using are of low cost. Also the ARM processor that we use in our process is very compact. Hence the space required for our system is very less compared to the other projects that are available in the market. Thus the system is portable. Performance of the system is not affected by the disturbances. Here we are converting hand signs in to the corresponding speech signal; hence the system is the proper means of effective communication. As the hardware required in designing are low cost, the overall cost of the system is less compared to the other systems available in the market, and it is the system is flexible enough for a user to add, modify or delete a hand sign. www.theijes.com THE IJES Page 20
  • 3. Hand Sign Interpreter 3. RESEARCH ELABORATIONS: SYSTEM HARDWARE: System hardware consists of an accelerometer and Flex sensors attached to the glove which are the input devices. The signal so generated by these two sensors is given to the voltage divider and then to the processor i.e. LPC2138 which acts as the heart of the system. Flex sensors made up of resistive carbon element shows the property of change in resistance according to the bending of sensor (fingers). Five Flex sensors are mounted on five fingers of the hand. Accelerometer works on the concept of gravity and serves the purpose of sensing the direction. The output of the Flex sensor is the varying resistance which is converted into a voltage signal using a voltage divider bias circuit. Further, the accelerometer output and the voltage signal are provided to the inbuilt ADC of the processor. The received output from the sensors is continuously monitored.rocessor is interfaced with a memory IC which acts as an input/output device. Various audio files are recorded and then stored in the memory IC for the different hand signs (signals). Thus the processor acts as master and the external memory acts as slave. Depending on the output voltage from the Flex sensors and the accelerometer, processor checks the ADC count in the lookup table and if an audio signal corresponding to the ADC count is found then the audio signal is fetched from the memory IC (external memory). The data so fetched which is in the digital form is converted into an analog signal and provided to the amplifier and thus to the speaker. LCD is used to display the corresponding output, thus serves an important role for cross-verification. The output of the processor is quite weak to drive the loudspeaker. Thus an audio amplifier comes into the picture. The output taken from the processor acting as an input to the audio amplifier is in the form of an analog signal. That is facilitated by DAC which is inbuilt in the processor. Speaker, at the end of the system plays the corresponding audio file.Using only a single glove restricts the system to the recognition of one-handed signs (and hence eliminates the possibility of recognizing the Auslan manual alphabet which is two-handed), but it is envisaged that the techniques used in developing the system could be extended to two-handed signs if appropriate input hardware was available. Fig. 1 Block diagram of system Flex Sensors: The output the Flex sensor is in the form of the variable resistance. As the fingers are bend the resistance of the Flex sensor changes. This property is shown below Working of Flex sensor www.theijes.com THE IJES Page 21
  • 4. Hand Sign Interpreter Fig. 2 Voltage divider for flex sensors Electrical Specifications: 1. Flat resistance : 25 k ohms 2. Tolerance : 30% 3. Bend Resistance Range : 45k to 125k ohms Accelerometer: In the system a 3 axis accelerometer is used. Accelerometer works on the principle of the gravitational force. Accelerometer Output changes according to the inclination, in response to the movement of the mass inside. To maintain the simplicity in the prototype we are considering only the 2 dimensional models. Accelerometer provides temperature compensation and g-select option which allow selection among 4 different type of sensitivity. It can be directly interfaced to the ADC. Table I: Gravity Selection Table g-select 2 g -select 1 g –range Sensitivity 0 0 1.5g 800mV/g 0 1 2g 600mV/g 1 0 4g 300mV/g 1 1 6g 200mV/g LPC2138: It belongs to ARM7TDMI family. LPC stands for low power consumption. Features:  32 bit RISC processor.  Hardware Debug Technology.  Load store architecture.  512 KB flash memory.  32 KB RAM. ADC (Analog to Digital Convertor): It uses 10 bit successive approximation technique. Input multiplexing among the 6 channels of ADC0 and 8 channels of ADC1 can be done. The 10 bit conversion time is approximately 2.44 us. DAC (Digital to Analog Convertor): The 10 bit DAC uses resistor string architecture. It produces buffered output. UART (Universal Asynchronous Receiver Transmitter): The Universal Asynchronous Receiver/Transmitter (UART) takes bytes of data from the LPC1238 and transmits the individual bits in a sequential fashion which is received at the other side i.e. AT45db161. It has got FIFOs capable of receiving and transmitting 16 bytes of data. It has got built-in fractional baud rate generator with autobauding capabilities. It uses the mechanism that enables software and hardware flow control implementation. www.theijes.com THE IJES Page 22
  • 5. Hand Sign Interpreter SPI (Serial Peripheral Interface): SPI is a full duplex serial interfaces. It can handle multiple masters and slaves being connected to a given bus. Only a single master and a single slave can communicate on the interface during a given data transfer. During a data transfer, the master always sends 8 to 16 bits of data to the slave, and the slave always sends a byte of data to the master. The maximum bit rate of data is one eighth of the input clock rate. AT45db161: Memory IC is Electronically Erasable Programmable Read Only Memory (EEPROM), which is used in portable devices. It provides a rapid serial peripheral interface, where 66 MHz is the max clock frequency. It also provides a page program operation and user configurable page size. With presence of various erase options the process become convenient. LM386: The LM386 is a power amplifier designed for use in low voltage consumer applications. It’s a versatile amplifier which can be used for different gains. The gain is set as 20 by default but it can be changed by changing the components between Pin no.1 and Pin no.8. Software Design: Getting the analog input from the sensors and converting it into digital form is one of the crucial parts, for this purpose different registers are used. ADCR, ADGDR, ADDR0, DACR... are used to control and monitor the process. 1. ADCR (A/D Control Register): The ADCR register must be written to select the operating mode before A/D conversion can occur. 2. ADGDR (A/D Global Data Register): This register contains the ADC’s DONE bit and the result of the most recent A/D conversion. The count is stored 3. ADDR0 (A/D Channel 0 Data Register): This register contains the result of the most recent conversion completed on channel 0. After fetching the audio signal and sending to the speaker DACR comes into picture. 4. DACR (D/A Control Register): This read/write register includes the digital value to be converted to analog, and a bit that trades off performance vs. power. Bits 5:0 are reserved for future, higher-resolution D/A converters. Writing a file in the memory IC: For the output to be an audio data the audio file should be stored in the memory IC. For this purpose SPI (serial peripheral interface) and UART (universal asynchronous receiver/transmitter) are used. The data is transferred in form of pages. Using the code set, the local buffers and SPI are initialized. There are two local buffers 1 and 2 of size 528 Kbytes each, are used for sending and receiving of the data serially. The data file is first taken into the local buffer 1 as soon as it is full the data is now taken into the local buffer 2 and data from the local buffer 1 is send to the memory IC to a specific location. As soon as buffer 2 is full its switched back to buffer 1 and the process continues the same way until whole data is written into the memory IC. Reading a file from a memory IC: On the other hand while reading a file from the memory; we use an address to fetch a file from its location. The same process is repeated only in a reverse manner, where the data is transferred from memory IC to the processor. Result And Discussion: Application: Being an application based project it only serves as nullifying the communication medium between dumb and normal people. www.theijes.com THE IJES Page 23
  • 6. Hand Sign Interpreter Future Modification 1. With a little modification these hand signs can be used for operating vehicles. The various operations like taking turns, starting or stopping vehicles can be implemented efficiently. 2. The system can be modified to be a wireless one in such a way that it will minimize the complexity while using. Without the wirings around it will be a further step towards commercialization of the product. 4. Result The output signal is an audio signal which is dependent on ADC count, all the hand signs that are to be fetched depends on the counts. The hand signs taken in the prototype do not belong to a specific language, rather these are taken in a way such that user can modify it as per their requirement.Every bending of the sensor (finger) produces a unique ADC count so that when different hand sign is made different ADC count is produced. Processor uses these ADC counts to decide which audio signal should be fetched and played accordingly. With different hand signs different signals can be produced. Using this concept of ADC count, more and more no. of hand signs can be used as per user and accuracy can be increased with a little change in the ADC count. System accuracy, user configurability, portability, its immunity to noise and environmental disturbances make it to be a better choice over the other products available in the market. Fig 1 Hand signs used in the system Table II: Count and Selection Table A U O P Q HWR HELLO LEFT RIGHT FORWARD ADC1(2) <560 <560 <560 <560 >800 <560 >800 ADC1(3) >900 <650 <650 <650 <650 <650 <650 ADC1(4) <600 >850 <600 <600 <600 >850 <600 ADC1(5) <450 <450 >700 <450 <450 >700 <450 ADC1(6) <560 <560 <560 >800 <560 <560 >800 ADC(1) <350 >650 ADC(2) <350 5. CONCLUSION In today’s digitized world, processing speeds have increased dramatically, with computers being advanced to the levels where they can assist humans in complex tasks. Yet, input technologies seem to cause a major bottleneck in performing some of the tasks, under-utilizing the available resources and restricting the expressiveness of application use. Hand Gesture recognition comes to the rescue here The system is designed in such a way that it helps in communication between speech impaired & normal people with the help of flex sensors & accelerometer. We made a prototype in which we have stored 10 different signals of as per the user’s hand sign. Three of them are totally based on the output of the accelerometer and seven of them are based on the output of flex sensors. The accuracy of the system is efficient enough to produce different outputs with different hand signs of the user. With different bending, angle different ADC count & hence different hand signs can be www.theijes.com THE IJES Page 24
  • 7. Hand Sign Interpreter made & could be stored accordingly. The created system is made in such a way that with small changes the system can become more & more versatile. To continue this momentum it is clear that further research in the areas of feature extraction, classification methods, and gesture representation are required to realize the ultimate goal of humans interfacing with machines on their own natural terms. References [1]. Recognition of sign language gestures using neural networks by Peter Vamplew. Department of Computer Science, University of Tasmania [2]. Fels and G Hinton (1993), Glove-Talk: A Neural Network Interface Between a Data-Glove and a Speech Synthesiser, IEEE Transactions on Neural Networks, 4, 1, pp. 2-8 [3]. Johnston (1989), Auslan: The Sign Language of the Australian Deaf Community, PhD thesis, Department of Linguistics, University of Sydney [4]. Kramer and L Leifer (1989), The Talking Glove: A Speaking Aid for Nonvocal Deaf and Deaf-Blind Individuals, RESNA 12th Annual Conference, New Orleans, Louisiana [5]. Finger Detection for Sign Language Recognition by Ravikiran J, Kavi Mahesh, Suhas Mahishi,Dheeraj R, Sudheender S, Nitin V Pujari www.theijes.com THE IJES Page 25