SlideShare uma empresa Scribd logo
1 de 5
Baixar para ler offline
Computer Vision Based Human-Computer
Interaction Using Color Detection Techniques
Chetan Dhule
Computer Science & Engineering Department
G.H Raisoni College Of Engineering
Nagpur,India
chetandhule123@gmail.com
Trupti Nagrare
Computer Science & Engineering Department
G.H Raisoni College Of Engineering
Nagpur,India
trupti.nagrare@raisoni.net
Abstract— A gesture-based human computer interaction
allows people to control the application on windows by moving
their hands through the air and make computers and devices
easier to use. Existing solutions have relied on gesture
recognition algorithms they needs different hardwares, often
involving complicated setups limited to the research lab.
Algorithms which are used so far for gesture recognition are
not practical or responsive enough for real-world use, might be
due to the inadequate data on which the image processing is
done. As existing methods are based on gesture recognition
algorithms. It needs ‘ANN training’ which makes whole
process slow and reduce accuracy. Method we proposed is
based on real time controlling the motion of mouse in windows
according to the motion of hand and fingers by calculating the
change in pixels values of RBG colors from a video, ‘without
using any ANN training’ to get exact sequence of motion of
hands and fingers.
Keywords- computer vision, gesture recognition, speech
recognition, human computer interaction
I. INTRODUCTION
Existing solutions have relied on gesture recognition
algorithms they needs exotic hardware, often involving
elaborate setups limited to the research lab. Existing gesture
recognition algorithms are not so much efficient or practical
enough for real-world use, might be due to image
processing is applied on inadequate data. As existing
methods are based on gesture recognition algorithms. It
needs ‘ANN training’ which makes whole process slow and
reduce accuracy. The main objective of Method we
proposed is based on real time controlling the motion of
mouse in windows according to the motion of hand and
fingers by calculating the change in pixels values of RBG
colors from a video, ‘without using any ANN training’ to
get exact sequence of motion of hands and fingers.
II. PROBLEM DEFINATION
Unfortunately, most of current approaches which are based
on gesture recognition have several shortcomings. Some of
them has required bulky hardware and users needs to wear
multiple sensors and stand near multiple calibrated cameras
for processing gestures. Most of the cameras used for
capturing the image use color data ,so they are sensitive to
environmental factors such as dynamic backgrounds and
lighting conditions. The algorithms which are used to
identify the gestures from the data supplied by the
hardware like cameras have been unreliable when they are
applied on number of users during the process of testing.
Most of current approaches work by using recognition
algorithms. Since the difference between time needed for
the computer to recognize a gesture and time needed to
display its result is usually longer, so there is always a lag
and it affect the speed of application and makes it slow.
Also all this relay on specific pre-fixed set of gesture.
Finally, there is not any workspace or environments that
allow users to freely use gestures for completing tasks such
as controlling motion and events of mouse and which is
easy to use.
III. OBJECTIVES
Existing solutions have relied on gesture recognition
algorithms they needs exotic hardware like multiple sensors
to wear on hand in the form of gloves to track motion of
mouse coordinates and many times needs to stand near
multiple calibrated cameras, often involving elaborate
setups limited to the research lab. Existing Gesture
recognition algorithms used are not that much practical or
efficient enough for real-world use,slightily due to the
inadequate data on which the image processing is applied.
As existing methods are based on gesture recognition
algorithms. It needs ‘ANN training’ which makes whole
process slow and reduce accuracy. The main objective of
Method we proposed is based on real time controlling the
motion of mouse in windows according to the motion of
hand and fingers by calculating the change in pixels values
of RBG colors from a video, ‘without using any ANN
training’ to get exact sequence of motion of hands and
fingers.
IV. LITERATURE REVIEW
The existing literature briefly explain the processing of
hand gestures. Earlier work by Freeman and Weissman [1]
2014 Fourth International Conference on Communication Systems and Network Technologies
978-1-4799-3070-8/14 $31.00 © 2014 IEEE
DOI 10.1109/CSNT.2014.192
934
2014 Fourth International Conference on Communication Systems and Network Technologies
978-1-4799-3070-8/14 $31.00 © 2014 IEEE
DOI 10.1109/CSNT.2014.192
934
allow the user to control a television set by using a video
camera and computer vision template matching algorithms to
detect a user's hand from across a room. In this approach a
user could show an open hand and an on-screen hand icon
would appear which could be used to adjust various
graphical controls, such as a volume slider. To activate slider
the user needs to cover the control for a fixed amount of
time. By this approach users enjoyed this alternative to the
physical remote control and that the feedback of the on-
screen hand was effective in assisting the user. However, to
activate the different controls users needs to hold their hand
up for long amounts of time, so it is tiring of user. Same type
of problem of user fatigue is common in case of the one of
the gesture-based interfaces called gorilla arm.
Other approaches works by using multiple cameras to detect
and track hand motion by producing a 3D image [2][4]. As
these systems are using multiple cameras so it required
careful installation process as calibration parameters such as
the distance between the cameras was important in the
triangulation algorithms used. Since a large amount of video
data needed to be processed in real-time these algorithms
proves computationally expensive and stereo-matching
typically fails on scenes with little or no texture. Ultimately,
it is not possible to use such systems outside of their special
lab environments. In [3] Pranav Mistry presented the Sixth
Sense wearable gestural interface, which used a camera and
projector worn on the user's chest to allow the user to zoom
in on projected maps(among other activities) by the use of
two-handed gestures. In order for the camera to detect the
user's hand, the user had to wear brightly-colored markers on
their index fingers and thumbs. The regular webcam worn by
the user would also be sensitive to environmental conditions
such as bright sunlight or darkness, which would makes it
difficult to recognize color marker. Wilson and Oliver [5]
aimed to create G Windows which is a Minority Report-like
environment . By pointing with their hand and using voice
commands the user was able to move an on-screen cursor of
a Microsoft Windows desktop to trigger actions like "close"
and "scroll" to affect the underlying application windows.
They concluded that users preferred interacting with hand
gestures over voice commands and if desktop workspaces
designed for gesture interactions so it would be more
profitable in further. When considering online workspaces,
several commercial and academic web-based collaboration
solutions have existed for some time. However, there are
limitations like interaction with other users in these
environments is usually limited to basic sharing of media
files, rather than allowing for full real-time collaboration of
entire web-based applications and their data between users
on distinctly deployed domains, as this paper proposes.
Cristian Gadea, Bogdan Ionescu [6] aimed to create Finger-
Based Gesture Control of a Collaborative Online Workspace,
but system needs continuous internet connectivity, but this is
not possible always in India. It needs an online workspace
called as UC-IC, the application is within web browser to
determine latest hand gesture, but it is not possible always to
provide all time high speed connectivity everywhere and
every time. Beside this it needs the training to recognize
gesture, it slows down the system. In [7,8,9] methods are
based on gesture recognition algorithms. It needs ‘ANN
training’ which makes whole process slow and reduce
accuracy. Because each time if we are trying to recognize the
gesture so the ANN training will be needed, and much of
time will be needed. So system will not work or can’t match
its output speed with exact motion of mouse pointer.
V. SYSTEM ARCHITECTURE
In this system we have used different preprocessing
techniques, feature extraction a tool for recognizing the
pixel based values or coordinates of RBG color by tracking
the change in pixel position of different color stickers
attached at fingers of user in real time. So accordingly the
new updated values will be sent to PC to track motion of
mouse.
Figure 1: Block diagram of the different phases of the system.
A. Video Capturing: Here continuous video will be
given as an input toby our system to the laptop.
B. Image Processing: Image segmentation is done under
two phases:
1. Skin Detection Model: To detect hand and fingers
from image.
2. Approximate Median model : For subtraction of
background.It has been observed that by the use of
both methods for segmentation was obtained much
better for further process.
C. Pixel Extraction: In this phase we will get pixel
sequence from image ‘without using any ANN training’
to get exact sequence of motion of hands and fingers.
D. Color Detection : In this phase we will extract color
positions of RGB color from pixel sequence to detect the
motion of hand and fingures by calculating change in
pixel values of RBG colors.
935935
E. Controlling Position of mouse Pointer: Send signals to
system to control mouse pointer motion and mouse
events. It will give an appropriate command to PC to
display the motion of mouse pointer according to motion
of users fingers or hand.
VI. TECHNIQUES FOR FOR PIXEL AND COLOR
DETECTION
A. Video Capturing
1) Loading Drivers
System may have multiple web camera. It needs camera
driver.Each Driver has a unique ID.Use
“capGetDriverDescription” function which return
Name of driver and the ID of driver.
2) Capturing
To capture camera view:
obj=capCreateCaptureWindow();
To start showing camera view in picture box in our
s/w: sendmesage(connect,obj);
B. Processing frames of video
We cant process the video directly ,so we need to convert
video into image by function: picture=hdcToPicture(obj);
Suppose camera of 16MP and fps value=45 (Frames per
second).So we will need to process 45 images per second.
To get detail pixel RGB (RED,GREEN,BLUE) values use
function “ getBitmapBits() “
C. Getting Pixel Color:
Figure 2. Getting Pixel Color
D. Scanning
Figure 3. Scanning pixel wise horizontally in x and come
back vertically in y direction
E. Algorithm for pixel and color detection
Figure 4: Algorithm for pixel and color detection
X- x-coordinate of pixel in image.
Y- y-coordinate of pixel in image.
R-Red
B-Blue
G-Green
VII. METHODOLOGY
A. Hand Position tracking and mouse control
Figure 5. Hand Position tracking and mouse control
Getting user input virtually is the main aim for this module
where user will move his finger in front of camera capture
area. This motion will capture and detected by the camera
and processed by the system frame by frame. After
processing system will try to get the finger co-ordinates and
once co-ordinates get calculated it will operate the cursor
position.
B. Laser Pointer Detection
Figure 6. Laser Pointer Detection
936936
C. Hand Gesture Based Auto Image Grabbing: (virtual
Zoom in/out)
Figure7. Virtual Zoom in/out
D. Camera Processing and image capturing:
Figure8. Camera Processing and image capturing
E. Virtual Sense for file handling.
This system will make use of the virtual sense technology in
order to copy a file from one system into another within a
local area network LAN/Wi-Fi. The user will make an
action of picking upon the file that needs to be copied and
then move it to the system where the file would be copied
and then release it over that system.
VIII. RESULTS AND DISCUSSION
The software has provision to control all clicking events of
mouse by using a color marker. . After several experiments,
it was observed that use of red color marker are more
effective in comparison with when other color markers are
used.
Figure9.Graphical user Interface of application .
Figure10.Start camera
Figure11. Set the marker color.
937937
Figure12.Control motion and clicking events of mouse with
the color marker set earlier
IX. CONCLUSION
This project can be very useful for people who want to
control computer without actually touching to system or by
using wireless mouse which needs always a platform to
operate. The accuracy is more when we are using red color
marker in comparison to the case when other color markers
were used individually. The problem of changing lighting
condition and color based recognition has been solved in
this work by giving the button to set the marker color at
starting phase of application. Still there are some problems
while recognition speed , where speed of controlling motion
of mouse is not 100% which need to be improved for some
of the gestures. All mouse movement and keys action has
already been mapped that is working well under given
circumstances. As a part of future scope the application can
be improved to work with mobile phone and play stations.
Other mode of human computer interaction like voice
recognition, facial expression, eye gaze, etc. can also be
combined to make the system more robust and flexible.
ACKNOWLEDGMENT
I want to thank all subjects participating in our experiments,
my guide for her valuable guidance, advice and help
provided during this project. And finally I will thank to my
parents for their encouragement.
REFERENCES
[1] W. T. Freeman and C. D. Weissman, "Television Control by
Hand-Gestures", in proc. of international. Workshop on
Automatic Face and Gesture Recognition. IEEE Computer
Society, 1995, pp. 179-183.
[2] Z. Jun, Z. Fangwen, W. Jiaqi, Y. Zhengpeng, and C. Jinbo,
"3D Hand-Gesture Analysis Based on Multi-Criterion in
Multi-Camera Systems”,in ICAL2008 IEEE Int. Conf. on
Automation and Logistics. IEEE Computer Society,
September 2008, pp. 2342-2346.
[3] P. Mistry and P. Maes, "Sixth Sense: A Wearable Gestural
Interface", in ACM SIGGRAPH ASIA 2009 Sketches. New
York, NY, USA: ACM,2009.
[4] A. Utsumi, T. Miyasato, and F. Kishino, " Multi-Camera
Hand Pose Recognition System Using Skeleton Image ", in
RO-MAN'95:Proc. Of 4th IEEE international Workshop on
Robot and Human Communication. IEEE Computer Society,
July 1995, pp. 219-224.
[5] A. Wilson and N. Oliver, "G Windows: Robust Stereo Vision
for Gesture Based Control of Windows", in ICMI03: Proc. of
5th Int. Con! On Multimodal interfaces. New York, NY,
USA: ACM, 2003, pp. 211-218.
[6] Cristian Gadea, BogdanIonescu, Dan Ionescu, Shahidul Islam,
Bogdan Solomon University of Ottawa, Mgestyk
Technologies, “ Finger-Based Gesture Control of a
Collaborative Online Workspace”, 7th IEEE International
Symposium on Applied computational intelligence and
Informatics· May 24-26, 2012 Timisoara, Romania.
[7] Manaram Ganasekera,”Computer Vision Based Hand
movement Capturing System ”, The 8th International
Conference on Computer Science & Education (ICCSE 2013)
April 26-28, 2013. Colombo, Sri Lanka
[8] Fabrizio Lamberti, ”Endowing Existing Desktop Applications
with Customizable Body Gesture-based Interfaces”,IEEE Int’l
Conference on Consumer Electronics(ICCE),978-1-4673-
1363-6, 2013
[9] Anupam Agrawal, Rohit Raj and Shubha Porwal, ”Vision-
based Multimodal Human-Computer Interaction using Hand
and Head Gestures”, Proceedings of 2013 IEEE Conference
on Information and Communication Technologies (ICT 2013)
[10] M. Turk and G.Robertson, “Perceptual user interfaces”,
Communications of the ACM, vol. 43(3), March 2000.
[11] Y. Wu and T. S. Huang, "Vision-Based Gesture Recognition:
A Review", Lecture Notes in Computer Science, Vol.
1739,pp. 103-115,1999
938938

Mais conteúdo relacionado

Mais procurados

Design of Image Projection Using Combined Approach for Tracking
Design of Image Projection Using Combined Approach for  TrackingDesign of Image Projection Using Combined Approach for  Tracking
Design of Image Projection Using Combined Approach for TrackingIJMER
 
Final Year Project-Gesture Based Interaction and Image Processing
Final Year Project-Gesture Based Interaction and Image ProcessingFinal Year Project-Gesture Based Interaction and Image Processing
Final Year Project-Gesture Based Interaction and Image ProcessingSabnam Pandey, MBA
 
User Interfaces and User Centered Design Techniques for Augmented Reality and...
User Interfaces and User Centered Design Techniques for Augmented Reality and...User Interfaces and User Centered Design Techniques for Augmented Reality and...
User Interfaces and User Centered Design Techniques for Augmented Reality and...Stuart Murphy
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...NidhinRaj Saikripa
 
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerMems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerIJARIIT
 
human activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysishuman activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysisVenkat Projects
 
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardDevelopment of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardWaqas Tariq
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear CameraIRJET Journal
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt3265mn
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-pptvignan university
 
Media Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand GestureMedia Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand GestureIRJET Journal
 
Lift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationLift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationShang Ma
 
Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot IJERA Editor
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureIRJET Journal
 
Human activity recognition
Human activity recognition Human activity recognition
Human activity recognition srikanthgadam
 
Human Activity Recognition using Smartphone's sensor
Human Activity Recognition using Smartphone's sensor Human Activity Recognition using Smartphone's sensor
Human Activity Recognition using Smartphone's sensor Pankaj Mishra
 

Mais procurados (19)

Design of Image Projection Using Combined Approach for Tracking
Design of Image Projection Using Combined Approach for  TrackingDesign of Image Projection Using Combined Approach for  Tracking
Design of Image Projection Using Combined Approach for Tracking
 
Final Year Project-Gesture Based Interaction and Image Processing
Final Year Project-Gesture Based Interaction and Image ProcessingFinal Year Project-Gesture Based Interaction and Image Processing
Final Year Project-Gesture Based Interaction and Image Processing
 
User Interfaces and User Centered Design Techniques for Augmented Reality and...
User Interfaces and User Centered Design Techniques for Augmented Reality and...User Interfaces and User Centered Design Techniques for Augmented Reality and...
User Interfaces and User Centered Design Techniques for Augmented Reality and...
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
 
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerMems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
 
Sixth sense
Sixth senseSixth sense
Sixth sense
 
human activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysishuman activity recognization using machine learning with data analysis
human activity recognization using machine learning with data analysis
 
Niknewppt
NiknewpptNiknewppt
Niknewppt
 
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 BoardDevelopment of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
Development of Sign Signal Translation System Based on Altera’s FPGA DE2 Board
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear Camera
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
 
28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt28704893 sixth-sense-final-ppt
28704893 sixth-sense-final-ppt
 
Nikppt
NikpptNikppt
Nikppt
 
Media Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand GestureMedia Player with Face Detection and Hand Gesture
Media Player with Face Detection and Hand Gesture
 
Lift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationLift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentation
 
Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger Gesture
 
Human activity recognition
Human activity recognition Human activity recognition
Human activity recognition
 
Human Activity Recognition using Smartphone's sensor
Human Activity Recognition using Smartphone's sensor Human Activity Recognition using Smartphone's sensor
Human Activity Recognition using Smartphone's sensor
 

Destaque

Ubitous computing ppt
Ubitous computing pptUbitous computing ppt
Ubitous computing pptjolly9293
 
Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interactionDr. Rajesh P Barnwal
 
Docxpresso legal documents online eIDAS
Docxpresso legal documents online eIDASDocxpresso legal documents online eIDAS
Docxpresso legal documents online eIDASLink to WhatsApp
 
Palestra campo grande_(4)
Palestra campo grande_(4)Palestra campo grande_(4)
Palestra campo grande_(4)coisadevelho
 
Brentford pocket-by-brisbane-property-developer-ar-developments-brochure
Brentford pocket-by-brisbane-property-developer-ar-developments-brochureBrentford pocket-by-brisbane-property-developer-ar-developments-brochure
Brentford pocket-by-brisbane-property-developer-ar-developments-brochureMick Millan
 
Some Changes madet to our opening sequence
Some Changes madet to our opening sequenceSome Changes madet to our opening sequence
Some Changes madet to our opening sequencejodiefoster96
 
Hombro doloroso
Hombro dolorosoHombro doloroso
Hombro dolorosofreefallen
 
опрос велосипедистов в сокольниках
опрос велосипедистов в сокольникахопрос велосипедистов в сокольниках
опрос велосипедистов в сокольникахstassia8
 
Tips for Building Strong Relationships
Tips for Building Strong RelationshipsTips for Building Strong Relationships
Tips for Building Strong RelationshipsPhilip H. Levy
 
Casimir fishing trip
Casimir fishing tripCasimir fishing trip
Casimir fishing tripjinmartini
 
Forms & conventions for documentary jodie
Forms & conventions for documentary jodieForms & conventions for documentary jodie
Forms & conventions for documentary jodiejodiefoster96
 
RT Overview Rus 2016
RT Overview Rus 2016RT Overview Rus 2016
RT Overview Rus 2016RT TV Channel
 

Destaque (20)

Ubitous computing ppt
Ubitous computing pptUbitous computing ppt
Ubitous computing ppt
 
Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interaction
 
Docxpresso legal documents online eIDAS
Docxpresso legal documents online eIDASDocxpresso legal documents online eIDAS
Docxpresso legal documents online eIDAS
 
Dasar kelistrikan
Dasar kelistrikanDasar kelistrikan
Dasar kelistrikan
 
Serena fasolo es. 4
Serena fasolo es. 4Serena fasolo es. 4
Serena fasolo es. 4
 
Palestra campo grande_(4)
Palestra campo grande_(4)Palestra campo grande_(4)
Palestra campo grande_(4)
 
Brentford pocket-by-brisbane-property-developer-ar-developments-brochure
Brentford pocket-by-brisbane-property-developer-ar-developments-brochureBrentford pocket-by-brisbane-property-developer-ar-developments-brochure
Brentford pocket-by-brisbane-property-developer-ar-developments-brochure
 
Pdf 1
Pdf 1Pdf 1
Pdf 1
 
Some Changes madet to our opening sequence
Some Changes madet to our opening sequenceSome Changes madet to our opening sequence
Some Changes madet to our opening sequence
 
Air Pollution
Air PollutionAir Pollution
Air Pollution
 
Hombro doloroso
Hombro dolorosoHombro doloroso
Hombro doloroso
 
опрос велосипедистов в сокольниках
опрос велосипедистов в сокольникахопрос велосипедистов в сокольниках
опрос велосипедистов в сокольниках
 
Tips for Building Strong Relationships
Tips for Building Strong RelationshipsTips for Building Strong Relationships
Tips for Building Strong Relationships
 
Casimir fishing trip
Casimir fishing tripCasimir fishing trip
Casimir fishing trip
 
Media study.
Media study.Media study.
Media study.
 
Assignment 3
Assignment 3Assignment 3
Assignment 3
 
Evaluation
EvaluationEvaluation
Evaluation
 
Cms made simple
Cms made simpleCms made simple
Cms made simple
 
Forms & conventions for documentary jodie
Forms & conventions for documentary jodieForms & conventions for documentary jodie
Forms & conventions for documentary jodie
 
RT Overview Rus 2016
RT Overview Rus 2016RT Overview Rus 2016
RT Overview Rus 2016
 

Semelhante a Computer Vision Based Human-Computer Interaction Using Color Detection

A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureIRJET Journal
 
Virtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture RecognitionVirtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture RecognitionIRJET Journal
 
virtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptxvirtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptxsivaeswarreddy
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand GesturesIRJET Journal
 
Sign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningSign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningIRJET Journal
 
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...IRJET Journal
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET Journal
 
Cursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptxCursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptxRastogiAman
 
VIRTUAL PAINT APPLICATION USING HAND GESTURES
VIRTUAL PAINT APPLICATION USING HAND GESTURESVIRTUAL PAINT APPLICATION USING HAND GESTURES
VIRTUAL PAINT APPLICATION USING HAND GESTURESIRJET Journal
 
HAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSEHAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSEIRJET Journal
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesIRJET Journal
 
Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Harin Veera
 
VIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVVIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVIRJET Journal
 
Research on Detecting Hand Gesture
Research on Detecting Hand GestureResearch on Detecting Hand Gesture
Research on Detecting Hand GestureIRJET Journal
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesIRJET Journal
 
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...IRJET Journal
 
Controlling Mouse Movements Using hand Gesture And X box 360
Controlling Mouse Movements Using hand Gesture And X box 360Controlling Mouse Movements Using hand Gesture And X box 360
Controlling Mouse Movements Using hand Gesture And X box 360IRJET Journal
 
HAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptxHAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptxDeepakkumaragrahari1
 

Semelhante a Computer Vision Based Human-Computer Interaction Using Color Detection (20)

A Survey on Detecting Hand Gesture
A Survey on Detecting Hand GestureA Survey on Detecting Hand Gesture
A Survey on Detecting Hand Gesture
 
Virtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture RecognitionVirtual Mouse Control Using Hand Gesture Recognition
Virtual Mouse Control Using Hand Gesture Recognition
 
virtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptxvirtual mouse using hand gesture.pptx
virtual mouse using hand gesture.pptx
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
 
Sign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningSign Language Recognition using Machine Learning
Sign Language Recognition using Machine Learning
 
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...IRJET-  	  Convenience Improvement for Graphical Interface using Gesture Dete...
IRJET- Convenience Improvement for Graphical Interface using Gesture Dete...
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language Interpreter
 
Cursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptxCursor movement by hand gesture.pptx
Cursor movement by hand gesture.pptx
 
VIRTUAL PAINT APPLICATION USING HAND GESTURES
VIRTUAL PAINT APPLICATION USING HAND GESTURESVIRTUAL PAINT APPLICATION USING HAND GESTURES
VIRTUAL PAINT APPLICATION USING HAND GESTURES
 
HAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSEHAND GESTURE CONTROLLED MOUSE
HAND GESTURE CONTROLLED MOUSE
 
G0342039042
G0342039042G0342039042
G0342039042
 
K1802027780
K1802027780K1802027780
K1802027780
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand Gestures
 
Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data
 
VIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVVIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCV
 
Research on Detecting Hand Gesture
Research on Detecting Hand GestureResearch on Detecting Hand Gesture
Research on Detecting Hand Gesture
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
 
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
 
Controlling Mouse Movements Using hand Gesture And X box 360
Controlling Mouse Movements Using hand Gesture And X box 360Controlling Mouse Movements Using hand Gesture And X box 360
Controlling Mouse Movements Using hand Gesture And X box 360
 
HAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptxHAND GESTURE RECOGNITION.ppt (1).pptx
HAND GESTURE RECOGNITION.ppt (1).pptx
 

Último

ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfKamal Acharya
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINESIVASHANKAR N
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlysanyuktamishra911
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxupamatechverse
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSISrknatarajan
 
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service NashikCall Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
result management system report for college project
result management system report for college projectresult management system report for college project
result management system report for college projectTonystark477637
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...ranjana rawat
 
Online banking management system project.pdf
Online banking management system project.pdfOnline banking management system project.pdf
Online banking management system project.pdfKamal Acharya
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdfKamal Acharya
 
UNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its PerformanceUNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its Performancesivaprakash250
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordAsst.prof M.Gokilavani
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)simmis5
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxAsutosh Ranjan
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingrakeshbaidya232001
 

Último (20)

ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdfONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
ONLINE FOOD ORDER SYSTEM PROJECT REPORT.pdf
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
 
KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghly
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptx
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSIS
 
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service NashikCall Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
 
result management system report for college project
result management system report for college projectresult management system report for college project
result management system report for college project
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
 
Online banking management system project.pdf
Online banking management system project.pdfOnline banking management system project.pdf
Online banking management system project.pdf
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdf
 
Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024
 
UNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its PerformanceUNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its Performance
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
 
Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)Java Programming :Event Handling(Types of Events)
Java Programming :Event Handling(Types of Events)
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptx
 
Roadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and RoutesRoadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and Routes
 
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writing
 

Computer Vision Based Human-Computer Interaction Using Color Detection

  • 1. Computer Vision Based Human-Computer Interaction Using Color Detection Techniques Chetan Dhule Computer Science & Engineering Department G.H Raisoni College Of Engineering Nagpur,India chetandhule123@gmail.com Trupti Nagrare Computer Science & Engineering Department G.H Raisoni College Of Engineering Nagpur,India trupti.nagrare@raisoni.net Abstract— A gesture-based human computer interaction allows people to control the application on windows by moving their hands through the air and make computers and devices easier to use. Existing solutions have relied on gesture recognition algorithms they needs different hardwares, often involving complicated setups limited to the research lab. Algorithms which are used so far for gesture recognition are not practical or responsive enough for real-world use, might be due to the inadequate data on which the image processing is done. As existing methods are based on gesture recognition algorithms. It needs ‘ANN training’ which makes whole process slow and reduce accuracy. Method we proposed is based on real time controlling the motion of mouse in windows according to the motion of hand and fingers by calculating the change in pixels values of RBG colors from a video, ‘without using any ANN training’ to get exact sequence of motion of hands and fingers. Keywords- computer vision, gesture recognition, speech recognition, human computer interaction I. INTRODUCTION Existing solutions have relied on gesture recognition algorithms they needs exotic hardware, often involving elaborate setups limited to the research lab. Existing gesture recognition algorithms are not so much efficient or practical enough for real-world use, might be due to image processing is applied on inadequate data. As existing methods are based on gesture recognition algorithms. It needs ‘ANN training’ which makes whole process slow and reduce accuracy. The main objective of Method we proposed is based on real time controlling the motion of mouse in windows according to the motion of hand and fingers by calculating the change in pixels values of RBG colors from a video, ‘without using any ANN training’ to get exact sequence of motion of hands and fingers. II. PROBLEM DEFINATION Unfortunately, most of current approaches which are based on gesture recognition have several shortcomings. Some of them has required bulky hardware and users needs to wear multiple sensors and stand near multiple calibrated cameras for processing gestures. Most of the cameras used for capturing the image use color data ,so they are sensitive to environmental factors such as dynamic backgrounds and lighting conditions. The algorithms which are used to identify the gestures from the data supplied by the hardware like cameras have been unreliable when they are applied on number of users during the process of testing. Most of current approaches work by using recognition algorithms. Since the difference between time needed for the computer to recognize a gesture and time needed to display its result is usually longer, so there is always a lag and it affect the speed of application and makes it slow. Also all this relay on specific pre-fixed set of gesture. Finally, there is not any workspace or environments that allow users to freely use gestures for completing tasks such as controlling motion and events of mouse and which is easy to use. III. OBJECTIVES Existing solutions have relied on gesture recognition algorithms they needs exotic hardware like multiple sensors to wear on hand in the form of gloves to track motion of mouse coordinates and many times needs to stand near multiple calibrated cameras, often involving elaborate setups limited to the research lab. Existing Gesture recognition algorithms used are not that much practical or efficient enough for real-world use,slightily due to the inadequate data on which the image processing is applied. As existing methods are based on gesture recognition algorithms. It needs ‘ANN training’ which makes whole process slow and reduce accuracy. The main objective of Method we proposed is based on real time controlling the motion of mouse in windows according to the motion of hand and fingers by calculating the change in pixels values of RBG colors from a video, ‘without using any ANN training’ to get exact sequence of motion of hands and fingers. IV. LITERATURE REVIEW The existing literature briefly explain the processing of hand gestures. Earlier work by Freeman and Weissman [1] 2014 Fourth International Conference on Communication Systems and Network Technologies 978-1-4799-3070-8/14 $31.00 © 2014 IEEE DOI 10.1109/CSNT.2014.192 934 2014 Fourth International Conference on Communication Systems and Network Technologies 978-1-4799-3070-8/14 $31.00 © 2014 IEEE DOI 10.1109/CSNT.2014.192 934
  • 2. allow the user to control a television set by using a video camera and computer vision template matching algorithms to detect a user's hand from across a room. In this approach a user could show an open hand and an on-screen hand icon would appear which could be used to adjust various graphical controls, such as a volume slider. To activate slider the user needs to cover the control for a fixed amount of time. By this approach users enjoyed this alternative to the physical remote control and that the feedback of the on- screen hand was effective in assisting the user. However, to activate the different controls users needs to hold their hand up for long amounts of time, so it is tiring of user. Same type of problem of user fatigue is common in case of the one of the gesture-based interfaces called gorilla arm. Other approaches works by using multiple cameras to detect and track hand motion by producing a 3D image [2][4]. As these systems are using multiple cameras so it required careful installation process as calibration parameters such as the distance between the cameras was important in the triangulation algorithms used. Since a large amount of video data needed to be processed in real-time these algorithms proves computationally expensive and stereo-matching typically fails on scenes with little or no texture. Ultimately, it is not possible to use such systems outside of their special lab environments. In [3] Pranav Mistry presented the Sixth Sense wearable gestural interface, which used a camera and projector worn on the user's chest to allow the user to zoom in on projected maps(among other activities) by the use of two-handed gestures. In order for the camera to detect the user's hand, the user had to wear brightly-colored markers on their index fingers and thumbs. The regular webcam worn by the user would also be sensitive to environmental conditions such as bright sunlight or darkness, which would makes it difficult to recognize color marker. Wilson and Oliver [5] aimed to create G Windows which is a Minority Report-like environment . By pointing with their hand and using voice commands the user was able to move an on-screen cursor of a Microsoft Windows desktop to trigger actions like "close" and "scroll" to affect the underlying application windows. They concluded that users preferred interacting with hand gestures over voice commands and if desktop workspaces designed for gesture interactions so it would be more profitable in further. When considering online workspaces, several commercial and academic web-based collaboration solutions have existed for some time. However, there are limitations like interaction with other users in these environments is usually limited to basic sharing of media files, rather than allowing for full real-time collaboration of entire web-based applications and their data between users on distinctly deployed domains, as this paper proposes. Cristian Gadea, Bogdan Ionescu [6] aimed to create Finger- Based Gesture Control of a Collaborative Online Workspace, but system needs continuous internet connectivity, but this is not possible always in India. It needs an online workspace called as UC-IC, the application is within web browser to determine latest hand gesture, but it is not possible always to provide all time high speed connectivity everywhere and every time. Beside this it needs the training to recognize gesture, it slows down the system. In [7,8,9] methods are based on gesture recognition algorithms. It needs ‘ANN training’ which makes whole process slow and reduce accuracy. Because each time if we are trying to recognize the gesture so the ANN training will be needed, and much of time will be needed. So system will not work or can’t match its output speed with exact motion of mouse pointer. V. SYSTEM ARCHITECTURE In this system we have used different preprocessing techniques, feature extraction a tool for recognizing the pixel based values or coordinates of RBG color by tracking the change in pixel position of different color stickers attached at fingers of user in real time. So accordingly the new updated values will be sent to PC to track motion of mouse. Figure 1: Block diagram of the different phases of the system. A. Video Capturing: Here continuous video will be given as an input toby our system to the laptop. B. Image Processing: Image segmentation is done under two phases: 1. Skin Detection Model: To detect hand and fingers from image. 2. Approximate Median model : For subtraction of background.It has been observed that by the use of both methods for segmentation was obtained much better for further process. C. Pixel Extraction: In this phase we will get pixel sequence from image ‘without using any ANN training’ to get exact sequence of motion of hands and fingers. D. Color Detection : In this phase we will extract color positions of RGB color from pixel sequence to detect the motion of hand and fingures by calculating change in pixel values of RBG colors. 935935
  • 3. E. Controlling Position of mouse Pointer: Send signals to system to control mouse pointer motion and mouse events. It will give an appropriate command to PC to display the motion of mouse pointer according to motion of users fingers or hand. VI. TECHNIQUES FOR FOR PIXEL AND COLOR DETECTION A. Video Capturing 1) Loading Drivers System may have multiple web camera. It needs camera driver.Each Driver has a unique ID.Use “capGetDriverDescription” function which return Name of driver and the ID of driver. 2) Capturing To capture camera view: obj=capCreateCaptureWindow(); To start showing camera view in picture box in our s/w: sendmesage(connect,obj); B. Processing frames of video We cant process the video directly ,so we need to convert video into image by function: picture=hdcToPicture(obj); Suppose camera of 16MP and fps value=45 (Frames per second).So we will need to process 45 images per second. To get detail pixel RGB (RED,GREEN,BLUE) values use function “ getBitmapBits() “ C. Getting Pixel Color: Figure 2. Getting Pixel Color D. Scanning Figure 3. Scanning pixel wise horizontally in x and come back vertically in y direction E. Algorithm for pixel and color detection Figure 4: Algorithm for pixel and color detection X- x-coordinate of pixel in image. Y- y-coordinate of pixel in image. R-Red B-Blue G-Green VII. METHODOLOGY A. Hand Position tracking and mouse control Figure 5. Hand Position tracking and mouse control Getting user input virtually is the main aim for this module where user will move his finger in front of camera capture area. This motion will capture and detected by the camera and processed by the system frame by frame. After processing system will try to get the finger co-ordinates and once co-ordinates get calculated it will operate the cursor position. B. Laser Pointer Detection Figure 6. Laser Pointer Detection 936936
  • 4. C. Hand Gesture Based Auto Image Grabbing: (virtual Zoom in/out) Figure7. Virtual Zoom in/out D. Camera Processing and image capturing: Figure8. Camera Processing and image capturing E. Virtual Sense for file handling. This system will make use of the virtual sense technology in order to copy a file from one system into another within a local area network LAN/Wi-Fi. The user will make an action of picking upon the file that needs to be copied and then move it to the system where the file would be copied and then release it over that system. VIII. RESULTS AND DISCUSSION The software has provision to control all clicking events of mouse by using a color marker. . After several experiments, it was observed that use of red color marker are more effective in comparison with when other color markers are used. Figure9.Graphical user Interface of application . Figure10.Start camera Figure11. Set the marker color. 937937
  • 5. Figure12.Control motion and clicking events of mouse with the color marker set earlier IX. CONCLUSION This project can be very useful for people who want to control computer without actually touching to system or by using wireless mouse which needs always a platform to operate. The accuracy is more when we are using red color marker in comparison to the case when other color markers were used individually. The problem of changing lighting condition and color based recognition has been solved in this work by giving the button to set the marker color at starting phase of application. Still there are some problems while recognition speed , where speed of controlling motion of mouse is not 100% which need to be improved for some of the gestures. All mouse movement and keys action has already been mapped that is working well under given circumstances. As a part of future scope the application can be improved to work with mobile phone and play stations. Other mode of human computer interaction like voice recognition, facial expression, eye gaze, etc. can also be combined to make the system more robust and flexible. ACKNOWLEDGMENT I want to thank all subjects participating in our experiments, my guide for her valuable guidance, advice and help provided during this project. And finally I will thank to my parents for their encouragement. REFERENCES [1] W. T. Freeman and C. D. Weissman, "Television Control by Hand-Gestures", in proc. of international. Workshop on Automatic Face and Gesture Recognition. IEEE Computer Society, 1995, pp. 179-183. [2] Z. Jun, Z. Fangwen, W. Jiaqi, Y. Zhengpeng, and C. Jinbo, "3D Hand-Gesture Analysis Based on Multi-Criterion in Multi-Camera Systems”,in ICAL2008 IEEE Int. Conf. on Automation and Logistics. IEEE Computer Society, September 2008, pp. 2342-2346. [3] P. Mistry and P. Maes, "Sixth Sense: A Wearable Gestural Interface", in ACM SIGGRAPH ASIA 2009 Sketches. New York, NY, USA: ACM,2009. [4] A. Utsumi, T. Miyasato, and F. Kishino, " Multi-Camera Hand Pose Recognition System Using Skeleton Image ", in RO-MAN'95:Proc. Of 4th IEEE international Workshop on Robot and Human Communication. IEEE Computer Society, July 1995, pp. 219-224. [5] A. Wilson and N. Oliver, "G Windows: Robust Stereo Vision for Gesture Based Control of Windows", in ICMI03: Proc. of 5th Int. Con! On Multimodal interfaces. New York, NY, USA: ACM, 2003, pp. 211-218. [6] Cristian Gadea, BogdanIonescu, Dan Ionescu, Shahidul Islam, Bogdan Solomon University of Ottawa, Mgestyk Technologies, “ Finger-Based Gesture Control of a Collaborative Online Workspace”, 7th IEEE International Symposium on Applied computational intelligence and Informatics· May 24-26, 2012 Timisoara, Romania. [7] Manaram Ganasekera,”Computer Vision Based Hand movement Capturing System ”, The 8th International Conference on Computer Science & Education (ICCSE 2013) April 26-28, 2013. Colombo, Sri Lanka [8] Fabrizio Lamberti, ”Endowing Existing Desktop Applications with Customizable Body Gesture-based Interfaces”,IEEE Int’l Conference on Consumer Electronics(ICCE),978-1-4673- 1363-6, 2013 [9] Anupam Agrawal, Rohit Raj and Shubha Porwal, ”Vision- based Multimodal Human-Computer Interaction using Hand and Head Gestures”, Proceedings of 2013 IEEE Conference on Information and Communication Technologies (ICT 2013) [10] M. Turk and G.Robertson, “Perceptual user interfaces”, Communications of the ACM, vol. 43(3), March 2000. [11] Y. Wu and T. S. Huang, "Vision-Based Gesture Recognition: A Review", Lecture Notes in Computer Science, Vol. 1739,pp. 103-115,1999 938938