Sei sulla pagina 1di 4

GESTURE CONTROLLED ROBOTIC ARM

1. Shivashish Bohra 2 . Chaya K 3. Arpitha Srinivasa Murthy 4. Apoorva K G


Student, Department of ME Student, Department of EEE Student, Department of EEE Student, Department of EEE
Sapthagiri College of Sapthagiri College of Sapthagiri College of Sapthagiri College of
Engineering,Bengaluru, India Engineering ,Bengaluru, India Engineering ,Bengaluru, India Engineering ,Bengaluru, India
shivashishsarvr@gmail.com chayack1997@gmail.com arpithasrinivas1997@gmail.com apoorvakg1998@gmail.com

5. Irene Jacob
Student, Department of EEE
Sapthagiri College of
Engineering , Bengaluru, India
ireneiris0397@gmail.com

Abstract—This paper focuses on Design and control of pick and place objects. Microcontroller is used to control
a robotic arm with human hand gestures using computer the rotation of wheels [2]. Kinect sensor traces the
vision techniques. Computer Vision is a field of deep skeletal movements to provide gestures for the robotic
learning that enables machines to see, identify and process arm prototype. Bluetooth is used for wireless transmission
images. Use of OpenCV (python) for computer vision of signals to the robotic arm [3]. The use of gesture
provides a powerful environment for learning since it is control is also used in remote rescue operation where no
easy to use compared to other techniques. In this humans can go and has the ability to become a part of
technology, a camera reads the movement of hand and massive rescue operations [4]. The gesture recognition
communicates with the computer that uses gestures as involving counting of fingertips in the gesture can be
input to the control devices. implemented [5]. The use of microcontroller to control
the robotic arm is less efficient as compared to Arduino
Keywords—Robotic arm, hand gestures, Computer [6]. Robotic arm can be a great boon to the handicapped
Vision, OpenCV, powerful environment, communicates, as prosthetic arm [7]. The design of glove-based arm
control devices. using neural network to detect objects and do pick and
place operations [8] or to navigate the arm in space with
appropriate gestures [9].
I. INTRODUCTION
Robotics is an inter-disciplinary branch of engineering II. PROPOSED SYSTEM
which deals with the design, construction, operation, and
We provide a system in which the user can navigate the
use of robots , as well as computer systems for their
wireless robot in the environment using various gesture
control, sensory feedback, and information processing.
commands.
Computer vision provides basis for applications like
Camera is used to capture real time hand gestures to
image analysis using automation which enables us to
generate commands for the robot. Gesture is taken as an
determine any object or activity in the given image. Some
input and processed using Image processing.
of the task performed using computer vision is
recognizing simple geometric object, analyzing printed or Arduino takes the signal from camera as input and
hand-written characters, identifying human faces and generates output signals. This output signal generation
hand gestures. depends on the gesture input, for every possible gesture
input, different output signal is generated. Servo motor
The efficiency of the control of robotic arm is achieved
takes digital signals as the input from the Arduino .Once a
using gesture recognition techniques. The gesture images
command signal is given to the robotic arm, it replicates
captured can be processed by computer vision techniques
the gesture
[1]. Gesture recognition using Kinect sensor in various
fields such as control of wheelchair movements and also .
GESTURES

3. Thresholding process
CAMERA
Image thresholding is an important intermediary step
for image processing pipelines. Thresholding can help
POWER ARDUINO UNO us to remove lighter or darker regions and contours of
SUPPLY images. Grabbing all pixels in the gray image greater
than 225 and setting them to 0 (black) which
SERVO MOTOR corresponds to the background of the image.
Thresholding is the process of assigning pixel
intensities to 0’s and 1’s based a particular threshold
level so that our object of interest alone is captured
ROBOTIC ARM
from an image.
Fig 1. BLOCK DIAGRAM

III. GESTURE RECOGNITION USING OPENCV


AND PYTHON

1. Capturing Frames

The input frames(RGB format) is captured by camera Fig 3.3. Threshold process
and it is converted into gray scale image and the
region of interest is extracted as shown in fig 1. 4. Drawing contours ,finding convex hull
and convexity defects

In computer vision, contours are drawn by scanning


the binary image from left to right and from top to
bottom to find first contour pixel and then scanning
clockwise till the next pixel value becomes 1.
The largest contour (fingertips) is extracted and then
convex hull, a convex set enclosing the hand region
is drawn. The green line in fig 5 is the convex hull
Fig 3.1. Input frame
and the blue dots are the defect points. Depending on
the number of defect points, the gesture is identified.
2. Blur frame

For blurring the image using Gaussian blur function,


the gray scale image is convolved with a Gaussian
filter, a low-pass filter that removes high-frequency
components.

Fig 3.4. Convex hull and Convexity defects

5. Counting the fingers

The no of fingers is identified based on the distance


Fig 3.2. Blur image between the centre and the tip of finge0 r(radius).
III. INTERFACING PYTHON WITH ARDUINO IDE V. CONCLUSION
Interfacing python with Arduino is done by pyFirmata Gesture recognition helps computers to begin to
which is basically a prebuilt library package of python understand human body language, thus building a
program which can be installed to allow serial richer bridge between machines and humans than
communication between a python script on any computer primitive text user interfaces or even GUIs (graphical
and an Arduino. user interfaces), which still limit the majority of input
to keyboard and mouse.
The control of robotic arm by interfacing the
IV. HARDWARE IMPLEMENTATION Arduino to Bluetooth can enable distant
communication. The efficient operation at poor
In this technology, camera reads the gesture and lighting conditions should be accomplished. Also the
communicates with the computer that uses images as use of sensors to detect obstacles can be done.
input.
The image is processed by the system and the processed
data of the thresholded image is converted into the angle of VI. FUTURE SCOPE
rotation of servo motors by sending command to the
Robotic Arms has a wide scope of development. In
Arduino.
the near future the arms will be able to perform every
The Arduino sends the required PWM to the servo motors task as humans and in much better way. Brain
for appropriate movement of the fingers connected to it. Computer Interface (BCI) is an immerging field of
research. BCI can be used to acquire signals from the
human brain and control the arm. The system can
work in the same way as human arm. A person who
may have lost his hand in any accident can resume
his life like previous by such artificial arms. Robotic
arms are versatile and have enormous ways of
implementations.

VI. REFERENCES

1. Prutha Atre, Sahil Bhagat, Nevil Pooniwala, Payal


Fig 4.1. Variable pulse width control of servo motor shah, ’ Efficient and feasible Gesture Controlled
Robotic Arm’-IEEE(ICICCS),2018
2. M.Ababneh,H.Sha’ban,D.Khader,H.Mahameed,
• The servo motor works by using pulse width modulation M.AlQudimat,
for control. ‘ Gesture Controlled Mobile Robotic Arm for Elderly
• A standard servo motor can rotate from 0 degrees to 180 and Wheelchair people Assistance Using Kinect
degrees. Sensor’ -15th international multi-conference on
• Varying the pulse width between 1ms and 2ms varies the SSD,2018
servo position between 0° and 180°. 3. Memon Md. Farhan Md. Fareed, Shaikh Bilal Ahmed
Anees,‘Gesture based wireless single armed robot in
cartesian 3D space using Kinect’-5th international
conference on communication system and network
technologies,2015
4. Pantha Protim Sarker, Farihal Abedin and Farshina
Nazrul Shimim.‘R3Arm: Gesture controlled Robotic
Arm for Remote Rescue operation’-IEEE(R10-
HTC),2017
5. Ibrahim Baran CELIK, Dr. Mehmet KUNTALP. Arduino-controlled servos’New Jersey governor’s school
‘Development of a Robotic-Arm Controller by using of engineering and technology 2014.
Hand Gesture Recognition’IEEE 2012. 8. Asif Shahriyar Sushmit,Fariha Musharrat Haque,Md.
6. Ganesh Choudhary B, Chethan Ram B ‘Real time robotic Shahriar, Shaikh Al Mahmud Bhuiyan, M.A.Rashid
arm control using hand gestures’-2nd international Sarkar.‘Design of a Gesture Controlled Robotic Gripper
conference on Machine learning and computing, 2010. Arm using Neural Networks-IEEE(ICPCSI),2017.
7. Nicholas Bonini, Nithya Iyer, David Kim, Katherine 9. Harish Kumar Kaura, Vipul Honrao, sayali Patil, Pravish
Mathison, Laurewellons.‘Robotic hand in motion using Shetty..‘Gesture controlled robot using image Processing’
(IJARAI)Vol.2,No.5,2013

Potrebbero piacerti anche