Sei sulla pagina 1di 6

Wearable Computing Device with Gestural

Augmentation using LabVIEW


Eliyaz Mahammad1, Madhu Nakirekanti2, G.Bhaskar Phani Ram3
1,2,3Assistant Professor

Department of ECE
Vardhaman College of Engineering
Hyderabad, India

Abstract—Our motivation is from the analysis of Mr. Steve consists of system design platform and development
Mann on computational imaging and wearable computing, that environment.The RT processor that was used in this project
brings out sixth sense technology. Wearable computing is the is myRIO from National instruments. NI myRIO is a real
start of, developing, designing, or using very small wearables time embedded evaluation board which uses LabVIEW
and sensor devices. This device lets us to move with the data by software. It is used to construct applications that uses its
simple hand gestures. The recognition of gestures are the basic
onboard microprocessor and FPGA.
idea behind this device. The main aim of our project is to bring
digital and natural worlds nearer, virtually together. Our II. APPLICATIONS
project work uses a camera which captures the gestures,
process it using RT processor (myRIO) by using LabVIEW Wearable Computers find various applications in
software and makes the desired action to be done. The differentfields like industrial, military and medical fields etc
application of our device includes
i. Gestural camera that permits us for taking images by A. Industrial Applications
framing the finger gestures. The accurate and quick availability of information which
ii. The data can be copied from physical world, like a is complex to the working person in the field, or in a
paragraph on a printed paper with hand gesture and paste workplace (non- office), has become an important
them on a digital device, i.e., gesture based pick from a real consideration of many organizations after the establishment
paper and drop on a digital computer screen, rather than
typing.
of computerized storage in the 1950's. While this would be
iii. Replacement of Physical keyboard by printed paper. obtained with the use of handheld equipment, large number
of workers utilize one or both hands for carrying out their
assigned tasks, and with their piece of work they have to
Keywords— Wearable Computing, Gesture Recognition, maintain eye contact. The wearable computers available with
LabVIEW, myRIO.
a head-mounted display and hands free interface will provide
a good solution for these workers.
I. INTRODUCTION
B. Military Applications
The wearable computing device consists of “always turn
on computer combined with a control interface and display The enforcement agencies associated military
(head mounted) and have become a practical possibility. organizations quickly acknowledges the importance of
The Steve Mann who is the first adopters and advocates wearable devices to a soldier. To agitate terrible situations,
of this form of usage of computer, wearables and arrived at the sensible data assisting with differentiating between
three basic properties. The first one deals with a computer friendly and opposed forces, and certain giving approaches,
which is wearable and is not carried, and came to be a part are often accessed with wearable devices. Naturally a lot of
of the user; the second one states that it is easily controllable this analysis was classified as confidential, but the
by the user, not compulsorily involving conscious idea or collaboration with non-military researchers would be found
attempt and, third one is, it runs in real time and it is within the united states, U.K, Singapore and Australia. The
constantly active and the user can be interacted at any time. main intention is, during a combat-effective, cohesive
By making use of these definitions it was possible to system, the technology and the soldier needs to be merged.
recognize early applications of wearable computing [1-3]
retrospectively. These involved the shoe mounted roulette C. Medical Applications
wheel prediction system by Shannon and throp, realized in The applications represented earlier have used position
1960, and well developed and pre owned by the sensing technology to help in numerous varieties of tasks. the
'Eudaemons' and commercialized by Keith and others by data of wherever the address of user clearly provides the
1983. Kaith again designed camera which was belt support for several wearable styles. wearable devices may
mounted that was able to relay card, as it would be dealt, to even be designed to observe well-being and activity of the
a satellite receiver in a pickup truck which was in the user. Wearables have the potential to observe health to help
parking lot, where an associator read the video picture and with rising performance e.g. sports; prevention and detection
then player has been signaled at the table with the of sickness through diagnosis; and even treatment, although
information he required to play his hand. this typically involves some procedure that is invasive.
Here our project requires a simple camera that captures III. LITERATURE SURVEY
the gestures and these will be processed by the RT processor
which uses LabVIEW software and makes the required The development of this technology traced back to the
action. LabVIEW software is developed by National 1990’s. During this period, The Steve Mann developed a
Instruments and it is a visual programming language wearable computer, he is known as the father of sixth sense
technology[4-6]. This technology consists of a light source

XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE


joined with simple camera and was evolved in 1997. But the working on AR technology using some other software
name was first published in 2001 as Sixth Sense. He procedure to make their products but working with LabView
developed this technology as a simple camera with a neck make AR technology to more efficient .Our project makes
worn projector. Bryce Kellogy implemented a overall system life of blind people and physically handicapped as it uses
and it becomes an effective innovative and very important image processing techniques to control the devices interfaced
step towards the gesture recognition development for with it. Every individual can adopt this technology and
mankind. The next development of this technology was done configure it based on the problems they are facing. Like our
by Pranav Mistry from India. He was an research assistant in project as a product for physically handicapped will help
MIT media Lab. By using this technology Mistry them to control the device configured with it with simple
implemented new applications coined the term “Wear Ur hand gestures .Our project as a product serves as both
World‟ (wow). The earlier prototype has one limitation Business 2 Business and Business 2 Consumer product. It is
because it has a helmet with a big projector surrounded on it the self-selling product.
which causes the problem, if the person projecting the data
on wall and suddenly turns to talk with someone, then the V. SIXTH SENSE DEVICE
information will be projected over the person from the Sixth sense is the capability to know the fine-dimension
projector. By using new neck worn prudent prototype the or the hidden world. It additionally incorporates our
above limitation will be avoided. The initial prototype had an capability to understand the fragile cause and impact
normal web camera. Pranav Mistry had this device around relationship behind many events that's on the far side
his neck, and four coloured markers (red, green, blue and understanding of the intellect. We’ve got a liability to
yellow) placed on both the hands. After an intensive analysis, acknowledge visible world through the five physical senses
the authors found the term “wearable computer” has been (i.e. sight, smell, touch, taste, and sound), our mind, and our
outlined in many ways. for example, As per Massachusetts intellect. Once it involves the unseen world or the subtle-
Institute of Technology wearable computing laboratory, world, we've got an inclination to feel it through the five
wearable pc could be a device that will be worn very much subtle-senses, the subtle-mind and thus the subtle-intellect –
like “eyeglasses or clothing”, and act with the user that and of popularly referred to as our intuition. We tend to use
depends on the scenario. With features like personal WAN’s, our five natural senses to know any data; that data helps us
non conspicuous input devices, and alert displays, the build choices and choose the right actions to take. However
wearable pc would behave as an intelligent assistant.. in all probability the most useful information which is able to
According to NASA used the term Body Wearable facilitate us to form the right decision is not naturally
Computer (BWC), which has battery-powered computer perceivable with our five senses, notably the data,
system worn on the user's body and the unit is implemented knowledge, and data that group has accumulated regarding
for mobile and mainly hands-free operations, often uses everything and that is increasingly offered on-line. Intuition
speech input and head-mounted displays. technology bridges this gap, develops intangible digital data
The Gesture Computing could be a best innovation that out into the tangible world, and allowing us to maneuver
enables hand development and outward appearances as data with this information via hand gestures.
controls. Various gadgets are used now-a-days for capturing The sixth sense Technology uses the subsequent devices:
pictures and putting away in their mass reposition gadgets
A. Camera
like processed cam with memory cards or basic memory, cell
phones with built-in memory and memorycards, and so It selects the motion of the colored markers worn at the
forth.This project proposes the implementation and progress fingertips and user hand gestures are tracked. The camera
of the input device like keyboard and paint application with acknowledges images, individuals, gestures, photographs,
the help of hand gestures. Any application can be controlled that person makes with his hand. The digicam then sends
which brings out a path to human computer interaction by these records to the laptop or pc for processing. The digital
using these hand gestures. camera plays a fundamental interface device between laptop
and the physical world.

IV. IMPACT, RELEVANCE AND NOVELTY


The touch less sensing market & gesture recognition is
expected to grow from USD 5.15 Billion in 2014 to USD
23.55 Billion by 2020. Most of the companies in the touch
less sensing market & gesture recognition include Intel
(U.S.), Prime Sense (Israel), Cross Match Technologies, Inc.
(U.S.), Microchip Technology Inc. (U.S.), Qualcomm Inc.
Fig. 1. Web cam
(U.S.) and more AR technology is the most emerging
technology now a days. B. Coloured Marker
Our experience with LabVIEW environment (including These color markers placed at the tip of user fingers.
software and hardware) makes us to believe strongly that our Marking the user fingers with yellow, red, blue and green
project with some upgrades can make wonders. By this colored tape makes the digital camera to acknowledge the
project it is possible to play games on personal computer hand air gestures. The movement of the color markers set at
with gestures and any non-touch monitor can also be turned the tip of the hand is recognized and several other hand
into touch monitor We can even replace the mouse of gestures are fed relying upon the mixture of colors present
computer and use it by hand gestures The companies
within the laptop computer in the form of preloaded to communicate naturally without the need of any
information. mechanical instruments
C. Computer Vision
The software package tracks the user gestures using
computer-vision based mostly algorithms. pc vision is that
the inverse of computer graphics. Here image data is
produced by the computer graphics from three dimensional
models, while computer vision often generates three
dimensional structures from image information
VII. DESIGN AND IMPLEMENTATION
Fig.2. Colour markers A. Block Diagram
The Block diagram of the project shown in Fig. 4 gives
C. Laptop or PC the functioning of the project flow. As the user performs the
The computer or laptop is going to be used for specific actions before the camera using the color markers
implementing and running the code written on the simulation and the camera grabs the images and process them as per the
tools for the execution of the idea of image processing. The code dumped in the myRIO. The camera connected to
camera tracks different hand gestures carried out by the color myRIO is mounted onto the body like pendent or could be
marker and therefore applied to the PC or Laptop for further worn attaching to the cap.
operations of the various applications.

Camera recognizes
the user performed
User performing gestures and sends to
predefined gestures myRIO for processing
using colour markers

Fig. 3. Laptop

VI. TECHNOLOGIIES BASED ON SIXTH SENSE TECHNOLOGY


Sixth Sense technology uses a spread approach for myRIO processes the session
computation and makes the digital exposure of our human recognized by camera
lives additional interactive, intuitive, and, above all, a lot of Displays and stores the according to the code
natural. we must always not consider it on an individual data sent by myRIO developed and dumped in
wirelessly RT processor
basis. it's a lot of complicated technology squeezed into an
easy transportable device. Once we come with Fig.4. Block Diagram
implementation, we might get prompt, pertinent, visual
information projected on any object we pick up or the
technology will be interacted will be primarily depends on B. Components
gesture recognition, augmented reality[7-9] and computer The Various components used in this project have been
vision algorithm. discussed below.
A. Augmented Reality
1) Battery
Augmented reality[10-12] may be a phrase for a live 8AA batteries with barrel type connector at 1.25Volts
direct and indirect vision of a true world environment where each which will be used to power a 14Watts myRIO system
the elements are augmented by virtual system established with webcam for 1.4 hours.
imaginary.
B. Gesture Recognition 2) Colour Markers
Four Color markers of colors (Red, Yellow, Green, Blue)
Gesture recognition provides a path for computers to start
used for color detection and are detectable by camera and
out with understanding physical body movements, therefore
processed by vision assistant system.
developing a bridge between persons and machines and so
3) NI myRIO
even graphical user interfaces, that still limit the majority of
NI myRio hardware with code deployed in it for image
input to keyboard and mouse. Gesture recognition makes
processing and image acquisition is used as an embedded
human beings for interfacing with the device and allows us
platform which communicates with laptop wirelessly.
myRIO is a real-time embedded evaluation board developed
by National Instruments. The structure of myRIO is shown in
Fig. 5 below. It is used for the development of applications
which uses its FPGA and µP. It requires LabVIEW. The
National Instruments myRIO-1900 is a portable
reconfigurable Input output device which will be used for
designing and controlling and mechatronics and robotic
systems.
botic systems.

Fig.7. Connection Diagram

VIII. RESULTS
Fig.5. NI myRIO
A. Gestural Camera
The ni myRIO-1900 has audio, and power output pins,
analog input (AI), analog output (A0), digital input and
output (DIO), in a very dense embedded instrument. The ni
myRIO-1900 connects to a laptop over wireless 802.11b and
USB as shown in Fig. 6below

Fig.8. Block Diagram of Gestural Camera

The camera detects the gestures performed in front it


using colour markers and checks the condition that if the
distance between the coordinates of two colours (blue and
yellow)&(red and green)is in between -100 to 100 but not
zero. The image logging of the Vision Acquisition is enabled
Fig.6. NI myRIO Internal Block Diagram and the image is captured by the web camera connected to
myRIO and saves the single image into the image log folder.
4) Camera The block diagram and front panel of gestural Camera is
Logitech 5V, 500 mA webcam which takes power from shown in Fig. 8 and Fig. 9 respectively. The image showing
NI myRIO will be used for image acquisition. the detected colour templates is shown in Fig.10

5) Laptop
Laptop is used for storing data and displaying data
processed from myRIO which communicates with myRIO
wirelessly.
6) Connection Diagram
A Battery set with barrel type connector connected to
myRIO adaptor port will supply power to myRIO and the
web camera connected to it. Web camera will be connected
to the USB device port of myRIO. Laptop and myRIO are
connected wirelessly. The connection diagram is shown in
Fig. 7 Fig.9. Front Panel of Gestural Camera
Fig.10. Image showing the detected colour templates

B. Pick and Place in to a digital screen Fig.13. Front Panel of Pick and Place Feature
Picking and placing into digital screen includes a gesture
to pick the image from the natural world. The camera
captures the image and the selected part of image is trimmed
by IMAQ Extract 2. The LabVIEW icon for IMAQ Extract 2
is shown in Fig. 11

Fig.14. Image showing the overlays over the number templates and
red colour template
Fig.11. IMAQ Extract 2

The selected part from the real world paper will be


recognized by the coordinates of the colour markers and
trimmed by the IMAQ Extract 2 palette. These are fed as
input to the optional rectangle by which the selected part is
trimmed. The function write image acquires the single image
from the image output and the image will be written into the Fig.15.Tthe image showing OCR detecting all the characters in the
memory when the two colours (red &yellow) shows the area keypad
to be picked. IMAQ Extract extracts the image from real
Here the user uses a keyboard patterned paper. He would
world by considering the ROI as the coordinates of those two
show the alphabet to be typed by his finger and the character
templates. The block diagram and front panel of Picking and
is recognized by OCR by considering ROI slightly above to
Placing feature is shown in Fig. 12 and Fig. 13 respectively.
that of the colour template. The data typed is copied into a
word file programmatically. The image showing OCR
detecting all the characters in the keypad is shown in Fig. 15
The front panel of virtual keyboard is shown in Fig. 16

Fig.12. Block Diagram of Pick and Place Feature

C. Repalcement of Physical keyboard in to printed one


The vision assistant detects the number templates and the Fig.16. Front Panel of Virtual Keyboard
colour template as shown in Fig. 14 below
CONCLUSION [7] Bass, L., Kasabach, C., Martin, R., Siewiorek, D., smailagic, A. and
Stivoric, J., “The Design of a Wearable Computer”, Proceedings of
This technology will be used as a replacement of the five CHI97, (1997) pp 139-146.
senses for disabled peoples. this will give simple control over [8] Campbell, N.W, Mackeown, W.P., Thomas, B.T. and Troscianko, T.,
machineries in industry. This technology has various “Automatic Interpretation of Outdoor Scenes”, British Machine
Vision Conference, September (1995).
applications for different developers based on how the
[9] Caudell, T.P. and Mizell, D.W., “Augmented Reality: An Application
developer thinks and what he needs. By making use of of Heads-Up Display Technology to Manual Manufacturing
gesture movements, speech integrated circuits, LabVIEW Processes”, Proceedings, IEEE HawaiiInternational Conference on
software and hardware have made sixth sense technology an Systems Sciences, January (1992), pp 659-669.
emerging innovation. It provides us a smooth access to [10] Doyle (Boland) E.A., Weinzimer, S.A., Steffen, A.T., Ahern, J.H.,
information that may help us to make crucial decisions. The Vincent, M. andTamborlane, W.V.. “A Randomized, Prospective
Trial Comparing the Efficacy ofContinuous Subcutaneous Insulin
masterstroke here is that Sixth Sense technology detects the Infusion with Multiple Daily Injection Using Insulin Glargine”.
objects around and make us to use that information in the Diabetes Care, 27, (2004), pp 1554-8.
way we need and projects that data as well, all this in the [11] Duchamp, D., Steven, K. F. and Gerald Jr. Q. M., “Software
most easiest of the different ways. The well chosen Technology for Wireless Mobile Computing”, IEEE Network
awareness of this technology will project towards the further Magazine, 12(18), (1991) p.218.
development and usefulness of this technology, which in- [12] Feiner, S., MacIntyre, B. and Seligmann, D., “Knowledge based
augmented reality,Communications of the ACM”, 36(7), (1993), pp
turn will aid in acquiring information and operating any type 53-62.
of function practically at any time. And this can be achieved
simply by using gestures and commands.

References
[1] Prof. D.S. Patil, M. Shahak Patil, “Sixth Sense Technology- A New
Innovation”, International Journal On recent and Innovation trends in
computing and Communication, Volume-2, issue-5, pp. 1200-1204,
ISSN: 2321-8169.
[2] Aakansha Yeole, S. Chitraby using V.sriraman, PoojaShinde,
Priyanka Sadhwani, “Virtual Keyboard", International Journal of
Science Engineering and Technology Research, vol.3, issue 3,March
2014 .
[3] T. Starner, "The Challenges of Wearable Computing", IEEE Micro,
21, pp. 54-67, July 2001.
[4] Use your hand as a 3-D mouse or relative orientation from extended
sequences of sparse point and line correspondences using the affine
trifocal tensor. In proceedings 5th European Conference on Computer .
Vision, 1998, pages 141-157.
[5] O. Castillo, O. and P. Melin, "A New Approach for Plant Monitoring
using Type-2 Fuzzy Logic and Fractal Theory", International Journal
of General Systems, Taylor and Francis, Vol. 33, 2004, pp. 305-319.
[6] Bakay, R.E., and Kennedy, P.R. In B. Siuru, “A Brain/Computer
Interface, ElectronicsNow”, March, (1999), pp 55-56.

Potrebbero piacerti anche