Sei sulla pagina 1di 7

Design of a Laser Controlled Keyboard for Physically

Challenged People
Ayan Paul
1
,Pramit Dey
2
,Dipanjan Saha
3
,Asoke Nath
4

Department of Computer Science
St. Xaviers College (Autonomous)
Kolkata, India
1
ayanpaul_90@yahoo.com,
2
pramitdey@yahoo.com,
3
dipanjan.saha.me@yahoo.com,
4
asokejoy1@gmail.com


Abstract-Human Interaction with the computer is
done mainly by mouse and keyboard. Recently Dey et
al developed a method where the entire mouse
operation can be done using laser beam. This will
help the user to control the mouse operation from a
remote place and this may also help the physically
challenged person to operate the mouse operation
using the laser pointer. In this paper authors present
a computer vision based system where the keyboard
can be controlled using a laser and the keyboard
need not to be the actual one, it can be made of either
paper or fabric. However, the conventional format of
the keyboard is necessary. Any laser pointer can be
used to control the keyboard and all kind of keyboard
functions can be actuated with the present system.
There are projection keyboards which have been
implemented by the IBM but that is much costlier
than the system built by the authors.
Keywords: laser; keyboard; matlab; c++; human
computer interaction
1. INTRODUCTION
Due to an extensive growth in the internet
technologies controlling the machine remotely has
been one of the most interesting areas of research.
Human computer interaction with the laser pointer has
been used to actuate the windows mouse operation
[1]. In this paper we present a system with which one
can simulate the complete keyboard (conventional)
with the laser pointer and the keyboard does not even
need to be an original one rather a simple drawing of
the layout of the same would do.






The keyboard image can be drawn on a paper or
fabric in a conventional manner and most importantly
in a matrix format and a webcam is used to capture
the image of the keyboard layout. The laser pointer is
used to illuminate the key which the user wants to use.
The laser pointer is tracked by finding out the
brightest point in the image and comparing with the
matrix position the required key is trapped and tapped.
The setup includes only a camera and an ordinary
laser pointer as the pointing device. The first system,
Vision-based Keyboard System is designed for
wearable computers and personal digital assistants
(PDAs). The system needs a keyboard layout image
that can be printed on any material and the user can
enter a character by controlling the laser pointer.
Alternatively, this keyboard system can be easily
adapted for physically challenged people who have
little or no control of their hands to use a standard
mouse or a keyboard. As a companion system,
Vision-Based Mouse System is developed for the
same class of physically challenged people where the
system provides the users to control the mouse
pointer. The main advantage of the developed system
for the physically challenged people is that the
system requires only a USB Webcam and an ordinary
laser pointer. Therefore, the overall cost is very low
compared to similar products. In the present work we
have used Matlab for the image processing and c++ is
used for calling the interrupts for the respective keys.


2. SYSTEM ARCHITECTURE

The whole system is constructed for the Keyboard
interaction system and the software part is constructed
for image processing. The hardware system consists
of laser pointer, camera and the computer. The laser
point functions for interaction with the outlay of the
keyboard. The webcam captures the image and
obtains the position of laser spot. The computer
functions as the processing unit. Software part for the
image processing consists of color setting module and
calibration module.
The hardware parts consist of the Keyboard
interaction system and software part for image
processing. The `computer. The camera obtains the
capture image and the position of laser spot. The
computer functions as the processing unit.
Software part for the image processing consists of
color setting module and calibration module. The
color setting module can find the laser point by input
image. The calibration module finds out the position.
To obtain the laser point, the color setting module is
subdivided into RGB, the red layer and the green
layer being removed and then the input image is
converted into gray scale image. We propose the
hardware system which is inexpensive technique
whereas every person in the room using a laser pointer
can interact with the keyboard.
3. TO DETECT LASER SPOT BY A
CAMERA
1.1. Acquisition of Image
The projected area is captured by a webcam with
its default resolution in any form, might be RGB or
YUV or any other. The intensity of the projected area
is measured and we check it out whether the intensity
is within the range. The frames are captured at the rate
of 30 frames per second on an average. We are
treating each frame as an image and there we remove
the Red layer of the RGB mode and are working on
the Blue layer. Furthermore we turn that image onto a
grayscale image and from that the brightest spot is
detected. We use a constant value of B
thres
any value
exceeding the threshold value is considered as the
laser point and the co-ordinates of the corresponding
pixels are found out, assuming that to be the cursor
position.

1.2. Calibration
The image that is captured with the webcam is
resized and the scaling factor (SF) is calculated. We
use the scaling factor to calculate the ratio through
which the pixels are being taken into consideration.





Fig 1. Resizing of the screen
4. IMPLEMENTATION
Here we consider a layout of a default QWERTY
keyboard, with all its keys. We design the projected or
paper keyboard where each key is non-overlapping
rectangular region as shown in Fig1. We divide the
keyboard in a grid format where each cell represents a
button or a blank space. Each cell is provided with a
unique cell address which is similar to the address of a
cell in a 2-D array the first address starting from
location 0(zero). Here the paper or captured region is
divided in a grid format of 24X7, the address of a
each cell is determined by the formula
(24*rowval+colval), where rowval is the row position
and lies in the range 0 to 7 and the colval is the
column position in the range 0 to 24. Thus, the lowest
cell address is 0 and ranges upto 167.

Table-1. Matrix address Codes
Keys Values
Esc 0
F1 2
F2 3
F3 4
F4 5
F5 6
F6 7
F7 8
F8 9
F9 10
F10 11
F11 12
F12 13
Power 16
Sleep 17
Wake Up 18
~, ` 48
1, ! 49
2, @ 50
3, # 51
4, $ 52
5, % 53
6, ^ 54
7, & 55
8, * 56
9, ( 57
0, ) 58
- , _ 59
+ ,= 60
| , \ 61
Backspace 62
Print Screen 64
Scroll Lock 65
Pause, Break 66
Num Lock 68
/ (Divide) 69
* (Multiply) 70
- (Subtraction) 71
Tab 72,73
Q 74
W 75
E 76
R 77
T 78
Y 79
U 80
I 81
O 82
P 83
{ , [ 84
} , ] 85
Enter/Return 86,109,110,143,167
Insert 88
Home 89
Page Up 90
Num 7 , Home 92
Num 8, Up arrow 93
Num 9, Pg Up 94
+ (Addition) 95,119
Caps Lock 96,97
A 98
S 99
D 100
F 101
G 102
H 103
J 104
K 105
L 106
: , ; 107
, 108
Delete 112
End 113
Page Down 114
Num4, Left arrow 116
Num5 117
Num6, Right arrow 118
Left Shift 120,121,122
Z 123
X 124
C 125
V 126
B 127
N 128
M 129
< , , 130
. , > 131
? , / 132
Right Shift 133,134
Up arrow 137

Num1, End 140
Num2, Down Arrow 141
Num3, Page Down 142
Left Control 144,145
Left Windows 146
Left ALT 147
Space Bar 148,149,150,151,152,153
Right ALT 154
Right Windows 155
Right Click short key 156
Right Control 157,158
Left Arrow 160
Down Arrow 161
Right Arrow 162
Num0, Insert 164,165
. , Delete 166

According to the spot of the laser the value
is determined and mapped to the correct key to which
it corresponds. Our program displays the button on
which the laser is illuminated and when the laser is
switched off, the interrupt of the button on which the
laser was last illuminated is called. Each on/off for a
laser corresponds to a key hit except for Left Shift,
Right Shift, Left Control, Right Control, Left ALT
and Right ALT. In those exceptions mentioned
earlier first hit represents button down and second hit
represents button release alternatively.






Fig. 2- Setup to use the Paper Keyboard Fig. 3- Laser spot is used to detect the brightest point on
the paper and simulate the respective key action.

Fig. 4-Special characters are also available by
simulating Shift key along with the special character
button.



Fig. 5-Numlock key is switched on, as the laser
is brought upon the NumLock
button on and switched off.

Fig. 6-Simulation of ENTER keys.

Fig. 7-Simulation of uppercase alphabets are also possible after switching on CAPSLOCK key and also with pressing down shift
keys.

CAPSLOCK is selected by switching the
laser on and off above the button.
After using CAPSLOCK all alphabets
appears in capital letters in current
notepad window
Fig. 8-Simulation of combo keys CTRL+A.

Fig. 9- Simulation of TAB key.


Fig. 10- Simulation of DELETE key.



Fig. 11- Scanned picture of the Paper Keyboard used in
the Experiment

Fig. 12-Window key operation is simulated as laser spot is detected on the respective button.
Simulating Delete event as
the letter D is deleted (see
the previous picture ).
5. CONCLUSION AND FUTURE
SCOPE
When the webcam is used to capture the image of
the keyboards layout it should be taken care of the
fact that there is no reflection and there is no brighter
spot other than the laser spot. The intensity of the
laser pointer pixels should be greater than any other
pixels of the captured image.
In future we would like to extend this project to a
new dimension where we would be able to access a
remote machine through the network where the laser
would be used to control the keyboard of the remote
machine rather than the machine with which the
apparatus is connected with and we have already
started working on it.












6. REFERENCES:
[1] Laser Beam Operated Windows Operation
P. Dey, A. Paul,D. Saha,S. Mukherjee and A. Nath
accepeted for publication in IEEE conference
CSNT12 to be held at Rajkot,India May 11-13,2012.

[2] J. Mankoff and G. D. Abowd. Cirrin: A word-
level unistroke keyboard for
pen input. In Proceedings of Symposium on User
Interface Software and
Technology (UIST98), pages 213214, 1998.

[3] J. Nielsen. Usability Engineering. Academic
Press, Inc., 1993.

[4] J. L. Crowley, J. Coutaz, and F. Brard. Perceptual
user interfaces: things

that see. Communications of the ACM, 2000.
[5] Z. Zhang, Y. Wu, Y. Shan, and S. Shafer. Visual
panel: Virtual mouse,
keyboard and 3d controller with an ordinary piece of
paper. In Proceedings
of ACM Perceptive User Inferface
Workshop(PUI01), Florida, November
2001.

[6] O. Faruk Ozer, O. Ozun, V. Atalay, and A. Enis
Cetin. Visgraph: Vision
based single stroke character recognition for
wearable computing. IEEE
Intelligent Systems and Applications,
May-June 2001.

[7] C. Tomasi, A. Rafii, and I. Torunoglu. Full-size
projection keyboard for
handheld devices. Communications of the ACM,
2003.
[8] I.S. MacKenzie. Fitts law as a research and
design tool in human computer
interaction. Human Computer Interaction.

Potrebbero piacerti anche