Sei sulla pagina 1di 4

Touching a remote object virtually

Areas: Perceptual Computing, Interactive Technologies Author: Chetankumar G. Shetty

Disclaimer
The view expressed in this paper/article corresponds to the views of the author (Mr. Chetankumar G. Shetty). Research matter articulated in this paper/article entirely belongs to the author of this article. The paper/article is published in www.chetangshetty.com. Any unauthorized broadcasting in any form or by any means, electronic, mechanical, including prototyping, recording by any information storage and retrieval system, public performance, copying or re-recording will constitute an infringement of copyright. Prior written permission has to be taken from the author Mr. Chetankumar G. Shetty for further working on the idea. He could be reached through e-mail: chetanshettyengineer@gmail.com.

Copyright 2013 by Chetankumar G. Shetty | All Rights Reserved.

Touching a remote object virtually


Areas: Perceptual Computing, Interactive Technologies Author: Chetankumar G. Shetty

Touching a remote object virtually


Chetankumar G. Shetty chetanshettyengineer@gmail.com Abstract
This innovation helps people to touch remote objects virtually. Using this technology people can touch objects present in a remote location. Though the user will be touching a virtual copy of the real object, user feels as if he/she is touching the real object. People can feel the sensation of touch even though the object is present at a remote location. This technical innovation makes use of two prominent technologies namely: Advanced Interactive Holographic Display Technology (AIHDT) and Objects Information Rendering Box (OIRB). This innovation shows the possibility of touching and feeling any chosen object placed far off from the point of contact. Keywords: AIHDT, OIRB, Palm Tracking Mechanism, Ultrasound Transducer

1. Introduction
This technology requires Advanced Interactive Holographic Display Technology (AIHDT) [1] and Objects Information Rendering Box (OIRB) as its main constituents. AIHDT [1] will be suitably modified by removal of IR pass cameras, and placement of cameras will be done in order to track the users palm. OIRB has as its constituents eight webcams which capture the pictures of any object placed within it and send these pictures to the users computer. The user (say user A) chooses an object of which he/she requires another user (say user B) located in a remote location, to have a virtual perception. Once the object is chosen, it is placed inside the OIRB. OIRB contains cameras to take the real-time object information including images from all the exterior sides of the object. These pictures captured from the cameras will be sent to the computer in real-time having a suitable program installed in it. Along with these pictures, the size, mass, texture and surface-thickness information of the object will also be taken in as input. The installed software will be used to process these pictures and convert them into a distinct 3D object (This can be achieved using the program which makes use of different techniques) which corresponds to the chosen real-time object. This 3D object and information related to it is uploaded onto the space in the cloud provided for user A, from which the program will get downloaded at the user Bs computer. This virtual object will be displayed at the Viewers side (i.e. at User Bs side) using AIHDT [1]. When the user B tries touching the object using his/her palm, consequently ultrasound transducers placed inside the AIHDT box [1], get activated for regions corresponding to the palm and thereby result in the perception of touch for that particular object. In reality, this object is present at a remote location with respect to user B. User B can touch the object from top, left, bottom and right side. When we touch that object from a given side, the palm gets detected and traced with the help of cameras placed at the opposite side of the AIHDT box [1]. Position of the palm will be tracked based on the palm tracking mechanism. The program makes use of this mechanism to detect whether the person is touching the bottom, left, top or right side of the object and accordingly, that particular ultrasound transducers matrix house will be activated. These transducers will create a pressure on the users palm based on pre-computed values, computed using the information about the real object such as mass, surface thickness etc. which are provided by the software.

Copyright 2013 by Chetankumar G. Shetty | All Rights Reserved.

Touching a remote object virtually


Areas: Perceptual Computing, Interactive Technologies Author: Chetankumar G. Shetty
Possible instances of occurrence of events are stipulated here: If we touch left side of the object, then the right ultrasound transducers matrix house will get activated and, based on the relative position of the object and the palm, only corresponding ultrasound transducers in that particular matrix will get activated; If we touch right part of the object then the left ultrasound transducers matrix house will get activated and based on the position of the object and the palm, only corresponding ultrasound transducers in that particular matrix will get activated; If we touch bottom part of the object then the top ultrasound transducers matrix house will get activated and based on the position of the object and the palm, only corresponding ultrasound transducers in that particular matrix will get activated; If we touch top part of the object then the bottom ultrasound transducers matrix house will get activated and based on the position of the object and the palm, only corresponding ultrasound transducers in that particular matrix will get activated. When the user B tries to touch both the top and bottom parts of the virtual object simultaneously, then the ultrasound pressure originating from the top side of the AIHDT box [1] will be restricted by the hand touching the top side of the virtual object. This blocks the ultrasound waves originating from the top side of the box, hence the hand held at the bottom side of the virtual object will not sense any pressure. Similarly, the top hand also does not feel any sensation when it touches the virtual object. When the user B tries to touch both the left and right parts of the virtual object simultaneously, then the ultrasound pressure originating from the right side of the AIHDT box [1] will be restricted by the hand touching the right side of the virtual object. This blocks the ultrasound waves originating from the right side of the box, hence the hand held at the left side of the virtual object will not sense any pressure. Similarly, the right hand also does not feel any sensation when it touches the virtual object.

Fig.1 Representation for touching a remote object virtually. The above figure defines a chronological order of events that occur while using this technology. When the user A keeps the Real-Object (RO) inside the Object Information Rendering Box (OIRB), the OIRB collects all the information related to the RO such as texture, surface thickness, mass and several set of pictures captured by the webcam inside the OIRB. This information is used by the software installed at the user As PC to convert this information into a distinct 3D Virtual-Object (3D VO). This 3D VO is uploaded onto the space in the cloud allocated for user A. Consequently, the 3D VO will be downloaded at user Bs PC. This 3D VO is displayed as a Holographic-Virtual-Object (HVO) inside the AIDHT box [1] located at the user Bs end. Now, when the user B tries to touch this HVO, he/she feels the sensation of touch equivalent to that of the tactile sensation created while

Copyright 2013 by Chetankumar G. Shetty | All Rights Reserved.

Touching a remote object virtually


Areas: Perceptual Computing, Interactive Technologies Author: Chetankumar G. Shetty
touching the RO. The pressure created by transducers needs to be felt by the palm more precisely with further additions and modifications of the system. REFERENCES [1] Chetankumar G. Shetty (2012) Advanced Interactive Holographic Display Technology (AIHDT). International Journal of Applied Research & Studies, 1(2), 227.

Copyright 2013 by Chetankumar G. Shetty | All Rights Reserved.