Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Disclaimer
The view expressed in this paper/article corresponds to the views of the author (Mr. Chetankumar G. Shetty). Research matter articulated in this paper/article entirely belongs to the author of this article. The paper/article is published in www.chetangshetty.com. Any unauthorized broadcasting in any form or by any means, electronic, mechanical, including prototyping, recording by any information storage and retrieval system, public performance, copying or re-recording will constitute an infringement of copyright. Prior written permission has to be taken from the author Mr. Chetankumar G. Shetty for further working on the idea. He could be reached through e-mail: chetanshettyengineer@gmail.com.
1. Introduction
This technology requires Advanced Interactive Holographic Display Technology (AIHDT) [1] and Objects Information Rendering Box (OIRB) as its main constituents. AIHDT [1] will be suitably modified by removal of IR pass cameras, and placement of cameras will be done in order to track the users palm. OIRB has as its constituents eight webcams which capture the pictures of any object placed within it and send these pictures to the users computer. The user (say user A) chooses an object of which he/she requires another user (say user B) located in a remote location, to have a virtual perception. Once the object is chosen, it is placed inside the OIRB. OIRB contains cameras to take the real-time object information including images from all the exterior sides of the object. These pictures captured from the cameras will be sent to the computer in real-time having a suitable program installed in it. Along with these pictures, the size, mass, texture and surface-thickness information of the object will also be taken in as input. The installed software will be used to process these pictures and convert them into a distinct 3D object (This can be achieved using the program which makes use of different techniques) which corresponds to the chosen real-time object. This 3D object and information related to it is uploaded onto the space in the cloud provided for user A, from which the program will get downloaded at the user Bs computer. This virtual object will be displayed at the Viewers side (i.e. at User Bs side) using AIHDT [1]. When the user B tries touching the object using his/her palm, consequently ultrasound transducers placed inside the AIHDT box [1], get activated for regions corresponding to the palm and thereby result in the perception of touch for that particular object. In reality, this object is present at a remote location with respect to user B. User B can touch the object from top, left, bottom and right side. When we touch that object from a given side, the palm gets detected and traced with the help of cameras placed at the opposite side of the AIHDT box [1]. Position of the palm will be tracked based on the palm tracking mechanism. The program makes use of this mechanism to detect whether the person is touching the bottom, left, top or right side of the object and accordingly, that particular ultrasound transducers matrix house will be activated. These transducers will create a pressure on the users palm based on pre-computed values, computed using the information about the real object such as mass, surface thickness etc. which are provided by the software.
Fig.1 Representation for touching a remote object virtually. The above figure defines a chronological order of events that occur while using this technology. When the user A keeps the Real-Object (RO) inside the Object Information Rendering Box (OIRB), the OIRB collects all the information related to the RO such as texture, surface thickness, mass and several set of pictures captured by the webcam inside the OIRB. This information is used by the software installed at the user As PC to convert this information into a distinct 3D Virtual-Object (3D VO). This 3D VO is uploaded onto the space in the cloud allocated for user A. Consequently, the 3D VO will be downloaded at user Bs PC. This 3D VO is displayed as a Holographic-Virtual-Object (HVO) inside the AIDHT box [1] located at the user Bs end. Now, when the user B tries to touch this HVO, he/she feels the sensation of touch equivalent to that of the tactile sensation created while