Sei sulla pagina 1di 2

Mixing Virtuality and Reality –

Authoring Mixed Reality Applications with Blender


Paul Grimm
University of Applied Sciences Erfurt, Erfurt, Germany
grimm@fh-erfurt.de

Abstract Concept
This paper presents an approach how an extended Blender’s unique combination of a modeling tool with
version of Blender can support the development of an integrated game engine as runtime environment allows
content-rich Mixed Reality (MR) applications. its usage as authoring tool as well as its usage as runtime
Combining virtual and real worlds is complex and framework. To support the described authoring process of
time consuming, but many problems in building 3D MR applications on a technical level, two Blender
geometries and building MR applications are similar. extensions for tracking and for merging are necessary. To
Therefore, Blender is a natural candidate for being a fulfill the requirements of the first task (tracking), a
part of an MR authoring environment. Based on the marker-based tracking library was added to Blender. To
requirements of MR applications, it is shown how allow a merging of real video streams with the virtual
Blender can be extended to fulfill these requirements world a video texture was added. The second (adminis-
(e.g. integration of a tracking system and a video tration of relationships) and the fourth task (achieve visual
texture). A proof-of-concept implementation is correctness) are accomplished using visual editor for
presented as well as an outlook how MR technologies Blender building block as well as the Python integration.
could be used to ease geometric modeling in Blender. Both extensions (tracking and merging) are provided as
Python modules written in C++. Compared to a pure C++
Introduction extension of Blender this approach allows later an easy
Mixed Reality was defined by Milgram and Kishino as access either in the Python scripting interface or even as a
the merging of real and virtual worlds somewhere along logic brick in the visual editor. For the development of the
the 'virtuality continuum' which connects completely real Python modules CXX [10]was used.
environments to completely virtual ones [1]. In addition to
the complex authoring process to build an interactive 3D
application, several additional tasks are required to inte-
grate MR features to an application. Mainly four additional
tasks are necessary. The first task is to integrate tracking
devices (e.g. mechanical or vision based) in the runtime
framework in order to use the real world as an input
device. This integration includes also the task to provide
techniques to allow a stable calibration in order to adapt to
changing hardware as well as changing conditions, e.g.
different light conditions. Second, relationships between
real and virtual objects have to be administrated. This
includes, that geometric as well as logical dependencies
between real and virtual objects are described. Third, Fig 1: live video texture on a cube
merging of the real and the virtual world has to be enabled. The ARToolkit [11] is used for tracking and the frame
Fourth, visual correctness should be reached. This grabber library [12] for video capturing. With this library
includes, that the occlusion of real and virtual objects is live video as well as video playback is possible to use.
correct as well as that virtual object have shadows in the Both plug-ins are accessible using special Blender nodes.
real world and vice versa. The position and orientation of a tracking node (called
Existing technologies for MR application creation MRBTrackingNode) is controlled by the tracking
facilitate game development on different levels of extension. Each MRBTrackingNode can be used as parent
abstraction. These range from purely library-type node for one or more Blender nodes. Thus, it is possible to
technologies, which require programming skills for game assign a geometry node to a marker. A property with the
development, over script-based technologies to out-of-the- filename of marker specifies which marker is assigned to
box technologies, which allow a development with which MRBTrackingNode. The video is shown via an
dedicated visual editors. Examples are AMIRE [2][3], extended plane node (called MRBVideoPlane). It is
Arvika [4], DART [5], DWARF [6], Tinmith [7], displayed as video texture on the plane and can be used
Studierstube with APRIL [8] and MARS [9]. But there with arbitrary objects (see Fig 1). A property controls
does not exist one tool which can be used while content whether a life video stream or a recorded one is shown.
authoring as well as while running the application.
Demonstration Conclusion
In this example it is shown how an MR application can Based on the integration of a tracking system and on the
be built using the presented extended Blender. Step by step integration of video textures it was shown how Blender
a model of the city Frankfurt is created. It is based on a can be used to build content rich MR applications.
satellite image and will be augmented using a tangible As soon as hardware shaders become available within
interface. Blender, the video texture will be extended by a color-
First, you have to model or import the geometries of the keying shader.
buildings and you have to insert MRBTrackingNodes into Currently, ARToolkit is used as tracking technology. To
the scene hierarchy. Second, you have to use the tracking allow the development of a broader range of applications,
nodes as parent nodes for the geometry nodes (see Fig 2). one of the next steps is integrate alternative tracking
Third, you have to assign each MRBTrackingNode with technologies (e.g. OpenTracker [13]).
one marker specifying the filenames of the marker files. The vision is to use the tracking data also for the
modeling itself (compare with [7]). In combination with
new input devices a very powerful and intuitive user
interface could be built.

Fig 2: Hierarchy within a Blender MR Scene

Whenever one special marker is visible the texture of the


nearest building should change. Therefore, you have to
describe dependencies in the application logic. The easiest
way to accomplish this task is to use the logic brick editor
of Blender (see bottom of Fig 3). In addition it is also
Fig 4: Interaction within an example demonstration
possible to use Python as scripting language for more
complex scenes. One marker is used to change the References
orientation of the model (see Fig 4). [1] P. Milgram, F. Kishino: A Taxonomy of Mixed Reality Visual
Displays, IEICE Trans. Information Systems, Vol. E77-D, No. 12,
pp. 1321-1329, December 1994
[2] Paul Grimm, Michael Haller, Volker Paelke, Silvan Reinhold,
Christian Reimann, Jürgen Zauner: Amire - Authoring Mixed
Reality, in Rodney Berry (Ed.) u.a. Proceedings of the First IEEE
International Augmented Reality Toolkit Workshop, Darmstadt,
Germany, IEEE Catalog, Number: 02EX632, ISBN: 0-7803-7680-3,
2002,
[3] AMIRE Website http://www.amire.net
[4] W. Friedrich (Ed.). ARVIKA. Publicis, Erlangen, 2004
[5] M. Gandy, S. Dow, B. MacIntyre. Prototyping Applications with
DART, The Designer's Augmented Reality Toolkit. Physical World
Workshop at IEEE Pervasive Computing, 2004t
[6] M. Bauer, B. Bruegge, G. Klinker, A. MacWilliams, T. Reicher, S.
Riss, C. Sandor, M. Wagner. Design of a component-based
augmented reality framework. Proceedings of ISAR 01, 2001
[7] W. Piekarski, B.H. Thomas. An object-oriented software
Fig 3: Behavior description with Blender logic bricks architecture for 3D mixed reality applications, Proceedings of
ISMAR 03, 2003
While editing an MR scene it is possible to immediately [8] F. Ledermann, D. Schmalstieg. Tools and Techniques for MR
Authoring. Proceedings of ISMAR 04, 2004
preview and test the scene in the same authoring tool [9] S. Güvem, S. Feiner. Authoring 3D hypermedia for wearable
starting the game engine by pressing the ´p´-button in augmented and virtual reality. Proceedings of the ACM
Blender (see Fig 4). The blenderplayer can be used to International Symposium on Wearable Computers, 2003
deploy an application as a standalone application. [10] CXX Python-C++ Connection, http://cxx.sourceforge.net/
[11] H. Kato, M. Billinghurst. Marker tracking and HMD calibration for
a video-based augmented reality conferencing system, Proceedings
Acknowledgements of IWAR 99, 1999
Many thanks to Bastian Birnbach, Clemens Sutor and [12] Frame Grabber Library:
http://sourceforge.net/projects/libframegrabber
Tobias Tost for their contributions. [13] Opentracker Homepage, http://www.studierstube.org/opentracker/

Potrebbero piacerti anche