Sei sulla pagina 1di 13

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/265888816

VIRTUAL AND AUGMENTED REALITY


DEVELOPMENTS FOR ENGINEERING
APPLICATIONS

Conference Paper July 2004

CITATIONS READS

10 14

2 authors:

Ulrich Lang Uwe Wssner


University of Cologne High Performance Computing Center Stutt
75 PUBLICATIONS 390 CITATIONS 25 PUBLICATIONS 155 CITATIONS

SEE PROFILE SEE PROFILE

All in-text references underlined in blue are linked to publications on ResearchGate, Available from: Ulrich Lang
letting you access and read them immediately. Retrieved on: 02 May 2016
European Congress on Computational Methods in Applied Sciences and Engineering
ECCOMAS 2004
P. Neittaanmki, T. Rossi, S. Korotov, E. Oate, J. Priaux, and D. Knrzer (eds.)
Jyvskyl, 2428 July 2004

VIRTUAL AND AUGMENTED REALITY DEVELOPMENTS FOR


ENGINEERING APPLICATIONS

Ulrich Lang*, Uwe Wssner


*
Centre for Applied Informatics, University of Cologne
Robert-Koch-Strasse 10, D50931 Cologne, Germany
e-mail: lang@uni-koeln.de, web page: www.zaik.uni-koeln.de/people/lang

High Performance Computing Centre, University of Stuttgart
Allmandring 30a, D-70550 Stuttgart, Germany
e-mails: woessner@hlrs.de, web page: www.hlrs.de/people/woessner

Key words: visualization, virtual reality, augmented reality, virtual prototyping, simulation
steering.

Abstract. Computing centres such as the High Performance Computing Centre Stuttgart
(HLRS) or the computing centre of the Centre for applied Informatics at the University of
Cologne (ZAIK/RRZK) support their users in performing complex simulation tasks which
often aim at optimizing new industrial products. Virtual and augmented reality techniques are
means to visualize virtual prototypes of future products and their behaviour. Starting form
scientific visualization an overview on actual developments combining real and virtual
representations of industrial prototypes ranging from water power plants and car climate
layout to architectural optimizations will be given. The integration of concepts establishes
hybrid prototypes as a combination of virtual and augmented realities with physical
prototypes or mock-ups. Interaction concepts play a major role in providing an improved and
intuitive access to such prototypes and their underlying behavioural models.

1
Ulrich Lang, Uwe Wssner

1 INTRODUCTION
Scientific Visualization is a support technology that enables scientists and engineers to
understand complex relationships typically represented by large amounts of data. The
visualization process chain is a part of the overall simulation process chain. Its elements and
their interrelationship represent the characteristics of scientific visualization and its usage in
different application fields. By combining visualization techniques datasets can be analyzed
and simulation models can be explored. In a certain class of engineering and scientific
domains properties of an object need to be judged in relationship to its shape. As such
properties are typically invisible visual metaphors are chosen to represent these properties
overlaid on or related to the shape of the object thus allowing such judgments. Such
visualizations are then also used to communicate complex content and support decision
processes. Virtual and augmented reality techniques not only improve this perception process
but also strengthen and accelerate group decision processes.
Data as intermediate carrier of information can not be immediately understood by humans.
Visualization as the process to convert data into a visual representations becomes increasingly
important as the data volumes produced by large computers or measurement devices rises.
With visualizations humans can recognize states, structures and behaviour of objects or
models. Visualizing properties as attributes of 3D objects enables humans to make use of their
evolutionary established capabilities to discern structures at certain locations or see spatial
transitions in structures. This accelerates the comprehension of complex structures or enables
it at all. The spatial recognition capabilities are complemented by further capabilities such as
the recognition of movements as well as the relationship between movements. To make use of
these capabilities dynamics in content need to be scaled to human perceivable time scales. By
combining our visual sense with further senses such as haptics strongly improves and
accelerates the perception process. While virtual prototypes introduce a machine model of a
future product, object or model, hybrid prototypes go one step further in combining physical
properties with virtual representations enabling a multisensory experience.

2 THE VISUALIZATION PROCESS CHAIN


The visualization process chain, shown in figure 1, starts with the source process, which
either generates or reads data. Instead of a simulation it could also be a measurement process
or the reading of previously stored data. The filter process, either selects or samples data,
corrects errors or produces derived data. It is used to extract spatial or data value domains. In
the mapping step selected visualization methods convert data into abstract visual
representations. A multitude of different visualization algorithms exist that implement
different types of mappings, each of them having their specific capabilities. In most cases the
mapping leads to a collection of geometric primitives such as triangle lists, line lists, point
clouds, etc. This can be combined with textures and material properties of surfaces for an
improved visual impression. Figure 2 shows an example visualization of a fluid flow field.
Particle paths visualize the velocity field of water flowing through a water power plant.

2
Ulrich Lang, Uwe Wssner

Figure 1: The visualization process chain.

Figure 2: Visualization process chain (IHS, University of Stuttgart).

In the rendering step the 3D scene content is converted into images which are then
displayed. During display series of images can be viewed as animations showing a time
dependent behaviour. In a virtual reality environment the rendering and display steps are
typically combined into one step to speed up processing and reduce reaction times on human
interactions. While 2D images have the deficiency that depth information is lost (see figure 2)
and can only by incompletely regenerated by the human visual system from hints such as
shading, such information is maintained in a virtual environment, giving humans the
capability to really judge 3D structure and content.
Volume rendering as a special visualization method conceptually integrates the mapping
and rendering steps. Volume rendering bypasses the geometric representations between the
mapping and the rendering step. Its input is a scalar field defined on a three-dimensional grid
which is interpreted as a semitransparent medium. Via transfer functions for transparency and
colouring a mapping of the scalar values in each volume element (Voxel) is performed. These

3
Ulrich Lang, Uwe Wssner

semitransparent coloured voxels are then superimposed to form an image of the overall
volume.
Multiple algorithms exist to define transfer functions and to accumulate the voxels. Aims
are to detect subtle structures and reduce the processing time. Figure 3 shows the volume
rendering internal structures of a metallic motor block. The data has been acquired via
computer tomography.

Figure 3: Volume rendering of material structure in an engine (GE).

In figures 2 and 3 it is to a big part not possible to judge the distance of the particle paths
or volume elements from the observer. Using a stereoscopic display technique of a virtual
reality environment the depth information is maintained and thus the 3D structure can be
much better analysed and understood. When the observer moves around the object motion
parallaxes is added to further improve the impression.

2.1 Interaction and feedback


In the visualization process chain the interaction activities are separated into multiple
feedback loops. The innermost loop feeding back into the rendering step allows modifying the
observer position and orientation thus enabling a free roaming in the 3D scene. The
modification of camera parameters additionally allows zooming into specific details of the
scene. As many simulations produce time dependent results it is of equal importance to
understand the dynamic behaviour of a system. The human perception requires that the
dynamic behaviour is shown in an appropriate time scale. To allow this the scene content for
the different time steps need to be stored in memory enabling a quick switching between them
thus giving an observer the impression of a smooth change in structure respectively a smooth
movement of objects. An exploratory visualization process is characterized by a repetitive
display of the dynamic behaviour while changing viewing parameters. An interaction concept

4
Ulrich Lang, Uwe Wssner

similar to a video recorder is required to slow down the animation speed, step through time or
reverse the orientation of the animation time.
In the next outer feedback loop a user can interact with the parameters of a selected
mapper. Realistic process chains typically consist of multiple filters and mappers as can be
seen in figure 5. Typical interactions with mappers are e.g. the repositioning of a cutting
plane, the definition of new starting positions for particle traces in a flow field or the
definition of a new isovalue of an isosurface. Such types of interactions are applicable to the
3D visualization of figure 4. Thus a user can identify a specific region where a certain effect
appears or an unusual behaviour is determined.

Figure 4: Climate simulation in a car cabin (data courtesy DaimlerChrysler Research).

The next outer loop allows interactions with the filtering steps. Filter parameters enable to
select subdomains of a region together with the values defined on this subdomain. In a search
process for interesting effects the location of such a subdomain is consecutively moved across
the computational domain.
Finally simulation steering can be introduced via feedback into the input of the simulation
step. Here a user can see the simulation results change as the simulation evolves. With the
immediate feedback the user can modify boundary conditions and with a certain delay see
how the behaviour of the simulated system changes.
From the inner to the outer loops the timing requirements for the reaction of the system
become less demanding. When moving through a scene or interacting with other animation
parameters the system should ideally react within a 1/30 of a second to give a reality like
behaviour in VR. This requires an image update rate of at least 30 frames/s which limits the
complexity of the scene. Modifying mapper parameters can already take longer, especially if

5
Ulrich Lang, Uwe Wssner

they need to be applied to a whole sequence of time dependent data. In such a case a new
isosurface would e.g. have to be recalculated for all time steps.
Finally large-scale simulations can take hours or even days to finish. Waiting to see the
modified behaviour of such a system after changing an input parameter seems to be rather
inappropriate. Instead, the visualization system collects the result data as it appears over time
and keeps it for a repetitive analysis.

3 COVISE, A VISUALIZATION AND VR PACKAGE


Since the introduction of scientific visualization multiple modular visualization packages
have been developed. A common approach is the description of the visualization task via a
data flow network paradigm which reflects the concept of the visualization process chain. A
visual program editor allows configuring the topological relationship of the processing steps
graphically. Exchanged data is depicted as edges of a graph connecting the processing steps.
Figure 5 shows COVISE [2] as an example of such a package, which has been developed at
the High Performance Computing Centre Stuttgart (HLRS). Other packages with similar
characteristics for desktop usage such as OpenDX [3], Khoros, AVS or NAG Explorer also
apply dataflow networking paradigms. Most of the packages allow executing a visualization
process chain across multiple machines in a computer network.

Figure 5: Screen snapshot of COVISE.

4 VIRTUAL REALITY TECHNIQUES FOR 3D VISUALIZATION


A virtual reality impression as described here is produced by a combination of
technologies that give a user the feeling to be immersed in a computer generated scene. It is
important to cover a large viewing angle, as the peripheral view is an essential element of the
human perception for having the impression of being inside the virtual world. This can either

6
Ulrich Lang, Uwe Wssner

be reached via a head mounted device (HMD) or a projection environment. An augmented


reality (AR) is established with an HMD when camera signals are overlaid by computed
generated content. Although such devices are not very comfortable to ware, the potential of
this approach is impressive. On the other side group oriented discussion processes are better
accomplished by setting up a CAVE like environment as shown in figure [6] consisting of at
least 3 stereoscopic projection walls and a stereo projection floor. To further support an
immersive impression the displayed world needs to react immediately to movements of the
observer and allow direct interaction with the scene content. A user should be able to grab
objects, move them around and perform other interactions directly and intuitively that fit to
the displayed content. COVISE supports both display types in this respect. Figure [6] shows a
person interacting in a CAVE with GIS data of the Zurich area in Switzerland layered over
the terrain model. In figure [7] two scientists discuss the temperature distribution inside a car
cabin produced by a previous simulation. They can directly insert new particles in the air flow
and see the paths they follow appear immediately.

Figure 6: Immersive virtual environment used for GIS terrain visualization.

Depending on the scenario further sensory information can be very supportive. For a
medical specialist force feedback is essential during the training of an operation. For an
architect or urban planner the auditory information within a larger building or street strongly
improves the sensation of being there. In the real world objects can be moved with
constraints, doors can be opened, etc. Users expect the same behaviour of objects like in a real
world. Therefore time dependent event driven animations of objects are an essential element

7
Ulrich Lang, Uwe Wssner

of a virtual environment. This e.g. allows calling an elevator by pushing a button, which
opens and closes doors and caries users to different levels of a building.

Figure 7: Immersive virtual environment used for CFD visualization.

4.1 Usage of texturing in 3D modelling


Whereas for mechanical engineering mostly different colours on the model surfaces are
sufficient, architectural representations depend on the visual representation of surfaces like
concrete, wood or plaster. As the atmosphere of visualization mainly depends on the mapping,
one has to put a focus on it. To apply textures as well as to reduce the number of polygons to
allow decent frame rates, modelling and animation tools like 3D Studio MAX are used. As
most of the CAD and modelling packages defined their own proprietary file format a common
exchange format is required to pass the geometry on to the visualization package. VRML97
has evolved during the last years as this common file format describes 3D geometry and
behaviour of models for the internet. A VRML/VRML97 with extended capabilities is
integrated into The COVISE renderer COVER contains a VRML97 importer to combine the
imported geometry and mappings with the visualization of measurement or simulation data.
VRML97 supports interaction and animation that greatly assist the users in immersing into
the scene.

5 VIRTUAL PROTOTYPING ENVIRONMENT


Combining all the elements described above allows presenting engineering design and
development processes using the concept of virtual prototypes. Such virtual prototypes
integrate geometry and behaviour in a computer representation while allowing a user to

8
Ulrich Lang, Uwe Wssner

interact with them as if they were real. Engineers can change parameters of the air
conditioning of a future car, as if they were in the real car. With online simulation coupling
the model reacts on changes of the dashboard opening.
Users need not only focus on the layout and optimization of products but can furthermore
concentrate on the development of usage concepts and profiles such as of buildings or cars.
Layout concepts orient the focus of interest on certain areas of a technical product. They e.g.
avoid distraction of a driver from a road while providing complementary information. In a
virtual environment it can be shown how to guide humans through buildings or how to make
users feel comfortable there. Mostly combined approaches are used with maps and elements
representing design concepts as well as animations explaining the concept.

5.1 Virtual prototype for car climate layout


In the framework of the European Community funded project VISiT (Virtual intuitive
simulation testbed) multiple virtual prototyping scenarios from different European companies
have been implemented and evaluated. The climate optimization for future cars of
DaimlerChrysler was such a scenario. In a CAVE it became possible to enter a virtual car
cabin, interact with the dashboard openings to change the amount and direction of inflowing
air as well as the temperature (see figure 8), insert new openings at the dashboard as well as in
the legroom and also interact with them. Additionally seats could be moved and different
types of drivers and passengers could be selected to provide realistic variations of the
climatisation conditions.

Figure 8: Virtual prototype of a car climate layout.

All modifications of and interactions with the model were immediately provided to
simulation codes running in the background. Within minutes engineers could see the modified
behaviour of the virtual prototype and thus optimize it. Additionally discussions of specialist
about the behaviour of the prototype are supported and therefore held with much more

9
Ulrich Lang, Uwe Wssner

intensity. To implement this concept all simulation tools had to be hidden from the user and
were only accessible to him via a virtual reality user interface. The user interface as well as
the simulation and support tools was all implemented in COVISE as software integration
platform.

5.2 Virtual auto house


DaimlerChrysler is also promoting the application of interactive visualizations in
architecture. To design a new generation of auto houses virtual reality has been used from the
very beginning of developing the general building concept to the final projects. Team
meetings with many disciplines are held in the CAVE to discuss about the architecture and its
impact in 1:1 scale. Architects, brand managers, sales specialists, event designers, marketing
specialists, artists, simulation experts (e.g. airflow, temperature) and even potential customers
discuss in the 1:1 project representations. Figure 9 on the left shows particle paths of a climate
simulation visualizing the air flow in a planned auto house. On the right the temperature
distribution above the ground floor of the building is shown.

Figure 8: DaimlerChrysler Auto house in a virtual environment.

This way of working is part of the communication concept "MarkenStudio" [4] in which
visualization plays a major role. It could be observed, that as soon as there is "something to
look at" and ideas are being visualized, it is much easier to achieve a commonly agreed
meeting result. The participants are more willing to change their "point of view" and to
understand and accept the ideas of the others more easily. Additionally virtual reality assists
in reaching a high degree of planning safety at an early stage.

To allow different users appropriate interaction with the model, different interaction
methods like colour-picker (changing the colour of walls / floors / ceiling interactively),
texture-picker (changing texture on the fly to judge the right material), exhibition-designer
(creating, placing and modifying exhibitions interactively) or switching through variations
have been implemented. As understanding takes a certain time, it became apparent that many
ideas could be communicated much better with animations.

10
Ulrich Lang, Uwe Wssner

6 HYBRID PROTOTYPES
Hybrid prototypes combine physical representation of objects and physical feedback with
computer generated information to analyse the behaviour or properties of future products.
Grabbing an object, feeling its weight or moving it, feeling a force feedback gives a much
more direct and intuitive access than just a visual based virtual reality. Figure 9 shows the
usage of an HMD which has attached to its front two video cameras delivering a stereo image
(see left and right eye view on the background monitors). With a proper registration it is
possible to overlay these video images with the visualization of a deep metal drawing process.
The determination of the proper position and orientation is based on the recognition of known
markers in a video image using the AR toolkit software [5]. The visualization and rendering is
done with COVISE and COVER. When the scientist rotates or moves the object the time
dependent visualization of the deforming metal sheet follows it. Such a type of interaction is
immediately obvious for any practitioner from different application fields. Thus complex
simulations can provide an immediate benefit for them.

Figure 9: AR Visualization of simulated metal forming process around a physical prototype.

Another combination of physical and virtual representations is used in the hybrid prototype
of figure 10. There a seat box of a car with an interaction element positioned at the proper
distance of the dash board opening is used. This seat box is positioned in a CAVE which
allows visualizing the air flow in the car cabin. This air flow is simulated based on the
orientation of the dash board opening. The engineer using the set-up wears an HMD thus
seeing a combined visualization of the CAVE via the video cameras overlaid by particle flows
added as computer generated elements to his glasses.

11
Ulrich Lang, Uwe Wssner

Figure 10: Physical car seat with interaction element of dashboard air inlet.

CONCLUSIONS
Virtual and augmented reality techniques are still under development. Their improvements
during the last years are very impressive. Many new application scenarios are possible but for
each of them evaluations of the usability and acceptance by humans are required. The devices
used are still far away from an acceptable ergonomic state, but the shrinking of sizes and
reduction of weight often promises an acceptance for certain applications that are currently
not reached. As the development of software techniques often takes longer than hardware
development cycles, it is important to perform early software developments.

REFERENCES
[1] McCormick B., DeFanti T., Brown M. Visualization in Scientific Computing. Computer
Graphics, 1987, Vol. 21, No. 6. P. 1-14.
[2] Rantzau D., Frank K., Lang U., Rainer D., Wssner U., COVISE in the CUBE: an
environment for analyzing large and complex simulation data. Proc. of the 2nd
Workshop on Immersive Projection Technology, 1998.
[3] http://www.opendx.org/
[4] Drosdol J., Kieferle J., Wierse A., Wssner U., Interdisciplinary cooperation in the
development of customer-oriented brand architecture. Proc. of Trends in Landscape
Modeling, Dessau, 2003, p. 264-269.
[5] AR Toolkit: http://www.hitl.washington.edu/research/shared_space/

12

Potrebbero piacerti anche