Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/265888816
CITATIONS READS
10 14
2 authors:
All in-text references underlined in blue are linked to publications on ResearchGate, Available from: Ulrich Lang
letting you access and read them immediately. Retrieved on: 02 May 2016
European Congress on Computational Methods in Applied Sciences and Engineering
ECCOMAS 2004
P. Neittaanmki, T. Rossi, S. Korotov, E. Oate, J. Priaux, and D. Knrzer (eds.)
Jyvskyl, 2428 July 2004
Key words: visualization, virtual reality, augmented reality, virtual prototyping, simulation
steering.
Abstract. Computing centres such as the High Performance Computing Centre Stuttgart
(HLRS) or the computing centre of the Centre for applied Informatics at the University of
Cologne (ZAIK/RRZK) support their users in performing complex simulation tasks which
often aim at optimizing new industrial products. Virtual and augmented reality techniques are
means to visualize virtual prototypes of future products and their behaviour. Starting form
scientific visualization an overview on actual developments combining real and virtual
representations of industrial prototypes ranging from water power plants and car climate
layout to architectural optimizations will be given. The integration of concepts establishes
hybrid prototypes as a combination of virtual and augmented realities with physical
prototypes or mock-ups. Interaction concepts play a major role in providing an improved and
intuitive access to such prototypes and their underlying behavioural models.
1
Ulrich Lang, Uwe Wssner
1 INTRODUCTION
Scientific Visualization is a support technology that enables scientists and engineers to
understand complex relationships typically represented by large amounts of data. The
visualization process chain is a part of the overall simulation process chain. Its elements and
their interrelationship represent the characteristics of scientific visualization and its usage in
different application fields. By combining visualization techniques datasets can be analyzed
and simulation models can be explored. In a certain class of engineering and scientific
domains properties of an object need to be judged in relationship to its shape. As such
properties are typically invisible visual metaphors are chosen to represent these properties
overlaid on or related to the shape of the object thus allowing such judgments. Such
visualizations are then also used to communicate complex content and support decision
processes. Virtual and augmented reality techniques not only improve this perception process
but also strengthen and accelerate group decision processes.
Data as intermediate carrier of information can not be immediately understood by humans.
Visualization as the process to convert data into a visual representations becomes increasingly
important as the data volumes produced by large computers or measurement devices rises.
With visualizations humans can recognize states, structures and behaviour of objects or
models. Visualizing properties as attributes of 3D objects enables humans to make use of their
evolutionary established capabilities to discern structures at certain locations or see spatial
transitions in structures. This accelerates the comprehension of complex structures or enables
it at all. The spatial recognition capabilities are complemented by further capabilities such as
the recognition of movements as well as the relationship between movements. To make use of
these capabilities dynamics in content need to be scaled to human perceivable time scales. By
combining our visual sense with further senses such as haptics strongly improves and
accelerates the perception process. While virtual prototypes introduce a machine model of a
future product, object or model, hybrid prototypes go one step further in combining physical
properties with virtual representations enabling a multisensory experience.
2
Ulrich Lang, Uwe Wssner
In the rendering step the 3D scene content is converted into images which are then
displayed. During display series of images can be viewed as animations showing a time
dependent behaviour. In a virtual reality environment the rendering and display steps are
typically combined into one step to speed up processing and reduce reaction times on human
interactions. While 2D images have the deficiency that depth information is lost (see figure 2)
and can only by incompletely regenerated by the human visual system from hints such as
shading, such information is maintained in a virtual environment, giving humans the
capability to really judge 3D structure and content.
Volume rendering as a special visualization method conceptually integrates the mapping
and rendering steps. Volume rendering bypasses the geometric representations between the
mapping and the rendering step. Its input is a scalar field defined on a three-dimensional grid
which is interpreted as a semitransparent medium. Via transfer functions for transparency and
colouring a mapping of the scalar values in each volume element (Voxel) is performed. These
3
Ulrich Lang, Uwe Wssner
semitransparent coloured voxels are then superimposed to form an image of the overall
volume.
Multiple algorithms exist to define transfer functions and to accumulate the voxels. Aims
are to detect subtle structures and reduce the processing time. Figure 3 shows the volume
rendering internal structures of a metallic motor block. The data has been acquired via
computer tomography.
In figures 2 and 3 it is to a big part not possible to judge the distance of the particle paths
or volume elements from the observer. Using a stereoscopic display technique of a virtual
reality environment the depth information is maintained and thus the 3D structure can be
much better analysed and understood. When the observer moves around the object motion
parallaxes is added to further improve the impression.
4
Ulrich Lang, Uwe Wssner
similar to a video recorder is required to slow down the animation speed, step through time or
reverse the orientation of the animation time.
In the next outer feedback loop a user can interact with the parameters of a selected
mapper. Realistic process chains typically consist of multiple filters and mappers as can be
seen in figure 5. Typical interactions with mappers are e.g. the repositioning of a cutting
plane, the definition of new starting positions for particle traces in a flow field or the
definition of a new isovalue of an isosurface. Such types of interactions are applicable to the
3D visualization of figure 4. Thus a user can identify a specific region where a certain effect
appears or an unusual behaviour is determined.
The next outer loop allows interactions with the filtering steps. Filter parameters enable to
select subdomains of a region together with the values defined on this subdomain. In a search
process for interesting effects the location of such a subdomain is consecutively moved across
the computational domain.
Finally simulation steering can be introduced via feedback into the input of the simulation
step. Here a user can see the simulation results change as the simulation evolves. With the
immediate feedback the user can modify boundary conditions and with a certain delay see
how the behaviour of the simulated system changes.
From the inner to the outer loops the timing requirements for the reaction of the system
become less demanding. When moving through a scene or interacting with other animation
parameters the system should ideally react within a 1/30 of a second to give a reality like
behaviour in VR. This requires an image update rate of at least 30 frames/s which limits the
complexity of the scene. Modifying mapper parameters can already take longer, especially if
5
Ulrich Lang, Uwe Wssner
they need to be applied to a whole sequence of time dependent data. In such a case a new
isosurface would e.g. have to be recalculated for all time steps.
Finally large-scale simulations can take hours or even days to finish. Waiting to see the
modified behaviour of such a system after changing an input parameter seems to be rather
inappropriate. Instead, the visualization system collects the result data as it appears over time
and keeps it for a repetitive analysis.
6
Ulrich Lang, Uwe Wssner
Depending on the scenario further sensory information can be very supportive. For a
medical specialist force feedback is essential during the training of an operation. For an
architect or urban planner the auditory information within a larger building or street strongly
improves the sensation of being there. In the real world objects can be moved with
constraints, doors can be opened, etc. Users expect the same behaviour of objects like in a real
world. Therefore time dependent event driven animations of objects are an essential element
7
Ulrich Lang, Uwe Wssner
of a virtual environment. This e.g. allows calling an elevator by pushing a button, which
opens and closes doors and caries users to different levels of a building.
8
Ulrich Lang, Uwe Wssner
interact with them as if they were real. Engineers can change parameters of the air
conditioning of a future car, as if they were in the real car. With online simulation coupling
the model reacts on changes of the dashboard opening.
Users need not only focus on the layout and optimization of products but can furthermore
concentrate on the development of usage concepts and profiles such as of buildings or cars.
Layout concepts orient the focus of interest on certain areas of a technical product. They e.g.
avoid distraction of a driver from a road while providing complementary information. In a
virtual environment it can be shown how to guide humans through buildings or how to make
users feel comfortable there. Mostly combined approaches are used with maps and elements
representing design concepts as well as animations explaining the concept.
All modifications of and interactions with the model were immediately provided to
simulation codes running in the background. Within minutes engineers could see the modified
behaviour of the virtual prototype and thus optimize it. Additionally discussions of specialist
about the behaviour of the prototype are supported and therefore held with much more
9
Ulrich Lang, Uwe Wssner
intensity. To implement this concept all simulation tools had to be hidden from the user and
were only accessible to him via a virtual reality user interface. The user interface as well as
the simulation and support tools was all implemented in COVISE as software integration
platform.
This way of working is part of the communication concept "MarkenStudio" [4] in which
visualization plays a major role. It could be observed, that as soon as there is "something to
look at" and ideas are being visualized, it is much easier to achieve a commonly agreed
meeting result. The participants are more willing to change their "point of view" and to
understand and accept the ideas of the others more easily. Additionally virtual reality assists
in reaching a high degree of planning safety at an early stage.
To allow different users appropriate interaction with the model, different interaction
methods like colour-picker (changing the colour of walls / floors / ceiling interactively),
texture-picker (changing texture on the fly to judge the right material), exhibition-designer
(creating, placing and modifying exhibitions interactively) or switching through variations
have been implemented. As understanding takes a certain time, it became apparent that many
ideas could be communicated much better with animations.
10
Ulrich Lang, Uwe Wssner
6 HYBRID PROTOTYPES
Hybrid prototypes combine physical representation of objects and physical feedback with
computer generated information to analyse the behaviour or properties of future products.
Grabbing an object, feeling its weight or moving it, feeling a force feedback gives a much
more direct and intuitive access than just a visual based virtual reality. Figure 9 shows the
usage of an HMD which has attached to its front two video cameras delivering a stereo image
(see left and right eye view on the background monitors). With a proper registration it is
possible to overlay these video images with the visualization of a deep metal drawing process.
The determination of the proper position and orientation is based on the recognition of known
markers in a video image using the AR toolkit software [5]. The visualization and rendering is
done with COVISE and COVER. When the scientist rotates or moves the object the time
dependent visualization of the deforming metal sheet follows it. Such a type of interaction is
immediately obvious for any practitioner from different application fields. Thus complex
simulations can provide an immediate benefit for them.
Another combination of physical and virtual representations is used in the hybrid prototype
of figure 10. There a seat box of a car with an interaction element positioned at the proper
distance of the dash board opening is used. This seat box is positioned in a CAVE which
allows visualizing the air flow in the car cabin. This air flow is simulated based on the
orientation of the dash board opening. The engineer using the set-up wears an HMD thus
seeing a combined visualization of the CAVE via the video cameras overlaid by particle flows
added as computer generated elements to his glasses.
11
Ulrich Lang, Uwe Wssner
Figure 10: Physical car seat with interaction element of dashboard air inlet.
CONCLUSIONS
Virtual and augmented reality techniques are still under development. Their improvements
during the last years are very impressive. Many new application scenarios are possible but for
each of them evaluations of the usability and acceptance by humans are required. The devices
used are still far away from an acceptable ergonomic state, but the shrinking of sizes and
reduction of weight often promises an acceptance for certain applications that are currently
not reached. As the development of software techniques often takes longer than hardware
development cycles, it is important to perform early software developments.
REFERENCES
[1] McCormick B., DeFanti T., Brown M. Visualization in Scientific Computing. Computer
Graphics, 1987, Vol. 21, No. 6. P. 1-14.
[2] Rantzau D., Frank K., Lang U., Rainer D., Wssner U., COVISE in the CUBE: an
environment for analyzing large and complex simulation data. Proc. of the 2nd
Workshop on Immersive Projection Technology, 1998.
[3] http://www.opendx.org/
[4] Drosdol J., Kieferle J., Wierse A., Wssner U., Interdisciplinary cooperation in the
development of customer-oriented brand architecture. Proc. of Trends in Landscape
Modeling, Dessau, 2003, p. 264-269.
[5] AR Toolkit: http://www.hitl.washington.edu/research/shared_space/
12