Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Sessione di Laurea
Marzo 2016
Universit Iuav di Venezia
Dipartimento di Progettazione e pianificazione in
ambienti complessi
Corso di Laurea Magistrale in Design del Prodotto e
della Comunicazione Visiva
Curriculum Comunicazioni visive e multimediali
Abstract
Italiano
Abstract
English
Contents
1. Introduction
10
12
2. Scientific Visualization
14
16
18
24
3. Open Data
32
34
36
37
40
44
46
Evaluating attractiveness
48
50
5. Case Studies
52
54
56
60
64
66
68
70
72
74
Kurzgesagt, In a Nutshell
76
78
82
6. A Design-centered evolution
88
90
91
93
94
95
96
98
100
Application to data
102
103
104
106
Technical aspects
108
Testing process
110
Resources
114
Conclusions
118
Thanks
121
Colophon
122
1.
Introduction
Why to investigate
scientific
visualization
12
sible some extremely interesting works of data rendition, ranging from static visualization to real-time
immersive interactive installations, covering the vast
arc that goes from design to art.
This could turn out to be a great possibility for
Scientific visualization: it has the chance to learn
from the great mass of projects involving open data. It
could incorporate, for example, a more conscious and
consistent design process, to improve usability and
readability for the specialists, but also to look at the
more experimental and aesthetics-oriented works of
art, that could be made into powerful popularization
instruments for the general public.
Introduction
13
2.
Scientific
Visualization
What is Scientific
Visualization
16
Scientific Visualization
17
Historic Scientific
Visualization
18
For how SciViz is defined and for its mechanics, it is tightly bounded to computers: it requires to
elaborate and translate to images always increasing
amounts of data, or to run complex simulations that
may take up to years of calculations on huge supercomputers. Therefore, it may be difficult to imagine
Scientific visualization before the digital revolution
that happened in the 80. In fact, usually we consider SciViz to be born in the early 80, and becoming prominent at the end of the same decade, after a
1987 report released by SIGGRAPH for the National
Science Foundation, Visualization in Scientific Computing. That report was crucial for the history of Scientific visualization, bringing to the newborn field a
lot of attentions, that concretized themselves in new
funding, workshops, conferences and publications in
the following years.
Before the NSF paper there are anyway some
remarkable examples of visualizations, developed
beginning from the 60, and there are even more impressing examples of pre-digital Scientific visualization: in fact, the century from 1850 to 1950 gave a
great push to the developing of various data representation techniques. The most important example, but
also quite singular, is considered to be the Maxwells
thermodynamic surface, a clay sculpture created in
1874 by the mathematician James Clerk Maxwell.
He is considered the father of electromagnetism, but
he was active also in other physics and mathematics
fields.
His surface was, in fact, a three-dimensional abstract plot of the thermodynamic states of a fictitious
substance, similar to water, putting in relation its volume, entropy and energy on the x, y and z axes, with
isothermal and isopiestics lines traced on the surface.
With Maxwells own words, it allowed the principal
features of known substances [to] be represented on
a convenient scale4 : usually, in thermodynamics,
when it is required to analyze the state of any fluid
and to extract a parameter like pressure or temperature, engineers and scientist have to rely on a number
of different bi-dimensional diagrams, each one plotting just two parameters at a time, like pressure and
volume, or temperature and entropy, and so on. Those
diagrams can be considered, in fact, just various bi-dimensional projections of a Maxwells thermodynamic surface. While the practicality of this artifact is
low, due to the obvious difficult to extrapolate or read
single values from it, it offers an extreme degree of
synthesis, allowing a novel and encompassing view
on various phenomena of fluid thermodynamics, offering so a major insight value over the traditional bidimensional graphs.
Going back two decades, another remarkable
example must be taken into consideration, this time
in the field of public health. During 1854 there was
a cholera outbreak in London, in the Soho district: it
was a particularly unfortunate area of the city, since
20
Scientific Visualization
21
22
that it was necessary to improve the sanitary conditions in the country to increase the life quality and
expectance for both the Indian population and the
British army.
With support from the Royal Sanitary Commission, she convinced the minister responsible to
require every house owner to connect its building to
the main drainage system, greatly enhancing sanitary conditions.
As in Snows example, also this case shows how
a proper and well-design way to represent abstract
data can help to establish connections that are otherwise difficult to be noted, and to spread knowledge on
problems that are difficult to understand for whoever
may be a stranger to a particular field of knowledge.
Scientific Visualization
23
Scientific
Visualization,
computers and
computer graphics
24
7: http://www-pi.physics.uiowa.edu/~frank/
Scientific Visualization
25
26
8: http://excelsior.biosci.ohio-state.edu/~carlson/
history/ACCAD-overview/overview-old.html
9: http://excelsior.biosci.ohio-state.edu/~carlson/
history/ACCAD-overview/overview-old.html
10: http://excelsior.biosci.ohio-state.
edu/~carlson/history/ACCAD-overview/
overview-old.html
Scientific Visualization
27
28
tems, artists, for their education in visual communication and inventive about visual representations,
cognitive scientist and psychologists to design better
interfaces and try new concepts and steer research in
new directions.13
Throughout the paper its underlined many
times how visualization in general is fundamental
to human beings, and that a lot of different kinds of
information are better communicate in a visual way:
for example, when dealing with big amounts of data
that the brain cannot process in numerical form, or
for structures like DNA, molecular models or other
kinds of simulation. An interesting fact is that the
authors note the need for interaction in Scientific visualization: at the time most elaborations run on supercomputers via batch processing, so the user could
only start the process and wait for the result, without knowing what the calculator was doing. They
theorized Interactive visual computing, where scientist
could communicate with data by manipulating its visual representation during processing. The more sophisticated process of navigation allows scientists to steer, or
dynamically modify, computations while they are occurring. These processes are invaluable tools for scientific
discovery.14 This change should be implemented developing and giving high-quality visualization tools
directly to the researchers, via a proposed federallyfunded initiative. The first step should be to abandon
the supercomputing model, in favor of workstations,
minicomputers and image computers [] that should
be placed on the desks of each and every researcher
with specialized graphics processors, that could be
much more efficient, with each of these machines
costing between 5,000 $ and 100,000 $. Along with
Scientific Visualization
29
30
Scientific Visualization
31
3.
Open data
beginning of
The Open Data era
34
Elinor Ostrom
15: Robert King Merton, The Normative
Structure of Science, The Sociology of Science:
Theoretical and Empirical Investigations
Open Data
35
One of the first notable example of the applications of some Open Data principles is from the 1950
decade: the International Council of Scientific Unions
created the World Data Center system, in preparation
for the International Geophysical Year of 1957/58.
Several World Data Centers were established, to minimize the risk of data loss and maximize data accessibility. However, since there was no Internet, that was
more like a distributed storage system, rather than an
active data sharing platform.
The first really Open Data-oriented project was
the Human Genome Project, which is still the biggest
collaborative biological project. Its planning started in
1984, the development begun in 1990 and was completed in 2003. It was carried out by twenty universities in United States, United Kingdom, Japan, France,
Germany and China, under coordination and funding from the US government, with the ultimate goal
of mapping the whole human genome, sequencing
multiple variations for each gene.17
36
The birth of
public open data
Open Data
37
38
being the lack of interoperability and the heterogeneous demand. The first problem can be summarized
with the lack of a universal Open Data standard: it
is stated that data needs to be machine-readable, but
not how, and so dozens of different standards and
formats are used around the world. This does not just
regard the file formats themselves, but also column
subdivision and labeling, precision in number representation, and many more similar details that a developer must face when approaching different datasets.
Open Data theorists themselves, after their
great initial success and the general social acceptance
of their ideals, also are slightly changing their points
of view: for example, Lawrence Lessing dissociated
himself from the ideas of radical transparency of the
movement. Even Beth Noveck, that took part in the
creation of the Open Data policies of the Obama administration, is not sure anymore that Open Data is
being enough to improve the governance of public
affairs. Other participants of the Sebastopol meeting
even ended up supporting WikiLeaks.
These minor difficulties do not mean that Open
Data should be abandoned: they just mean that that
was a first step, and we must proceed towards always
better models of data sharing. Tim Berners-Lee is proposing, for example, Linked Data.
Beth Noveck
Open Data
39
The future:
Linked Data
19: https://www.w3.org/History/1989/proposal.
html
20: https://joinup.ec.europa.eu/community/
semic/news/understanding-linked-dataexample
Tim Berners-Lee
40
Tim Berners-Lee is an incredibly important figure, that influenced the very way we communicate
and perceive information nowadays: in March 1989,
while working at CERN he defined and prosed a new
information management system that became, in the
November of the same year, the World Wide Web
that we all know, successfully implementing the first
HTTP communication.19 Currently he is director of
the World Wide Web Consortium (W3C) and founder
of the World Wide Web Foundation.
Starting from 2009, he worked with the British
government to support their Open Data project, that
concretized itself in the data.gov.uk portal.
Tim Berners-Lee research nowadays is focusing
toward the future of Open Data: he is proposing the
Linked Data standard, as a mean to solve the problems that traditional Open Data are facing.
Linked data must respond to four simple rules:
1- Use Uniform Resource Identifiers (URIs) as
names for things, e.g. http://dbpedia.org/resource/Brussels can be used for referring to the city of Brussels.
2- Use HTTP URIs, so that people can look up those
names.
3- When someone looks up a URI, provide useful
information, using the standards (i.e. RDF, SPARQL).
4- Include links to other URIs, so more things can
be discovered, e.g. from http://dbpedia.org/resource/
Brussels a link is available to http://dbpedia.org/resource/Belgium.20
To make clear and rate how much Linked
Open Data is considered good and useful, a rating system has been developed:
1 Star: The Data is available on the Web with an
Open license.
2 Stars: Data is in a machine-readable format, like
an Excel file instead of a scanned image.
3 Stars: Data is in a non-propertary format, like
.csv instead of Excel.
4 Stars: Data uses a W3C open standard to iden-
21: https://www.w3.org/DesignIssues/
LinkedData.html
22: https://www.ted.com/talks/
tim_berners_lee_on_the_next_web/
transcript?language=en#t-485180
Open Data
41
23: https://www.ted.com/talks/
tim_berners_lee_on_the_next_web/
transcript?language=en#t-485180
24: http://www.geonames.org/
25: http://umbel.org/resources/news/umbel-v1-20-released/
42
Open Data
43
4.
art, science
and design
separation of
science and
aesthetics
46
used to generate a visualization, mainly to its reliability and robustness; but also takes a wider sense,
including how efficient the code is, if it can take different kind of data, or tolerate errors. In an academic
environment it is expected that the algorithm behind
a visualization is clearly explained, and must fulfill
some elementary stability and reliability requirements. It can be evaluated both with mathematical
and empirical methods.
Attractiveness refers to the aesthetic quality of
a visualization, relating primarily to the appeal or
beauty, but can be extended to consider other aspects
like originality, novelty, and can also encompass the
whole user experience matter. The problem with this
principle is that, at least traditionally, there is no recognized way to objectively evaluate or it, therefore
academic research in scientific visualization does not
have any expectation in that regard26.
This kind of reasoning has a fundamental flaw
anyway: while it is obvious that beauty cannot be
measured, it is also true that various studies proved
that, when dealing for example with computer interfaces, the aesthetic quality can support utility, improving the users performances and satisfaction27.
47
Evaluating
attractiveness
28: M. Kurosu and K. Kashimura, Apparent
Usability vs. Inherent Usability: Experimental
Analysis on the Determinants of the Apparent
Usability. Conference on Human factors in
Computing Systems (CHI95)
48
While each and every project has its own equilibrium of art, design and science, nowadays it can
nevertheless be clearly distinguished if they are addressed toward aesthetic value or scientific clarity,
creating a sort of buffer zone that sits right in-between them and is currently extremely small populated, if not at all. Design could move in this zone,
somehow bringing those two directions together and
blurring the line to create a kind of visualizations that
fuses them, maintaining readability of the data and
also bringing in aesthetic value. This could improve
the experience both of experts, creating better tools
and practices to handle and work with data, while
also creating new artworks to popularize science
among the public, that readability and instructional
value.
49
50
51
5.
CASE STUDIES
case studies
selection
There is currently a notable number of databased projects that crosses ways with art, design and
science, using both more traditional forms of expression (graphics, video, sculpture) and new and
non-conventional medias (sound, interaction, installations). There is a selection of works in different field,
including some institutional projects created directly by research center and agencies, and some works
that does not directly involve data, but acts as means
of science popularization, with a very good response
from audience. Each project has tags to distinguish its
field and features:
1 - R. Ikeda, Supersimmetry - Art, Installation, Particle physics, Music,
Projections
2 - Miebacht, Weather Scores - Weather data, Sculpture, Music, Art
3 - A. Parker, Worlds - Video, Planetary Science, Design
4 - eCloud - Weather data, Installation, Art
5 - Living Light - Weather data, Installation, Design
6 - L. Pagliei, Les Invisibles - Music, Particle physics, Art
7 - J. Park, LHC CERN Data Audio Generator - Music, Particle physics,
Application, Design, Science
8 - J. W. Tulp, GoldiLocks - Planetary Science, Web, Design
9 - In a nutshell - Web, Video, Animation, Design
10 - NASA Scientific Visualization Studio - Weather data, Planetary science, Science, Map
11 - NASA Hyperwall Visualizations - Planetary science, Science, Weather Data
54
Tags recurrency
Science
Weather Data
Planetary Science
Animation
Design
Art
Music
Map
Installation
Particle Physics
Video
Projections
Sculpture
Application
Case studies
mapping
Design
09
08
05
04
01
06
Art
02
07
03
10
11
Science
Case Studies
55
Ryoji ikeda
Supersymmetry
Audio visual installation, 40 projectors,
40 computers w/ monitor, speakers,
2014
35: http://www.ryojiikeda.com/biography/
Ryoji Ikeda during a performance
56
Ryoji Ikeda is a Japanese sound and visual artist, whose musical interest is mainly in the most
raw states of sounds, like fundamental tones and
various kinds of noise, that he often uses to play with
the limits of human earing, with extremely low or
high-pitched sounds. His visual aesthetics is based on
similar principles, often visualizing the sound he produces with massive, abstract and geometric black and
white projection. He often used data, mathematics
and physical concepts as sources for the composition
of his works, and is considered a pioneer of minimal
electronic music35.
His project Supersymmetry was developed
during his residence at CERN in Geneva in 2014 and
2015, where he had the chance to cooperate with the
researchers working on the supersymmetry theory:
at the time they were running experiments at the
LHC particle accelerator to find some yet-undiscovered fundamental particles, like the Higgs Boson,
that would prove the reliance of the Standard Model.
Even if this work does not include data directly from
the LHC experiments, it is trying to describe in an artistic and experiential way what these physics theories are and imply, so it is more of a conceptual project
than an actual data visualization system.
The installation itself is constituted of two sections, named experiment and experience: the first
one revolves around observation of natural phenomena, which is exemplified with a number of small
36: http://special.ycam.jp/supersymmetry/en/
work/index.html
pellets of different materials, moving on three lightened surfaces that are constantly changing their inclination. The pallets move following the angle of the
surfaces, colliding with each other. A laser scanning
system detects the position of each particle and sends
it to the second section, experience, where sound
and images are produced according to movements
and collisions detected between the pellets. This creates a kind of detachments in the users perception:
while they try to follow and understand at least one
of the events, hundreds of other events are happening each second around them, in a way simulating
the enormous amount of data and complexity that
particle physics researchers have to face with each
experiment36.
Supersymmetry (experience)
Supersymmetry (experiment)
Nathalie miebach
Weather Scores
Scuplture, music
37: http://nathaliemiebach.com/statement.html
Nathalie Miebach
60
Case Studies
61
62
Case Studies
63
Alex H. Parker
Worlds
Animation, Music
38: http://www.alexharrisonparker.com/
39: http://www.alexharrisonparker.com/
datavisualization/
64
Worlds
Painted Stone
Aaron Koblin,
Nik Hafermaas,
Dan Goods
eCLOUD
Dynamic sculpture, 2010
40: http://www.ecloudproject.com/tech.html
41: http://www.ecloudproject.com
66
Case Studies
67
Soo-in Yang
David Benjamin
Living Light
Public art installation, 2009
42: http://www.interactivearchitecture.org/
living-light-2.html
68
Living Light is a functional art installation located in Seoul, created to inform the citizens about
the air quality in the city. The canopy has a city map
etched on its glass surface, and is divided in 27 different blocks representing the different areas of the city
where the air monitoring stations are situated. When
an improvement in the air quality of a district is detected, the corresponding block of the system lights
up; otherwise the system is automatically updated
every 15 minutes, lighting up all the blocks in order,
from the one with the best air quality to the worst
one. The structure also responds to text messages,
blinking and then sending the data about the specific
location that the user requests. Living Light is fully
integrated with the already existing air-quality monitoring system active in Seoul, receiving data from its
sensors network42.
This is a notable example of public art: its beautiful design became an attraction for the city itself,
and it is also very useful for the population, informing and raising awareness about the condition of air
and pollution in the city.
Lorenzo Pagliei
Les Invisibles
Sound Installation, 2001 / 2015
43: http://www.lorenzopagliei.com/site/bio.html
44: Personal interview with Lorenzo Pagliei,
February 2016
Lorenzo Pagliei
70
JeongHo Park
LHC CERN Data
Audio Generator
Data analysis Software
45: http://jeonghopark.de/z
46: https://github.com/jeonghopark/lhcCernMac
72
Case Studies
73
47: http://tulpinteractive.com/goldilocks/
74
The Goldilocks name comes from the homonym effect, that states that some phenomena must
fall within certain margins rather than reaching
extremes: this principle is applied to astrobiology to
determine when a planet could theoretically support
life, since it depends on a series of parameters that
must fall within certain boundaries47.
This interactive visualization displays all the
known exoplanets (1942 confirmed, at October 9th,
2015) and their host stars. Size and orbital parameters
of the planets are based on the observed data, as well
as star size, color and position.
Alongside the main visualization, displaying
all planets around their stars in their respective positions, there are five other visualizations, more dataoriented, focused on: distance of each planet from
their star, underlying the ones in their systems habitable zone; Earth Similarity Index; atmosphere and
composition compared to Earth; temperature and
mass; temperature of their stars and amount of energy they receive from them.
kurzgesagt
In a Nutshell
YouTube channel
48: http://kurzgesagt.org/profile/
49: https://www.youtube.com/user/Kurzgesagt
76
Case Studies
77
78
Nasa Scientific
Visualization
Studio
50: https://svs.gsfc.nasa.gov/cgi-bin/details.
cgi?aid=12094
Case Studies
79
80
Space Weather: This animation shows the hypothetical voyage of a photon of light from the center
of the Sun to the Earths atmosphere. The video displays also various phenomena that are encountered
in such a trip: it shows nuclear fusion and ignition
processes inside the Suns core; loops, flares, prominences and coronal material ejections on the surface;
and then follow the solar particles until they reach
Earths magnetic field, causing aurora53.
52: https://svs.gsfc.nasa.gov/cgi-bin/details.
cgi?aid=4404
53: https://svs.gsfc.nasa.gov/Gallery/
SpaceWeather.html
Case Studies
81
Nasa Hyperwall
Visualizations
NASAs Hyperwall
Case Studies
83
Case Studies
85
86
NOAA Coral Reef Watch 2015: A series ofanimated Data Visualizations showing the real time
conditions of reef environments, helping to identify
where the corals are losing their symbiotic algae,
so losing their color and bleaching: a condition that
heavily endangers the survival of entire colonies.
Data is taken from a variety of satellites, including
the polar orbiters Suomi-NPP/VIIRS and MetOp-B/
AVHRR, and the geostationary satellites MSG-3, MTSAT-2, GOES-East, and GOES-West. Data itself consist
mainly of temperatures, from which various other
indexes are calculated, according to average or instantaneous values.
Case Studies
87
6.
a design-centered
evolution
Design as the
unifying force
90
Analyzing projects from both scientific visualization and data-driven art, looking at their history and at some case studies, it emerges that, while
these disciplines underwent a great evolution and
has reached very remarkable levels of advancement,
there is still room for notable improvement, especially
if we consider the role that design methodology could
have in the future of these disciplines.
Nowadays, in fact, SciViz and data art projects
are running in two different and almost parallel directions, with almost none contact points; but they
could both benefit a lot from some kind of interaction between them. Design could be the force driving
them to cooperate, being almost equidistant from
both of them, but still close enough to be able to interact with both. Recognizing the potential of design
for these fields and involving more designers into the
developing processes of scientific visualizations and
data art would bring a more reliable design methodology, that would improve, for scientific visualization,
important aesthetic qualities like the overall experience and the design of interfaces, easing the work for
researchers and allowing the creation of better popularizing and generally public-distributed material; at
the same time data-oriented art projects could rely on
a more solid work methodology and on a good technologic know-how.
There are various examples of designers and researchers endorsing the potential of art: to mention
one, in the report made for a project experimenting
visualization techniques for Virtual Reality, Daniel
Keefe and his colleagues stated concluded that artists
[have] key roles in working closely with scientists to
design novel visual techniques for exploring data and
testing hypotheses, and strongly suggest a interdisciplinary approach to visualization, Teaching art to
computer scientists, computer science to artists.
A Design-centered evolution
91
55: https://virtualrealityreporter.com/virtualreality-surgical-medical-application
92
topic.
It must also be considered the value of new media for researchers and professionals, for example: in
the medical field the use of virtual reality has already
proven its outstanding value both for surgical training and as assistance during surgeries54, and some
theorize that technologies such as Microsoft Handpose, an extremely precise hand-tracking system,
will be paired with already-existing robotic precision surgical arm to use during operations, allowing
much more accurate actions and preventing human
mistakes55.
The use of sound is also proving its usefulness
in the field of astrophysics research: NASA is using
data sonification through a software called Xsonify
to translate observations made in various fields into
sound. For example, they translated x-ray observations, radio signals from Jupiter and micrometeorites
impacts on the Voyager II probe56. The project was
born to help visually impaired researchers, but it has
been found to be useful also for normally seeing individuals: for example, for x-ray observations data
was usually visualized, but that method proved to be
scarcely effective due to the high level of noise and
the difficulty to visually identify events and patterns.
Moving to a sound-based system proved to be a better solution to pinpoint different kinds of events, that
presents themselves as different timbres, or patterns,
audible as rhythms57.
Coverage of
contemporary
and popular
science topics
A Design-centered evolution
93
Usage of datasets
and real-time data
sources
94
This is of course the most important requirement for a project aimed at pure research and investigation purposes, which would be useless if the
data is not somehow clearly readable and understandable. Data can be accessed even if an indirect
way: an easy expedient that a lot of works use is to
present a general overview of the whole involved
data in the main screen, while allowing the user to
select one or more data points to access the details
about it.
This point may be important also for projects
directed to the general public, allowing for a more
precise understanding of the presented phenomena
and maybe also giving the piece an instructional
value. If the data is difficult to understand, maybe
because of its unit of measure, or for the very wide
order of magnitude, a number of devices can be used
to overcome the problem, for example translating
the raw numbers with some metaphor related to
some more understandable measures, or related to
everyday objects. Instead of using magnitude for the
brightness of a star the value can be related to the
brightness of a far-away light bulb, or power consumption/absorption can be related to the amount
used by home appliances.
A Design-centered evolution
95
Application of
design reasoning
and verifiable
methodology
96
phenomenon like global warming, it should be considered in a slightly different way. Artists working
in this particular sub-field should be aware that the
value of their work can be also in the research and in
the thinking they made and in the rationale in their
decisions; and that their work has the possibility to
ease scientific research if they invent a valuable data
rendition technique.
There are some cases of good documentation
of the design processes in the last years: for example,
Daniel F. Keefe et al. explains in a IEEE Computer
Graphics and Applications article the process and
the results of a project that involved both professors
and students to develop new visualization techniques
in a virtual reality environment. This kind of documentation can be extremely valuable: VR as a working technology is still in its first years, so there is no
consistent literature or examples about possible uses
in design and visualization; in fact, also Keefe and his
colleagues highlighted the complete lack of design
tools to rapidly sketch and prototype58. Also Byron
and Wattenberg explained in detail in a paper the
design rationale behind their Streamgraph, an new
kind of stacked graph, from the algorithmic level to
every visual choice they made59, creating a collection
of generalizable knowledge that can benefit other designers and developers who want a similar approach.
Together with the current approach of interaction design, were aesthetics to be an integral part
of functionality, with pleasure a criterion for design
equal to efficiency or usability60, these and other examples show and explain the potential of integrating
design thinking in the research process, how aesthetics is valuable to any visualization, and how it can be
somehow estimated and evaluated.
A design-centered evolution
97
7.
Project: Sounds
from the sun
Sound and
psychoacoustics
100
Vision is widely considered to be the most important and defining sense of human beings: it is indeed true that the ability to see colors, for example,
gave a great evolutionary advantage to our ancestors;
and it is also one of the rarest and most powerful features between mammals, reserved almost only to humans and some great apes.
At the same time, the hearing system of mankind is considered to be less evolved and refined than
the one of other species of mammals: most of them
have a wider range of audible frequencies, are more
sensible and can better localize the source of sounds
around them. The hearing system of humans was,
and still is, nevertheless fundamental for the survival
of our specie, being the most effective way to detect
and localize possible dangers: a sudden noise startles
us, and triggers in the brain and in the body a chain
reaction that gets us ready either for fighting or for
fleeing.
The effectiveness of this system is due to a series
of different reasons:
-Hearing is much quicker than seeing. Vision
is a slow chemical process: molecules in the retina
change their characteristics when struck by light,
this change is transformed into electrical signals that
the brain has to deeply elaborate to transform them
in the images we see. This process is considered to
take between 1/30 and 1/20 of a second, resolving in
the general knowledge that we can see no more that
20/30 different pictures per second. We can express
this as a value with a unit of measure: 30 Hz. Meanwhile hearing is based on a quick mechanical process, with the hearing bones vibrating and exciting
a nerve connected to the brain. In result this process
is much quicker, allowing mankind to process sound
with a resolution 1.000 times greater, around a maximum value of 20.000 Hz. In other words, the shorter
event that the eyes can perceive has to last at least 50
milliseconds, while the shortest hearable event can
101
applications
to data
102
The peculiarities explained, together with others of minor relevance, can have some interesting
consequences. For example, the extreme precision of
the hearing system enables us to easily detect both
established patterns and irregularities in them: for
example, we instinctively detect and recognize temporal patterns as rhythms, often following them with
some kind of body movement. In the same manner
we can notice if something breaks out of an established pattern, breaking a rhythm.
The mechanisms applies in the frequency domain: we instantly distinguish an harmonic sound
(where all sounds have a frequency multiple of the
same number) from any kind of non-organized sound,
that we call noise. If any kind of harmonic context is
established, like an orchestra playing a classical piece,
any kind of sound incoherent with that background
pops up, for example a badly tuned instrument.
This kind of ability has, in fact, already proven its value for data investigation: NASA is using it
with its Xsonify software, to translate observations
made in various fields into sound. For example, they
translated x-ray observations, radio signals from Jupiter and micrometeorites impacts on the Voyager II
probe. The project was born to help visually impaired
researchers, but it has been found to be useful also
for normally seeing individuals: for example, for xray observations data was usually visualized, but
that method proved to be scarcely effective due to
the high level of noise and the difficulty to visually
identify events and patterns. In this specific case, the
ability of our hearing system to ignore background
noise made easier for researcher to isolate interesting events, while temporal patterns in the data that
corresponds to repeating or cycling phenomena are
easier to find, also quickly spotting irregularities that
are much more difficult to identify with traditional
visualization.
the Solar
Wind and its
consequences
Above: a mid-level solar flare emitted on Jan.
12 2015, pictured by NASAs Solar Dynamics
Observatory
103
Solar Wind
Observation
104
105
Sound rendition of
Solar wind data
106
the plasma flux is rendered by a noisy, continuous sound, where the temperature is mapped to the
timbre of the sound, the density to width of the noise
and the speed to the central frequency;
the magnetic field is rendered by an harmonic
sound similar to a chord, where the total magnitude
is the fundamental pitch and each component of the
magnetic field corresponds to the volume and beating
cycle of each harmonic.
These choices are not casual, and are made for
different reasons: first of all, metaphorical, their perceptions mimic the nature of the phenomenon they
replicate, and the sound parameters are mapped taking into consideration which one of them may better
represent their physical correspondence.
Another aspect taken into consideration is if the
sounds can be identified even if played together: in
this case the sharp attacks of the particles is always
clearly distinguishable from the other continuous
sounds; while these other two are distinguishable
from two different aspects; they take different registers of the audible spectrum, and also the harmonic
nature of the magnetic field rendition stands out from
the noisy sound generated by the solar wind plasma.
107
Techical aspects
108
Data on SWPC
website
Wait 30 seconds
Download on local
computer
Software
Flow Chart
Conversion to
positional notation
Processing
Labeling
OSC send
Values to sound
parameters mapping
Max/MSP
Sound synthesis
Spatialization
Output
109
Testing
Process
110
111
References
Conclusions
Thanks
Colophon
Bibliography
RONALD BAECKER, Sorting Out Sorting: A Case Study of Software Visualization for
Teaching Computer Science, Software Visualization: Programming as a Multimedia Experience, MIT Press,1998,pp. 369 - 381
LEE BYRON, MARTIN WATTENBERG, Stacked Graphs Geometry & Aesthetics, IEEE
Transactions on Visualization and Computer Graphics ,Vol 14 Issue 6,2008,pp. 1245 - 1252
ROBERT M. CANDEY, ANTON M. SCHERTENLEIB, WANDA L. DIAZ MERCED, Xsonify
Sonification Tool For Space Physics, 12th International Conference on Auditory Display,
London, UK, June 20-23, 2006
CHARLOTTE HESS, ELINOR OSTROM,IDEAS, ARTIFACTS, AND FACILITIES: Information
As A Common-Pool Resource, Law And Contemporary Problems, vol 66, Winter/Spring
2003, pp. 111 - 145
DANIEL F. KEEFE, DAVID B. KARELITZ, EILEEN L. VOTE, DAVID H. LAIDLAW, Artistic
Collaboration in Designing VR Visualizations, IEEE Computer Graphics and Applications,
March/April 2005, pp. 18 - 23
DANIEL F. KEEFE, DANIEL ACEVEDO, JADRIAN MILES, FRITZ DRURY, SHARON M.
SWARTZ, DAVID H. LAIDLAW, Scientific Sketching for Collaborative VR Visualization
Design, IEEE Transactions On Visualization And Computer Graphics, Vol 16 n. 4, July /
August 2008, pp. 1 - 13
ANDREA LAU, ANDREW VANDE MOERE, Towards a Model of Information Aesthetics in
Information Visualization, Information Visualization IV, 2007
J. A. LEWIS, E. E. ZAJAC, A Two-Gyro, Gravity-Gradient Satellite Attitude Control System,
The Bell System Technical Journal, November 1964, pp. 2705 - 2765
LIPYEOW LIM, HAIXUN WANG, MIN WANG, Semantic Queries by Example, 2003
DAVID CHEK LING NGO, LIAN SENG TEO, JOHN G. BYRNE, Modelling interface aesthetics, Information Sciences, 152, 2003, pp. 25 - 46
BRUCE H. MCCORMICJ, THOMAS A. DEFANTI, MAXINE D. BROWN, Visualization in
Scientific Computing, Computer Graphics, ACM SIGGRAPH, July 1987
114
115
Sitography
AT&T Archives, Simulation of a Two-Gyro Gravity-Gradient Attitude Control System http://webcache.googleusercontent.com/search?q=cache:wlCuGlowlpgJ:techchannel.att.
com/play-video.cfm/2012/7/18/AT%26T-Archives-First-Computer-Generated-GraphicsFilm+&cd=1&hl=en&ct=clnk&gl=it
Ronald M. Baecker, Sorting Out Sorting Video - https://www.youtube.com/
watch?v=SJwEwA5gOkM
BBC Data Art, A Quick Illustrated History of Visualisation - http://data-art.net/resources/
history_of_vis.php
Tim Berners-Lee, The next Web of open, linked data - http://www.dailymotion.com/
video/x8omty_tim-berners-lee-the-next-web-of-ope_tech#.UVWqxhnGrJw
Tim Berners-Lee, Linked Data - https://www.w3.org/DesignIssues/LinkedData.html
Jorge Camoes, Infographics vs. Data Visualization - http://www.excelcharts.com/blog/
infographics-data-visualization/
Wayne Carlson, A Critical History of Computer Graphics and Animation, Section 18: Scientific Visualization - https://design.osu.edu/carlson/history/lesson18.html
Simon Chignard, A brief history of Open Data - http://www.paristechreview.
com/2013/03/29/brief-history-open-data/
Centre des Arts, Invisible et Insaisissable - http://www.cda95.fr/content/invisible-etinsaisissable
Michael Friendly, Milestones in the History of Scientific Visualization - http://www.
datavis.ca/papers/vita/Friendly08aaas.html
MARIEKE GUY, OPEN ACCESS TO RESEARCH DATA TIMELINE - http://access.okfn.
org/2015/04/30/open-access-to-research-data-timeline/
Jeremy Norman, Edward Zajac Produces the First Computer-Animated Film (1963) http://www.historyofinformation.com/expanded.php?id=1002
Barack Obama, Transparency and Open Government Memorandum - https://www.
whitehouse.gov/the_press_office/TransparencyandOpenGovernment
116
117
Conclusions
common in the next years, involving artists and designers not only in the finishing phases of visualization projects but also in the concept and development
parts: this could bring great positive effects both for
researchers, that could have better instruments, involving new technologies and approaches to investigate their fields; and also for the general public, that
could have access, for example, to pieces of public art
with an informative value, supported by strong scientific principles.
The project I developed demonstrates how a
new and different approach, that revolves around the
conscious use of sound, can be an effective and useful
way to to represent real-time data, even in a mission
critical environment.
To conclude, it has been a difficult but satisfying
work, that required a broad research and the conceptual effort to put together many different fields, that I
hope will at least contribute to open the discussion on
how to reconsider the respective roles of science, art
and design, how they could interact for us and how
they could positively effect our lives in the future.
119
120
Thanks
121
Colophon
New Media for Scientific Data Visualization
IUAV University of Venice
Design and Planning in Complex Environments Department
Master Degree in Product and Visual Communication Design
AA 2014 / 2015
Supervisor:
Marco Ferrari
Typefaces:
Aleo, Alessio Lasio
Bebas Neue, Ryoichi Tsunekawa
Software:
Adobe InDesign CC
Adobe Illustrator CC
Adobe Photoshop CC
Processing 3.0
Nodebox 3.0
Cycling 74 Max 7
122