Sei sulla pagina 1di 16

3D Adaptive Music in the Virtual World

How should music behave in VR?

Content creators can use existing spatial audio technology to create


interactive, user-determined musical environments that complement
visual narratives, guide user perception, and shift emotional
character in real time.
Several types of adaptive Horizontal Adaptive Music

Musical arrangements re-
music are presently used to sequenced by game events
modify music in real time Vertical Adaptive Music

Layers (instruments) are
added or removed to modify
the score

Algorithmic Music

Generating a score on the fly

Music provides emotional subtext in a
visual experience

In VR, the visual experience is determined
by the user’s movement

Music needs to be emotionally flexible and
react to movement in real time
Music and Visuals
Film VR

The director determines •
1st person perspective
perspective, framing, and movement •
Framing and movement are

Directors carefully coordinate their determined by the user
visuals with the musical score. •
The visual experience is unpredictable
Conventional Workflow
Elements of music are combined before
implementation in the virtual environment
3D Spatial Composing Workflow User hears a unique
score appropriate to
their visual experience

Dark

User movements add or


remove key elements of
the score in real time

“Shell” elements of
the music are present
throughout a given Happy
virtual space
Composition in Immersive Worlds A B

• The role of the composer in VR is to organize the


musical experience across space and time, aligning
the users’ perspective with appropriate musical structures.
• There is a deep academic and artistic tradition in music
that can be drawn upon to compose across the
dimensions of interaction that VR provides

• With the present level of control that spatial audio D


allows, musical ideas can be “turned sideways” into C
the virtual space
A B
Using spatial audio to modify music

• Throughout composition and production, the


musical elements are conceptualized, grouped,
and produced along the lines of musical/emotional
role

Instruments, in particular polyphonic instruments,
are broken down into musical elements – harmonic role, etc.

Variations of particular elements are recorded, D
produced, and implemented in the experience. C

• These elements are toggled and manipulated


inside a spatial audio system, within the limits of
the given format.
Spatial VR music should:

Support multiple perspectives with
multiple emotional subtexts
simultaneously


guide the user’s attention and
movement through significant events in
the visual environment


not affect immersion – significant, but
not distracting
Using spatial audio to modify music A B
• In this example, A and B would be heard under all
circumstances, while C and D would be heard only
when focusing on points of emotional value.

The yellow notes at C and D are affected by the
user’s movement (added, removed, or otherwise
manipulated).

• The interaction of these elements changes the


character of the score.
D
C
• Emotional relevance is applied to multiple directions
at a single moment. These conceptions are repeated
across time and across virtual space to layout the
initial state – the musical environment.

• Principles of adaptive music – loops, layers,


sequencing – are also applied.
Recording and producing a 3D spatial composition

A composition can be visualized as a hierarchical musical network – master elements
and variable elements

In this example, blue tracks are the foundation, and are present throughout
a given area

Red tracks are controlled by in-game calls and variables and react to user action
and movement
In the final 3D Spatial Composition:

The user is moving through a flexible field of interactive
music

Overlapping elements determine the character of the
score at any given point.

Transitions between environments are analog -
maintaining immersion.

Manifest musical structures represent the users location,
perspective, and progress.
Notation and implementation

The emotional and narrative structure of
the environment is illustrated by the content
creators and composer

The audio engine is configured to add, remove,
or modify elements of the score to match
that illustration.

At i, the object C# is mixing with track A to create a simplified harmonic example
comfortable major third.

At ii, there is silence.

At iii, the tracks G and D and B combine to create
a major triad, or ‘happy’ harmony.

A is unmoving, while C# is an object that can be
moved by the user to any location in the field.

A user holding object C# at i will hear C# playing
along with A.

As the user moves C# to other locations in the
field, there will be pleasing harmony in some
areas, and dissonant harmony in others.

If tempo changes are desired in the field, poly-
rhythms of various types can be used to transition
between tempos
Narrative Function Example
• A user is instructed through musical language to interact with object A and then to exit through a
particular doorway, location D.

• Deviation from that path – approaching E – will create negative emotional harmonies in the score

Focus Focus

Potrebbero piacerti anche