Sei sulla pagina 1di 5

Student Name : Simon Edmonds

Student Number : 20452888

Course : MA/MFA Creative Music Practice

Module : G106834 - Musicology and Theoretical Perspectives

SENSE AND PERCEPTION OF SOUND

As we pass through life our senses are our connection to the world but they are just sensors, what we perceive
through our senses is a result of a complex interplay between one or more sensory inputs, past experience and
our ability to think.

If, for a moment, we consider ourselves as computers or robots then the brain is the central processing unit.
On its own it is capable of nothing but then we have an operating system that keeps the body alive and
manages all of the minutiae of day to day living and on top of that we have a piece of very sophisticated
software that is our ‘self’, our consciousness.

Calling it a piece of software may seem to be denigrating our consciousness but it is a way of visualising or
contextualising its place in relation to our physical being and the outside world. Suffice it to say that there is no
evidence to suggest that we will ever be able to reproduce the functionality of this so called software outside
the human body and mind.

In the ‘ordinary’ world, ‘ordinary’ people go through life taking all of this for granted. In this paper we are
looking at the ability of ordinary people to become extraordinary through understanding their perception of
the world about them and overcoming limitations that present themselves by extending their perception. In
the light of this we are looking at how the understanding of our perceptual abilities can assist us when
designing for music creation and listening as well as creating the music itself.

A useful approach in this field of study is to look at anomalies. There are many cases where apparent
disabilities can shed some light on how normal perception works and how it can be extended.

HEARING IN THE ORDINARY WORLD

Get to the cinema early, when no one else is there and they haven’t started rolling the adverts. Sit in your seat
with your eyes closed. You know that you are in a cinema, there’s no doubt, there’s nowhere else that sounds
like that, that sounds like a warm blanket of nothing.

Whether your eyes are open or shut, your perception of sound informs your consciousness of the space that
you are in.

Next time you walk past an old church. Take the time to go in. Walk down the aisle, take a pew and sit quietly
for a moment. What appears to be silent is not at all but that doesn’t detract from the sense of peace and
tranquillity.

We tend to think that we are constantly at the cutting edge of understanding yet if you sit in Salisbury
Cathedral and considering the impact of the building on your consciousness it is hard to believe that it was
completed over 750 years ago. It seems unlikely that the architect responsible didn’t understand that his
design would affect the people both outside and inside the building in the way that it does.
In today’s world of constant noise there are few places where you can go and sit in peace and quiet, where the
sound environment is steady and ‘clean’, free from the incidentals and accidentals of the everyday 21st century
drone. Vehicles, phones, shouts, bangs and crashes create audio stress points that jar our psyche and fracture
peace. Even in the great outdoors it is nigh on impossible to escape the constant background drone of road
and air traffic.

Whilst our visual sense appears to be dominant, our perception of space and place through the subliminal
interpretation of the acoustic environment informs us constantly. We hear but we tend not to listen until
something appears to break through, to break the trend of the expected. Sound constantly feeds directly to
our consciousness – in fact it continues beyond to our sub-conscious and un-conscious perception. Even as we
sleep, our brains continue to monitor the outside world through sound. As if sleeping with one eye open, this
evolutionary capability serves as the most basic protection.

SOUND VERSUS VISION

In ‘ordinary’ people, the breadth and depth of auditory perception is extraordinary. We can discern pitch
(frequency) from around 16 Hz to 16 Khz which is 10 octaves. In comparison, visible light is only one octave
from 380 THz (red) to 770 THz (violet).

We can, typically, discern a change in pitch of around five hundredths of a tone which gives us a frequency
palette of around 1,600 pitches (10 octaves, 80 tones, 1600 20ths of a tone) but we can only distinguish about
150 colour hues. When it comes to dynamic range, we are 10,000 times more sensitive to sound as to light
(130dB versus 90dB).

When it comes to time, again there is a huge difference. We can look at a still image, frozen in time but sound
without time cannot exist. Due to persistence of vision, we perceive an image changing 24 times a second
(40ms) as continual movement yet our sensitivity to rhythm is such that we can sense a change in timing as
small as a few milliseconds.

ADAPTING PERCEPTION – SEEING THE WORLD UPSIDE DOWN

We know that the mechanics of our eyes mean that what we see is projected onto our retinas upside down
and our brains reverse the image as a post-processing function. In the 1890’s George Stratton (Stratton, 1896)
experimented with glasses that turned everything upside down and reported that after 4 days he was
perceiving things to be the right way up. After 8 days, he removed the glasses and his ‘normal’ vision was now
inverted. This experiment was part of Stratton’s research into what he called perceptual adaptation proving
the concept that the senses and the brain are interlinked to present a composite ‘reality’ to our consciousness.

This adaptation process can be seen in the way in which we learn. Consider a child learning to ride a bicycle. If
you were to try and program a robot to ride a bicycle, the task would be considerable but a child quickly learns
that turning the handlebars towards the direction that he or she is falling enables balance to be regained. Once
this process is learned it appears to be forgotten just as quickly from the conscious mind as it becomes an
instinctive response to stay upright. The learned behaviour effectively becomes another entry added to the
lexicon of perceptual adaptation in our brain.

Subsequently trying to ride a circus bicycle where the handlebars are connected through a gearing system to
reverse the way that they operate is nigh on impossible but with practice it can be mastered – at the risk of
then being unable to ride a normal bicycle.

EXTENDING PERCEPTION – SEEING THROUGH YOUR EARS


The ability of some blind people to employ echolocation to ‘see’ was first documented in the case of James
Holman (1786-1857). A naval officer, he lost his sight at the age of 25 and, having been invalided out of the
navy, he set out to travel the world. He used the sound created by tapping his cane to navigate by. (Roberts,
2006)

Our ability to perceive our physical environment through auditory cues has been applied and extended by
people like Daniel Kish (Kish), a blind American, who has developed his own audio perception to a workable
echolocation capability and now teaches this technique to other blind people including Lucas Murray, the
youngster from Dorset recently featured on the BBC (BBC, 2009). The extent of Daniel Kish’s capability and
those of some of his friends and/or students is shown by a video clip of them riding their mountain bikes on
roads and trails. (Industry)

It has been thought that people who lose a sense such as sight automatically compensate through the
augmentation of alternative senses such as hearing but there is no evidence that this is an automatic process.
Instead, the evidence shows that we all have the capability to extend our perception through our senses by
learning. The fact that Daniel Kish can train people to echolocate is just one example of this phenomenon.

EXTENDING PERCEPTION – HEARING THROUGH YOUR BODY

Evelyn Glennie became profoundly deaf in her early teens yet, due to her own determination and the
intelligent support of family and teachers, she learnt to extend her perception of sound through her whole
body. That in itself is extraordinary enough but through this she has developed to become one of the world’s
leading exponents of percussion music across all genres from the classical repertoire to modern and
collaborative improvisation.

In the documentary film ‘Touch the Sound’ by Thomas Riedelsheimer (Riedelsheimer, 2004), we see Evelyn
exploring the nature of sound and performing on her own and improvising with others. Watching her play it
becomes evident that she must be ‘hearing’ as clearly, if not more so, as any person who has no hearing
disability.

As with Daniel Kish, she has clearly learnt to extend her perception to levels beyond that which we consider to
be normal. Also, as with Daniel, she has taken the time to teach this process to others, reinforcing the theory
that this is not a unique ability that is gifted to some but it is an innate ability that we all possess yet few
explore or exploit.

Because of the process that Evelyn has put herself through, she has gained a unique experience as to what it
means to listen which clearly informs her practice as a musician, composer and artist and, through her work,
we can perhaps learn something of this.

Whilst it seems extraordinary to hear through your body, it doesn’t take much to realise that we all do it to a
degree, all of the time. Perhaps the most obvious example is why we find it hard to recognise our own voices
when listening to a recording. We don’t hear ourselves the same way that other people hear us because we
hear all the additional information that resonates through our bodies as we speak or sing.

Another simple example can be demonstrated using a tuning fork. Strike the tuning fork and hold it in the air
and you don’t hear much sound. Now strike the fork and press the base of it against your forehead or your jaw
and your can hear the tone clearly as it transmits its resonance through to your skull.

TECHNOLOGY AS A POTENTIAL BARRIER TO PERCEPTION


In the example of Evelyn Glennie teaching a teenage girl to start to listen with her body, she has her remove
her hearing aids as they will interfere with the process.

Peter Mejeri, a Dutch scientist, has designed a system called vOICe to help blind people perceive their
surroundings through sound using a head mounted video camera, the output of which is analysed by software
running on a backpack based laptop. Audio tones are generated in response to visual cues derived from the
video. These cues are fed to the subject through headphones. The subject learns to interpret the sounds and,
through this, visualise the external environment. (University)

Daniel Kish, on the other hand, makes clicking sounds with his mouth and listens to the echoes.

This doesn’t necessarily mean that the vOICe system is a waste of time, it merely highlights that technology
doesn’t always provide the answer and that the more we understand the extent of our own capabilities and
limitations, the better we can design our environment. This applies across the board from aids for disability
through to the creation of art.

If we consider the concept of being able to perceive sound through the body then what does that mean for the
use of headphones as a playback technology for sound – clearly it will significantly reduce our ability to hear
through our body as we have reduced the number of sensory inputs.

When thinking about the design of electronic audio equipment generally, we tend to work to a maximum
audible frequency range of around 20 Hz to 20 kHz. Whilst it is clear that our bodies will respond to
frequencies below 20Hz in the form of vibration, is it also possible that we can detect frequencies above the
‘normal’ hearing range through our bodies?

Ever since we developed the ability to record and replay live acoustic performance, we have struggled to be
able to achieve ‘realism’ in the replay of those recordings. It is likely that one of the reasons for this lack of
realism is that we are focussed on hearing rather than perception of sound. It may even be that the goal of
attaining true realism is impossible due to the extended nature of our perception of sound that is inherent in
all of us.

When MIDI was introduced, companies like Steinberg, C-Lab (now Apple Logic) and Voyetra produced the first
sequencers. One of the first issues was that discovered was that our sensitivity to rhythmic timing was much
greater than anticipated and sequencers were updated to provide greater timing resolution. Similarly,
Bosendorfer, developing their first MIDI controlled concert grand piano, found that seven bit resolution for key
velocity was insufficient for capturing the subtle nuances of concert pianists and they increased it to fourteen
bit resolution.

The next problem was the discovery that we were very sensitive to the lack of variation in rhythmic timing.
Even with very fine time granularity, programmed music still tended to have a very mechanistic ‘feel’. Michael
Stewart, a successful record producer, invented the Kahler ‘Human Clock’, a device designed to introduce
subtle random variations in the MIDI timing in order to simulate the normal human influence.

CONCLUSION

Having established that, in general, our perception of sound is under developed, how can we extend our
perception in a useful and practical way in order to inform our practice?

There is a lot of material in books and on the internet which can extend our knowledge of what is possible but,
at the end of the day, the most valuable research subject is ourselves.
R Murray Schafer (Schafer, 1977, p. 208) states that the first task of the acoustic designer is to learn how to
listen and describes exercises in what he calls ‘ear cleaning’. The most important of which is to learn to respect
silence. There are, undoubtedly, many other techniques that we can employ to train ourselves further.

The extreme examples of Daniel Kish and Evelyn Glennie referred to in this paper may seem daunting but, on
reflection, we will discover that, not only are we all capable of seeing with sound and hearing through our
bodies, we are all doing it to a greater or lesser degree already. The crucial task is to understand how and
where we do this and to trust in our ability to extend these capabilities further.

REFERENCES

BBC. (2009, October 5th). BBC News. Retrieved from BBC News:
http://news.bbc.co.uk/1/hi/england/dorset/8291573.stm

Industry, C. M. (n.d.). Seeing with Sound. Retrieved from www.msichicago.org:


http://www.msichicago.org/whats-here/exhibits/you/the-exhibit/your-movement/stay-active/seeing-with-
sound/

Kish, D. (n.d.). Home Page. Retrieved from World Acces for the Blind: http://worldaccessfortheblind.org

Riedelsheimer, T. (Director). (2004). Touch the Sound [Motion Picture].

Roberts, J. (2006). A Sense of the World. New York: Harper Collins.

Schafer, R. M. (1977). Our Sonic Environment and The Soundscape. New York: Knopf.

Stratton, G. M. (1896). Some preliminary experiments on vision. Psychological Review .

University, H. (n.d.). Sight through Sound. Retrieved from Youtube:


http://www.youtube.com/watch?v=U_TeRKieD0I

Potrebbero piacerti anche