Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Neurological Bases
Pitch
For each pitch there is a corresponding part of the tonotopically organized basilar membrane in the inner ear which responds to the sound and sends a signal to the auditory cortex. Sections of cells in the cortex are responsive to certain frequencies, which range from very low to very high in pitches.[1] This organization may not be stable and the specific cells that are responsive to different pitches may change over days or months.[2]
The primary auditory cortex is one of the main areas associated with superior pitch resolution.
Many neuroimaging studies have found evidence of the importance of right secondary auditory regions in aspects of musical pitch processing, such as melody.[5] Many of these studies such as one by Patterson, Uppenkamp, Johnsrude and Griffiths (2002) also find evidence of a hierarchy of pitch processing. Patterson et al. (2002) used spectrally matched sounds which produced: no pitch, fixed pitch or melody in an fMRI study and found that all conditions activated HG and PT. Sounds with pitch activated more of these regions than sounds without. When a melody was produced activation spread to the superior temporal gyrus (STG) and planum polare (PP). These results support the existence of a pitch processing hierarchy.
Rhythm
The belt and parabelt areas of the right hemisphere are involved in processing rhythm. When individuals are preparing to tap out a rhythm of regular intervals (1:2 or 1:3) the left frontal cortex, left parietal cortex, and right cerebellum are all activated. With more difficult rhythms such as a 1:2.5, more areas in the cerebral cortex and cerebellum are involved.[6] EEG recordings have also shown a relationship between brain electrical activity and rhythm perception. Snyder and Large (2005) performed a study examining rhythm perception in human subjects, finding that activity in the gamma band (20 60 Hz) corresponds to the 'beats' in a simple rhythm. Two types of gamma activity were found by Snyder et al: (2005); induced gamma activity, and evoked gamma activity. Evoked gamma activity was found after the onset of each tone in the rhythm; this activity was found to be phase-locked (peaks and troughs were directly related to the exact onset of the tone) and did not appear when a gap (missed beat) was present in the rhythm. Induced gamma activity, which was not found to be phase-locked, was also found to correspond with each beat. However, induced gamma activity did not subside when a gap was present in the rhythm, indicating that induced gamma activity may possibly serve as a sort of internal metronome independent of auditory input.
Tonality
The right auditory cortex is primary involved in perceiving pitch, and parts of harmony, melody and rhythm.[6] One study by Peter Janata found that there are tonally sensitive areas in the medial prefrontal cortex, the cerebellum, the superior Temporal sulci of both hemispheres and the Superior Temporal gyri (which has a skew towards the right hemisphere).
Similarities
Studies have shown that the human brain has an implicit musical ability.[19][20] Koelsch, Gunter, Friederici and Schoger (2000) investigated the influence of preceding musical context, task relevance of unexpected chords and the degree of probability of violation on music processing in both musicians and non-musicians.[19] Findings showed that the human brain unintentionally extrapolates expectations about impending auditory input. Even in non-musicians, the extrapolated expectations are consistent with music theory. The ability to process information musically supports the idea of an implicit musical ability in the human brain. In a follow-up study, Koelsch, Schroger, and Gunter (2002) investigated whether ERAN and N5 could be evoked preattentively in non-musicians.[20] Findings showed that both ERAN and N5 can be elicited even in a situation where the musical stimulus is ignored by the listener indicating that there is a highly differentiated preattentive musicality in the human
Gender Differences
Minor neurological differences with regard to hemispheric processing exist between brains of males and females. Koelsch, Maess, Grossmann and Friederici (2003) investigated music processing through EEG and ERPs and discovered gender differences.[21] Findings showed that females process music information bilaterally and males process music with a right-hemispheric predominance. However, the early negativity of males was also present over the left hemisphere. This indicates that males do not exclusively utilize the right hemisphere for musical information processing. In a follow-up study, Koelsch, Grossman, Gunter, Hahne, Schroger and Friederici (2003) found that boys show lateralization of the early anterior negativity in the left hemisphere but found a bilateral effect in girls.[22] This indicates a developmental effect as early negativity is lateralized in the right hemisphere in men and in the left hemisphere in boys.
Handedness Differences
It has been found that subjects who are lefthanded, particularly those who are also ambidextrous, perform better than righthanders on short term memory for the pitch.[23][24] It was hypothesized that this handedness advantage is due to the fact that lefthanders have more duplication of storage in the two hemispheres than do righthanders. Other work has shown that there are pronounced differences between righthanders and lefthanders (on a statistical basis) in how musical patterns are perceived, when sounds come from different regions of space. This has been found, for example, in the Octave illusion [25][26] and the Scale illusion.[27][28]
Musical Imagery
Musical imagery refers to the experience of replaying music by imagining it inside the head.[29] Musicians show a superior ability for musical imagery due to intense musical training.[30] Herholz, Lappe, Knief and Pantev (2008) investigated the differences in neural processing of a musical imagery task in musicians and non-musicians. Utilizing magnetoencephalography (MEG), Herholz et al. examined differences in the processing of a musical imagery task with familiar melodies in musicians and non-musicians. Specifically, the study examined whether the mismatch negativity (MMN) can be based solely on imagery of sounds. The task involved participants listening to the beginning of a melody, continuation of the melody in his/her head and finally hearing a correct/incorrect tone as further continuation of the melody. The imagery of these melodies was strong enough to obtain an early preattentive brain response to unanticipated violations of the imagined melodies in the musicians. These results indicate similar neural correlates are relied upon for trained musicians imagery and perception. Additionally, the findings suggest that modification of the imagery mismatch negativity (iMMN) through intense musical training results in achievement of a superior ability for imagery and preattentive processing of music. Perceptual musical processes and musical imagery may share a neural substrate in the brain. A PET study conducted by Zatorre, Halpern, Perry, Meyer and Evans (1996) investigated cerebral blood flow (CBF) changes related to auditory imagery and perceptual tasks.[31] These tasks examined the involvement of particular anatomical regions as well as functional commonalities between perceptual processes and imagery. Similar patterns of CBF changes provided evidence supporting the notion that imagery processes share a substantial neural substrate with related perceptual processes. Bilateral neural activity in the secondary auditory cortex was associated with both perceiving and imagining songs. This implies that within the secondary auditory cortex, processes underlie the phenomenological impression of imagined sounds. The supplementary motor area (SMA) was active in both imagery and perceptual tasks suggesting covert vocalization as an element of musical imagery. CBF increases in the inferior frontal polar cortex and right thalamus suggest that these regions may be related to retrieval and/or generation of auditory information from memory.
Absolute Pitch
Absolute pitch (AP) is defined as the ability to identify the pitch of a musical tone or to produce a musical tone at a given pitch without the use of an external reference pitch.[32] Neuroscientific research has not discovered a distinct activation pattern common for possessors of AP. Zatorre, Perry, Beckett, Westbury and Evans (1998) examined the neural foundations of AP using functional and structural brain imaging techniques.[33] Positron Musicians possessing absolute pitch can emission tomography (PET) was utilized to measure cerebral blood flow identify the pitch of musical tones without external reference. (CBF) in musicians possessing AP and musicians lacking AP. When presented with musical tones, similar patterns of increased CBF in auditory cortical areas emerged in both groups. AP possessors and non-AP subjects demonstrated similar patterns of left dorsolateral frontal activity when they performed relative pitch judgments. However, in non-AP subjects activation in the right inferior frontal cortex was present whereas AP possessors showed no such activity. This finding suggests that musicians with AP do not need access to working memory devices for such tasks. These findings imply that there is no specific regional activation pattern unique to AP. Rather, the availability of specific processing mechanisms and task demands determine the recruited neural areas.
Emotion
Emotions induced by music activate similar frontal brain regions compared to emotions elicited by other stimuli.[34] Schmidt and Trainor (2001) discovered that valence (i.e. positive vs. negative) of musical segments was distinguished by patterns of frontal EEG activity.[35] Joyful and happy musical segments were associated with increases in left frontal EEG activity whereas fearful and sad musical segments were associated with increases in right frontal EEG activity. Additionally, the intensity of emotions was differentiated by the pattern of overall frontal EEG activity. Overall frontal region activity increased as affective musical stimuli became more intense.[35] Music is able to create an incredibly pleasurable experience that can be described as chills.[36] Blood and Zatorre (2001) used PET to measure changes in cerebral blood flow while participants listened to music that they knew to give them the chills or any sort of intensely pleasant emotional response. They found that as these chills increase, many changes in cerebral blood flow are seen in brain regions such as the amygdala, orbitofrontal cortex, ventral striatum, midbrain, and the ventral medial prefrontal cortex. Many of these areas appear to be linked to reward and motivation, emotion and arousal and are also activated in other pleasurable situations.[36] Nucleus accumbens (a part of striatum) is involved in both music related emotions, as well as rhythmic timing. When unpleasant melodies are played, the posterior cingulate cortex activates, which indicates a sense of conflict or emotional pain.[6] The right hemisphere has also been found to be correlated with emotion, which can also activate areas in the cingluate in times of emotional pain, specifically social rejection (Eisenberger). This evidence, along with observations, has led many musical theorists, philosophers and neuroscientists to link emotion with tonality. This seems almost obvious because the tones in music seem like a characterization of the tones in human speech, which indicate emotional content. The vowels in the phonemes of a song are elongated for a dramatic effect, and it seems as though musical tones are simply exaggerations of the normal verbal tonality.
Memory
Neuropsychology of Musical Memory
Musical memory involves both explicit and implicit memory systems.[37] Explicit musical memory is further differentiated between episodic (where, when and what of the musical experience) and semantic (memory for music knowledge including facts and emotional concepts). Implicit memory centers on the how of music and involves automatic processes such as procedural memory and motor skill learning in other words skills critical for playing an instrument. Samson and Baird (2009) found that the ability of musicians with Alzheimers Disease to play an instrument (implicit procedural memory) may be preserved.
Development
The musical four year olds have been found to have compared to one greater left hemisphere intrahemispheric coherence.[41] Musicians have been found to have more developed anterior portions of the corpus callosum in a study by Cowell et al. in 1992 . This was confirmed by a study by Schlaug et al. in 1995 who found that classical musicians between the ages of 21 and 36 have significantly greater anterior corpora callosa than the non-musical control. Schlaug also found that there was a strong correlation of musical exposure before the age of seven, and a great increase in the size of the corpus callosum.[41] These fibers join together the left and right hemispheres and indicate an increased relaying between both sides of the brain. This suggests the merging between the spatialemotiono-tonal processing of the right brains and the linguistical processing of the left brain. This large relaying across many different areas of the brain might contribute to musics ability to aid in memory function.
Impairment
Focal Hand Dystonia
Focal hand dystonia is a task-related movement disorder associated with occupational activities that require repetitive hand movements.[42] Focal hand dystonia is associated with abnormal processing in the premotor and primary sensorimotor cortices. An fMRI study examined five guitarists with focal hand dystonia.[43] The study reproduced task-specific hand dystonia by having guitarists use a real guitar neck inside the scanner as well as performing a guitar exercise to trigger abnormal hand movement. The dystonic guitarists showed significantly more activation of the contralateral primary sensorimotor cortex as well as a bilateral underactivation of premotor areas. This activation pattern represents abnormal recruitment of the cortical areas involved in motor control. Even in professional musicians, widespread bilateral cortical region involvement is necessary to produce complex hand movements such as scales and arpeggios. The abnormal shift from premotor to primary sensorimotor activation directly correlates with guitar-induced hand dystonia.
Music Agnosia
Music agnosia, an auditory agnosia, is a syndrome of selective impairment in music recognition.[44] Three cases of music agnosia are examined by Dalla Bella and Peretz (1999); C.N., G.L., and I.R.. All three of these patients suffered bilateral damage to the auditory cortex which resulted in musical difficulties while speech understanding remained intact. Their impairment is specific to the recognition of once familiar melodies. They are spared in recognizing environmental sounds and in recognizing lyrics. Peretz (1996) has studied C.N.s music agnosia further and reports an initial impairment of pitch processing and spared temporal processing.[45] C.N. later recovered in pitch processing abilities but remained impaired in tune recognition and familiarity judgments. Musical agnosias may be categorized based on the process which is impaired in the individual.[46] Apperceptive music agnosia involves an impairment at the level of perceptual analysis involving an inability to encode musical information correctly. Associative music agnosia reflects an impaired representational system which disrupts music recognition. Many of the cases of music agnosia have resulted from surgery involving the middle cerebral artery. Patient studies have surmounted a large amount of evidence demonstrating that the left side of the brain is more suitable for holding long-term memory representations of music and that the right side is important for controlling access to these representations. Associative music agnosias tend to be produced by damage to the left hemisphere, while apperceptive music agnosia reflects damage the to right hemisphere.
Congenital Amusia
Congenital amusia, otherwise known as tone deafness, is a term for lifelong musical problems which are not attributable to mental retardation, lack of exposure to music or deafness, or brain damage after birth .[47] Amusic brains have been found in fMRI studies to have less white matter and thicker cortex than controls in the right inferior frontal cortex. These differences suggest abnormal neuronal development in the auditory cortex and inferior frontal gyrus, two areas which are important in musical-pitch processing. Studies on those with amusia suggest different processes are involved in speech tonality and musical tonality. Congenital amusics lack the ability to distinguish between pitches and so are for example unmoved by dissonance and playing the wrong key on a piano. They also cannot be taught to remember a melody or to recite a song; however, they are still capable of hearing the intonation of speech, for example, distinguishing between You speak French and You speak French? when spoken.
Amygdala damage
Damage to the amygdala has selective emotional impairments on musical recognition. Gosselin, Peretz, Johnsen and Adolphs (2007) studied S.M., a patient with bilateral damage of the amygdala with the rest of the temporal lobe undamaged and found that S.M. was impaired in recognition of scary and sad music.[48] S.M.s perception of happy music was normal, as was her ability to use cues such as tempo to distinguish between happy and sad music. It appears that damage specific to the amygdala can selectively impair recognition of scary music.
Auditory Arrhythmia
Arrhythmia in the auditory modality is defined as a disturbance of rhythmic sense; and includes deficits such as the inability to rhythmically perform music, the inability to keep time to music and the inability to discriminate between or reproduce rhythmic patterns.[50] A study investigating the elements of rhythmic function examined Patient H.J., who acquired arrhythmia after sustaining a right temporoparietal infarct.[50] Damage to this region impaired H.J.s central timing system which is essentially the basis of his global rhythmic impairment. H.J. was unable to generate steady pulses in a tapping task. These findings suggest that keeping a musical beat relies on functioning in the right temporal auditory cortex.
10
References
[1] Arlinger, S.; Elberling, C.; Bak, C.; Kofoed, B.; Lebech, J.; Saermark, K. (1982). "Cortical magnetic fields evoked by frequency glides of a continuous tone". EEG & Clinical Neurophysiology 54 (6): 642653. doi:10.1016/0013-4694(82)90118-3. [2] Janata P, Birk J, Van Horn J, Leman M, Tillmann B, Bharucha J. (2002). "The cortical topography of tonal structures underlying Western music". Science 298 (5601): 216770. doi:10.1126/science.1076262. PMID12481131. [3] Brattico, E.; Tervaniemi, M.; Naatanen, R.; Peretz, I. (2006). "Musical scale properties are automatically processed in the human auditory cortex". Brain Research 1117 (1): 162174. doi:10.1016/j.brainres.2006.08.023. PMID16963000. [4] Hyde, K. L; Peretz, I.; Zatorre, R. J. (2008). "Evidence for the role of the right auditory cortex in fine pitch resolution". Neuropsychologia 46 (2): 632639. doi:10.1016/j.neuropsychologia.2007.09.004. PMID17959204. [5] Patterson, R. D.; Uppenkamp, S.; Johnsrude, I. S.; Griffiths, T. D. (2002). "The processing of temporal pitch and melody information in auditory cortex". Neuron 36 (4): 767776. doi:10.1016/S0896-6273(02)01060-7. PMID12441063. [6] Tramo MJ. (2001). "Biology and music. Music of the hemispheres". Science 291 (5501): 546. doi:10.1126/science.10.1126/SCIENCE.1056899. PMID11192009. [7] Brown, S.; Martinez, M.J.; Parsons, L.M. (2006). "Music and language side by side in the brain: a PET study of the generation of melodies and sentences". European Journal of Neuroscience 23 (10): 27912803. doi:10.1111/j.1460-9568.2006.04785.x. PMID16817882. [8] Jentschke, S.; Koelsch, S.; Sallat, S.; Friederici, A.D. (2008). "Children with specific language impairment also show impairment of music-syntactic processing". Journal of Cognitive Neuroscience 20 (11): 19401951. doi:10.1162/jocn.2008.20135. PMID18416683. [9] Stewart, L.; Walsh, V.; Frith, U.; Rothwell, J. (2001). "Transcranial magnetic stimulation produces speech arrest but not song arrest". Annals New York Academy of Sciences 930: 43335. doi:10.1111/j.1749-6632.2001.tb05762.x. [10] Koelsch, S., Gunter, T., Cramon, D., Zysset, S., Lohmann, G. & Friederici, A. (2002). "Bach Speaks: A Cortical Language-Network Serves the Processing of Music". NeuroImage 17 (2): 956966. doi:10.1006/nimg.2002.1154. PMID12377169. [11] Brown, S., Martinez, M. & Parsons, L. (2006). "Music and language side by side in the brain: a PET study of the generation of melodies and sentences". European Journal of Neuroscience 23 (10): 27912803. doi:10.1111/j.1460-9568.2006.04785.x. PMID16817882. [12] Daltrozzo, J., Schn, D. (2009). Conceptual processing in music as revealed by N400 effects on words and musical targets. Journal of Cognitive Neuroscience, 21(10): 1882-1892. (http:/ / daltrozzo. net78. net/ papers/ Daltrozzo_Schon_2009a. pdf) [13] Deutsch, D., Henthorn, T., Marvin, E., & Xu H-S (2006). "Absolute pitch among American and Chinese conservatory students: Prevalence differences, and evidence for a speech-related critical period". Journal of the Acoustical Society of America 119 (2): 719722. doi:10.1121/1.2151799. PMID16521731. PDF Document (http:/ / philomel. com/ pdf/ JASA-2006_119_719-722. pdf) [14] Deutsch, D., Dooley, K., Henthorn, T. and Head, B. (2009). "Absolute pitch among students in an American music conservatory: Association with tone language fluency". Journal of the Acoustical Society of America 125 (4): 23982403. doi:10.1121/1.3081389. PMID19354413. Weblink (http:/ / scitation. aip. org/ vsearch/ servlet/ VerityServlet?KEY=ASADL& smode=strresults& sort=rel& maxdisp=25& threshold=0& pjournals=ARLOFJ,JASMAN,NOCOAN,SOUCAU,PMARCW,ATCODK,ASASTR& possible1=Absolute+ pitch+ among+ students+ in+ an+ American+ music+ conservatory& possible1zone=article& OUTLOG=NO& viewabs=JASMAN& key=DISPLAY& docID=2& page=0& chapter=0) PDF Document (http:/ / philomel. com/ pdf/ JASA-2009_125_2398-2403. pdf) [15] Gaser, C.; Schlaug, G. (2003). "Brain structures differ between musicians and non-musicians". The Journal of Neuroscience 23 (27): 92409245. PMID14534258. [16] Croom, A.M. (2012). "Music, neuroscience, and the psychology of wellbeing: a prcis". Frontiers in Theoretical and Philosophical Psychology 2 (393): 1-15. doi:10.3389/fpsyg.2011.00393. [17] Krings, T.; Topper, R.; Foltys, H.; Erberich, S.; Sparing, R.; Willmes, K.; Thron, A. (1999). "Cortical activation patterns during complex motor tasks in piano players and control subjects. A functional magnetic resonance imaging study". Neuroscience Letters 278 (3): 189193. doi:10.1016/S0304-3940(99)00930-1. PMID10653025. [18] Koeneke, S.; Lutz, K.; Wustenberg, T.; Jancke, L. (2004). "Long-term training affects cerebellar processing in skilled keyboard players". Neuroreport 15 (8): 12791282. doi:10.1097/01.wnr.0000127463.10147.e7. PMID15167549. [19] Koelsch, S.; Gunter, T.; Friederici, A.D.; Schoger, E. (2000). "Brain indices of music processing: "nonmusicians" are musical". Journal of Cognitive Neuroscience 12 (3): 520541. doi:10.1162/089892900562183. PMID10931776. [20] Koelsch, S.; Schroger, E.; Gunter, T. (2002). "Music matters: preattentive musicality of the human brain". Psychophysiology 39 (1): 3848. doi:10.1111/1469-8986.3910038. PMID12206294. [21] Koelsch, S.; Maess, B.; Grossmann, T.; Friederici, A.D. (2003). "Electric brain responses reveal gender differences in music processing". Neuroreport 14 (5): 709713. doi:10.1097/00001756-200304150-00010. PMID12692468. [22] Koelsch, S.; Grossmann, T.; Gunter, T.C.; Hahne, A.; Schroger, E.; Friederici, A.D. (2003). "Children processing music: electric brain responses reveal musical competence and gender differences". Journal of Cognitive Neuroscience 15 (5): 683693. doi:10.1162/jocn.2003.15.5.683. PMID12965042. [23] Deutsch, D. (1978). "Pitch memory: An advantage for the lefthanded.". Science 199 (4328): 559560. doi:10.1126/science.622558. PMID622558. PDF Document (http:/ / philomel. com/ pdf/ Science-1978_199_559-560. pdf) [24] Deutsch, D. (1980). "Handedness and Memory for Tonal Pitch. In J. Herron (Ed.)". Neuropsychology of Lefthandedness: 263271. PDF Document (http:/ / philomel. com/ pdf/ Ch-Handedness-1980. pdf) [25] Deutsch, D. (1974). "An auditory illusion". Nature 251 (5473): 307309. doi:10.1038/251307a0. PMID4427654. Weblink (http:/ / www. nature. com/ nature/ journal/ v251/ n5473/ abs/ 251307a0. html) PDF Document (http:/ / philomel. com/ pdf/ Nature-1974_251_307-309. pdf)
11
12
Further reading
The neurosciences and music II : from perception to performance. New York, N.Y: New York Academy of Sciences. 2005. ISBN1573316105.
External links
Croom, Adam M. (2012). "Music, Neuroscience, and the Psychology of Wellbeing: A Prcis." Frontiers in Theoretical and Philosophical Psychology, 2, 393, 1-15 (http://www.frontiersin.org/ theoretical_and_philosophical_psychology/10.3389/fpsyg.2011.00393/abstract). Deutsch, Diana (July 29, 2010). "Speaking in Tones: Music and Language Partner in the Brain" (http://www. scientificamerican.com/article.cfm?id=speaking-in-tones-jul10). Scientific American Mind. Weinberger, Norman M. (October 25, 2004). "Music and the Brain" (http://www.sciam.com/article. cfm?chanID=sa006&articleID=0007D716-71A1-1179-AF8683414B7F0000). Scientific American. Connection Science (2009). Special Issue "Music, Brain, & Cognition" 21(2-3). (http://prod.informaworld.com/ smpp/title~db=all~content=g911586331)
13
License
Creative Commons Attribution-Share Alike 3.0 Unported //creativecommons.org/licenses/by-sa/3.0/