Sei sulla pagina 1di 3

ABSTRACT

SUPRAMODAL REPRESENTATION OF OBJECTS AND ACTIONS IN THE HUMAN


INFERIOR TEMPORAL AND VENTRAL PREMOTOR CORTEX
Ferdinand Binkofski1,2, Giovanni Buccino3, Karl Zilles4 and Gereon R. Fink4,5
(1Dept. of Neurology, University Hospital Schleswig-Holstein, Campus Lbeck, Germany; 2Neuroimaging Center
Hamburg, Kiel, Lbeck, Germany; 3Institute of Physiology, Parma, University, Italy; 4Institute of Medicine, Research
Center Jlich, Germany, 5Dept. of Neurology, RWTH Aachen, Germany)

INTRODUCTION
Objects can be recognized not only by direct
vision, but also by tactile exploration or by a
characteristic sound they produce. Although the
modalities of object presentation (and therefore
also the primary stages of information processing)
may differ, one can also expect that object
recognition related supramodal neural processing
exists (Mesulam, 1998). When the perceived
objects are tools they may also encode or trigger
neural processes related to potential actions. Again,
these processes are likely to occur independent of
the modality in which they are presented (tactile,
acoustic, visual).
The first goal of the present fMRI study was to
identify the supramodal neural mechanisms that
allow for modality independent object recognition.
Since an area around the fusiform gyrus, mainly
the lateral occipital complex (LOC, Malach et al.,
1995), has been recognised as the structure
involved in object feature processing for object
recognition (Kanwisher et al., 1997; Grill-Spector
et al., 1998) and lesions of the inferotemporal
corex are associated with deficits in object
recognition (Damasio et al., 2000), we
hypothesized that a putative supramodal object
recognition area might be localized in this same
region. In addition, when objects are tools we
expected to identify the cerebral structures engaged
in triggering object related actions.
METHODS
In the first experiment, nine healthy male
subjects (aged 22-38) were asked: (i) to look at
objects presented on a screen (C1a), (ii) to
manipulate objects that were placed into their
hands by one of the experimenter (C1b), (iii) to
listen to a characteristic sound made by a specific
object (C1c). Between conditions, modality related
specific baseline conditions (to view an indifferent
screen, to hold an indifferent sphere, and to listen
to white noise, respectively) were implemented. In
the second experiment, eight healthy male subjects
(aged 21-32) were asked to perform characteristic
pantomimic hand movements related to tools with
Cortex, (2004) 40, 159-161

the right hand. As in experiment 1, the tools were


presented (i) visually (C2a), (ii) tactically (C2b), or
(iii) acoustically (C2c). In addition, in a fourth
condition the tools were presented (iv) in all three
modalities simultaneously (i.e., subjects saw the
object on a screen, touched it and heard the
objects characteristic sound; C2d). The same
baseline conditions as in the first experiment were
used. Twenty four different objects were presented
in each modality.
FMRI measurements were performed on a
SIEMENS VISION 1.5 T scanner utilising standard
EPI sequences. Block designs were used with
pseudo-randomised order of conditions. Data
analysis
was
performed
using
SPM99
(www.fil.ion.ucl.ac.uk/spm) with a standard box-car
reference function. Categorical comparisons
between the conditions were employed at a
significance level of p < 0.05, corrected for
multiple comparisons and extended threshold of 10
voxels.
RESULTS
Experiment 1
Unimodal object recognition
All objects were recognized by all subjects in
each modality. Significant modality-associated (i.e.,
visual, tactile, auditory) activations in the
respective primary, secondary, and associated
higher order cortices were observed in all
stimulus conditions relative to the modality-specific
baselines.
Visual object recognition (C1a) activated the
visual cortex bilaterally and bilateral extrastriate
areas with two distinct local maxima in the
posterior and lateral part of the fusiform gyrus on
the left and only one local maximum in the
posterior fusiform gyrus on the right side.
Auditory object recognition (C1b) activated
primary auditory cortex bilaterally, an area in the
right mid-temporal gyrus, the posterior fusiform
gyrus bilaterally and fronto-opercular cortex
bilaterally.
Tactile object recognition (C1c) activated the

160

Ferdinand Binkofsky and Others

Fig. 1 Projection onto the MNI-template of A) the results of the conjunction analysis between the visual, tactile and acoustic object
recognition (Experiment 1) and B) the subtraction of the pantomimes related to objects presented in all three modalities (acoustic, visual,
tactile) simultaneously from the sum of the unimodal object presentation conditions (Experiment 2).

sensorimotor cortex bilaterally, the secondary


somatosensory cortex bilaterally, the right
cerebellar hemisphere and the vermis cerebelli.
Additionally, the right fronto-opercular region, the
supplementary motor area and the left posterior
fusiform gyrus were activated.

cortex, the supplementary motor area and the


cerebellum.

Neural activity common to all three modalities

The comparison of the neural activity related to


the performance of pantomimes related to objects
(tools) presented in all three modalities
simultaneously vs the sum of neural activities
elicited by movements related to object (tool)
presentation in each individual modality revealed
significant differential activation of the ventral
premotor cortex bilaterally (Fig. 1B).
No further differential activations were
observed.

A conjunction analysis (Price and Friston, 1997)


revealed significant activation common to all three
modalities (i.e., vision, somatosensation and
audition) when conveying object recognition
(relative to the respective modality specific
baseline conditions) that was most prominent
around the bilateral posterior fusiform gyrus
(Talairach coordinates: x = 48, y = 64, z =
20; x = 48, y = 64, z = 20). Significant
activations were also found in the frontoopercular
regions bilaterally (Fig. 1A).
Experiment 2
Polymodal pantomimic representations of objects
The performance of pantomimes related to
objects (tools) that were presented in the three
different modalities visual (C2a), tactile (C2b),
acoustic (C2c) elicited neural activations in
practically the same modality related areas as in
the previous experiment plus additional modalityindependent activation of the primary sensorimotor

Neural activation related to polymodal vs. unimodal


coding of object (tools) related actions [C2d >
(C2a + C2b + C2c)]

CONCLUSIONS
The data indicate that the posterior fusiform
gyrus contains a supramodal object recognition
area, which is probably part of LOC (Malach et al.,
1995). This area seems to allow for object
recognition irrespective of the modality in which
they are presented. Recent neuroimaging studies
have already demonstrated that the inferior
temporal cortex participates in object recognition
processes in modalities other than vision (Amedi et
al., 2001; De Volder et al., 2001). Since all objects
used in the first study were potential tools (ball,
scissors, hammer, etc.), the premotor activations

Cortex Forum

observed in the first experiment might be related to


a supramodal pragmatic coding of potential actions
on these objects. Recently, in the ventral premotor
cortex of the macaques neurons coding actions in
different modalities have been found (Kohler et al.,
2002). This new finding, together with the results
of the second experiment, support the idea of
supramodal coding of actions in the human
premotor cortex.
Acknowledgments. The study was supported by the
Deutsche Forschungsgemeinschaft (DFG).
REFERENCES
AMEDI A, MALACH R, HENDLER T, PELED S and ZOHARY E. Visuohaptic object-related activation in the ventral visual pathway.
Nature Neurosciences, 4: 324-330, 2001.
DAMASIO AR, TRANEL D and RIZZO M. Disorders of complex
visual processing. In M-M Mesulam (Ed) Principles of
Cognitive and Behavioral Neurology, Oxford University
Press, 2000, pp. 332-372.
DE VOLDER A, TOYAMA H, KIMURA Y, KIYOSAWA M, NAKANO H,
VANLIERDE A, WANET-DEFALQUE MC, MISHINA M, ODA K,
ISCHIWATA K and SENDA M. Auditory triggered mental imagery

161

of shape involves visual association areas in early blind


humans. Neuroimage, 14: 129-139, 2001.
GRILL-SPECTOR K, KUSHNIR T, EDELMAN S, AVIDAN G, ITZCHAK Y
and MALACH R. Cue invariant activation in object-related
areas of the human occipital lobe. Neuron, 21: 191-202, 1998.
KANWISHER N, WOODS RP, IACOBONI M and MAZZIOTA JC. A locus
in human extrastriate cortex for visual shape analysis. Journal
of Cognitive Neuroscience, 9: 133-142, 1997.
KOHLER E, KEYSERS C, UMILT MA, FOGASSI L, GALLESE V and
RIZZOLATTI G. Hearing sounds, understanding actions: Action
representation in mirror neurons. Science, 297: 846-848,
2002.
MALACH R, REPPAS JB, BENSON RR, KWONG KK, JIANG H,
KENNEDU WA, LEDDEN PJ, BRADY TJ, ROSEN BR and
TOOTELL RBH. Object-related activity revealed by functional
magnetic resonance imaging in human occipital cortex.
Proceedings of National Academy of Sciences, USA, 92:
8135-8139, 1995.
MESULAM M-M. From sensation to cognition. Brain, 121: 10131052, 1998.
PRICE CJ and FRISTON KJ. Cognitive conjunction: a new approach
to brain activation experiments. Neuroimage, 5: 261-270,
1997.
TALAIRACH J and TOURNOUX P. Co-planar stereotactic atlas of the
human brain. New York: Thieme-Verlag, 1988.
Ferdinand Binkofski, Department of Neurology, University Hospital SchleswigHolstein, Campus Lbeck, Ratzeburger Allee 160, 23538 Lbeck, Germany. E-mail:
binkofski.f@neuro.mu-luebeck.de

Potrebbero piacerti anche