Sei sulla pagina 1di 5

Book Reviews

Co mp u t a t i o n a l l e xi c al s e ma n t i c s
Patrick Saint-Dizier and Evelyne Viegas (editors)
(Institut de Recherche en Informat i que de Toulouse, CNRS and Brandeis University)
Cambri dge Uni versi t y Press (Studies in
nat ural l anguage processing, edi t ed by
Branimir K. Boguraev), 1995, ix +
447 pp.
Har dbound, ISBN 0-521-44410-1, $69.95
Reviewed by
Paul Deane
Dataware Technologies
1. Overview
The last decade has seen a striking expansi on of interest in the lexicon and in lexical
semantics. Within theoretical linguistics, this trend can be measur ed by the increasing
interest generat i ve grammari ans have di spl ayed t owar d such issues as lexical con-
cept ual st ruct ure and ar gument structure and by the increased appeal of cognitive
semantics. Within comput at i onal linguistics, the same peri od has seen a burgeoni ng
of interest in the construction of semantically realistic lexicons and their integration
wi t h larger Nat ural Language Processing (NLP) systems. Computational lexical semantics
suppl i es a fascinating snapshot of the state-of-the-art in the early 1990s. The articles
dat e from late 1991 or early 1992, reflecting the i mmedi at e i mpact of Pust ej ovsky' s the-
or y of the generat i ve lexicon (1991), but lacking references to George Miller' s Wor dNet
syst em (1990). The book is organi zed into six sections:
Io
II.
III.
IV.
V.
VI.
Psycholinguistics for lexical semantics;
Foundat i onal issues in lexical semantics;
Lexical databases;
Lexical semantics and artificial intelligence;
Applications;
Comput er model s for lexical semantics.
The book provi des wi de coverage. A vari et y of pr obl ems are addressed, t hough the
focus is upon pol ysemy, di sambi guat i on, and co- composi t i on- - t hat is, processes by
whi ch wor d meani ngs are dynami cal l y der i ved and cont ext ual l y modul at ed. There
are gaps in coverage, however, whi ch appear to reflect the early state of wor k in the
field. Pol ysemy generally falls into one of three categories: met onymy (associations
bet ween concept s in the same domain), met aphor (mappi ngs across concept ual do-
mains), and vari ous pat t erns of semantic variation that are oft en l umped t oget her and
l abel ed as pr ot ot ype effects. But this book (and, it woul d appear, comput at i onal ap-
proaches generally) concentrates al most exclusively on explicating met onymy (which
593
Computational Linguistics Volume 21, Number 4
is, in fact, t he least problematic of t he t hree and t hus t he easiest to formalize). Similarly,
t he articles in the book are strongest when dealing wi t h logical aspects of meani ng.
Very little attention is devot ed to deal i ng wi t h less easily formal i zed aspects of lexical
knowl edge, such as t he role of attention and frequency of use in di sambi guat i on pro-
cesses. These, of course, are not criticisms of the book, but a reflection of its focus on
probl ems that seem tractable usi ng wel l -underst ood tools and techniques.
2. Cont e nt s of t he bo o k
The articles ma y be di vi ded under headi ngs somewhat different from t hose used by
the editors. While t he editors organi zed t he i nformat i on by field and task, t he actual
research is heavi l y wei ght ed t owar d a few key issues, in particular pol ysemy and
lexical disambiguation.
General articles. ' An i nt roduct i on to lexical semantics from a linguistic and a psycholin-
guistic perspective' , by Patrick Saint-Dizier and Evel yne Viegas; ' Pol ysemy and related
phenomena from a cognitive linguistic vi ewpoi nt ' , by D. A. Cruse; ' Mental lexicon and
machi ne lexicon: Whi ch properties are shared by machi ne and ment al wor d represen-
tations? Whi ch are not?' by Jean-Francois Le Ny.
Saint-Dizier and Viegas provi de a brief revi ew of lexical semantics coveri ng classic
concepts in several key frameworks, i ncl udi ng Jackendoff' s lexical concept ual struc-
ture, Pustejovsky' s generat i ve lexicon, and Mel' ~uk' s Expl anat ory Combi nat ory Dic-
t i onary (1988). Cruse focuses on provi di ng a linguistic explication of t he complexities
that make pol ysemy difficult to handl e, i ncl udi ng its partial productivity, lack of clear
boundari es, and context-sensitivity. Le Ny provi des a useful checklist of t he propert i es
of t he ment al lexicon that are and are not paralleled in current NLP lexicons. Criti-
cally, Le Ny notes, such basic psycholinguistic properties as activation (and the related
properties of activability and pre-activation) are not built into most NLP model s of
t he lexicon. The concerns raised by Le Ny and Cruse are ver y i mport ant , part i cul arl y
since t hey do not seem yet to have been fully addressed in current NLP work.
Polysemy in general. ' Word meani ng bet ween lexical and concept ual structure' , by Pe-
ter Gerstl; 'Lexical semantics and terminological knowl edge represent at i on' , by Gerrit
Burkert; ' A prel i mi nary lexical and concept ual analysis of break: A comput at i onal per-
spective' , by Mart ha Pal mer and Alain Polgu6re. Of closely related interest: ' Inheri t i ng
pol ysemy' , by Adam Kilgariff.
Comput at i onal lexical semantics implies t he construction of ver y large lexical
databases that go beyond traditional NLP lexicons (or most AI knowl edge bases for
that matter) by support i ng hi ghl y flexible, dynami c i nt erpret at i ve processing. Gerstl' s
article explores what i nformat i on is requi red to suppor t dynami c lexical i nt erpret a-
tion. Aft er a det ai l ed literature review, he argues programmat i cal l y for an analysis in
whi ch wor d meani ngs are composed of a set of interacting ' factors' , some of whi ch con-
strain and others of whi ch expand t he potential for interpretation. Burkert' s (equally
programmat i c) article focuses on aspects of lexical meani ng amenabl e to traditional
knowl edge represent at i on formalisms (i.e., t er m subsumpt i on languages). By contrast,
Pal mer and Polgu6re' s article is ext remel y data-oriented. They focus upon t he wor d
break, argui ng that it is best anal yzed as a hi erarchy of sense component s, each of
whi ch entails specific constraints on t he wor d' s overall syntactic and semantic struc-
ture. Ada m Kilgariff' s article is ext remel y interesting because it shows that at least
some t ypes of variations in wor d meani ng can be stored in a semant i c net and in-
herited. It shoul d be not ed, however, that Kilgariff' s article focuses on met onymy:
594
Book Reviews
ot her t ypes of pol ysemy are left largely unaddressed. This emphasi s on account i ng
for met onymy is typical of t he current state of NLP wor k on polysemy, as may be
seen in t he articles discussed below.
Metonymy, the generative lexicon and related issues. ' Linguistic constraints on t ype coer-
cion' , by James Pustejovsky; ' From lexical semantics to text analysis' , by Sabine Bergler;
'Lexical functions, generat i ve lexicons, and the worl d' , by Dirk Heylen; ' Semantic fea-
tures in a generic lexicon' , by Gabriel G. B6s and Alain LeComte. A closely related
application of generat i ve lexicon theory: ' The represent at i on of group denot i ng nouns
in a lexical knowl edge base' , by Ann Copestake. Of related interest: ' A lexical semantic
solution to t he di vergence probl em in machi ne translation' , by Bonnie J. Dorr.
Much of the semantic variability of nat ural l anguage is due to the influence of
logical me t onymy- - whe r e wor d meani ngs shift (following concept ual l y nat ural asso-
ciations) dur i ng t he process of semantic composition. Pustejovsky' s t heor y of the gen-
erative lexicon integrates a linguistically mot i vat ed vi ew of met onymy wi t h a rigorous
formal semantics t hr ough the mechani sm of t ype coercion to make t he lexicon into
a generat i ve mechani sm capable of deri vi ng contextually appropri at e meani ngs ' on
t he fly'. Pustejovsky' s article outlines the basic theory; Sabine Bergler discusses its ap-
plications to text comprehensi on. Dirk Heyl en dr aws parallels bet ween Pustejovsky' s
generative lexicon and Mel' ~uk' s Expl anat ory Combi nat ory Dictionary, poi nt i ng out
close parallels bet ween t he t wo systems. B6s and LeComte seek to define a met al an-
guage for describing proposals about wor d meani ng, proposi ng mechani sms parallel
to Pustejovsky' s, t hough purposel y mor e generic.
Copestake' s article is quite interesting, as it has a strong theoretical base but ad-
dresses many of the practical probl ems i nvol ved in constructing a large lexical seman-
tic database. Copestake' s syst em i mpl ement s a version of Pustejovsky' s generat i ve
lexicon in a unification-based lexical knowl edge representation. She devel ops a t ype-
coercion analysis of t he pol ysemy and related syntactic properties of group nouns,
and illustrates how t he rel evant lexical entries can be automatically acqui red from a
machi ne-readabl e dictionary. But, as mi ght be expected, there are significant difficul-
ties, i ncl udi ng probl ems in defi ni ng default i nheri t ance relationships and t he absence
of fully consistent cues for group noun member shi p in the under l yi ng di ct i onary data.
Dorr' s article focuses on machi ne translation but shares wi t h t he articles listed
above a deep concern for provi di ng an adequat e linguistic under pi nni ng for NLP
work. She argues that use of Jackendoff' s lexical conceptual structures provi des a
useful solution to t he probl em of di vergence in machi ne translation. The ar gument is
simple: since lexical conceptual structures (a) are closely tied to syntactic structure, and
(b) are nonet hel ess deep semantic representations, t hey are well sui t ed to provi de an
interlingua. I ndependent l y needed, language-specific mappi ng mechani sms can t hen
be exploited to account for different di vergent syntactic expressions of the common
interlingua.
Theoretically mot i vat ed wor k usi ng Jackendoff' s conceptual semantics a n d / o r
Pustejovsky' s generat i ve lexicon forms the cutting edge of current NLP work. The ar-
ticles listed above provi de a good picture of earl y advances al ong this line of research.
It may be too early, however, to j udge how far the generat i ve lexicon approach can be
taken, until large-scale lexicons based upon its principles have been constructed. But
there can be no doubt that these approaches mar k a significant advance over earlier
comput at i onal model s of t he lexicon.
Disambiguation, defaults, and logical approaches to NLP. ' Large neural net wor ks for t he res-
olution of lexical ambi gui t y' , by Jean Veronis and Nancy Ide; ' Blocking' , by Ted Briscoe,
595
Computational Linguistics Volume 21, Number 4
Ann Copest ake, and Alex Lascarides; ' A non-monot oni c appr oach to lexical seman-
tics' by Daniel Kayser and Hoci ne Abir. Of related interest: ' Int roduci ng Lexlog' , by
Jacques Jayez; ' Constraint propagat i on techniques for lexical semantics descriptions' ,
by Patrick Saint-Dizier.
Word-sense di sambi guat i on is a fundament al and pot ent i al l y intractable problem.
Veronis and Ide argue that it can be accompl i shed by a neural net wor k wi t hout pri or
linguistic analysis. They const ruct a net wor k in whi ch wor ds are associated wi t h the
wor ds in their (machine-readable dictionary) definition. In the dat a sets upon whi ch
t hey report, t hey achieve excellent di sambi guat i on, but their met hod woul d seem
to have severe limitations: it is not at all clear that it woul d scale up well, bot h for
comput at i onal reasons (the size of the requi red neural net work) and theoretical reasons
(the potential for interference from crosstalk among definitions as the number of wor ds
in the net wor k increases.)
Defaul t logic is an obvi ous alternative to connectionist t echni ques of lexical di sam-
biguation. Briscoe et al.'s article deals wi t h the i mport ant issue of lexical bl ocki ng- - t he
prevent i on of one form or meani ng from occurring because a compet i ng el ement al-
r eady exists. In principle, the resol ut i on principles that Briscoe et al. empl oy appl y
equal l y wel l to formal and semantic ambiguity. Kayser and Abi r argue that di sam-
bi guat i on can be formal l y model ed by usi ng defaul t logic to set preferences for one
meani ng over another. The articles by Jayez and Saint-Dizier concern i mpl ement at i ons
of logic pr ogr ammi ng- based NLP systems; whi l e not directly concerned wi t h defaul t
logic, many of the same technical issues arise, such as the propagat i on of i nheri t ed
i nformat i on and the resol ut i on of mul t i pl e constraints in a compl ex knowl edge base.
These articles address one of the thorniest pr obl em t ypes in lexical semantics.
However , defaul t logic has an appar ent weakness when appl i ed to language: its fail-
ure to account for the effects of analogy, habituation, and ot her essentially cognitive
factors. For example, Kayser and Abi r are forced to post ul at e a ' strength' factor (in
effect, an impressionistic measur e of relative psychol ogi cal domi nance, or capaci t y to
attract attention in a neut ral context) to i nduce the defaul t logic to choose a single
(likely) wor d meani ng from the choices that remai n after clearly i nappropri at e mean-
ings have been eliminated. It remai ns to be seen whet her an account can be devel oped
that mai nt ai ns the strengths of a formal logic whi l e i ncorporat i ng realistic theories of
domi nance, attention, and ot her extra-logical aspect s of human cognition.
Structural analysis. 'Lexical semantics: Dictionary or encycl opedi a' , by Marc Cavazza
and Pierre Zwei genbaum; 'Lexical functions of the Expl anat ory Combi nat ori al Dictio-
nary for lexicalization in text generation' , by Margarita Al onso Ramos, Agn6s Tutin,
and Guy LaPalme.
Cavazza and Zwei genbaum descri be a classically structuralist t echni que for an-
alyzing wor d meani ng into contrasting semantic component s. Since t hey appl y the
t echni que to medi cal texts, wher e the key wor ds t end to be hi ghl y t ermi nol ogi zed, the
results are quite good. Keepi ng a strict separat i on bet ween domai n pr ot ot ypes and
wor d definitions, t hey focus on demonst rat i ng how an NLP syst em can infer prot o-
t ypes from partial, phrasal descriptions. Ramos et al. descri be the use of Mel' ~uk' s
lexical functions to generat e anaphor matches, paraphrases, and the like in a speech
generat i on system.
3. Co n c l u s i o n s
One of the striking facts about this book is the wa y in whi ch it illustrates a general
trend: the growi ng recognition (t hroughout linguistics and its allied fields) of the i m-
596

Potrebbero piacerti anche