BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:An efficiency theory of complexity and related phenomena - Profess
 or John Hawkins\, University of Cambridge\, RCEAL and University of Califo
 rnia\, Davis.
DTSTART:20071127T160000Z
DTEND:20071127T173000Z
UID:TALK8364@talks.cam.ac.uk
CONTACT:Napoleon Katsos
DESCRIPTION:In this talk I discuss the relationship between efficiency and
  complexity in language structure and language use.  The notion of complex
 ity has received more attention than efficiency and there have been seriou
 s attempts to define metrics for it.  I argue that these metrics are curre
 ntly of limited value.  They do not succeed in defining the structural pre
 ferences of language use and of grammars that they are designed to predict
 .  Instead I argue that efficiency is the primary concept that we should b
 e defining\, I outline some of the different ways in which efficiency is a
 chieved\, and I link degrees of complexity in performance and grammars to 
 this larger theory of efficiency.\n\nTheories of linguistic complexity sha
 re the guiding intuition that "more [structural units/rules/representation
 s] means more complexity".  This intuition has proved hard to define.  See
 \, for example\, the lively debate in the papers of Linguistic Typology (2
 001) Vol.5-2/3 responding to McWhorter's (2001) analysis of creoles as "th
 e ... simplest" linguistic systems.  The discussion went to the heart of t
 he fundamental question:  what exactly is complexity?  and how do we defin
 e it?\n\nSome problems include:\nTrade-offs:  simplicity in one part of th
 e grammar often results in complexity in another\;  several illustrations 
 will be given.\n\nOverall complexity:  the trade-offs make it difficult to
  give an overall assessment of complexity\, resulting in unresolvable deba
 tes over whether some grammars are more complex than others\, when there i
 s no clear metric of overall complexity for deciding the matter. \n\nDefin
 ing grammatical properties:  the (smaller) structural units of grammars ar
 e often clearly definable\, but the rules and representations are anything
  but and theories differ over whether they assume "simplicity" in their su
 rface syntactic structures (see e.g. Culicover & Jackendoff 2005)\, or in 
 derivational principles (as in Chomsky 1995)\, making quantification of co
 mplexity difficult in the absence of agreement over what to quantify.\n\nD
 efining complexity itself:  should our definition be stated in terms of ru
 les or principles that generate the structures of each grammatical area (i
 .e. in terms of the "length" of the description or grammar\, as discussed 
 most recently and extensively in Dahl 2004)\, or in terms of the structure
 s themselves (the outputs of the grammar)?  Definitions of this latter kin
 d are inherent in metrics such as Miller & Chomsky's (1963) using non-term
 inal to terminal node ratios\, and in Frazier's (1985)\, Hawkins' (1994/20
 04) and Gibson's (1998) metrics.  Do these rule-based and structure-based 
 definitions give the same complexity ranking or not?\n\nI argue that we ca
 n solve some of these problems if metrics of complexity are embedded in a 
 larger theory of efficiency.  Efficiency relates to the basic function of 
 language\, which is to communicate information from the speaker (S) to the
  hearer (H).  I propose the following definition:\n\nCommunication is effi
 cient when the message intended by S is delivered to H  in rapid time and 
 with minimal processing effort\;\n\nand the following hypothesis:\n\nActs 
 of communication between S and H are generally optimally efficient\;   tho
 se that are not occur in proportion to their degree of  efficiency.\n\nCom
 plexity metrics\, by contrast\, are defined on the grammar and structure o
 f language.  An important component of efficiency often involves structura
 l and grammatical simplicity.  But sometimes efficiency results in greater
  complexity.  And it also involves additional factors that determine the s
 peaker's structural selections\, leading to the observed preferences of pe
 rformance\, including:\n\nspeed in delivering linguistic properties in on-
 line processing\;\n\nfine-tuning structural selections to (i) frequency of
  occurrence and (ii) accessibility\;\n\nfew on-line errors or garden paths
 .\n\nThese factors interact\, sometimes reinforcing sometimes opposing one
  another.  In Hawkins (1994\, 2004) I have presented evidence that grammat
 ical conventions across languages conventionalize these performance factor
 s and reveal a similar interaction and competition between them.  Comparin
 g grammars in terms of efficiency\, rather than complexity alone\, gives u
 s a more complete picture of the forces that have shaped grammars and of t
 he resulting variation (including creoles).  Cross-linguistic variation pa
 tterns also provide quantitative data that can be used to determine the re
 lative strength of different factors and the manner of their interaction w
 ith one another. \n\nReferences\n\nChomsky\, N. (1995). The Minimalist Pro
 gram. Cambridge\, Mass.: MIT Press.\n\nCulicover\, P.W. & Jackendoff\, R. 
 (2005). Simpler Syntax. Oxford: Oxford University Press.\n\nDahl\, Ö. (20
 04). The Growth and Maintenance of Linguistic Complexity. Amsterdam: John 
 Benjamins.\n\nFrazier\, L. (1985). Syntactic complexity. In: D. Dowty\, L.
  Karttunen & A. Zwicky\, eds.\, Natural Language Parsing. Cambridge: Cambr
 idge University Press.\n\nGibson\, E. (1998). Linguistic complexity: Local
 ity of syntactic dependencies. Cognition  68:1-76.\n\nHawkins\, J.A. (1994
 ). A Performance Theory of Order and Constituency. Cambridge: Cambridge Un
 iversity Press.\n\nHawkins\, J.A. (2004). Efficiency and Complexity in Gra
 mmars. Oxford: Oxford University Press.\n\nMcWhorter\, J. (2001). The worl
 d's simplest grammars are creole grammars. Linguistic Typology 5: 125-166.
LOCATION:GR-06/07\, English Faculty Building
END:VEVENT
END:VCALENDAR
