Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated Learning Model Theories and Models of Language Change Conclusion Homeworks Session 7: Models III - Emergence of Grammar Roland Mühlenbernd 2014/12/03
Review: Universal Darwinism Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated Learning Model Mechanisms of universal evolution: Conclusion 1. variation : continuing abundance of different elements Homeworks 2. selection : number/probability of copies of elements - depending on interaction between element features and environmental features 3. replication : reproduction/copying of elements What is the role of grammar in an evolutionary model of language change?
Language Change - Broad and Narrow Sense Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated Learning Model Conclusion Homeworks
The Emergence of Linguistic Structure Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated Learning Model “The most basic principle giuding [language] design Conclusion is not communicative utility but reproduction – theirs Homeworks and ours... Languages are social and cultural entities that have evolved with respect to the forces of selection imposed by human users. The structure of a language is under intense selection because in its reproduction from generation to generation, it must pass through a narrow bottleneck: children’s minds.” (Deacon, 1997: 110)
Language Emergence on 3 Adaptive Systems Roland Mühlenbernd Introduction: The Language is a result of three complex adaptive systems: Evolut. Approach ◮ biological evolution (phylogeny) The Iterated Learning Model ◮ individual learning (ontogeny) Conclusion ◮ language change/cultural evolution (glossogeny) Homeworks
The Iterated Learning Model Roland Mühlenbernd Introduction: The Evolut. Approach Two forms of representation: The Iterated Learning Model ◮ I-language: I nternal representation as pattern of neural Conclusion Homeworks connectivity (more abstract: grammar) ◮ E-language: E xternal representation as actual sets of utterances (all possible grammatical expressions) Two forms of interactions: ◮ language use: I-language → E-language ◮ language learning: E-language → I-language
Exercise 1 (Kirby & Hurford 2002) Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated Learning Model has the following four basic The Iterated Learning Model components: Conclusion ◮ a learning bottleneck Homeworks ◮ a homogeneous population structure ◮ a meaning space √ ◮ one or more language-using agents √ ◮ a set of stable languages ◮ a signal space √ ◮ one or more language-learning agents √ ◮ one or more language-imitating agents
A Simple ILM Roland Mühlenbernd Example: Introduction: The Learner as neural network Evolut. Approach Meanings The Iterated ◮ male/female Learning Model Conclusion ◮ related/unrelated Homeworks ◮ older generation ◮ younger generation Signals ◮ p/m ◮ u/a ◮ t/d ◮ a/o Speaker production: s desired = arg max s C ( m | s )
A Simple ILM Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated 1. initial population: two randomly initializes networks for Learning Model speaker and hearer each Conclusion Homeworks 2. certain number of randomly chosen meanings from 00000000 to 11111111 ( n of 256) 3. speaker produces signals for each of this meanings 4. hearer learns by back-propagation error learning (minimizing an error function) 5. remove speaker, hearer becomes speaker, new hearer is added (with random weights) 6. repeat circle
Exercise 2 (Kirby & Hurford 2002) Roland Mühlenbernd Introduction: The The simple ILM version (as presented on page 125/126) Evolut. Approach produces three types of behavior, dependent on the size of the The Iterated Learning Model training set. Allocate the following training set sizes to the type Conclusion of the language that emerges. Homeworks A very small learning set inexpressive and unstable ⇔ (20 random meanings) language A very large learning set completely expressive un- ⇔ (2000 random meanings) structured language A medium-sized learning set expressive and highly ⇔ (50 random meanings) structured language
A Simple ILM: Results Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated Learning Model Conclusion Homeworks Structured: ◮ p/m → +og/-og ◮ u/a → +yg/-yg ◮ t/d → rel./unrel. ◮ a/o → female/male Unstructured: ◮ pato → (grand)father/uncle ◮ muda → (grand)mother/aunt ◮ pata → young man ... speaker/learner difference – proportion of covered meaning space
Compositionality & Recursion Roland Mühlenbernd Introduction: The Evolut. Approach ◮ Compositionality : A compositional signaling system is The Iterated Learning Model one in which the meaning of a signal is some function of Conclusion the meaning of the parts of that signal and the way in Homeworks which they are put together ◮ But: sentence-meaning mapping is not only compositional, but recursive ◮ Digital infinity: potentially infinite use of finite means by constructing syntactic structures that contain structures of the same type ◮ Note: Simple ILM produced compositional (but not recursive) language for the medium-sized learning set
ILM for Recursive Compositionality Roland Mühlenbernd Results: Extended Model Introduction: The Evolut. Approach ◮ meaning space: simple variant of The Iterated predicate logic Learning Model ◮ signal space: string of characters Conclusion Homeworks ◮ example: loves(mary,john) ↔ “marylovesjohn” ◮ learning method enables the learning/parsing of rules ◮ production mechanism includes innovation
Size & Expressivity of Grammars Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated Learning Model Conclusion Homeworks
Some Notes Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated Learning Model ◮ the resulting structures of the ILM are eventually stable, Conclusion but languages are always changing Homeworks ◮ learning is not the only mechanism at work in the generational transmission of language ◮ speaker selection might also play an important role ◮ in the following model ◮ the principle of least effort is assumed (speaker economy): speaker use the shortest of multiple alternatives ◮ imperfect production is integrated: random string dropping
Frequency and Irregularity Roland Mühlenbernd Frequency often correlates with Irregularity: Introduction: The ◮ top 10 verbs (Engl.): be, have, do, say, make, go, take, come, see, get Evolut. Approach The Iterated Learning Model Changed Model: Conclusion ◮ meaning space: (objects with) two properties a and b Homeworks ◮ meaning probability defined by index: if i > j , then p ( a i ) < p ( a j ) ◮ signal space: string of characters Result: example of emerged language : Note: Hurford (2000) simulates a ILM with a meaning space that combines predicate logic with frequency
Exercise 3 (Kirby & Hurford 2002) Roland Mühlenbernd Fill the gaps in the following quote from “The Emergence of Introduction: The Linguistic Structure”: Evolut. Approach The Iterated “In this simulation, an iterated learning model was imple- Learning Model Conclusion mented, with a population of agents starting with no language Homeworks at all, and over time a language emerged in the community in which there were completely general compositional rules for expressing a range of meanings represented as formulae in predicate logic. A variation on the basic simulation was then implemented in which one particular meaning was used with vastly greater frequency than any of the other available meanings. This inflated frequency held throughout the simu- lated history of the community. The result was that, as before, a language emerged in the population characterized by gen- eral compositional rules, but in addition, all speakers also had one special idiosyncratic stored fact pertaining to the highly frequent meaning.”
Social Transmission favors Generalization Roland Mühlenbernd Introduction: The Evolut. Approach The Iterated Learning Model Note: Conclusion ◮ the strength or coverage of generalization correlates with Homeworks the survival potential of meaning-form pairs ◮ linguistic generalization is favored by social transmission (iterated learning) ◮ infant drive for internal generalization might be the prime mover in causing regularities in languages ◮ creatures with no such drive at all would produce no historical E-languages with persistent regular patterns
Exercise 4 (Kirby & Hurford 2002) Roland Mühlenbernd Kirby & Hurford discuss the force of generalization in human Introduction: The language evolution and mention that it is important to distinguish Evolut. Approach between The Iterated Learning Model a) the evolutionary source and Conclusion b) the reason for the historical persistence Homeworks of generalization. Accordingly, complete the following expressions appropriately: ...is the child’s innate capacity to gener- alize (a) (a) The evolutionary source of generalization... ...is the human language faculty in the narrow sense, particularly recursion (Chomsky, Hauser, Fitch) (b) The reason for the historical ...is the inherent advantage of general persistence of generalization... patterns to be propagated across gener- ations (b)
Recommend
More recommend