Induction and interaction in the evolution of language and conceptual structure Jon W. Carr
Kinship terms are simple and informative Kemp & Regier (2012) English Northern Paiute
Kinship terms are simple and informative Kemp & Regier (2012) English Northern Paiute ⬅ Informative ⬅ Simple
Pressures operating in simplicity–informativeness space ⬅ Informative o p t i m a l f r o n t i e r ⬅ Simple
Pressures operating in simplicity–informativeness space Learning exerts pressure for simplicity tuge tuge tuge tuge tuge tuge tuge tuge tuge ⬅ Informative tupim tupim tupim miniku miniku miniku o o p p t t tupin tupin tupin i i m m a a poi poi poi l l f f r r o o poi poi poi n n t t i i e e r r poi poi poi Kirby, Cornish, & Smith (2008) ⬅ Simple
Pressures operating in simplicity–informativeness space Communication exerts pressure for informativeness pihino nemone piga kawake ⬅ Informative o o p p t t i i m m kapa gakho wuwele nepi a a l l f f r r o o n n t t i i e e r r newhomo kamone gaku hokako Kirby, Tamariz, Cornish, & Smith (2015) ⬅ Simple
Pressures operating in simplicity–informativeness space Learning + Communication exerts pressure for simplicity and informativeness egewawu egewawa egewuwu ege ⬅ Informative o o p p t t i i m m mega megawawa megawuwu wulagi a a l l f f r r o o n n t t i i e e r r gamenewawu gamenewawa gamenewuwu gamene Kirby, Tamariz, Cornish, & Smith (2015) ⬅ Simple
Semantic category systems
Semantic category systems
Semantic category systems Compactness
Simplicity and informativeness of semantic category systems Simplicity Informativeness – Compact – Random
Simplicity and informativeness of semantic category systems Simplicity Informativeness – Compact – Random
Hallmark features of simple and informative category systems Simplicity pressure from induction favours Few categories Compactness • Compact Informativeness pressure from interaction favours Many categories Compactness • Random
Hallmark features of simple and informative category systems Simplicity pressure from induction favours Few categories Compactness • Compact Informativeness pressure from interaction favours Many categories Compactness Are semantic categories • Random compact because of simplicit y or informativeness?
Part 2
Can iterated learning give rise to informative languages? Carstensen, Xu, Smith, & Regier (2015)
Can iterated learning give rise to informative languages? Carstensen, Xu, Smith, & Regier (2015)
Can iterated learning give rise to informative languages? Carstensen, Xu, Smith, & Regier (2015)
Modelling a Bayesian learner S I Simplicity bias Informativeness bias
Bayesian inference L = { · · ·}
Bayesian inference L = { · · ·} D = [ ⟨ m 1 , s 1 ⟩ , ⟨ m 2 , s 2 ⟩ , ⟨ m 3 , s 3 ⟩ , ..., ⟨ m n , s n ⟩ ]
Bayesian inference L = { · · ·} D = [ ⟨ m 1 , s 1 ⟩ , ⟨ m 2 , s 2 ⟩ , ⟨ m 3 , s 3 ⟩ , ..., ⟨ m n , s n ⟩ ] = � likelihood( D | L ) = P ( s | L, m ) ⟨ m,s ⟩
Bayesian inference L = { · · ·} D = [ ⟨ m 1 , s 1 ⟩ , ⟨ m 2 , s 2 ⟩ , ⟨ m 3 , s 3 ⟩ , ..., ⟨ m n , s n ⟩ ] = � likelihood( D | L ) = P ( s | L, m ) ⟨ m,s ⟩ < prior( L ) ∝ 2 − complexity( L ) S
Bayesian inference L = { · · ·} D = [ ⟨ m 1 , s 1 ⟩ , ⟨ m 2 , s 2 ⟩ , ⟨ m 3 , s 3 ⟩ , ..., ⟨ m n , s n ⟩ ] = � likelihood( D | L ) = P ( s | L, m ) ⟨ m,s ⟩ < prior( L ) ∝ 2 − complexity( L ) S > prior( L ) ∝ 2 − cost( L ) I
Bayesian iterated learning under a simplicity prior S S S
Bayesian iterated learning under a simplicity prior S S S
Bayesian iterated learning under a simplicity prior S S S
Bayesian iterated learning under a simplicity prior S S S
Bayesian iterated learning under a simplicity prior S S S
Bayesian iterated learning under a simplicity prior S S S
Bayesian iterated learning under an informativeness prior I I I
Bayesian iterated learning under an informativeness prior I I I
Model results
Experimental stimuli Angle Size
Iterated learning with humans
Iterated learning with humans
Iterated learning with humans
Iterated learning with humans
Iterated learning with humans
Iterated learning with humans
Converged-on category systems 1 category (2/12) 2 categories (1/12) 3 categories (8/12) 4 categories (1/12)
Model results under best-fit parameters
Can iterated learning give rise to informative languages? Carstensen, Xu, Smith, & Regier (2015)
Model results under best-fit parameters
Hallmark features of simple and informative category systems Simplicity pressure from induction favours Few categories Compactness • Compact Informativeness pressure from interaction favours Many categories Compactness • Random
Part 3
A pressure for informativeness prevents degeneration Kirby, Cornish, & Smith (2008) Experiment 1 Experiment 2 Iterated learning Iterated learning with an informativeness pressure tuge tuge tuge n-ere-ki l-ere-ki renana tuge tuge tuge n-ehe-ki l-aho-ki r-ene-ki tuge tuge tuge n-eke-ki l-ake-ki r-ahe-ki tupim tupim tupim n-ere-plo l-ane-plo r-e-plo miniku miniku miniku n-eho-plo l-aho-plo r-eho-plo tupin tupin tupin n-eki-plo l-aki-plo r-aho-plo poi poi poi n-e-pilu l-ane-pilu r-e-pilu poi poi poi n-eho-pilu l-aho-pilu r-eho-pilu poi poi poi n-eki-pilu l-aki-pilu r-aho-pilu
Continuous, open-ended stimulus space
Transmission design Iterated learning Dynamic set 3 Dynamic set 0 Dynamic set 1 Dynamic set 2 Generation 1 Generation 2 Generation 3
Transmission design Iterated learning Dynamic set 3 Dynamic set 0 Dynamic set 1 Dynamic set 2 Static set Static set Static set Generation 1 Generation 2 Generation 3
Transmission design Iterated learning Dynamic set 3 Dynamic set 0 Dynamic set 1 Dynamic set 2 Static set Static set Static set Generation 1 Generation 2 Generation 3 Iterated learning with communicative interaction Dynamic set 0 Dynamic set 1 Dynamic set 2 Dynamic set 3 Static set Static set Static set Generation 1 Generation 2 Generation 3
Iterated learning gives rise to discrete categories
Iterated learning gives rise to discrete categories
Iterated learning gives rise to discrete categories
Iterated learning gives rise to discrete categories
Iterated learning gives rise to discrete categories
Iterated learning gives rise to discrete categories
Iterated learning gives rise to discrete categories fama
Iterated learning gives rise to discrete categories fama
Iterated learning gives rise to discrete categories a m a p fama
Iterated learning gives rise to discrete categories a m a p fama fod
Iterated learning gives rise to discrete categories a m a p muaki fama fod
Iterated learning gives rise to discrete categories a m a p muaki kazizui fama fod kazizizui k a z i z i z u
Communicative interaction gives rise to sublexical structure
Communicative interaction gives rise to sublexical structure Iterated learning Iterated learning with interaction 500 400 Complexity 300 200 100 0 0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10 6 Communicative cost 5 4 3 2 1 0 0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10 Generation number Generation number Chain A Chain B Chain I Chain J Chain C Chain D Chain K Chain L
Conclusions
Conclusions • Languages are shaped by competing pressures from induction and interaction • The human inductive bias is best characterized by a preference for simplicity • Therefore, iterated learning gives rise to simple, inexpressive category systems with simple, compact structure Side-note: Compact structure also happens to be a feature of informativeness, obscuring the mechanism • But! The presence of communicative interaction prevents this process getting out of hand by permitting the emergence of higher-level forms of linguistic structure • The framework developed in the CLE (which, by the way, has many parallels with a body of work from Regier and colleagues) is resilient to more realistic assumptions about meaning
Thanks!
Recommend
More recommend