induction and interaction in the evolution of language
play

Induction and interaction in the evolution of language and - PowerPoint PPT Presentation

Induction and interaction in the evolution of language and conceptual structure Jon W. Carr Kinship terms are simple and informative Kemp & Regier (2012) English Northern Paiute Kinship terms are simple and informative Kemp & Regier


  1. Induction and interaction in the evolution of language and conceptual structure Jon W. Carr

  2. Kinship terms are simple and informative Kemp & Regier (2012) English Northern Paiute

  3. Kinship terms are simple and informative Kemp & Regier (2012) English Northern Paiute ⬅ Informative ⬅ Simple

  4. Pressures operating in simplicity–informativeness space ⬅ Informative o p t i m a l f r o n t i e r ⬅ Simple

  5. Pressures operating in simplicity–informativeness space Learning exerts pressure for simplicity tuge tuge tuge tuge tuge tuge tuge tuge tuge ⬅ Informative tupim tupim tupim miniku miniku miniku o o p p t t tupin tupin tupin i i m m a a poi poi poi l l f f r r o o poi poi poi n n t t i i e e r r poi poi poi Kirby, Cornish, & Smith (2008) ⬅ Simple

  6. Pressures operating in simplicity–informativeness space Communication exerts pressure for informativeness pihino nemone piga kawake ⬅ Informative o o p p t t i i m m kapa gakho wuwele nepi a a l l f f r r o o n n t t i i e e r r newhomo kamone gaku hokako Kirby, Tamariz, Cornish, & Smith (2015) ⬅ Simple

  7. Pressures operating in simplicity–informativeness space Learning + Communication exerts pressure for simplicity and informativeness egewawu egewawa egewuwu ege ⬅ Informative o o p p t t i i m m mega megawawa megawuwu wulagi a a l l f f r r o o n n t t i i e e r r gamenewawu gamenewawa gamenewuwu gamene Kirby, Tamariz, Cornish, & Smith (2015) ⬅ Simple

  8. Semantic category systems

  9. Semantic category systems

  10. Semantic category systems Compactness

  11. Simplicity and informativeness of semantic category systems Simplicity Informativeness – Compact – Random

  12. Simplicity and informativeness of semantic category systems Simplicity Informativeness – Compact – Random

  13. Hallmark features of simple and informative category systems Simplicity pressure from induction favours Few categories Compactness • Compact Informativeness pressure from interaction favours Many categories Compactness • Random

  14. Hallmark features of simple and informative category systems Simplicity pressure from induction favours Few categories Compactness • Compact Informativeness pressure from interaction favours Many categories Compactness Are semantic categories • Random compact because of simplicit y or informativeness?

  15. Part 2

  16. Can iterated learning give rise to informative languages? Carstensen, Xu, Smith, & Regier (2015)

  17. Can iterated learning give rise to informative languages? Carstensen, Xu, Smith, & Regier (2015)

  18. Can iterated learning give rise to informative languages? Carstensen, Xu, Smith, & Regier (2015)

  19. Modelling a Bayesian learner S I Simplicity bias Informativeness bias

  20. Bayesian inference L = { · · ·}

  21. Bayesian inference L = { · · ·} D = [ ⟨ m 1 , s 1 ⟩ , ⟨ m 2 , s 2 ⟩ , ⟨ m 3 , s 3 ⟩ , ..., ⟨ m n , s n ⟩ ]

  22. Bayesian inference L = { · · ·} D = [ ⟨ m 1 , s 1 ⟩ , ⟨ m 2 , s 2 ⟩ , ⟨ m 3 , s 3 ⟩ , ..., ⟨ m n , s n ⟩ ] = � likelihood( D | L ) = P ( s | L, m ) ⟨ m,s ⟩

  23. Bayesian inference L = { · · ·} D = [ ⟨ m 1 , s 1 ⟩ , ⟨ m 2 , s 2 ⟩ , ⟨ m 3 , s 3 ⟩ , ..., ⟨ m n , s n ⟩ ] = � likelihood( D | L ) = P ( s | L, m ) ⟨ m,s ⟩ < prior( L ) ∝ 2 − complexity( L ) S

  24. Bayesian inference L = { · · ·} D = [ ⟨ m 1 , s 1 ⟩ , ⟨ m 2 , s 2 ⟩ , ⟨ m 3 , s 3 ⟩ , ..., ⟨ m n , s n ⟩ ] = � likelihood( D | L ) = P ( s | L, m ) ⟨ m,s ⟩ < prior( L ) ∝ 2 − complexity( L ) S > prior( L ) ∝ 2 − cost( L ) I

  25. Bayesian iterated learning under a simplicity prior S S S

  26. Bayesian iterated learning under a simplicity prior S S S

  27. Bayesian iterated learning under a simplicity prior S S S

  28. Bayesian iterated learning under a simplicity prior S S S

  29. Bayesian iterated learning under a simplicity prior S S S

  30. Bayesian iterated learning under a simplicity prior S S S

  31. Bayesian iterated learning under an informativeness prior I I I

  32. Bayesian iterated learning under an informativeness prior I I I

  33. Model results

  34. Experimental stimuli Angle Size

  35. Iterated learning with humans

  36. Iterated learning with humans

  37. Iterated learning with humans

  38. Iterated learning with humans

  39. Iterated learning with humans

  40. Iterated learning with humans

  41. Converged-on category systems 1 category (2/12) 2 categories (1/12) 3 categories (8/12) 4 categories (1/12)

  42. Model results under best-fit parameters

  43. Can iterated learning give rise to informative languages? Carstensen, Xu, Smith, & Regier (2015)

  44. Model results under best-fit parameters

  45. Hallmark features of simple and informative category systems Simplicity pressure from induction favours Few categories Compactness • Compact Informativeness pressure from interaction favours Many categories Compactness • Random

  46. Part 3

  47. A pressure for informativeness prevents degeneration Kirby, Cornish, & Smith (2008) Experiment 1 Experiment 2 Iterated learning Iterated learning with an informativeness pressure tuge tuge tuge n-ere-ki l-ere-ki renana tuge tuge tuge n-ehe-ki l-aho-ki r-ene-ki tuge tuge tuge n-eke-ki l-ake-ki r-ahe-ki tupim tupim tupim n-ere-plo l-ane-plo r-e-plo miniku miniku miniku n-eho-plo l-aho-plo r-eho-plo tupin tupin tupin n-eki-plo l-aki-plo r-aho-plo poi poi poi n-e-pilu l-ane-pilu r-e-pilu poi poi poi n-eho-pilu l-aho-pilu r-eho-pilu poi poi poi n-eki-pilu l-aki-pilu r-aho-pilu

  48. Continuous, open-ended stimulus space

  49. Transmission design Iterated learning Dynamic set 3 Dynamic set 0 Dynamic set 1 Dynamic set 2 Generation 1 Generation 2 Generation 3

  50. Transmission design Iterated learning Dynamic set 3 Dynamic set 0 Dynamic set 1 Dynamic set 2 Static set Static set Static set Generation 1 Generation 2 Generation 3

  51. Transmission design Iterated learning Dynamic set 3 Dynamic set 0 Dynamic set 1 Dynamic set 2 Static set Static set Static set Generation 1 Generation 2 Generation 3 Iterated learning with communicative interaction Dynamic set 0 Dynamic set 1 Dynamic set 2 Dynamic set 3 Static set Static set Static set Generation 1 Generation 2 Generation 3

  52. Iterated learning gives rise to discrete categories

  53. Iterated learning gives rise to discrete categories

  54. Iterated learning gives rise to discrete categories

  55. Iterated learning gives rise to discrete categories

  56. Iterated learning gives rise to discrete categories

  57. Iterated learning gives rise to discrete categories

  58. Iterated learning gives rise to discrete categories fama

  59. Iterated learning gives rise to discrete categories fama

  60. Iterated learning gives rise to discrete categories a m a p fama

  61. Iterated learning gives rise to discrete categories a m a p fama fod

  62. Iterated learning gives rise to discrete categories a m a p muaki fama fod

  63. Iterated learning gives rise to discrete categories a m a p muaki kazizui fama fod kazizizui k a z i z i z u

  64. Communicative interaction gives rise to sublexical structure

  65. Communicative interaction gives rise to sublexical structure Iterated learning Iterated learning with interaction 500 400 Complexity 300 200 100 0 0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10 6 Communicative cost 5 4 3 2 1 0 0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10 Generation number Generation number Chain A Chain B Chain I Chain J Chain C Chain D Chain K Chain L

  66. Conclusions

  67. Conclusions • Languages are shaped by competing pressures from induction and interaction • The human inductive bias is best characterized by a preference for simplicity • Therefore, iterated learning gives rise to simple, inexpressive category systems with simple, compact structure 
 Side-note: Compact structure also happens to be a feature of informativeness, obscuring the mechanism • But! The presence of communicative interaction prevents this process getting out of hand by permitting the emergence of higher-level forms of linguistic structure • The framework developed in the CLE (which, by the way, has many parallels with a body of work from Regier and colleagues) is resilient to more realistic assumptions about meaning

  68. Thanks!

Recommend


More recommend