a modular theory of pronouns and binding
play

A modular theory of pronouns and binding Simon Charlow (Rutgers) - PowerPoint PPT Presentation

A modular theory of pronouns and binding Simon Charlow (Rutgers) LENLS 14, University of Tsukuba, Tokyo November 14, 2017 1 Overview Today: a brief on the power of abstraction and modularity in semantic theorizing, with a focus on pronouns and


  1. A modular theory of pronouns and binding Simon Charlow (Rutgers) LENLS 14, University of Tsukuba, Tokyo November 14, 2017 1

  2. Overview Today: a brief on the power of abstraction and modularity in semantic theorizing, with a focus on pronouns and the grammatical mechanisms for dealing with them. Semanticists tend to respond to things beyond the Fregean pale by lexically and compositionally generalizing to the worst case. One-size-fits-all. Functional programmers instead look for repeated patterns, and abstract those out as separate, modular pieces (functions). When we do semantics, this strategy has conceptual and especially empirical virtues. 2

  3. The standard theory, and its discontents 3

  4. A baseline semantic theory Meanings are individuals, propositions, or functions from meanings to meanings: τ :: = e | t | τ → τ � �� � e → t , ( e → t ) → t , ... Binary-branching nodes are interpreted via (type-driven) functional application: � α β � : = � α �� β � or � β �� α � , whichever is defined 4

  5. Pronouns and binding This picture is awesome. But a lot of important stuff doesn’t fit neatly in it. Our focus today: (free and bound) pronouns — how are they valued, and what ramifications does the need to value them have for the rest of the grammar? 1. John saw her i . 2. Every philosopher i thinks they i ’re a genius. 5

  6. Standardly: extending the baseline theory with assignments Denotations uniformly depend on assignments (ways of valuing free variables): τ ◦ :: = e | t | τ ◦ → τ ◦ τ :: = g → τ ◦ � �� � g → e → t , g → ( e → t ) → t , ... Interpret binary combination via assignment-sensitive functional application: � � α β � � : = λ g .� � α � � g (� � β � � g ) or � � β � � g (� � α � � g ) , whichever is defined 6

  7. Standardly: extending the baseline theory with assignments Denotations uniformly depend on assignments (ways of valuing free variables): τ ◦ :: = e | t | τ ◦ → τ ◦ τ :: = g → τ ◦ � �� � g → e → t , g → ( e → t ) → t , ... Interpret binary combination via assignment-sensitive functional application: � � α β � � : = λ g .� � α � � g (� � β � � g ) or � � β � � g (� � α � � g ) , whichever is defined 6

  8. Sample derivation λ g . saw g 0 j g → t λ g . j λ g . saw g 0 g → e g → e → t John λ g . saw λ g . g 0 g → e → e → t g → e saw her 0 [Apply the result to a contextually furnished assignment to get a proposition.] 7

  9. Sample derivation λ g . saw g 0 j g → t λ g . j λ g . saw g 0 g → e g → e → t John λ g . saw λ g . g 0 g → e → e → t g → e saw her 0 [Apply the result to a contextually furnished assignment to get a proposition.] 7

  10. Sample derivation λ g . saw g 0 j g → t λ g . j λ g . saw g 0 g → e g → e → t John λ g . saw λ g . g 0 g → e → e → t g → e saw her 0 [Apply the result to a contextually furnished assignment to get a proposition.] 7

  11. Complicating the lexicon: Nonprominals λ g . saw g 0 j g → t λ g . j λ g . saw g 0 g → e g → e → t John λ g . saw λ g . g 0 g → e → e → t g → e saw her 0 8

  12. Complicating the grammar: Abstraction λ g . f g ( left g 0 ) = λ g . λ x . left x g → e → t f λ g . left g 0 g → t → e → t g → t Λ 0 t 0 left 9

  13. Complicating the grammar: Abstraction λ g . f g ( left g 0 ) = λ g . λ x . left x g → e → t f λ g . left g 0 g → t → e → t g → t Λ 0 t 0 left No f works in the general case... The grammar wants to interpret both branches at the same assignment , but the right node must be interpreted at a shifted assignment : � Λ i α � : = λ g . λ x .� α � g i → x � �� � extending � · � with a syncategorematic rule 9

  14. Under-generation: (binding) reconstruction It is well known that (quantificational) binding does not require surface c-command (e.g., Sternefeld 1998, 2001, Barker 2012): 1. Which of their i relatives does everyone i like ? 2. His i mom, every boy i likes . 3. Their advisor i seems to every Ph.D. student i to be a genius. 4. Unless he i ’s been a bandit, no man i can be an officer . 10

  15. Under-generation: (binding) reconstruction It is well known that (quantificational) binding does not require surface c-command (e.g., Sternefeld 1998, 2001, Barker 2012): 1. Which of their i relatives does everyone i like ? 2. His i mom, every boy i likes . 3. Their advisor i seems to every Ph.D. student i to be a genius. 4. Unless he i ’s been a bandit, no man i can be an officer . But Predicate Abstraction passes modified assignments down the tree , and so binding invariably requires (LF) c-command. Scoping the quantifier over the pronoun restores LF c-command, but should trigger a Weak Crossover violation: 5. *Who i does his i mother like ? 6. *His i superior reprimanded no officer i . 10

  16. Under-generation: paycheck pronouns Simple pronouns anaphoric to expressions containing pronouns can receive “sloppy” readings (e.g., Cooper 1979, Engdahl 1986, Jacobson 2000): 1. John i deposited [his i paycheck] j , but Bill k spent it j . 2. Every semanticist i deposited [their i paycheck] j . Every philosopher k spent it j . 11

  17. Under-generation: paycheck pronouns Simple pronouns anaphoric to expressions containing pronouns can receive “sloppy” readings (e.g., Cooper 1979, Engdahl 1986, Jacobson 2000): 1. John i deposited [his i paycheck] j , but Bill k spent it j . 2. Every semanticist i deposited [their i paycheck] j . Every philosopher k spent it j . These are unaccounted for on the standard picture. There are two (related) issues: a. The paycheck pronoun’s meaning is different from the thing it’s anaphoric to. b. How does the k “bind into” something with a different index? 11

  18. Roadmap The theoretical baggage associated with the standard account is straightforward and cheap to dispense with, via something called an applicative functor . The empirical baggage seems to require an additional piece for dealing with higher-order meanings . This upgrades the applicative functor into a monad . 12

  19. Roadmap The theoretical baggage associated with the standard account is straightforward and cheap to dispense with, via something called an applicative functor . The empirical baggage seems to require an additional piece for dealing with higher-order meanings . This upgrades the applicative functor into a monad . ◮ Time permitting, I’ll deflate monads a bit, at least for pronouns. :) 12

  20. Getting modular 13

  21. Abstracting out and modularizing the standard account’s key parts In lieu of treating everything as trivially dependent on an assignment, invoke a function ρ which turns any x into a constant function from assignments into x : ρ : = λ x . λ g . x � �� � a → g → a 14

  22. Abstracting out and modularizing the standard account’s key parts In lieu of treating everything as trivially dependent on an assignment, invoke a function ρ which turns any x into a constant function from assignments into x : ρ : = λ x . λ g . x � �� � a → g → a Instead of taking on � � · � � wholesale, we’ll help ourselves to a function ⊛ which allows us to perform assignment-friendly function application on demand: ⊛ : = λ m . λ n . λ g . mg ( ng ) � �� � ( g → a → b ) → ( g → a ) → g → b 14

  23. Sample derivations λ g . spoke g 0 g → t λ g . g 0 λ n . λ g . spoke ( ng ) g → e ( g → e ) → g → t she 0 ⊛ λ g . spoke g → e → t ρ spoke e → t spoke 15

  24. Sample derivations λ g . spoke g 0 g → t λ g . g 0 λ n . λ g . spoke ( ng ) g → e ( g → e ) → g → t she 0 ⊛ λ g . spoke g → e → t ρ spoke e → t spoke 15

  25. Sample derivations λ g . spoke g 0 g → t λ g . g 0 λ n . λ g . spoke ( ng ) g → e ( g → e ) → g → t she 0 ⊛ λ g . spoke g → e → t ρ spoke e → t spoke 15

  26. Sample derivations λ g . spoke g 0 g → t λ g . g 0 λ n . λ g . spoke ( ng ) g → e ( g → e ) → g → t she 0 ⊛ λ g . spoke g → e → t ρ spoke e → t spoke 15

  27. Sample derivations λ g . saw g 0 j g → t λ g . spoke g 0 λ g . j λ n . λ g . saw g 0 ( ng ) g → t g → e ( g → e ) → g → t ρ ⊛ λ g . g 0 λ n . λ g . spoke ( ng ) j λ g . saw g 0 g → e ( g → e ) → g → t e g → e → t she 0 John ⊛ λ g . spoke λ n . λ g . saw ( ng ) λ g . g 0 g → e → t ( g → e ) → g → e → t g → e her 0 ρ ⊛ spoke λ g . saw e → t g → e → e → t spoke ρ saw e → e → t saw 15

  28. Sample derivations λ g . saw g 0 j g → t λ g . spoke g 0 λ g . j λ n . λ g . saw g 0 ( ng ) g → t g → e ( g → e ) → g → t ρ ⊛ λ g . g 0 λ n . λ g . spoke ( ng ) j λ g . saw g 0 g → e ( g → e ) → g → t e g → e → t she 0 John ⊛ λ g . spoke λ n . λ g . saw ( ng ) λ g . g 0 g → e → t ( g → e ) → g → e → t g → e her 0 ρ ⊛ spoke λ g . saw e → t g → e → e → t spoke ρ saw e → e → t saw 15

Recommend


More recommend