non classical logics for natural language introduction to
play

Non-Classical Logics for Natural Language: Introduction to - PowerPoint PPT Presentation

Non-Classical Logics for Natural Language: Introduction to Substructural Logics Raffaella Bernardi KRDB, Free University of Bozen-Bolzano e-mail: bernardi@inf.unibz.it Contents First Last Prev Next Contents 1 Course Overview . . .


  1. 2.2. Long-distance Dependencies Interdependent constituents need not be juxtaposed, but may form long-distance dependencies, manifested by gaps ◮ What cities does Ryanair service [ . . . ]? The constituent what cities depends on the verb service, but is at the front of the sentence rather than at the object position. Contents First Last Prev Next ◭

  2. 2.2. Long-distance Dependencies Interdependent constituents need not be juxtaposed, but may form long-distance dependencies, manifested by gaps ◮ What cities does Ryanair service [ . . . ]? The constituent what cities depends on the verb service, but is at the front of the sentence rather than at the object position. Such distance can be large, ◮ Which flight do you want me to book [ . . . ]? Contents First Last Prev Next ◭

  3. 2.2. Long-distance Dependencies Interdependent constituents need not be juxtaposed, but may form long-distance dependencies, manifested by gaps ◮ What cities does Ryanair service [ . . . ]? The constituent what cities depends on the verb service, but is at the front of the sentence rather than at the object position. Such distance can be large, ◮ Which flight do you want me to book [ . . . ]? ◮ Which flight do you want me to have the travel agent book [ . . . ]? Contents First Last Prev Next ◭

  4. 2.2. Long-distance Dependencies Interdependent constituents need not be juxtaposed, but may form long-distance dependencies, manifested by gaps ◮ What cities does Ryanair service [ . . . ]? The constituent what cities depends on the verb service, but is at the front of the sentence rather than at the object position. Such distance can be large, ◮ Which flight do you want me to book [ . . . ]? ◮ Which flight do you want me to have the travel agent book [ . . . ]? ◮ Which flight do you want me to have the travel agent nearby my office book [ . . . ]? Contents First Last Prev Next ◭

  5. 2.3. Relative Pronouns and Coordination ◮ Relative Pronoun (eg. who, which): they function as e.g. the subject or object of the verb embedded in the relative clause ( rc ), ⊲ [[the [student [who [ . . . ] knows Sara] rc ] n ] np [left] v ] s . Contents First Last Prev Next ◭

  6. 2.3. Relative Pronouns and Coordination ◮ Relative Pronoun (eg. who, which): they function as e.g. the subject or object of the verb embedded in the relative clause ( rc ), ⊲ [[the [student [who [ . . . ] knows Sara] rc ] n ] np [left] v ] s . ⊲ [[the [book [which Sara wrote [ . . . ]] rc ] n ] np [is interesting] v ] s . Contents First Last Prev Next ◭

  7. 2.3. Relative Pronouns and Coordination ◮ Relative Pronoun (eg. who, which): they function as e.g. the subject or object of the verb embedded in the relative clause ( rc ), ⊲ [[the [student [who [ . . . ] knows Sara] rc ] n ] np [left] v ] s . ⊲ [[the [book [which Sara wrote [ . . . ]] rc ] n ] np [is interesting] v ] s . ◮ Coordination : Expressions of the same syntactic category can be coordinated via “and”, “or”, “but” to form more complex phrases of the same category. For instance, a coordinated verb phrase can consist of two other verb phrases separated by a conjunction: Contents First Last Prev Next ◭

  8. 2.3. Relative Pronouns and Coordination ◮ Relative Pronoun (eg. who, which): they function as e.g. the subject or object of the verb embedded in the relative clause ( rc ), ⊲ [[the [student [who [ . . . ] knows Sara] rc ] n ] np [left] v ] s . ⊲ [[the [book [which Sara wrote [ . . . ]] rc ] n ] np [is interesting] v ] s . ◮ Coordination : Expressions of the same syntactic category can be coordinated via “and”, “or”, “but” to form more complex phrases of the same category. For instance, a coordinated verb phrase can consist of two other verb phrases separated by a conjunction: ⊲ There are no flights [[leaving Denver] vp and [arriving in San Francisco] vp ] vp Contents First Last Prev Next ◭

  9. 2.3. Relative Pronouns and Coordination ◮ Relative Pronoun (eg. who, which): they function as e.g. the subject or object of the verb embedded in the relative clause ( rc ), ⊲ [[the [student [who [ . . . ] knows Sara] rc ] n ] np [left] v ] s . ⊲ [[the [book [which Sara wrote [ . . . ]] rc ] n ] np [is interesting] v ] s . ◮ Coordination : Expressions of the same syntactic category can be coordinated via “and”, “or”, “but” to form more complex phrases of the same category. For instance, a coordinated verb phrase can consist of two other verb phrases separated by a conjunction: ⊲ There are no flights [[leaving Denver] vp and [arriving in San Francisco] vp ] vp The conjuncted expressions belong to traditional constituent classes, vp . However, we could also have Contents First Last Prev Next ◭

  10. 2.3. Relative Pronouns and Coordination ◮ Relative Pronoun (eg. who, which): they function as e.g. the subject or object of the verb embedded in the relative clause ( rc ), ⊲ [[the [student [who [ . . . ] knows Sara] rc ] n ] np [left] v ] s . ⊲ [[the [book [which Sara wrote [ . . . ]] rc ] n ] np [is interesting] v ] s . ◮ Coordination : Expressions of the same syntactic category can be coordinated via “and”, “or”, “but” to form more complex phrases of the same category. For instance, a coordinated verb phrase can consist of two other verb phrases separated by a conjunction: ⊲ There are no flights [[leaving Denver] vp and [arriving in San Francisco] vp ] vp The conjuncted expressions belong to traditional constituent classes, vp . However, we could also have ⊲ I [[[want to try to write [ . . . ]] and [hope to see produced [ . . . ]]] [the movie] np ] vp ” Again, the interdependent constituents are disconnected from each other. Contents First Last Prev Next ◭

  11. 2.4. Ambiguity ◮ Lexical Ambiguity : a single word can have more than one syntactic category; for example, “smoke” can be a noun or a verb, “her” can be a pronoun or a possessive determiner. Contents First Last Prev Next ◭

  12. 2.4. Ambiguity ◮ Lexical Ambiguity : a single word can have more than one syntactic category; for example, “smoke” can be a noun or a verb, “her” can be a pronoun or a possessive determiner. ◮ Structural Ambiguity : there are a few valid tree forms for a single sequence of words; for example, which are the possible structures for “old men and women”? Contents First Last Prev Next ◭

  13. 2.4. Ambiguity ◮ Lexical Ambiguity : a single word can have more than one syntactic category; for example, “smoke” can be a noun or a verb, “her” can be a pronoun or a possessive determiner. ◮ Structural Ambiguity : there are a few valid tree forms for a single sequence of words; for example, which are the possible structures for “old men and women”? (a) [[old men]and women] or Contents First Last Prev Next ◭

  14. 2.4. Ambiguity ◮ Lexical Ambiguity : a single word can have more than one syntactic category; for example, “smoke” can be a noun or a verb, “her” can be a pronoun or a possessive determiner. ◮ Structural Ambiguity : there are a few valid tree forms for a single sequence of words; for example, which are the possible structures for “old men and women”? (a) [[old men]and women] or (b) [old[men and women]]. Contents First Last Prev Next ◭

  15. 2.4. Ambiguity ◮ Lexical Ambiguity : a single word can have more than one syntactic category; for example, “smoke” can be a noun or a verb, “her” can be a pronoun or a possessive determiner. ◮ Structural Ambiguity : there are a few valid tree forms for a single sequence of words; for example, which are the possible structures for “old men and women”? (a) [[old men]and women] or (b) [old[men and women]]. ◮ Mismatch between syntax and semantics (QPs: non local scope constru- als): [Alice [thinks [someone left] s ] vp ] s Contents First Last Prev Next ◭

  16. 2.4. Ambiguity ◮ Lexical Ambiguity : a single word can have more than one syntactic category; for example, “smoke” can be a noun or a verb, “her” can be a pronoun or a possessive determiner. ◮ Structural Ambiguity : there are a few valid tree forms for a single sequence of words; for example, which are the possible structures for “old men and women”? (a) [[old men]and women] or (b) [old[men and women]]. ◮ Mismatch between syntax and semantics (QPs: non local scope constru- als): [Alice [thinks [someone left] s ] vp ] s (a1) Think ( alice , ∃ x ( left ( x ))) Contents First Last Prev Next ◭

  17. 2.4. Ambiguity ◮ Lexical Ambiguity : a single word can have more than one syntactic category; for example, “smoke” can be a noun or a verb, “her” can be a pronoun or a possessive determiner. ◮ Structural Ambiguity : there are a few valid tree forms for a single sequence of words; for example, which are the possible structures for “old men and women”? (a) [[old men]and women] or (b) [old[men and women]]. ◮ Mismatch between syntax and semantics (QPs: non local scope constru- als): [Alice [thinks [someone left] s ] vp ] s (a1) Think ( alice , ∃ x ( left ( x ))) (a2) ∃ x ( Think ( alice , left ( x ))) Contents First Last Prev Next ◭

  18. 3. Formal Linguistics Given a linguistic input, we want to use a formal device to: Contents First Last Prev Next ◭

  19. 3. Formal Linguistics Given a linguistic input, we want to use a formal device to: ◮ recognize whether it is grammatical. Contents First Last Prev Next ◭

  20. 3. Formal Linguistics Given a linguistic input, we want to use a formal device to: ◮ recognize whether it is grammatical. ◮ give its syntactic structure. Contents First Last Prev Next ◭

  21. 3. Formal Linguistics Given a linguistic input, we want to use a formal device to: ◮ recognize whether it is grammatical. ◮ give its syntactic structure. ◮ build its meaning representation. Contents First Last Prev Next ◭

  22. 3. Formal Linguistics Given a linguistic input, we want to use a formal device to: ◮ recognize whether it is grammatical. ◮ give its syntactic structure. ◮ build its meaning representation. We look at natural language as a formal language and use formal grammars to achieve these goals. Contents First Last Prev Next ◭

  23. Contents First Last Prev Next ◭

  24. 3.1. Chomsky Hierarchy of Languages The Chomsky Hierarchy – p.8 Contents First Last Prev Next ◭

  25. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. Contents First Last Prev Next ◭

  26. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. ◮ Chomsky (1956, 1957) showed that NLs are not Regular Languages (examples). Contents First Last Prev Next ◭

  27. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. ◮ Chomsky (1956, 1957) showed that NLs are not Regular Languages (examples). ◮ Are NLs CFL? Contents First Last Prev Next ◭

  28. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. ◮ Chomsky (1956, 1957) showed that NLs are not Regular Languages (examples). ◮ Are NLs CFL? 1. Chomsky 1957: conjecture that natural languages are not CF Contents First Last Prev Next ◭

  29. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. ◮ Chomsky (1956, 1957) showed that NLs are not Regular Languages (examples). ◮ Are NLs CFL? 1. Chomsky 1957: conjecture that natural languages are not CF 2. sixties, seventies: many attempts to prove this conjecture Contents First Last Prev Next ◭

  30. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. ◮ Chomsky (1956, 1957) showed that NLs are not Regular Languages (examples). ◮ Are NLs CFL? 1. Chomsky 1957: conjecture that natural languages are not CF 2. sixties, seventies: many attempts to prove this conjecture 3. Pullum and Gazdar 1982: Contents First Last Prev Next ◭

  31. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. ◮ Chomsky (1956, 1957) showed that NLs are not Regular Languages (examples). ◮ Are NLs CFL? 1. Chomsky 1957: conjecture that natural languages are not CF 2. sixties, seventies: many attempts to prove this conjecture 3. Pullum and Gazdar 1982: ⊲ all these attempts have failed Contents First Last Prev Next ◭

  32. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. ◮ Chomsky (1956, 1957) showed that NLs are not Regular Languages (examples). ◮ Are NLs CFL? 1. Chomsky 1957: conjecture that natural languages are not CF 2. sixties, seventies: many attempts to prove this conjecture 3. Pullum and Gazdar 1982: ⊲ all these attempts have failed ⊲ for all we know, natural languages (conceived as string sets) might be context-free Contents First Last Prev Next ◭

  33. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. ◮ Chomsky (1956, 1957) showed that NLs are not Regular Languages (examples). ◮ Are NLs CFL? 1. Chomsky 1957: conjecture that natural languages are not CF 2. sixties, seventies: many attempts to prove this conjecture 3. Pullum and Gazdar 1982: ⊲ all these attempts have failed ⊲ for all we know, natural languages (conceived as string sets) might be context-free 4. Huybregts 1984, Shieber 1985: proof that Swiss German is not context-free Contents First Last Prev Next ◭

  34. 3.2. Where do Natural Languages fit? The crucial information to answer this question is which kind of dependencies are found in NLs. ◮ Chomsky (1956, 1957) showed that NLs are not Regular Languages (examples). ◮ Are NLs CFL? 1. Chomsky 1957: conjecture that natural languages are not CF 2. sixties, seventies: many attempts to prove this conjecture 3. Pullum and Gazdar 1982: ⊲ all these attempts have failed ⊲ for all we know, natural languages (conceived as string sets) might be context-free 4. Huybregts 1984, Shieber 1985: proof that Swiss German is not context-free 5. Joshi (1985) NLs are Mildly Context-sensitive Languages. Contents First Last Prev Next ◭

  35. 3.3. FG for Natural Languages Now we will move to see how CFG have been applied to natural language. To this end, it is convenient to distinguish rules from non-terminal to terminal symbols which define the lexical entries (or lexicon). Contents First Last Prev Next ◭

  36. 3.3. FG for Natural Languages Now we will move to see how CFG have been applied to natural language. To this end, it is convenient to distinguish rules from non-terminal to terminal symbols which define the lexical entries (or lexicon). ◮ Terminal: The terminal symbols are words (e.g. sara, dress . . . ). Contents First Last Prev Next ◭

  37. 3.3. FG for Natural Languages Now we will move to see how CFG have been applied to natural language. To this end, it is convenient to distinguish rules from non-terminal to terminal symbols which define the lexical entries (or lexicon). ◮ Terminal: The terminal symbols are words (e.g. sara, dress . . . ). ◮ Non-terminal: The non-terminal symbols are syntactic categories ( CAT ) (e.g. n , det , np , vp , . . . ). Contents First Last Prev Next ◭

  38. 3.3. FG for Natural Languages Now we will move to see how CFG have been applied to natural language. To this end, it is convenient to distinguish rules from non-terminal to terminal symbols which define the lexical entries (or lexicon). ◮ Terminal: The terminal symbols are words (e.g. sara, dress . . . ). ◮ Non-terminal: The non-terminal symbols are syntactic categories ( CAT ) (e.g. n , det , np , vp , . . . ). ◮ Start symbol: The start symbol is the s and stands for sentence . Contents First Last Prev Next ◭

  39. 3.3. FG for Natural Languages Now we will move to see how CFG have been applied to natural language. To this end, it is convenient to distinguish rules from non-terminal to terminal symbols which define the lexical entries (or lexicon). ◮ Terminal: The terminal symbols are words (e.g. sara, dress . . . ). ◮ Non-terminal: The non-terminal symbols are syntactic categories ( CAT ) (e.g. n , det , np , vp , . . . ). ◮ Start symbol: The start symbol is the s and stands for sentence . The production rules are divided into: ◮ Lexicon: e.g. np → sara. They form the set LEX Contents First Last Prev Next ◭

  40. 3.3. FG for Natural Languages Now we will move to see how CFG have been applied to natural language. To this end, it is convenient to distinguish rules from non-terminal to terminal symbols which define the lexical entries (or lexicon). ◮ Terminal: The terminal symbols are words (e.g. sara, dress . . . ). ◮ Non-terminal: The non-terminal symbols are syntactic categories ( CAT ) (e.g. n , det , np , vp , . . . ). ◮ Start symbol: The start symbol is the s and stands for sentence . The production rules are divided into: ◮ Lexicon: e.g. np → sara. They form the set LEX ◮ Grammatical Rules: They are of the type s → np vp . Contents First Last Prev Next ◭

  41. 3.3. FG for Natural Languages Now we will move to see how CFG have been applied to natural language. To this end, it is convenient to distinguish rules from non-terminal to terminal symbols which define the lexical entries (or lexicon). ◮ Terminal: The terminal symbols are words (e.g. sara, dress . . . ). ◮ Non-terminal: The non-terminal symbols are syntactic categories ( CAT ) (e.g. n , det , np , vp , . . . ). ◮ Start symbol: The start symbol is the s and stands for sentence . The production rules are divided into: ◮ Lexicon: e.g. np → sara. They form the set LEX ◮ Grammatical Rules: They are of the type s → np vp . Well known formal grammars are Phrase Structure Grammars ( PSG ). Contents First Last Prev Next ◭

  42. 3.3.1. PSG : English Toy Fragment We consider a small fragment of English defined by the following grammar G = � LEX , Rules � , with vocabulary Σ and cate- gories CAT . Contents First Last Prev Next ◭

  43. 3.3.1. PSG : English Toy Fragment We consider a small fragment of English defined by the following grammar G = � LEX , Rules � , with vocabulary Σ and cate- gories CAT . ◮ LEX = Σ × CAT Contents First Last Prev Next ◭

  44. 3.3.1. PSG : English Toy Fragment We consider a small fragment of English defined by the following grammar G = � LEX , Rules � , with vocabulary Σ and cate- gories CAT . ◮ LEX = Σ × CAT ⊲ Σ = { Sara , dress , wears , the , new } , Contents First Last Prev Next ◭

  45. 3.3.1. PSG : English Toy Fragment We consider a small fragment of English defined by the following grammar G = � LEX , Rules � , with vocabulary Σ and cate- gories CAT . ◮ LEX = Σ × CAT ⊲ Σ = { Sara , dress , wears , the , new } , ⊲ CAT = { det, n, np, s, v, vp, adj } , Contents First Last Prev Next ◭

  46. 3.3.1. PSG : English Toy Fragment We consider a small fragment of English defined by the following grammar G = � LEX , Rules � , with vocabulary Σ and cate- gories CAT . ◮ LEX = Σ × CAT ⊲ Σ = { Sara , dress , wears , the , new } , ⊲ CAT = { det, n, np, s, v, vp, adj } , ⊲ LEX = { np → Sara det → the , n → dress , adj → new , v → wears } Contents First Last Prev Next ◭

  47. 3.3.1. PSG : English Toy Fragment We consider a small fragment of English defined by the following grammar G = � LEX , Rules � , with vocabulary Σ and cate- gories CAT . ◮ LEX = Σ × CAT ⊲ Σ = { Sara , dress , wears , the , new } , ⊲ CAT = { det, n, np, s, v, vp, adj } , ⊲ LEX = { np → Sara det → the , n → dress , adj → new , v → wears } ◮ Rules = { s → np vp, np → det n, vp → v np, n → adj n } Contents First Last Prev Next ◭

  48. 3.3.1. PSG : English Toy Fragment We consider a small fragment of English defined by the following grammar G = � LEX , Rules � , with vocabulary Σ and cate- gories CAT . ◮ LEX = Σ × CAT ⊲ Σ = { Sara , dress , wears , the , new } , ⊲ CAT = { det, n, np, s, v, vp, adj } , ⊲ LEX = { np → Sara det → the , n → dress , adj → new , v → wears } ◮ Rules = { s → np vp, np → det n, vp → v np, n → adj n } Among the elements of the language recognized by the grammar, L ( G ), are Contents First Last Prev Next ◭

  49. 3.3.1. PSG : English Toy Fragment We consider a small fragment of English defined by the following grammar G = � LEX , Rules � , with vocabulary Σ and cate- gories CAT . ◮ LEX = Σ × CAT ⊲ Σ = { Sara , dress , wears , the , new } , ⊲ CAT = { det, n, np, s, v, vp, adj } , ⊲ LEX = { np → Sara det → the , n → dress , adj → new , v → wears } ◮ Rules = { s → np vp, np → det n, vp → v np, n → adj n } Among the elements of the language recognized by the grammar, L ( G ), are ◮ det → ∗ the —because this is in the lexicon, and Contents First Last Prev Next ◭

  50. 3.3.1. PSG : English Toy Fragment We consider a small fragment of English defined by the following grammar G = � LEX , Rules � , with vocabulary Σ and cate- gories CAT . ◮ LEX = Σ × CAT ⊲ Σ = { Sara , dress , wears , the , new } , ⊲ CAT = { det, n, np, s, v, vp, adj } , ⊲ LEX = { np → Sara det → the , n → dress , adj → new , v → wears } ◮ Rules = { s → np vp, np → det n, vp → v np, n → adj n } Among the elements of the language recognized by the grammar, L ( G ), are ◮ det → ∗ the —because this is in the lexicon, and ◮ s → ∗ Sara wears the new dress —which is in the language by repeated applications of rules. Contents First Last Prev Next ◭

  51. Contents First Last Prev Next ◭

  52. adj → new Contents First Last Prev Next ◭

  53. adj → new adj new Contents First Last Prev Next ◭

  54. adj → new n → new adj new Contents First Last Prev Next ◭

  55. adj → new n → new adj n new dress Contents First Last Prev Next ◭

  56. adj → new n → new n → adj n adj n new dress Contents First Last Prev Next ◭

  57. adj → new n → new n → adj n adj n new dress n ✟ ❍ ❍ ✟ adj n new dress Contents First Last Prev Next ◭

  58. adj → new n → new n → adj n adj n new dress n s ✟ ❍ ❍ ✟✟✟ ✟ ❍ ❍ ✟ ❍ adj n ❍ np vp ✟ ❍ new dress ✟✟ ❍ ❍ Sara v np ✟ ❍ ✟✟ ❍ ❍ wears det n ✟ ❍ ❍ ✟ adj n the new dress [Sara[wears[the[new dress] n ] np ] vp ] s Contents First Last Prev Next ◭

  59. Contents First Last Prev Next ◭

  60. 3.3.3. PSG : Advantages and Disadvantages Advantages ◮ PSG deals with phrase structures represented as trees. Contents First Last Prev Next ◭

  61. 3.3.3. PSG : Advantages and Disadvantages Advantages ◮ PSG deals with phrase structures represented as trees. ◮ Trees preserve aspects of the compositional (constituent) structure Contents First Last Prev Next ◭

  62. 3.3.3. PSG : Advantages and Disadvantages Advantages ◮ PSG deals with phrase structures represented as trees. ◮ Trees preserve aspects of the compositional (constituent) structure Disadvantages Contents First Last Prev Next ◭

  63. 3.3.3. PSG : Advantages and Disadvantages Advantages ◮ PSG deals with phrase structures represented as trees. ◮ Trees preserve aspects of the compositional (constituent) structure Disadvantages ◮ We are not capturing any general property of natural language assembly. Contents First Last Prev Next ◭

  64. 3.3.3. PSG : Advantages and Disadvantages Advantages ◮ PSG deals with phrase structures represented as trees. ◮ Trees preserve aspects of the compositional (constituent) structure Disadvantages ◮ We are not capturing any general property of natural language assembly. ◮ Hence, to extend the grammar we have to keep on adding rules each time we add a word of a new category. Contents First Last Prev Next ◭

  65. 3.3.3. PSG : Advantages and Disadvantages Advantages ◮ PSG deals with phrase structures represented as trees. ◮ Trees preserve aspects of the compositional (constituent) structure Disadvantages ◮ We are not capturing any general property of natural language assembly. ◮ Hence, to extend the grammar we have to keep on adding rules each time we add a word of a new category. ◮ It’s difficult to tiedly connect these (syntactic) rewriting rules with semantic rules to obtain meaning representations. Contents First Last Prev Next ◭

  66. 3.3.3. PSG : Advantages and Disadvantages Advantages ◮ PSG deals with phrase structures represented as trees. ◮ Trees preserve aspects of the compositional (constituent) structure Disadvantages ◮ We are not capturing any general property of natural language assembly. ◮ Hence, to extend the grammar we have to keep on adding rules each time we add a word of a new category. ◮ It’s difficult to tiedly connect these (syntactic) rewriting rules with semantic rules to obtain meaning representations. ◮ PSG as such don’t handle long-distance dependencies, since there is no connec- tion among categories occurring in different rewriting rules. Contents First Last Prev Next ◭

  67. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas Contents First Last Prev Next ◭

  68. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas ◮ As such, they can be atomic or complex (not just plain A, B, a, b etc.). Contents First Last Prev Next ◭

  69. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas ◮ As such, they can be atomic or complex (not just plain A, B, a, b etc.). ◮ They are related by means of the derivability relation ( ⇒ ). E.g Contents First Last Prev Next ◭

  70. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas ◮ As such, they can be atomic or complex (not just plain A, B, a, b etc.). ◮ They are related by means of the derivability relation ( ⇒ ). E.g np pl ⇒ np all expressions that are plural np are also (under-specified) np . Contents First Last Prev Next ◭

  71. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas ◮ As such, they can be atomic or complex (not just plain A, B, a, b etc.). ◮ They are related by means of the derivability relation ( ⇒ ). E.g np pl ⇒ np all expressions that are plural np are also (under-specified) np . ◮ To recognize that a structure is of a certain category reduces to prove the formulas corresponding to the structure and the category are in a derivability relation Γ ⇒ A : Contents First Last Prev Next ◭

  72. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas ◮ As such, they can be atomic or complex (not just plain A, B, a, b etc.). ◮ They are related by means of the derivability relation ( ⇒ ). E.g np pl ⇒ np all expressions that are plural np are also (under-specified) np . ◮ To recognize that a structure is of a certain category reduces to prove the formulas corresponding to the structure and the category are in a derivability relation Γ ⇒ A : CAT sara CAT wears CAT the CAT new CAT dress ⇒ s ? Contents First Last Prev Next ◭

  73. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas ◮ As such, they can be atomic or complex (not just plain A, B, a, b etc.). ◮ They are related by means of the derivability relation ( ⇒ ). E.g np pl ⇒ np all expressions that are plural np are also (under-specified) np . ◮ To recognize that a structure is of a certain category reduces to prove the formulas corresponding to the structure and the category are in a derivability relation Γ ⇒ A : CAT sara CAT wears CAT the CAT new CAT dress ⇒ s ? The slogan is: Contents First Last Prev Next ◭

  74. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas ◮ As such, they can be atomic or complex (not just plain A, B, a, b etc.). ◮ They are related by means of the derivability relation ( ⇒ ). E.g np pl ⇒ np all expressions that are plural np are also (under-specified) np . ◮ To recognize that a structure is of a certain category reduces to prove the formulas corresponding to the structure and the category are in a derivability relation Γ ⇒ A : CAT sara CAT wears CAT the CAT new CAT dress ⇒ s ? The slogan is: “Parsing as deduction” Contents First Last Prev Next ◭

  75. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas ◮ As such, they can be atomic or complex (not just plain A, B, a, b etc.). ◮ They are related by means of the derivability relation ( ⇒ ). E.g np pl ⇒ np all expressions that are plural np are also (under-specified) np . ◮ To recognize that a structure is of a certain category reduces to prove the formulas corresponding to the structure and the category are in a derivability relation Γ ⇒ A : CAT sara CAT wears CAT the CAT new CAT dress ⇒ s ? The slogan is: “Parsing as deduction” The question is: Contents First Last Prev Next ◭

  76. 4. Parsing as deduction We look for the Logic that properly models natural language syntax-semantics interface. ◮ We consider syntactic categories to be logical formulas ◮ As such, they can be atomic or complex (not just plain A, B, a, b etc.). ◮ They are related by means of the derivability relation ( ⇒ ). E.g np pl ⇒ np all expressions that are plural np are also (under-specified) np . ◮ To recognize that a structure is of a certain category reduces to prove the formulas corresponding to the structure and the category are in a derivability relation Γ ⇒ A : CAT sara CAT wears CAT the CAT new CAT dress ⇒ s ? The slogan is: “Parsing as deduction” The question is: which logic do we need? Contents First Last Prev Next ◭

  77. 5. Summing up ◮ We need a (logical) grammar able Contents First Last Prev Next ◭

  78. 5. Summing up ◮ We need a (logical) grammar able ⊲ to both compose (assembly) and decompose (extraction) linguistic struc- tures Contents First Last Prev Next ◭

  79. 5. Summing up ◮ We need a (logical) grammar able ⊲ to both compose (assembly) and decompose (extraction) linguistic struc- tures ⊲ to account for both local dependency and long distance dependency Contents First Last Prev Next ◭

  80. 5. Summing up ◮ We need a (logical) grammar able ⊲ to both compose (assembly) and decompose (extraction) linguistic struc- tures ⊲ to account for both local dependency and long distance dependency ◮ the logical grammar should Contents First Last Prev Next ◭

  81. 5. Summing up ◮ We need a (logical) grammar able ⊲ to both compose (assembly) and decompose (extraction) linguistic struc- tures ⊲ to account for both local dependency and long distance dependency ◮ the logical grammar should ⊲ be at least context free, but actually more –mildly context sensitive. Contents First Last Prev Next ◭

  82. 5. Summing up ◮ We need a (logical) grammar able ⊲ to both compose (assembly) and decompose (extraction) linguistic struc- tures ⊲ to account for both local dependency and long distance dependency ◮ the logical grammar should ⊲ be at least context free, but actually more –mildly context sensitive. ⊲ be computationally appealing (polynomial) Contents First Last Prev Next ◭

  83. 5. Summing up ◮ We need a (logical) grammar able ⊲ to both compose (assembly) and decompose (extraction) linguistic struc- tures ⊲ to account for both local dependency and long distance dependency ◮ the logical grammar should ⊲ be at least context free, but actually more –mildly context sensitive. ⊲ be computationally appealing (polynomial) ⊲ be tiedly related to meaning representation assembly Contents First Last Prev Next ◭

  84. 5. Summing up ◮ We need a (logical) grammar able ⊲ to both compose (assembly) and decompose (extraction) linguistic struc- tures ⊲ to account for both local dependency and long distance dependency ◮ the logical grammar should ⊲ be at least context free, but actually more –mildly context sensitive. ⊲ be computationally appealing (polynomial) ⊲ be tiedly related to meaning representation assembly ⊲ capture the core of natural languages Contents First Last Prev Next ◭

  85. 5. Summing up ◮ We need a (logical) grammar able ⊲ to both compose (assembly) and decompose (extraction) linguistic struc- tures ⊲ to account for both local dependency and long distance dependency ◮ the logical grammar should ⊲ be at least context free, but actually more –mildly context sensitive. ⊲ be computationally appealing (polynomial) ⊲ be tiedly related to meaning representation assembly ⊲ capture the core of natural languages ⊲ capture natural language diversities Contents First Last Prev Next ◭

  86. 6. Classical Logic vs. Non-Classical Logics Contents First Last Prev Next ◭

Recommend


More recommend