Malapropism Detection: Results and Error Analysis False Positive Among the largest OTC issues, Farmers Group, which expects B.A.T. Industries to launch a hostile tenter [=tender] offer for it, jumped to 62 yesterday. “tenter” was placed in a chain: tenter isa framework/frame includes handbarrow has part handle/grip/hold includes stock ⇒ wrong WSD of “stock” (only “tender” could have disambiguated) Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Malapropism Detection: Results and Error Analysis Potential False Negative QVC Network, a 24-hour home television shopping issue, said yesterday it expects fiscal 1989 sales of $170 million to $200 million. . . “television” doesn’t fit any chain but also has no variants which fit thus it’s not flagged Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Malapropism Detection: Results and Error Analysis False Negative And while institutions until the past month or so stayed away from the smallest issues for fear they would get stuck in an illiquid stock, . . . “fear” doesn’t fit any chain orthographically close words: “gear”, “pear”, “year” for all variants chains were found (e.g., “pear”-”Lotus”) “fear” was wrongly flagged as a potential error Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Malapropism Detection: Results and Error Analysis words in corpus 322,645 number of words in chains 109,407 number of non-malapropisms 107,998 malapropsims 1,409 atomic chains 8,014 atomic chains that contained malapropisms 442 atomic chain that did not contain malapropisms 7,572 performance factor 4.47 number of potential errors flagged 3,167 true alarms 397 false alarms 2,770 performance factor 2.46 performance factor overall 11.0 number of perfectly detected and corrected malapropisms 349 malapropisms were 4.47 times more likely to be placed in an atomic chain than non-malapropisms malapropisms in atomic chains were 2.46 times more likely to result in alarms than non-malapropisms in atomic chains in general malapropisms were 11 times more likely to be flagged as potential errors than other words Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Malapropism Detection: Results and Error Analysis . . . or in other words error detection precision: 12.54% error detection recall: 28.18% error detection f-score: 17.36% correction accuracy for correctly detected errors: 87% Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Another Application: Idiom Detection Problem: spill the beans ( ≈ reveal a secret) Scott then published a book entitled Off Whitehall, which supposedly spilled the beans on the Blair/Brown feud. spill the beans Somehow I always end up spilling the beans all over the floor and looking foolish when the clerk comes to sweep them up. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Another Application: Idiom Detection Problem: spill the beans ( ≈ reveal a secret) Scott then published a book entitled Off Whitehall, which supposedly spilled the beans on the Blair/Brown feud. spill the beans Somehow I always end up spilling the beans all over the floor and looking foolish when the clerk comes to sweep them up. NLP systems need to be able to recognise idioms to assign correct analyses often an expression can have literal as well as non-literal meaning ⇒ need to disambiguate in context Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Another Application: Idiom Detection Possible solution: . . . could label a lot of training data, define a features set for the task and then use supervised machine learning to train a classifier But training data is expensive to label. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Another Application: Idiom Detection Possible solution: . . . could label a lot of training data, define a features set for the task and then use supervised machine learning to train a classifier But training data is expensive to label. An unsupervised approach: check whether component words of the idiom (e.g., “spill” or “bean”) occur in a (non-atomic) lexical chain if no lexical chain can be found in which the idiom participates predict non-literal usage Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Idiom Detection: Examples spill the beans Somehow I always end up spilling the beans all over the floor and looking foolish when the clerk comes to sweep them up. spill the beans ( ≈ reveal a secret) Scott then published a book entitled Off Whitehall, which supposedly spilled the beans on the Blair/Brown feud. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Idiom Detection: Examples spill the beans Somehow I always end up spilling the beans all over the floor and looking foolish when the clerk comes to sweep them up. spill the beans ( ≈ reveal a secret) Scott then published a book entitled Off Whitehall, which supposedly spilled the beans on the Blair/Brown feud. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
What semantic relations do we need to model? play with fire Grilling outdoors is much more than just another dry-heat cooking method. It’s the chance to play with fire, satisfying a primal urge to stir around in coals. drop the ball When Rooney collided with the goalkeeper, causing him to drop the ball, Kevin Campbell followed in. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
What semantic relations do we need to model? play with fire Grilling outdoors is much more than just another dry-heat cooking method. It’s the chance to play with fire, satisfying a primal urge to stir around in coals. drop the ball When Rooney collided with the goalkeeper, causing him to drop the ball, Kevin Campbell followed in. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
What semantic relations do we need to model? play with fire Grilling outdoors is much more than just another dry-heat cooking method. It’s the chance to play with fire, satisfying a primal urge to stir around in coals. drop the ball When Rooney collided with the goalkeeper, causing him to drop the ball, Kevin Campbell followed in. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
What semantic relations do we need to model? Relations found: relations between non-nouns (“spill” – “sweep up”) relations across parts-of-speech (“cooking” – “fire”) ’fuzzy’ relations (“fire” – “coals”) world knowledge (“Wayne Rooney” – “ball”) Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
What semantic relations do we need to model? Relations found: relations between non-nouns (“spill” – “sweep up”) relations across parts-of-speech (“cooking” – “fire”) ’fuzzy’ relations (“fire” – “coals”) world knowledge (“Wayne Rooney” – “ball”) Relatedness measure: WordNet-based measures not suitable compiled corpora smallish and out-of-date instead compute similarity from web counts Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Web-based Relatedness Measure Normalised Google Distance (NGD, Cilibrasi & Vitanyi 2007) NGD ( x , y ) = max { log f ( x ) , log f ( y ) } − log f ( x , y ) (1) log M − min { log f ( x ) , log f ( y ) } for two terms “x” and “y”, where M is the number of indexed pages. M is estimated by querying for “the” Yahoo rather than Google (more stable counts) query for all combinations of word forms Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Modelling Cohesion Lexical Chains one free parameter: relatedness threshold ⇒ need annotated data to optimise! performance very sensitive to parameter and chaining algorithm Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Modelling Cohesion Lexical Chains one free parameter: relatedness threshold ⇒ need annotated data to optimise! performance very sensitive to parameter and chaining algorithm Cohesion Graph model cohesion as a graph structure: nodes are content words, edges encode degree of relatedness between pairs of words compute how average relatedness changes if idiom is excluded from the graph: if it increases predict non-literal usage Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Data 17 idioms (mainly V+NP and V+PP) with literal and non-literal sense occurrences extracted from a Gigaword corpus (3964 instances) five paragraphs context manually labelled as “literal” (862 instances) or “non-literal” (3102 instances) Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Data expression literal non-literal all back the wrong horse 0 25 25 bite off more than one can chew 2 142 144 bite one’s tongue 16 150 166 blow one’s own trumpet 0 9 9 bounce off the wall* 39 7 46 break the ice 20 521 541 drop the ball* 688 215 903 get one’s feet wet 17 140 157 pass the buck 7 255 262 play with fire 34 532 566 pull the trigger* 11 4 15 rock the boat 8 470 478 set in stone 9 272 281 spill the beans 3 172 175 sweep under the carpet 0 9 9 swim against the tide 1 125 126 tear one’s hair out 7 54 61 all 862 3102 3964 Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Results B Maj B Rep Graph LC d LC o Super Acc 78.25 79.06 79.61 80.50 80.42 95.69 lit . Prec - 70.00 52.21 62.26 53.89 84.62 lit . Rec - 5.96 67.87 26.21 69.03 96.45 lit . F β = 1 - 10.98 59.02 36.90 60.53 90.15 B Maj : majority baseline, i.e., “non-literal” B Rep : predict “literal” if an idiom component word is repeated in the context Graph : cohesion graph LC d : lexical chains optimised on development set LC o : lexical chains optimised globally by oracle (upper bound for lexical chains) Super : supervised classifier (k-nn) using word overlap (leave-one-out) Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Limitations of the Cohesion-Based Approach Literal Use without Lexical Chain Chinamasa compared McGown’s attitude to morphine to a child’s attitude to playing with fire – a lack of concern over the risks involved. Non-Literal Use with Lexical Chain Saying that the Americans were ”playing with fire” the official press speculated that the ”gunpowder barrel” which is Taiwan might well ”explode” if Washington and Taipei do not put a stop to their ”incendiary gesticulations.” ⇒ Both cases are relatively rare Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Centering Theory Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Referring Expressions: Terminology Discourse model a representation of discourse meaning discourse entities (usually realised by NPs) properties of the discourse entities relations between discourse entities Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Referring Expressions: Terminology Discourse model a representation of discourse meaning discourse entities (usually realised by NPs) properties of the discourse entities relations between discourse entities Referring Expression an expression that a speaker uses to refer to an entity. Referent the entity which is referred to by the referring expression. Reference the process in which the speaker uses a referring expression to refer to a discourse entity. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Discourse Model and Reference Dynamics of the discourse model referring expressions change the discourse model introduction of new discourse entities creation of links to “old” entities Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Discourse model and Reference Reference and linguistic form The linguistic form reflects the current state of the discourse context. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Discourse model and Reference Reference and linguistic form The linguistic form reflects the current state of the discourse context. Typically: new discourse entities are introduced by indefinite NPs old discourse are referred to with definite NPs or pronouns Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Discourse model and Reference Reference and linguistic form The linguistic form reflects the current state of the discourse context. Typically: new discourse entities are introduced by indefinite NPs old discourse are referred to with definite NPs or pronouns ⇒ I saw a cat. The cat/It was black. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Discourse model and Reference Reference and linguistic form The linguistic form reflects the current state of the discourse context. Typically: new discourse entities are introduced by indefinite NPs old discourse are referred to with definite NPs or pronouns ⇒ I saw a cat. The cat/It was black. But: Peter went over to the house. The door was wide open. He is going to the States for a year. (A to B when C walks by) Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Centering Theory (Grosz, Joshi, Weinstein, 1995) Aim: Modelling the local coherence of a discourse segment? Why are some texts perceived as more coherent than others? Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Centering Theory (Grosz, Joshi, Weinstein, 1995) Aim: Modelling the local coherence of a discourse segment? Why are some texts perceived as more coherent than others? Hypothesis: different types of referring expressions are associated with different inference loads badly chosen referring expressions lead to a high inference load ⇒ the discourse is perceived as incoherent Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. It was a store John had frequented for many years. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. It was a store John had frequented for many years. He was excited that he could finally buy a piano. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. It was a store John had frequented for many years. He was excited that he could finally buy a piano. It was closing just as John arrived. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. It was a store John had frequented for many years. He was excited that he could finally buy a piano. It was closing just as John arrived. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. It was a store John had frequented for many years. He was excited that he could finally buy a piano. It was closing just as John arrived. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. He had frequented the store for many years. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. He had frequented the store for many years. He was excited that he could finally buy a piano. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. He had frequented the store for many years. He was excited that he could finally buy a piano. He arrived just as the store was closing for the day. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. He had frequented the store for many years. He was excited that he could finally buy a piano. He arrived just as the store was closing for the day. ⇒ coherence has something to do with local focus (attentional state) Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Differences in Coherence: Example John went to his favorite music store to buy a piano. He had frequented the store for many years. He was excited that he could finally buy a piano. He arrived just as the store was closing for the day. ⇒ coherence has something to do with local focus (attentional state) ⇒ Too many focus shifts make a text incoherent (cognitive processing of the text becomes more difficult) Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Focus structure and pronoun interpretation 1 Terry really goofs sometimes. 2 Yesterday was a beautiful day and he was excited about trying his new sailboat. 3 He wanted Tony to join him on a sailing expedition. 4 He called him at 6 am. 5 He was sick and furious at being woken up so early. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Focus structure and pronoun interpretation 1 Terry really goofs sometimes. 2 Yesterday was a beautiful day and he was excited about trying his new sailboat. 3 He wanted Tony to join him on a sailing expedition. 4 He called him at 6 am. 5 He was sick and furious at being woken up so early. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Focus structure and pronoun interpretation 1 Terry really goofs sometimes. 2 Yesterday was a beautiful day and he was excited about trying his new sailboat. 3 He wanted Tony to join him on a sailing expedition. 4 He called him at 6 am. 5 He was sick and furious at being woken up so early. “he” in (5)=Tony Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Focus structure and pronoun interpretation 1 Terry really goofs sometimes. 2 Yesterday was a beautiful day and he was excited about trying his new sailboat. 3 He wanted Tony to join him on a sailing expedition. 4 He called him at 6 am. 5 He was sick and furious at being woken up so early. “he” in (5)=Tony Discourse is perceived as incoherent because pronoun is supposed to refer to focal entity (=Terry). The fact that it refers to Tony here leads to a higher inference load. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Focus structure and pronoun interpretation 1 Terry really goofs sometimes. 2 Yesterday was a beautiful day and he was excited about trying his new sailboat. 3 He wanted Tony to join him on a sailing expedition. 4 He called him at 6 am. 5 Tony was sick and furious at being woken up so early. 6 He told Terry to get lost and hung up. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Focus structure and pronoun interpretation 1 Terry really goofs sometimes. 2 Yesterday was a beautiful day and he was excited about trying his new sailboat. 3 He wanted Tony to join him on a sailing expedition. 4 He called him at 6 am. 5 Tony was sick and furious at being woken up so early. 6 He told Terry to get lost and hung up. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Focus structure and pronoun interpretation 1 Terry really goofs sometimes. 2 Yesterday was a beautiful day and he was excited about trying his new sailboat. 3 He wanted Tony to join him on a sailing expedition. 4 He called him at 6 am. 5 Tony was sick and furious at being woken up so early. 6 He told Terry to get lost and hung up. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Focus structure and pronoun interpretation 1 Terry really goofs sometimes. 2 Yesterday was a beautiful day and he was excited about trying his new sailboat. 3 He wanted Tony to join him on a sailing expedition. 4 He called him at 6 am. 5 Tony was sick and furious at being woken up so early. 6 He told Terry to get lost and hung up. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Focus structure and pronoun interpretation 1 Terry really goofs sometimes. 2 Yesterday was a beautiful day and he was excited about trying his new sailboat. 3 He wanted Tony to join him on a sailing expedition. 4 He called him at 6 am. 5 Tony was sick and furious at being woken up so early. 6 He told Terry to get lost and hung up. 7 Of course, Terry hadn’t intended to upset Tony. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Modelling Focus in Centering Theory Every utterance U n has a backwards looking center C b , which connects U n with the preceding utterance U n − 1 . For discourse initial utterances C b is undefined. Each utterance U n also has a partially ordered set of forward looking centers C f , which form a potential link with the following utterance U n +1 The partial order of C f is determined, among others, by the grammatical role of the referring expression, i.e., Subject ≺ Object ≺ Others (subject before object before others) The highest ranking element in the C f of an utterance is the prefered center C p . The C b of an utterance U n is the highest ranking element in the C f of U n − 1 , which is realised in U n Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } C p = { he=John } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } C p = { he=John } Mike has annoyed him very much recently. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } C p = { he=John } Mike has annoyed him very much recently. C b = { him=John } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } C p = { he=John } Mike has annoyed him very much recently. C b = { him=John } C f = { Mike, him=John } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } C p = { he=John } Mike has annoyed him very much recently. C b = { him=John } C f = { Mike, him=John } C p = { Mike } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } C p = { he=John } Mike has annoyed him very much recently. C b = { him=John } C f = { Mike, him=John } C p = { Mike } He phoned John at 5 o’clock in the morning last Friday. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } C p = { he=John } Mike has annoyed him very much recently. C b = { him=John } C f = { Mike, him=John } C p = { Mike } He phoned John at 5 o’clock in the morning last Friday. C b = { he=Mike } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } C p = { he=John } Mike has annoyed him very much recently. C b = { him=John } C f = { Mike, him=John } C p = { Mike } He phoned John at 5 o’clock in the morning last Friday. C b = { he=Mike } C f = { he=Mike, John, Friday, 5 o’clock } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Example John has many problems with organising his holidays. C b = { undef } C f = { John, problems, holidays } C p = { John } He cannot find anybody to take over his duties. C b = { he=John } C f = { he=John, anybody, duties } C p = { he=John } Yesterday he phoned Mike to make a plan. C b = { he=John } C f = { he=John, Mike, plan } C p = { he=John } Mike has annoyed him very much recently. C b = { him=John } C f = { Mike, him=John } C p = { Mike } He phoned John at 5 o’clock in the morning last Friday. C b = { he=Mike } C f = { he=Mike, John, Friday, 5 o’clock } C p = { he=Mike } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Center Transitions different types of center transitions are possible, depending on whether whether the C b continues or not the chosen center transitions determine the coherence of a text C b ( U n ) = C b ( U n − 1 ) C b ( U n ) � = C b ( U n − 1 ) or undefined C b ( U n ) C b ( U n ) = C p ( U n ) Continue Smooth-Shift C b ( U n ) � = C p ( U n ) Retain Rough-Shift Preferred order for transitions: Continue > Retain > Smooth − Shift > Rough − Shift Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Center Transitions different types of center transitions are possible, depending on whether whether the C b continues or not the chosen center transitions determine the coherence of a text C b ( U n ) = C b ( U n − 1 ) C b ( U n ) � = C b ( U n − 1 ) or undefined C b ( U n ) C b ( U n ) = C p ( U n ) Continue Smooth-Shift C b ( U n ) � = C p ( U n ) Retain Rough-Shift Preferred order for transitions: Continue > Retain > Smooth − Shift > Rough − Shift Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Center Transitions different types of center transitions are possible, depending on whether whether the C b continues or not the chosen center transitions determine the coherence of a text C b ( U n ) = C b ( U n − 1 ) C b ( U n ) � = C b ( U n − 1 ) or undefined C b ( U n ) C b ( U n ) = C p ( U n ) Continue Smooth-Shift C b ( U n ) � = C p ( U n ) Retain Rough-Shift Preferred order for transitions: Continue > Retain > Smooth − Shift > Rough − Shift Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Center Transitions: Example John has many problems with organising his holidays. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Center Transitions: Example John has many problems with organising his holidays. C b = { undef } , C f = { John, problems, holidays } , C p = { John } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Center Transitions: Example John has many problems with organising his holidays. C b = { undef } , C f = { John, problems, holidays } , C p = { John } He cannot find anybody to take over his duties. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Center Transitions: Example John has many problems with organising his holidays. C b = { undef } , C f = { John, problems, holidays } , C p = { John } He cannot find anybody to take over his duties. C b = { he=John } , C f = { he=John, anybody, duties } , C p = { he=John } Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Center Transitions: Example John has many problems with organising his holidays. C b = { undef } , C f = { John, problems, holidays } , C p = { John } He cannot find anybody to take over his duties. C b = { he=John } , C f = { he=John, anybody, duties } , C p = { he=John } Transition: Continue Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Center Transitions: Example John has many problems with organising his holidays. C b = { undef } , C f = { John, problems, holidays } , C p = { John } He cannot find anybody to take over his duties. C b = { he=John } , C f = { he=John, anybody, duties } , C p = { he=John } Transition: Continue Yesterday he phoned Mike to make a plan. Caroline Sporleder csporled@coli.uni-sb.de Computational Models of Discourse
Recommend
More recommend