Unweighted example Example A = ( { l , r , ⊤ , ⊥} , Σ , {⊤} , δ ) with Σ = { σ ( 2 ) , α ( 0 ) , β ( 0 ) } σ ⊤ σ ⊤ σ ⊥ α ⊥ β ⊥ · ⊥ · r · ℓ · ⊥ · ⊥ · ⊥ σ ℓ σ ℓ σ r σ r α ℓ β r · ℓ · ⊥ · ⊥ · ℓ · r · ⊥ · ⊥ · r ⊤ reached on ℓ or r in left or right subtree ⊥ can accept any tree ℓ and r accept α and β and propagate Recognizability and Weighted Tree Transducers A. Maletti 12 ·
Unweighted example Example Input tree σ σ β α α Recognized language A = { σ ( t 1 , t 2 ) | | t 1 | α � = 0 or | t 2 | β � = 0 } Recognizability and Weighted Tree Transducers A. Maletti 13 ·
Unweighted example Example Accepting run σ ⊤ σ ℓ β ⊥ α ℓ α ⊥ Recognized language A = { σ ( t 1 , t 2 ) | | t 1 | α � = 0 or | t 2 | β � = 0 } Recognizability and Weighted Tree Transducers A. Maletti 13 ·
Unweighted example Example Accepting run σ ⊤ σ ⊤ σ ℓ β ⊥ σ ℓ β ⊥ α ℓ α ⊥ α ⊥ α ℓ Recognized language A = { σ ( t 1 , t 2 ) | | t 1 | α � = 0 or | t 2 | β � = 0 } Recognizability and Weighted Tree Transducers A. Maletti 13 ·
Unweighted example Example Accepting run σ ⊤ σ ⊤ σ ⊤ σ ℓ β ⊥ σ ℓ β ⊥ σ ⊥ β r α ℓ α ⊥ α ⊥ α ℓ α ⊥ α ⊥ Recognized language A = { σ ( t 1 , t 2 ) | | t 1 | α � = 0 or | t 2 | β � = 0 } Recognizability and Weighted Tree Transducers A. Maletti 13 ·
Unweighted example Example Accepting run σ ⊤ σ ⊤ σ ⊤ σ ℓ β ⊥ σ ℓ β ⊥ σ ⊥ β r α ℓ α ⊥ α ⊥ α ℓ α ⊥ α ⊥ Recognized language A = { σ ( t 1 , t 2 ) | | t 1 | α � = 0 or | t 2 | β � = 0 } Recognizability and Weighted Tree Transducers A. Maletti 13 ·
Unweighted example Example Accepting run σ ⊤ σ ⊤ σ ⊤ σ ℓ β ⊥ σ ℓ β ⊥ σ ⊥ β r α ℓ α ⊥ α ⊥ α ℓ α ⊥ α ⊥ Recognized language A = { σ ( t 1 , t 2 ) | | t 1 | α � = 0 or | t 2 | β � = 0 } = { σ ( t 1 , t 2 ) | | t 1 | α + | t 2 | β � = 0 } Recognizability and Weighted Tree Transducers A. Maletti 13 ·
Weighted example Example A = ( { l , r , ⊤ , ⊥} , Σ , F , δ ) over the field ( R , + , · , 0 , 1 ) of reals F ( ⊤ ) = 1 and F ( q ) = 0 otherwise Σ = { σ ( 2 ) , α ( 0 ) , β ( 0 ) } σ ⊤ σ ⊤ σ ⊥ α ⊥ β ⊥ · ⊥ · r · ℓ · ⊥ · ⊥ · ⊥ σ ℓ σ ℓ σ r σ r α ℓ β r · ℓ · ⊥ · ⊥ · ℓ · r · ⊥ · ⊥ · r Recognizability and Weighted Tree Transducers A. Maletti 14 ·
Weighted example Example A = ( { l , r , ⊤ , ⊥} , Σ , F , δ ) over the field ( R , + , · , 0 , 1 ) of reals F ( ⊤ ) = 1 and F ( q ) = 0 otherwise Σ = { σ ( 2 ) , α ( 0 ) , β ( 0 ) } σ 1 σ 1 σ 1 ⊤ ⊤ ⊥ α 1 β 1 ⊥ ⊥ · ⊥ · r · ℓ · ⊥ · ⊥ · ⊥ σ 1 σ 1 σ 1 σ 1 ℓ ℓ r r α 1 β − 1 ℓ r · ℓ · ⊥ · ⊥ · ℓ · r · ⊥ · ⊥ · r Recognizability and Weighted Tree Transducers A. Maletti 14 ·
Weighted example Example Input tree σ σ β α α Recognized weighted language A ( σ ( t 1 , t 2 )) = | t 1 | α − | t 2 | β Note Support supp ( A ) = { σ ( t 1 , t 2 ) | | t 1 | α � = | t 2 | β } is not recognizable! (i.e., language of non-zero weighted trees) Recognizability and Weighted Tree Transducers A. Maletti 15 ·
Weighted example Example Non-zero weighted run σ 1 ⊤ σ 1 β 1 ℓ ⊥ α 1 α 1 ⊥ ℓ Recognized weighted language A ( σ ( t 1 , t 2 )) = | t 1 | α − | t 2 | β Note Support supp ( A ) = { σ ( t 1 , t 2 ) | | t 1 | α � = | t 2 | β } is not recognizable! (i.e., language of non-zero weighted trees) Recognizability and Weighted Tree Transducers A. Maletti 15 ·
Weighted example Example Non-zero weighted run σ 1 σ 1 ⊤ ⊤ σ 1 β 1 σ 1 β 1 ℓ ⊥ ℓ ⊥ α 1 α 1 α 1 α 1 ⊥ ⊥ ℓ ℓ Recognized weighted language A ( σ ( t 1 , t 2 )) = | t 1 | α − | t 2 | β Note Support supp ( A ) = { σ ( t 1 , t 2 ) | | t 1 | α � = | t 2 | β } is not recognizable! (i.e., language of non-zero weighted trees) Recognizability and Weighted Tree Transducers A. Maletti 15 ·
Weighted example Example Non-zero weighted run σ 1 σ 1 σ 1 ⊤ ⊤ ⊤ σ 1 β 1 σ 1 β 1 σ 1 β − 1 r ℓ ⊥ ℓ ⊥ ⊥ α 1 α 1 α 1 α 1 α 1 α 1 ⊥ ⊥ ⊥ ⊥ ℓ ℓ Recognized weighted language A ( σ ( t 1 , t 2 )) = | t 1 | α − | t 2 | β Note Support supp ( A ) = { σ ( t 1 , t 2 ) | | t 1 | α � = | t 2 | β } is not recognizable! (i.e., language of non-zero weighted trees) Recognizability and Weighted Tree Transducers A. Maletti 15 ·
Weighted example Example Non-zero weighted run σ 1 σ 1 σ 1 ⊤ ⊤ ⊤ σ 1 β 1 σ 1 β 1 σ 1 β − 1 r ℓ ⊥ ℓ ⊥ ⊥ α 1 α 1 α 1 α 1 α 1 α 1 ⊥ ⊥ ⊥ ⊥ ℓ ℓ Recognized weighted language A ( σ ( t 1 , t 2 )) = | t 1 | α − | t 2 | β Note Support supp ( A ) = { σ ( t 1 , t 2 ) | | t 1 | α � = | t 2 | β } is not recognizable! (i.e., language of non-zero weighted trees) Recognizability and Weighted Tree Transducers A. Maletti 15 ·
Weighted example Example Non-zero weighted run σ 1 σ 1 σ 1 ⊤ ⊤ ⊤ σ 1 β 1 σ 1 β 1 σ 1 β − 1 r ℓ ⊥ ℓ ⊥ ⊥ α 1 α 1 α 1 α 1 α 1 α 1 ⊥ ⊥ ⊥ ⊥ ℓ ℓ Recognized weighted language A ( σ ( t 1 , t 2 )) = | t 1 | α − | t 2 | β Note Support supp ( A ) = { σ ( t 1 , t 2 ) | | t 1 | α � = | t 2 | β } is not recognizable! (i.e., language of non-zero weighted trees) Recognizability and Weighted Tree Transducers A. Maletti 15 ·
Contents Motivation 1 Recognizable Weighted Tree Language 2 Weighted Extended Top-down Tree Transducer 3 Preservation of Recognizability 4 Nonpreservation of Recognizability 5 Recognizability and Weighted Tree Transducers A. Maletti 16 ·
Syntax Definition (A RNOLD , D AUCHET 1976, G RAEHL , K NIGHT 2004) Weighted extended top-down tree transducer (WXTT) M = ( Q , Σ , ∆ , I , R ) with finitely many rules q ∆ c Σ → q ′ ( x i ) . . . p ( x j ) . . . x 1 x k states q , q ′ , p ∈ Q variable indices i , j ∈ { 1 , . . . , k } [A RNOLD , D AUCHET : Bi-transductions de forêts. Proc. ICALP 1976] [G RAEHL , K NIGHT : Training tree transducers. Proc. NAACL 2004] Recognizability and Weighted Tree Transducers A. Maletti 17 ·
Syntax Definition (R OUNDS 1970, T HATCHER 1970) Weighted top-down tree transducer (WTT) if all rules q ∆ c σ → . . . x 1 x k q ′ ( x i ) . . . p ( x j ) [R OUNDS : Mappings and grammars on trees. Math. Syst. Theory, 1970] [T HATCHER : Generalized sequential machine maps. J. Comput. Syst. Sci., 1970] Recognizability and Weighted Tree Transducers A. Maletti 18 ·
Semantics Example States { q S , q V , q NP } of which only q S has non-zero initial weight q S S ′ q V q NP q V q NP 0 . 4 1 1 S q V q NP q NP VP VP → → → x 1 x 2 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 Derivation Recognizability and Weighted Tree Transducers A. Maletti 19 ·
Semantics Example States { q S , q V , q NP } of which only q S has non-zero initial weight q S S ′ q V q NP q V q NP 0 . 4 1 1 S q V q NP q NP VP VP → → → x 1 x 2 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 Derivation q S S ′ S ′ S ′ S q V q NP q NP q V q NP q NP 0 . 4 1 1 q V q NP q NP ⇒ ⇒ ⇒ VP VP VP VP t 1 t 1 t 2 t 1 t 2 t 1 t 3 t 2 t 3 t 2 t 3 t 2 t 3 t 2 t 3 Recognizability and Weighted Tree Transducers A. Maletti 19 ·
Semantics Example States { q S , q V , q NP } of which only q S has non-zero initial weight q S S ′ q V q NP q V q NP 0 . 4 1 1 S q V q NP q NP VP VP → → → x 1 x 2 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 Derivation q S S ′ S ′ S ′ S q V q NP q NP q V q NP q NP 0 . 4 1 1 q V q NP q NP ⇒ ⇒ ⇒ VP VP VP VP t 1 t 1 t 2 t 1 t 2 t 1 t 3 t 2 t 3 t 2 t 3 t 2 t 3 t 2 t 3 Recognizability and Weighted Tree Transducers A. Maletti 19 ·
Semantics Example States { q S , q V , q NP } of which only q S has non-zero initial weight q S S ′ q V q NP q V q NP 0 . 4 1 1 S q V q NP q NP VP VP → → → x 1 x 2 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 Derivation q S S ′ S ′ S ′ S q V q NP q NP q V q NP q NP 0 . 4 1 1 q V q NP q NP ⇒ ⇒ ⇒ VP VP VP VP t 1 t 1 t 2 t 1 t 2 t 1 t 3 t 2 t 3 t 2 t 3 t 2 t 3 t 2 t 3 Recognizability and Weighted Tree Transducers A. Maletti 19 ·
Semantics Definition Computed transformation ( t ∈ T Σ and u ∈ T ∆ ): � M ( t , u ) = I ( q ) · c 1 · . . . · c n q ∈ Q c 1 cn q ( t ) ⇒··· ⇒ u left-most derivation Recognizability and Weighted Tree Transducers A. Maletti 20 ·
Contents Motivation 1 Recognizable Weighted Tree Language 2 Weighted Extended Top-down Tree Transducer 3 Preservation of Recognizability 4 Nonpreservation of Recognizability 5 Recognizability and Weighted Tree Transducers A. Maletti 21 ·
Preservation of recognizability Definition (Forward application) M : T Σ × T ∆ → C and A : T Σ → C � [ M ( A )]( u ) = A ( t ) · M ( t , u ) t ∈ T Σ Approach Input (or output) product followed by projection 1 Direct construction 2 Recognizability and Weighted Tree Transducers A. Maletti 22 ·
Preservation of recognizability Definition (Forward application) M : T Σ × T ∆ → C and A : T Σ → C � [ M ( A )]( u ) = A ( t ) · M ( t , u ) t ∈ T Σ Approach Input (or output) product followed by projection 1 Direct construction 2 Recognizability and Weighted Tree Transducers A. Maletti 22 ·
Input product + projection Definition (Forward application) M : T Σ × T ∆ → C and A : T Σ → C � [ M ( A )]( u ) = A ( t ) · M ( t , u ) t ∈ T Σ Definition (Input product) Input product of WTA A and WXTT M is WXTT A M with A M ( t , u ) = A ( t ) · M ( t , u ) Definition (Range projection) WXTT M [ ran ( M )]( u ) = � M ( t , u ) t ∈ T Σ Recognizability and Weighted Tree Transducers A. Maletti 23 ·
Input product + projection Definition (Forward application) M : T Σ × T ∆ → C and A : T Σ → C � [ M ( A )]( u ) = A ( t ) · M ( t , u ) t ∈ T Σ Definition (Input product) Input product of WTA A and WXTT M is WXTT A M with A M ( t , u ) = A ( t ) · M ( t , u ) Definition (Range projection) WXTT M [ ran ( M )]( u ) = � M ( t , u ) t ∈ T Σ Recognizability and Weighted Tree Transducers A. Maletti 23 ·
Input product + projection Definition (Forward application) M : T Σ × T ∆ → C and A : T Σ → C M ( A ) = ran ( A M ) Definition (Input product) Input product of WTA A and WXTT M is WXTT A M with A M ( t , u ) = A ( t ) · M ( t , u ) Definition (Range projection) WXTT M [ ran ( M )]( u ) = � M ( t , u ) t ∈ T Σ Recognizability and Weighted Tree Transducers A. Maletti 23 ·
Product + projection positive two simple generic constructions ◮ B AR -H ILLEL construction ◮ projection reusable explain most known cases Recognizability and Weighted Tree Transducers A. Maletti 24 ·
Product + projection positive two simple generic constructions ◮ B AR -H ILLEL construction ◮ projection reusable explain most known cases negative requirements of two constructions inefficiencies Recognizability and Weighted Tree Transducers A. Maletti 24 ·
Product + projection Requirement input range output domain model product projection product projection ln-XTOP ✓ ✓ ✓ ✓ l-XTOP ✓ / ✗ ✗ ✓ ✓ XTOP ✗ ✗ ✓ ✗ ln-MBOT l-MBOT MBOT ln-STSSG Conclusion Nondeletion essential for input product! Recognizability and Weighted Tree Transducers A. Maletti 24 ·
Product + projection Requirement input range output domain model product projection product projection ln-XTOP ✓ ✓ ✓ ✓ l-XTOP ✓ / ✗ ✗ ✓ ✓ XTOP ✗ ✗ ✓ ✗ ln-MBOT ✓ ✗ ✓ ✓ l-MBOT ✓ ✗ ✓ ✓ MBOT ✗ ✗ ✓ ✗ ln-STSSG Conclusion Nondeletion essential for input product! Recognizability and Weighted Tree Transducers A. Maletti 24 ·
Product + projection Requirement input range output domain model product projection product projection ln-XTOP ✓ ✓ ✓ ✓ l-XTOP ✓ / ✗ ✗ ✓ ✓ XTOP ✗ ✗ ✓ ✗ ln-MBOT ✓ ✗ ✓ ✓ l-MBOT ✓ ✗ ✓ ✓ MBOT ✗ ✗ ✓ ✗ ln-STSSG ✓ ✗ ✓ ✗ Conclusion Nondeletion essential for input product! Recognizability and Weighted Tree Transducers A. Maletti 24 ·
Product + projection Requirement input range output domain model product projection product projection ln-XTOP ( ✓ ) ( ✓ ) ✓ ✓ ✓ ✓ l-XTOP ( ✓ / ✗ ) ✓ / ✗ ( ✓ ) ✗ ✓ ✓ XTOP ( ✗ ) ( ✗ ) ✗ ✗ ✓ ✗ ln-MBOT ( ✗ ) ( ✓ ) ✓ ✗ ✓ ✓ l-MBOT ( ✗ ) ( ✓ ) ✓ ✗ ✓ ✓ MBOT ( ✗ ) ( ✗ ) ✗ ✗ ✓ ✗ ln-STSSG ( ✗ ) ( ✗ ) ✓ ✗ ✓ ✗ Conclusion Nondeletion essential for input product! Recognizability and Weighted Tree Transducers A. Maletti 24 ·
Product + projection Requirement input range output domain model product projection product projection ln-XTOP ✓ ✓ ✓ ✓ l-XTOP ✓ / ✗ ✗ ✓ ✓ XTOP ✗ ✗ ✓ ✗ ln-MBOT ✓ ✗ ✓ ✓ l-MBOT ✓ ✗ ✓ ✓ MBOT ✗ ✗ ✓ ✗ ln-STSSG ✓ ✗ ✓ ✗ Conclusion Nondeletion essential for input product! Recognizability and Weighted Tree Transducers A. Maletti 24 ·
Nondeletion Example q S S ′ q V q NP q V q NP 0 . 4 1 1 S q V q NP q NP VP VP → → → x 1 x 2 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 nondeleting linear linear Definition WXTT M is nondeleting if var ( l ) = var ( r ) for all rules l → r linear if no variable appears twice in r for all rules l → r Recognizability and Weighted Tree Transducers A. Maletti 25 ·
Nondeletion Example q S S ′ q V q NP q V q NP 0 . 4 1 1 S q V q NP q NP VP VP → → → x 1 x 2 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 nondeleting deletes x 2 deletes x 1 Definition all-copies nondeleting = nondeleting = every copy of an input subtree is fully explored some-copy nondeleting = one copy of each input subtree is fully explored Recognizability and Weighted Tree Transducers A. Maletti 25 ·
Nondeletion Example q S S ′ q V q NP q V q NP 0 . 4 1 1 S q V q NP q NP VP VP → → → x 1 x 2 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 is not some-copy nondeleting Example (Derivation) q S S ′ S ′ S q V q NP q NP q V q NP q NP 0 . 4 ⇒ ⇒ VP VP VP VP VP VP VP x 2 x 1 x 2 x 3 x 4 x 3 x 4 x 1 x 2 x 3 x 4 x 3 x 4 x 3 x 4 Recognizability and Weighted Tree Transducers A. Maletti 26 ·
Nondeletion Example q S S ′ q V q NP q V q NP 0 . 4 1 1 S q V q NP q NP VP VP → → → x 1 x 2 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 is not some-copy nondeleting Example (Derivation) q S S ′ S ′ S q V q NP q NP q V q NP q NP 0 . 4 ⇒ ⇒ VP VP VP VP VP VP VP x 2 x 1 x 2 x 3 x 4 x 3 x 4 x 1 x 2 x 3 x 4 x 3 x 4 x 3 x 4 Recognizability and Weighted Tree Transducers A. Maletti 26 ·
Nondeletion Example q S S ′ q V q NP q V q NP 0 . 4 1 1 S q V q NP q NP VP VP → → → x 1 x 2 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 is not some-copy nondeleting Example (Derivation) q S S ′ S ′ S q V q NP q NP q V q NP q NP 0 . 4 ⇒ ⇒ VP VP VP VP VP VP VP x 2 x 1 x 2 x 3 x 4 x 3 x 4 x 1 x 2 x 3 x 4 x 3 x 4 x 3 x 4 Recognizability and Weighted Tree Transducers A. Maletti 26 ·
Nondeletion Example q S S ′ q V q NP VP q V 0 . 4 1 1 S VP VP q NP q V q V q NP q NP → → → x 1 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 x 1 x 2 can be some-copy nondeleting Example (Derivation) q S S ′ S ′ S ′ S ′ S q V q NP q NP q V q NP q NP q V VP q NP q V VP VP 0 . 4 ⇒ ⇒ ⇒ ⇒ VP VP VP VP VP VP VP q NP q V VP q NP q V q NP q V x 3 x 3 x 3 x 1 x 2 x 3 x 4 x 3 x 4 x 1 x 2 x 3 x 4 x 1 x 2 x 3 x 4 x 2 x 1 x 3 x 4 x 2 x 1 x 4 x 3 Recognizability and Weighted Tree Transducers A. Maletti 26 ·
Nondeletion Example q S S ′ q V q NP VP q V 0 . 4 1 1 S VP VP q NP q V q V q NP q NP → → → x 1 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 x 1 x 2 can be some-copy nondeleting Example (Derivation) q S S ′ S ′ S ′ S ′ S q V q NP q NP q V q NP q NP q V VP q NP q V VP VP 0 . 4 ⇒ ⇒ ⇒ ⇒ VP VP VP VP VP VP VP q NP q V VP q NP q V q NP q V x 3 x 3 x 3 x 1 x 2 x 3 x 4 x 3 x 4 x 1 x 2 x 3 x 4 x 1 x 2 x 3 x 4 x 2 x 1 x 3 x 4 x 2 x 1 x 4 x 3 Recognizability and Weighted Tree Transducers A. Maletti 26 ·
Nondeletion Example q S S ′ q V q NP VP q V 0 . 4 1 1 S VP VP q NP q V q V q NP q NP → → → x 1 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 x 1 x 2 can be some-copy nondeleting Example (Derivation) q S S ′ S ′ S ′ S ′ S q V q NP q NP q V q NP q NP q V VP q NP q V VP VP 0 . 4 ⇒ ⇒ ⇒ ⇒ VP VP VP VP VP VP VP q NP q V VP q NP q V q NP q V x 3 x 3 x 3 x 1 x 2 x 3 x 4 x 3 x 4 x 1 x 2 x 3 x 4 x 1 x 2 x 3 x 4 x 2 x 1 x 3 x 4 x 2 x 1 x 4 x 3 Recognizability and Weighted Tree Transducers A. Maletti 26 ·
Nondeletion Example q S S ′ q V q NP VP q V 0 . 4 1 1 S VP VP q NP q V q V q NP q NP → → → x 1 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 x 1 x 2 can be some-copy nondeleting Example (Derivation) q S S ′ S ′ S ′ S ′ S q V q NP q NP q V q NP q NP q V VP q NP q V VP VP 0 . 4 ⇒ ⇒ ⇒ ⇒ VP VP VP VP VP VP VP q NP q V VP q NP q V q NP q V x 3 x 3 x 3 x 1 x 2 x 3 x 4 x 3 x 4 x 1 x 2 x 3 x 4 x 1 x 2 x 3 x 4 x 2 x 1 x 3 x 4 x 2 x 1 x 4 x 3 Recognizability and Weighted Tree Transducers A. Maletti 26 ·
Nondeletion Example q S S ′ q V q NP VP q V 0 . 4 1 1 S VP VP q NP q V q V q NP q NP → → → x 1 x 1 x 2 x 1 x 2 x 1 x 2 x 2 x 1 x 2 x 1 x 2 can be some-copy nondeleting Example (Derivation) q S S ′ S ′ S ′ S ′ S q V q NP q NP q V q NP q NP q V VP q NP q V VP VP 0 . 4 ⇒ ⇒ ⇒ ⇒ VP VP VP VP VP VP VP q NP q V VP q NP q V q NP q V x 3 x 3 x 3 x 1 x 2 x 3 x 4 x 3 x 4 x 1 x 2 x 3 x 4 x 1 x 2 x 3 x 4 x 2 x 1 x 3 x 4 x 2 x 1 x 4 x 3 Recognizability and Weighted Tree Transducers A. Maletti 26 ·
Scenario 1 Theorem (E NGELFRIET 1977) For nondeleting WXTT M and WTA A we can construct A M Proof. q S S ′ S ′ � q S , p � 0 . 4 c S c 0 . 4 q V q NP q NP → p q NP � q V , p 2 � � q NP , p 1 � → S x 2 x 1 x 2 x 1 p 1 x 2 p 2 x 1 x 2 x 2 x 1 x 2 constructed rule original rules for original nondeleting rules construct new rules mark one state for each variable; one possibility x 2 e x 1 f → x 2 a x 1 b x 2 d x 2 d a b Recognizability and Weighted Tree Transducers A. Maletti 27 ·
Scenario 1 Theorem (E NGELFRIET 1977) For nondeleting WXTT M and WTA A we can construct A M Proof. q S S ′ S ′ � q S , p � 0 . 4 c S c 0 . 4 q V q NP q NP → p q NP � q V , p 2 � � q NP , p 1 � → S x 2 x 1 x 2 x 1 p 1 x 2 p 2 x 1 x 2 x 2 x 1 x 2 constructed rule original rules for original nondeleting rules construct new rules mark one state for each variable; one possibility x 2 e x 1 f → x 2 a x 1 b x 2 d x 2 d a b Recognizability and Weighted Tree Transducers A. Maletti 27 ·
Scenario 2 Theorem ( ∼ 2010) For some-copy nondeleting WXTT M and WTA A over idempotent semiring we can construct A M Proof. for original nondeleting rules construct new rules mark one state for each variable; all possibilities e f | f e → x 2 a x 1 b x 2 d x 2 x 1 x 2 d x 2 a x 1 x 2 a b b d at least one exploration will succeed (somy-copy nondeletion) aebfd + abfde = abdef if several succeed (idempotency) Recognizability and Weighted Tree Transducers A. Maletti 28 ·
Scenario 2 Theorem ( ∼ 2010) For some-copy nondeleting WXTT M and WTA A over idempotent semiring we can construct A M Proof. for original nondeleting rules construct new rules mark one state for each variable; all possibilities e f | f e → x 2 a x 1 b x 2 d x 2 x 1 x 2 d x 2 a x 1 x 2 a b b d at least one exploration will succeed (somy-copy nondeletion) aebfd + abfde = abdef if several succeed (idempotency) Recognizability and Weighted Tree Transducers A. Maletti 28 ·
Scenario 2 Theorem ( ∼ 2010) For some-copy nondeleting WXTT M and WTA A over idempotent semiring we can construct A M Proof. for original nondeleting rules construct new rules mark one state for each variable; all possibilities e f | f e → x 2 a x 1 b x 2 d x 2 x 1 x 2 d x 2 a x 1 x 2 a b b d at least one exploration will succeed (somy-copy nondeletion) aebfd + abfde = abdef if several succeed (idempotency) Recognizability and Weighted Tree Transducers A. Maletti 28 ·
Scenario 3 Theorem ( ∼ 2010) For some-copy nondeleting WXTT M and WTA A over ring we can construct A M Proof. for original nondeleting rules construct several new rules mark states according to elimination scheme → x 2 a x 1 b x 2 d − 1 e f | f e | e f x 2 x 1 x 2 d x 2 a x 1 x 2 x 2 x 1 x 2 a a b b d b d at least one exploration will succeed Recognizability and Weighted Tree Transducers A. Maletti 29 ·
Scenario 3 Theorem ( ∼ 2010) For some-copy nondeleting WXTT M and WTA A over ring we can construct A M Proof. for original nondeleting rules construct several new rules mark states according to elimination scheme → x 2 a x 1 b x 2 d − 1 e f | f e | e f x 2 x 1 x 2 d x 2 a x 1 x 2 x 2 x 1 x 2 a a b b d b d at least one exploration will succeed Recognizability and Weighted Tree Transducers A. Maletti 29 ·
Scenario 3 Theorem ( ∼ 2010) For some-copy nondeleting WXTT M and WTA A over ring we can construct A M Proof. if several succeed, then − 1 e f | f e | e f x 2 x 1 x 2 d x 2 a x 1 x 2 x 2 x 1 x 2 a a b b d b d 0 0 aebfd 0 0 abfde aebfd abfde − aebfd Recognizability and Weighted Tree Transducers A. Maletti 29 ·
Elimination schemes Question Do elimination schemes exist? Answer 001 010 100 011 101 110 111 � + + + − − − + 001 0 0 0 0 0 0 a a 010 0 0 0 0 0 0 a a 100 0 0 0 0 0 0 a a 011 0 0 0 0 a a − a a 101 0 0 0 0 a a − a a 110 0 0 0 0 a a − a a 111 a a a − a − a − a a a Recognizability and Weighted Tree Transducers A. Maletti 30 ·
Elimination schemes Question Do elimination schemes exist? Answer 001 010 100 011 101 110 111 � + + + − − − + 001 0 0 0 0 0 0 a a 010 0 0 0 0 0 0 a a 100 0 0 0 0 0 0 a a 011 0 0 0 0 a a − a a 101 0 0 0 0 a a − a a 110 0 0 0 0 a a − a a 111 a a a − a − a − a a a Recognizability and Weighted Tree Transducers A. Maletti 30 ·
Elimination schemes Question Do elimination schemes exist? Answer 001 010 100 011 101 110 111 � + + + − − − + 001 0 0 0 0 0 0 a a 010 0 0 0 0 0 0 a a 100 0 0 0 0 0 0 a a 011 0 0 0 0 a a − a a 101 0 0 0 0 a a − a a 110 0 0 0 0 a a − a a 111 a a a − a − a − a a a Recognizability and Weighted Tree Transducers A. Maletti 30 ·
Direct construction Applicability here only l-XTOP (product + projection fails) Failure input product fails because it cannot attach weights to deleted subtrees but range projection disregards input trees Solution assign aggregate weight to transitions deleting subtrees Recognizability and Weighted Tree Transducers A. Maletti 31 ·
Direct construction Applicability here only l-XTOP (product + projection fails) Failure input product fails because it cannot attach weights to deleted subtrees but range projection disregards input trees Solution assign aggregate weight to transitions deleting subtrees Recognizability and Weighted Tree Transducers A. Maletti 31 ·
Direct construction Applicability here only l-XTOP (product + projection fails) Failure input product fails because it cannot attach weights to deleted subtrees but range projection disregards input trees Solution assign aggregate weight to transitions deleting subtrees Recognizability and Weighted Tree Transducers A. Maletti 31 ·
Direct construction Applicability here only l-XTOP (product + projection fails) Failure input product fails because it cannot attach weights to deleted subtrees but range projection disregards input trees Solution assign aggregate weight to transitions deleting subtrees Recognizability and Weighted Tree Transducers A. Maletti 31 ·
Bonus scenario q S S ′ S ′ � q S , p � 0 . 4 c ′ S c 0 . 4 q V → p � q V , p 2 � → S x 2 x 1 p 1 x 2 p 2 x 1 x 2 x 2 constructed rule original rules where c ′ = c · in ( p 1 ) Inside weight of p in ( p ) = � wt ( r ) t ∈ T Σ r run on t root ( r )= p Recognizability and Weighted Tree Transducers A. Maletti 32 ·
Bonus scenario Theorem For linear WXTT M and WTA A we can construct A M if inside weights of A can be computed Computation of inside weights trivial in B OOLEAN semiring typically simple in extremal semirings (V ITERBI algorithms) possible in N (deciding finiteness of support) → possible in many interesting cases approximation possible for R (N EWTON method) Recognizability and Weighted Tree Transducers A. Maletti 33 ·
Bonus scenario Theorem For linear WXTT M and WTA A we can construct A M if inside weights of A can be computed Computation of inside weights trivial in B OOLEAN semiring typically simple in extremal semirings (V ITERBI algorithms) possible in N (deciding finiteness of support) → possible in many interesting cases approximation possible for R (N EWTON method) Recognizability and Weighted Tree Transducers A. Maletti 33 ·
Bonus scenario Theorem For linear WXTT M and WTA A we can construct A M if inside weights of A can be computed Computation of inside weights trivial in B OOLEAN semiring typically simple in extremal semirings (V ITERBI algorithms) possible in N (deciding finiteness of support) → possible in many interesting cases approximation possible for R (N EWTON method) Recognizability and Weighted Tree Transducers A. Maletti 33 ·
Contents Motivation 1 Recognizable Weighted Tree Language 2 Weighted Extended Top-down Tree Transducer 3 Preservation of Recognizability 4 Nonpreservation of Recognizability 5 Recognizability and Weighted Tree Transducers A. Maletti 34 ·
Overview M − 1 ( L ) recognizable? model M ( L ) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓ / ✗ ( ✓ ) ✓ XTOP ✗ ( ✓ ) ✗ ln-MBOT ✗ ✓ l-MBOT ✓ ( ✓ ) ✗ MBOT ✗ ( ✓ ) ✗ ln-STSSG ✗ ✗ Limitation no coverage of unweighted failures only backward application of XTOP! (same phenomenon for MBOT) Recognizability and Weighted Tree Transducers A. Maletti 35 ·
Overview M − 1 ( L ) recognizable? model M ( L ) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓ / ✗ ( ✓ ) ✓ XTOP ✗ ( ✓ ) ✗ ln-MBOT ✗ ✓ l-MBOT ✓ ( ✓ ) ✗ MBOT ✗ ( ✓ ) ✗ ln-STSSG ✗ ✗ Limitation no coverage of unweighted failures only backward application of XTOP! (same phenomenon for MBOT) Recognizability and Weighted Tree Transducers A. Maletti 35 ·
Overview M − 1 ( L ) recognizable? model M ( L ) recognizable? ln-XTOP ✓ ✓ l-XTOP ✓ / ✗ ( ✓ ) ✓ XTOP ✗ ( ✓ ) ✗ ln-MBOT ✗ ✓ l-MBOT ✓ ( ✓ ) ✗ MBOT ✗ ( ✓ ) ✗ ln-STSSG ✗ ✗ Limitation no coverage of unweighted failures only backward application of XTOP! (same phenomenon for MBOT) Recognizability and Weighted Tree Transducers A. Maletti 35 ·
Counterexample WXTT M WTA A Example Example q σ σ 1 p q α 2 q q → → α γ p · p · p α x 1 x 1 x 1 Recognizability and Weighted Tree Transducers A. Maletti 36 ·
Counterexample WXTT M WTA A Example Example q σ σ 1 p q α 2 q q → → α γ p · p · p α x 1 x 1 x 1 Transformation γ σ . . . . . . . . . �→ γ σ σ α α α α α Recognizability and Weighted Tree Transducers A. Maletti 36 ·
Counterexample WXTT M WTA A Example Example q σ σ 1 p q α 2 q q → → α γ p · p · p α x 1 x 1 x 1 Transformation γ σ Weighted tree language . . . . . . . . . �→ A ( u ) = 2 | u | α γ σ σ α α α α α Recognizability and Weighted Tree Transducers A. Maletti 36 ·
Counterexample Transformation M Weighted tree language A γ σ A ( u ) = 2 | u | α . . . . . . . . . �→ γ σ σ Backward application α α α α α [ M − 1 ( A )]( t ) = 2 ( 2 | t | γ ) | u | α = 2 | t | γ Theorem For every WTA A over N there exists n ∈ N such that ∀ t ∈ T Σ A ( t ) ≤ n | t | + 1 [F ÜLÖP , ∼ , V OGLER : Weighted extended tree transducers. Fundam. Inform. 2011] Recognizability and Weighted Tree Transducers A. Maletti 37 ·
Recommend
More recommend