Grammatical inference and subregular phonology Adam Jardine Rutgers University December 11, 2019 · Tel Aviv University
Review
Outline of course • Day 1: Learning, languages, and grammars • Day 2: Learning strictly local grammars • Day 3: Automata and input strictly local functions • Day 4: Learning functions and stochastic patterns, other open questions 2
Review of days 1 & 2 • Phonological patterns are governed by restrictive computational universals • We studied one such universal of strict locality 3
Review of days 1 & 2 • We studied learning SL languages under the paradigm of identification in the limit from positive data t p ( t ) L ⋆ 0 abab 1 ababab 2 ab A G i . . . . . . p [ i ] i λ . . . . . . 4
Today • Learning with finite-state automata for – strictly local languages – input-strictly local functions 5
Strictly local acceptors
Strictly local acceptors Engelfriet & Hoogeboom, 2001 “It is always a pleasant surprise when two formalisms, intro- duced with different motivations, turn out to be equally pow- erful, as this indicates that the underlying concept is a natural one.” (p. 216) 6
Strictly local acceptors • A finite-state acceptor (FSA) is a set of states and transitions between states b b a 0 1 a 7
Strictly local acceptors b b a 0 1 a a b b a 8
Strictly local acceptors b b a 0 1 a a b b a 0 → 1 8
Strictly local acceptors b b a 0 1 a a b b a 0 → 1 → 1 8
Strictly local acceptors b b a 0 1 a a b b a 0 → 1 → 1 → 1 8
Strictly local acceptors b b a 0 1 a a b b a 0 → 1 → 1 → 1 → 0 8
Strictly local acceptors b b a 0 1 a a b b a 0 → 1 → 1 → 1 → 0 � 8
Strictly local acceptors b b a 0 1 a b a a b b a 9
Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 9
Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 9
Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 9
Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 → 0 9
Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 → 0 → 0 9
Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 → 0 → 0 → 1 9
Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 → 0 → 0 → 1 ✗ 9
Strictly local acceptors • A SL k FSA ’s states represent the k − 1 factors of Σ ∗ b b a b 0 1 0 1 a a Not SL k for any k SL 2 ; 0 = b , 1 = a 10
Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11
Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11
Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11
Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11
Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11
Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11
Strictly local acceptors • Forbidden k -factors are expressed by missing transitions/accepting states b a b a ⋊ a b b ⋉ ✗ 12
Strictly local acceptors • SLFSAs describe exactly the SL languages • Thus, they capture the same concept of locality as SL grammars, but in a different way 13
Learning with strictly local acceptors
Learning with strictly local acceptors • Finite-state automata are useful because they have a number of learning techniques (de la Higuera, 2010) • We’ll use a ‘transition filling’ of Heinz and Rogers (2013) 14
Learning with strictly local acceptors b a a 0 1 b 15
Learning with strictly local acceptors ⊥ ⊥ b : ⊥ ⊥ a : ⊥ ⊥ ⊤ a : ⊤ ⊤ 0 : ⊥ ⊥ ⊥ 1 : ⊤ ⊤ ⊤ ⊤ b : ⊤ ⊤ • output function Q × Σ → {⊤ , ⊥} • ending function Q → {⊤ , ⊥} 15
Learning with strictly local acceptors C : ⊥ C : ⊥ C : ⊥ V : ⊥ C : ⊥ ⋊ : ⊥ V : ⊥ V : ⊥ V : ⊥ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16
Learning with strictly local acceptors data C : ⊥ 0 CV C : ⊥ C : ⊥ V : ⊥ C : ⊥ ⋊ : ⊥ V : ⊥ V : ⊥ V : ⊥ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16
Learning with strictly local acceptors data C : ⊥ 0 CV C : ⊥ C : ⊤ V : ⊤ C : ⊥ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊥ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16
Learning with strictly local acceptors data C : ⊥ 0 CV 1 V C : ⊥ C : ⊤ V : ⊤ C : ⊥ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊤ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16
Learning with strictly local acceptors data C : ⊥ 0 CV 1 V C : ⊥ C : ⊤ V : ⊤ 2 CV CV C : ⊤ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊤ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16
Learning with strictly local acceptors data C : ⊥ 0 CV 1 V C : ⊥ C : ⊤ V : ⊤ 2 CV CV C : ⊤ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊤ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16
Learning with strictly local acceptors C : ⊥ C : ⊥ C : ⊥ V : ⊥ C : ⊥ ⋊ : ⊥ V : ⊥ V : ⊥ V : ⊥ • Any SL 2 language can be described by varying {⊤ , ⊥} on this structure 17
Learning with strictly local acceptors C : ⊥ C : ⊤ C : ⊤ V : ⊤ C : ⊤ ⋊ : ⊥ V : ⊤ V : ⊤ V : ⊤ • Any SL 2 language can be described by varying {⊤ , ⊥} on this structure 17
Learning with strictly local acceptors C : ⊥ C : ⊥ C : ⊤ V : ⊤ C : ⊤ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊤ • Any SL 2 language can be described by varying {⊤ , ⊥} on this structure 17
Learning with strictly local acceptors C : ⊥ C : ⊥ CC : ⊥ V : ⊥ C : ⊥ C : ⊥ C : ⊥ CV : ⊥ V : ⊥ ⋊ : ⊥ V : ⊥ C : ⊥ C : ⊥ V C : ⊥ V : ⊥ V : ⊥ C : ⊥ V : ⊥ V V : ⊥ V : ⊥ V : ⊥ • Any SL 3 language can be described by this structure 18
Learning with strictly local acceptors • This procedure ILPD-learns any SL k language for a given k • It is distinct, yet based on the same notion of locality 19
Input strictly local functions
Input strictly local functions • Generative phonology is primarily interested in maps /kam-pa/ → [kamba] /kam-pa/ F aith *NC ˇ ID (voi) /kam-pa/ *! C → [+ voi ] / N [kampa] b *! [kama] [kamba] * ☞ [kamba] 20
Input strictly local functions • Maps are (functional) relations /NC / → [NC ˇ ] ˚ { ( an , an ), ( anda , anda ), ( anta , anda ), ( lalalalampa , lalalalamba ),... } • We can study classes of relations like we studied classes of formal languages 21
Input strictly local functions • Johnson (1972); Kaplan and Kay (1994): phonological maps are regular memory memory length of w length of w regular non-regular • Regular functions � = regular languages! 22
Input strictly local functions computable functions Reg • How do we extend strict locality to functions? 23
Input strictly local functions computable functions Subseq Reg • How do we extend strict locality to functions? • Phonological maps are subsequential ... (Mohri, 1997; Heinz and Lai, 2013, et seq.) 23
Subsequential transducers ⊥ ⊥ b : ⊥ ⊥ a : ⊥ ⊥ ⊤ a : ⊤ ⊤ 0 : ⊥ ⊥ ⊥ 1 : ⊤ ⊤ ⊤ ⊤ b : ⊤ ⊤ Deterministic acceptor: • output function Q × Σ → {⊤ , ⊥} • ending function Q → {⊤ , ⊥} 24
Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c Subsequential transducer: • output function Q × Σ → Γ ∗ • ending function Q → Γ ∗ 24
Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 25
Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 b 25
Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 → 1 b a 25
Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 → 1 → 0 b a c 25
Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 → 1 → 0 → 0 b a c b 25
Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 → 1 → 0 → 0 b a c b d 25
Subsequential transducers Let’s do some examples... 26
Recommend
More recommend