full reduction at full throttle
play

Full reduction at full throttle Section 1 Maxime Klusman Radboud - PowerPoint PPT Presentation

Full reduction at full throttle Section 1 Maxime Klusman Radboud University Nijmegen January 23, 2015 1 / 10 Recap Terms t ::= x | t 1 t 2 | v Val v ::= x.t | [ x v 1 ...v n ] ( x.t ) v t [ x v ] ( v ) [ x v 1 ...v


  1. Full reduction at full throttle Section 1 Maxime Klusman Radboud University Nijmegen January 23, 2015 1 / 10

  2. Recap Terms ∋ t ::= x | t 1 t 2 | v Val ∋ v ::= λx.t | [˜ x v 1 ...v n ] ( λx.t ) v → t [ x ← v ] ( β v ) [˜ x v 1 ...v n ] v → [˜ x v 1 ...v n v ] ( β s ) Γ( t ) → Γ( t ′ ) if t → t ′ context Γ ::= t [ ] | [ ] v N ( t ) = R ( V ( t )) (1) R ( λx.t ) = λy. N (( λx.t ) [˜ � y fresh � y ]) (2) R [˜ x v 1 ...v n ] = x R ( v 1 ) ... R ( v n ) (3) 2 / 10

  3. Abstract setting module type Values = s i g type t val app : t − > t − > t type atom = | Var of var type head = | Lam of t − > t | Accu of atom ∗ t l i s t val head : t − > head val mkLam : ( t − > t ) − > t val mkAccu : atom − > t end 3 / 10

  4. Compilation and normalization � x if x ∈ B � x � B = mkAccu(Var x ) otherwise � λx.t � B = mkLam(fun x → � t � B ∪{ x } ) � t 1 t 2 � B = app � t 1 � B � t 2 � B N Λ t = R V � t � ∅ R V v = R (head v ) R (Lam f ) = λy. R V ( f (mkAccu(Var y ))) � y fresh � R (Accu( a, [ v n ; ... ; v 1 ])) = ( R A a )( R V v 1 ) ... ( R V v n ) R A (Var x ) = x 4 / 10

  5. Tagged normalization type t = head l e t app t v = match t with | Lam f − > f v | Accu ( a , args ) − > ( Accu ( a , v : : args ) ) l e t head v = v l e t mkLam f = Lam f l e t mkAccu a = Accu ( a , [ ] ) 5 / 10

  6. Example (tagged) N Λ t = 1 R V � t � ∅ R V v = 2 R (head v ) R (Lam f ) = 3 λy. R V ( f (mkAccu(Var y ))) � y fresh � R (Accu( a, [ v n ; ... ; v 1 ])) = 4 ( R A a )( R V v 1 ) ... ( R V v n ) R A (Var x ) = 5 x N Λ t = 1 R V � t � ∅ = R V (Lam f 1 ) = 2 R (head (Lam f 1 )) = R (Lam f 1 ) = 3 λy. R V ( f 1 (mkAccu (Var y ))) = λy. R V (Lam f 2 ) (= 2 =) = 3 λy.λz. R V ( f 2 (mkAccu (Var z ))) = λy.λz. R V (Accu (Var y , v :: [ ])) (= 2 =) = 4 λy.λz. ( R A (Var y )) ( R V v ) = λy.λz. ( R A (Var y )) ( R V Accu(Var z, [ ])) = 5 λy.λz.y ( R V Accu(Var z, [ ])) (= 2 =) = 4 λy.λz.y ( R A (Var z )) = 5 λy.λz.y z 6 / 10

  7. Tagless normalization: memory representation Closures: +----------+-------+-----------+-------+-------+ +-------+ | size | color | tag | C | v1 | ... | v2 | +----------+-------+-----------+-------+-------+ +-------+ \_____________________________/ \_____/ \___________________/ header code ptr values of free vars l e t g = l e t x = 4 and y = 3 in fun x y z − > x + y + z +----------+-------+-----------+-------+-------+-------+ | size = 3 | ... | tag =/= 0 | ... | 4 | 3 | +----------+-------+-----------+-------+-------+-------+ 7 / 10

  8. Tagless normalization type t = t − > t l e t app f v = f v l e t getAtom o = ( Obj . magic ( Obj . f i e l d o 3) ) : atom l e t getArgs o = ( Obj . magic ( Obj . f i e l d o 4) ) : t l i s t l e t rec head ( v : t ) = l e t o = Obj . repr v in i f Obj . tag o = 0 then Accu ( getAtoms o , getArgs o ) e l s e Lam v l e t mkLam f = f l e t rec accu atom args = l e t r e s = fun v − > accu atom ( v : : args ) in Obj . s e t t a g ( Obj . repr r e s ) 0 ; ( r e s : t ) l e t mkAccu atom = accu atom [ ] 8 / 10

  9. Example (tagless) N Λ t = 1 R V � t � ∅ R V v = 2 R (head v ) R (Lam f ) = 3 λy. R V ( f (mkAccu(Var y ))) � y fresh � R (Accu( a, [ v n ; ... ; v 1 ])) = 4 ( R A a )( R V v 1 ) ... ( R V v n ) R A (Var x ) = 5 x N Λ t = 1 R V � t � ∅ = R V f 1 = 2 R (head f 1 ) = R (Lam f 1 ) = 3 λy. R V ( f 1 (mkAccu (Var y ))) = λy. R V f 2 (= 2 =) = 3 λy.λz. R V ( f 2 (mkAccu (Var z ))) = λy.λz. R V (fun u tag=0 b ) → = 2 λy.λz. R (head (fun u tag=0 b )) = λy.λz. R (Accu(Var y, w :: [ ])) → = 4 λy.λz. ( R A (Var y )) ( R V w ) = λy.λz. ( R A (Var y )) ( R V (fun v tag=0 c )) → = 5 λy.λz.y ( R V (fun v tag=0 c )) → = 2 λy.λz.y ( R (head (fun v tag=0 c ))) = R (Accu(Var z, [ ])) → = 4 λy.λz.y ( R A (Var z )) = 5 λy.λz.y z 9 / 10

  10. Questions? 10 / 10

Recommend


More recommend