continuations and transducer composition
play

Continuations and Transducer Composition Olin Shivers Matthew Might - PowerPoint PPT Presentation

Continuations and Transducer Composition Olin Shivers Matthew Might Georgia Tech PLDI 2006 The Big Idea Observation Some programs easier to write with transducer abstraction. Goal Design features and compilation story to support this


  1. Continuations and Transducer Composition Olin Shivers Matthew Might Georgia Tech PLDI 2006

  2. The Big Idea Observation Some programs easier to write with transducer abstraction. Goal Design features and compilation story to support this abstraction.

  3. The Big Idea Observation Some programs easier to write with transducer abstraction. Goal Design features and compilation story to support this abstraction. Oh... Transducer ≡ Coroutine ≡ Process

  4. A computational analogy The world of functions ◮ Agents are functions. ◮ Functions are stateless. ◮ Composed with ◦ operator: h = f ◦ g .

  5. A computational analogy The world of functions ◮ Agents are functions. ◮ Functions are stateless. ◮ Composed with ◦ operator: h = f ◦ g . The world of online transducers ◮ Agents are input/compute/output processes. ◮ Processes have local, bounded state. ◮ Composed with Unix | operator: h = g | f . Compute Compute Input Output Input Output

  6. Online transducers ◮ DSP networks Convolve / integrate / filter / difference / . . . ◮ Network-protocol stacks (“micro-protocols”, layer integration) packet-assembly / checksum / order / http-parse / html-lex / . . . ◮ Graphics processing viewpoint-transform / clip1 / . . . / clip6 / z-divide / light / scan ◮ Stream processing ◮ Unix pipelines . . .

  7. Optimisation across composition Functional paradigm f ◦ g optimised by β -reduction: f = λ y . y + 3 g = λ z . z + 5

  8. Optimisation across composition Functional paradigm f ◦ g optimised by β -reduction: f = λ y . y + 3 g = λ z . z + 5 ◦ = λ m n . λ x . m ( n x ) (“Plumbing” made explicit in λ rep.)

  9. Optimisation across composition Functional paradigm f ◦ g optimised by β -reduction: f = λ y . y + 3 g = λ z . z + 5 ◦ = λ m n . λ x . m ( n x ) (“Plumbing” made explicit in λ rep.) f ◦ g = ( λ mn .λ x . m ( nx ))( λ y . y + 3 )( λ z . z + 5 )

  10. Optimisation across composition Functional paradigm f ◦ g optimised by β -reduction: f = λ y . y + 3 g = λ z . z + 5 ◦ = λ m n . λ x . m ( n x ) (“Plumbing” made explicit in λ rep.) f ◦ g = ( λ mn .λ x . m ( nx ))( λ y . y + 3 )( λ z . z + 5 ) = λ x . ( λ y . y + 3 )(( λ z . z + 5 ) x ) = λ x . ( λ y . y + 3 )( x + 5 ) = λ x . ( x + 5 ) + 3 = λ x . x + ( 5 + 3 ) = λ x . x + 8

  11. Optimisation across composition Transducer paradigm No good optimisation story. Optimisation across composition is key technology supporting abstraction: Enables construction by composition. If only. . . Compute Compute Input Output Input Output

  12. Optimisation across composition Transducer paradigm No good optimisation story. Optimisation across composition is key technology supporting abstraction: Enables construction by composition. If only. . . Compute Compute Input Output

  13. Optimisation across composition Transducer paradigm No good optimisation story. Optimisation across composition is key technology supporting abstraction: Enables construction by composition. If only. . . Compute/Compute Input Output

  14. Optimisation across composition Transducer paradigm No good optimisation story. Optimisation across composition is key technology supporting abstraction: Enables construction by composition. If only. . .

  15. Optimisation across composition Transducer paradigm No good optimisation story. Optimisation across composition is key technology supporting abstraction: Enables construction by composition. If only. . . Thread #1 Thread #1 Thread #2 Thread #2 Thread #3 Thread #3

  16. Optimisation across composition Transducer paradigm No good optimisation story. Optimisation across composition is key technology supporting abstraction: Enables construction by composition. If only. . . Thread #1 Thread #1 Thread #2 Thread #2 Thread #3 Thread #3

  17. Strategy ◮ Build transducers from continuations.

  18. Strategy ◮ Build transducers from continuations. ◮ Build continuations from λ .

  19. Strategy ◮ Build transducers from continuations. ◮ Build continuations from λ . ◮ Handle λ well.

  20. Strategy ◮ Build transducers from continuations. ◮ Build continuations from λ . ◮ Handle λ well. ◮ Watch what happens.

  21. Tool: Continuation-passing style (CPS) Restricted subset of λ calculus: Function calls do not return. Thus cannot write f(g(x)) . Must pass extra argument—the continuation —to each call, to represent rest of computation: (- a (* b c)) ⇒ (* b c ( λ (temp) (- a temp halt)))

  22. Tool: Continuation-passing style (CPS) Restricted subset of λ calculus: Function calls do not return. Thus cannot write f(g(x)) . Must pass extra argument—the continuation —to each call, to represent rest of computation: (- a (* b c)) ⇒ (* b c ( λ (temp) (- a temp halt))) CPS is the “assembler” of functional languages.

  23. CPS Payoff CPS is universal representation of control & env. Construct encoding call to λ fun call

  24. CPS Payoff CPS is universal representation of control & env. Construct encoding call to λ fun call fun return call to λ

  25. CPS Payoff CPS is universal representation of control & env. Construct encoding call to λ fun call fun return call to λ iteration call to λ

  26. CPS Payoff CPS is universal representation of control & env. Construct encoding call to λ fun call fun return call to λ iteration call to λ sequencing call to λ

  27. CPS Payoff CPS is universal representation of control & env. Construct encoding call to λ fun call fun return call to λ iteration call to λ sequencing call to λ conditional call to λ

  28. CPS Payoff CPS is universal representation of control & env. Construct encoding call to λ fun call fun return call to λ iteration call to λ sequencing call to λ conditional call to λ exception call to λ

  29. CPS Payoff CPS is universal representation of control & env. Construct encoding call to λ fun call fun return call to λ iteration call to λ sequencing call to λ conditional call to λ exception call to λ continuation call to λ

  30. CPS Payoff CPS is universal representation of control & env. Construct encoding call to λ fun call fun return call to λ iteration call to λ sequencing call to λ conditional call to λ exception call to λ continuation call to λ coroutine switch call to λ . . . . . .

  31. Writing transducers with put and get (define (send-fives) (put 5) (send-fives))

  32. Writing transducers with put and get (define (send-fives) (put 5) (send-fives)) (define (doubler) (put (* 2 (get))) (doubler))

  33. Writing transducers with put and get (define (send-fives) (put 5) (send-fives)) (define (doubler) (put (* 2 (get))) (doubler)) (define (integ sum) (let ((next-sum (+ sum (get)))) (put next-sum) (integ next-sum)))

  34. Tool: 3CPS & transducer pipelines f x k u d

  35. Tool: 3CPS & transducer pipelines f x k u d ExpCont: rest of this stage’s computation Semantic domains / Types x ∈ Value k ∈ ExpCont = Value → UpCont → DownCont → Ans (a transducer)

  36. Tool: 3CPS & transducer pipelines f x k u d ExpCont: rest UpCont: rest of this stage’s of upstream computation computation Semantic domains / Types x ∈ Value k ∈ ExpCont = Value → UpCont → DownCont → Ans u ∈ UpCont = DownCont → Ans (a transducer)

  37. Tool: 3CPS & transducer pipelines f x k u d ExpCont: rest UpCont: rest DownCont: rest of this stage’s of upstream of downstream computation computation computation Semantic domains / Types x ∈ Value k ∈ ExpCont = Value → UpCont → DownCont → Ans u ∈ UpCont = DownCont → Ans d ∈ DownCont = Value → UpCont → Ans c ∈ CmdCont = UpCont → DownCont → Ans (a transducer)

  38. Transducers in 3CPS Get & put in 3CPS get x k u d = put x k u d = Semantic domains / Types x ∈ Value k ∈ ExpCont = Value → UpCont → DownCont → Ans u ∈ UpCont = DownCont → Ans d ∈ DownCont = Value → UpCont → Ans c ∈ CmdCont = UpCont → DownCont → Ans

  39. Transducers in 3CPS Get & put in 3CPS get x k u d = u ( ) put x k u d = Semantic domains / Types x ∈ Value k ∈ ExpCont = Value → UpCont → DownCont → Ans u ∈ UpCont = DownCont → Ans d ∈ DownCont = Value → UpCont → Ans c ∈ CmdCont = UpCont → DownCont → Ans

  40. Transducers in 3CPS Get & put in 3CPS get x k u d = u ( λ x ′ u ′ . ) put x k u d = Semantic domains / Types x ∈ Value k ∈ ExpCont = Value → UpCont → DownCont → Ans u ∈ UpCont = DownCont → Ans d ∈ DownCont = Value → UpCont → Ans c ∈ CmdCont = UpCont → DownCont → Ans

  41. Transducers in 3CPS Get & put in 3CPS get x k u d = u ( λ x ′ u ′ . k ) put x k u d = Semantic domains / Types x ∈ Value k ∈ ExpCont = Value → UpCont → DownCont → Ans u ∈ UpCont = DownCont → Ans d ∈ DownCont = Value → UpCont → Ans c ∈ CmdCont = UpCont → DownCont → Ans

  42. Transducers in 3CPS Get & put in 3CPS get x k u d = u ( λ x ′ u ′ . k x ′ ) put x k u d = Semantic domains / Types x ∈ Value k ∈ ExpCont = Value → UpCont → DownCont → Ans u ∈ UpCont = DownCont → Ans d ∈ DownCont = Value → UpCont → Ans c ∈ CmdCont = UpCont → DownCont → Ans

Recommend


More recommend