library learning for neurally guided bayesian program
play

Library Learning for Neurally-Guided Bayesian Program Induction Kevin - PowerPoint PPT Presentation

Library Learning for Neurally-Guided Bayesian Program Induction Kevin Ellis 1 , Lucas Morales 1 , Mathias Sabl e-Meyer 2 , Armando Solar-Lezama 1 , Joshua B. Tenenbaum 1 1 : MIT. 2 : ENS Paris-Saclay. Symbolic regression: List processing: Text


  1. Library Learning for Neurally-Guided Bayesian Program Induction Kevin Ellis 1 , Lucas Morales 1 , Mathias Sabl´ e-Meyer 2 , Armando Solar-Lezama 1 , Joshua B. Tenenbaum 1 1 : MIT. 2 : ENS Paris-Saclay. Symbolic regression: List processing: Text editing: [5 2 9] → [9 2 5] P Kohli → Dr. Kohli Sumit Gulwani → Dr. Gulwani [1 1 2 2] → [2 2 1 1] ax 3 + bx 2 + [1 2 3 2] → [2 3 2 1] Danny Tarlow → Dr. Tarlow a / ( x − b ) cx + d Explore/Compress/Compile (EC 2 ) learns to solve programming tasks like these by growing a library of code and training a neural net to search for programs written using the library

  2. Library learning [7 2 3] → [7 3] [1 2 3 4] → [3 4] [4 3 2 1] → [4 3]

  3. Library learning [7 2 3] → [7 3] [1 2 3 4] → [3 4] [4 3 2 1] → [4 3] Library: f 1 ( ℓ, p ) = (foldr ℓ nil ( λ (x a) (if (p x) (cons x a) a))) ( f 1 : Higher-order filter function ) ( Get elements x from ℓ where (p x ) returns true )

  4. Library learning [7 2 3] → [7 3] [1 2 3 4] → [3 4] [4 3 2 1] → [4 3] f ( ℓ ) = ( f 1 ℓ ( λ (x) (> x 2))) Library: f 1 ( ℓ, p ) = (foldr ℓ nil ( λ (x a) (if (p x) (cons x a) a))) ( f 1 : Higher-order filter function ) ( Get elements x from ℓ where (p x ) returns true )

  5. Subset of 38 learned library routines for list processing f 0 ( ℓ, r ) = (foldr r ℓ cons) f 0 : Append lists r and ℓ f 1 ( ℓ, p ) = (foldr ℓ nil ( λ (x a) f 1 : Higher-order filter function (if (p x) (cons x a) a))) f 2 ( ℓ ) = (foldr ℓ 0 ( λ (x a) f 2 : Maximum element in list ℓ (if (> a x) a x))) f 3 ( ℓ, k ) = (foldr ℓ (is-nil ℓ ) f 2 : Whether ℓ contains k ( λ (x a) (if a a (= k x))))

  6. Explore/Compress/Compile as Bayesian Inference DSL prog prog prog task task task

  7. Explore/Compress/Compile as Amortized Bayesian Inference DSL is prog prog prog task task task

  8. Explore /Compress/Compile as Amortized Bayesian Inference Explore : Infer programs, fixing DSL and neural recognition model DSL is prog prog prog task task task

  9. Explore /Compress/Compile as Amortized Bayesian Inference Explore : Infer programs, fixing DSL and neural recognition model DSL f 1 , f 2 , ... Programs: Neurally-Guided Enumerative Search ( f 1 ℓ ( λ (x) (> x 2))) Recog. Task model [7 2 3] → [7 3] [1 2 3 4] → [3 4] [4 3 2 1] → [4 3] is DSL prog prog prog task task task

  10. Explore/ Compress /Compile as Amortized Bayesian Inference Compress : Update DSL, fixing programs DSL is prog prog prog task task task

  11. Explore/ Compress /Compile as Amortized Bayesian Inference Compress : Update DSL, fixing programs progs. for task 1 DSL+ progs. for task 2 cons 1 + 1 DSL + 1 + car z 1 DSL is prog prog prog task task task

  12. Explore/Compress/ Compile as Amortized Bayesian Inference Compile : Train recognition model DSL is prog prog prog task task task

  13. Explore/Compress/ Compile as Amortized Bayesian Inference Compile : Train recognition model sample run program DSL task DSL is prog prog prog task task task

  14. Explore/Compress/Compile as Amortized Bayesian Inference Explore DSL is prog prog prog task task task DSL DSL prog prog prog prog prog prog task task task task task task Compress Compile

  15. Domain: Text Editing In the style of FlashFill (Gulwani 2012). Starts with map , fold , etc. Input Output +106 769-438 106.769.438 +83 973-831 83.973.831 Temple Anna H TAH Lara Gregori LG

  16. Text editing: Library learning builds on itself Learned DSL primitives over 3 iterations (3 columns). Learned primitives call each other (arrows).

  17. Programs with numerical parameters: Symbolic regression from visual input Fits parameters by autograd-ing thru program Recognition model looks at picture of function’s graph Programs & Tasks DSL f 0 ( x ) = (+ x real) f 1 ( x ) = ( f 0 (* real x)) f 2 ( x ) = ( f 1 (* x ( f 0 x))) f ( x ) = ( f 1 x) f ( x ) = ( f 5 x) f 3 ( x ) = ( f 0 (* x ( f 2 x))) f 4 ( x ) = ( f 0 (* x ( f 3 x))) (f 4 : 4th order polynomial) f 5 ( x ) = (/ real ( f 0 x)) f ( x ) = ( f 4 x) f ( x ) = ( f 3 x) (f 5 : rational function)

  18. New domain: Generative graphics programs (Turtle/LOGO) Training tasks:

  19. Generative graphics programs: Samples from DSL DSL samples before learning DSL samples after learning

  20. Generative graphics programs: Learned library contains parametric drawing routines Semicircle: Spiral: Greek Spiral: S-Curves: Polygons & Stars: Circles:

  21. Learning to program: Poster AB #24

Recommend


More recommend