self applicable probabilistic inference without
play

Self-applicable probabilistic inference without interpretive - PowerPoint PPT Presentation

Self-applicable probabilistic inference without interpretive overhead Oleg Kiselyov Chung-chieh Shan FNMOC Rutgers University oleg@pobox.com ccshan@rutgers.edu Tufts University 12 February 2010 Probabilistic inference Model (what)


  1. Self-applicable probabilistic inference without interpretive overhead Oleg Kiselyov Chung-chieh Shan FNMOC Rutgers University oleg@pobox.com ccshan@rutgers.edu Tufts University 12 February 2010

  2. Probabilistic inference Model (what) Inference (how) Pr✭❘❡❛❧✐t②✮ ✾ ❂ Pr✭❖❜s ❥ ❘❡❛❧✐t②✮ ❀ Pr✭❘❡❛❧✐t② ❥ ❖❜s ❂ ♦❜s✮ ♦❜s Pr✭❖❜s ❂ ♦❜s ❥ ❘❡❛❧✐t②✮ Pr✭❘❡❛❧✐t②✮ Pr✭❖❜s ❂ ♦❜s✮ cloudy Pr✭ rain ❥ wet_grass ❂ true ✮ rain sprinkler wet_roof wet_grass 2/16

  3. Declarative probabilistic inference Model (what) Inference (how) Pr✭❘❡❛❧✐t②✮ ✾ ❂ Pr✭❖❜s ❥ ❘❡❛❧✐t②✮ ❀ Pr✭❘❡❛❧✐t② ❥ ❖❜s ❂ ♦❜s✮ ♦❜s Pr✭❖❜s ❂ ♦❜s ❥ ❘❡❛❧✐t②✮ Pr✭❘❡❛❧✐t②✮ Pr✭❖❜s ❂ ♦❜s✮ cloudy Pr✭ rain ❥ wet_grass ❂ true ✮ rain sprinkler wet_roof wet_grass 2/16

  4. Declarative probabilistic inference Model (what) Inference (how) invoke Toolkit distributions, (BNT, PFP) conditionalization, . . . Language random choice, interpret (BLOG, IBAL, observation, . . . Church) 2/16

  5. Declarative probabilistic inference Model (what) Inference (how) + use existing libraries, Toolkit + easy to add custom types, debugger (BNT, PFP) inference Language + random variables are + compile models for (BLOG, IBAL, ordinary variables faster inference Church) 2/16

  6. Declarative probabilistic inference Model (what) Inference (how) + use existing libraries, Toolkit + easy to add custom types, debugger (BNT, PFP) inference Language + random variables are + compile models for (BLOG, IBAL, ordinary variables faster inference Church) Today: invoke interpret Best of both Express models and inference as interacting programs in the same general-purpose language. 2/16

  7. Declarative probabilistic inference Model (what) Inference (how) + use existing libraries, Toolkit + easy to add custom types, debugger (BNT, PFP) inference Language + random variables are + compile models for (BLOG, IBAL, ordinary variables faster inference Church) Today: Payoff: expressive model Payoff: fast inference Best of both + models of inference : + deterministic parts of bounded-rational models run at full speed theory of mind + importance sampling Express models and inference as interacting programs in the same general-purpose language. 2/16

  8. Outline ◮ Expressivity Memoization Nested inference Implementation Reifying a model into a search tree Importance sampling with look-ahead Applications 3/16

  9. Grass model cloudy let flip = fun p -> rain sprinkler dist [(p, true); (1.-.p, false)] wet_roof wet_grass Models are ordinary code (in OCaml) using a library function dist . 4/16

  10. Grass model cloudy let flip = fun p -> rain sprinkler dist [(p, true); (1.-.p, false)] wet_roof wet_grass Models are ordinary code (in OCaml) using a library function dist . 4/16

  11. Grass model cloudy let flip = fun p -> rain sprinkler dist [(p, true); (1.-.p, false)] wet_roof wet_grass let cloudy = flip 0.5 in let rain = flip (if cloudy then 0.8 else 0.2) in let sprinkler = flip (if cloudy then 0.1 else 0.5) in let wet_roof = flip 0.7 && rain in let wet_grass = flip 0.9 && rain || flip 0.9 && sprinkler in if wet_grass then rain else fail () Models are ordinary code (in OCaml) using a library function dist . Random variables are ordinary variables. 4/16

  12. Grass model cloudy let flip = fun p -> rain sprinkler dist [(p, true); (1.-.p, false)] wet_roof wet_grass let cloudy = flip 0.5 in let rain = flip (if cloudy then 0.8 else 0.2) in let sprinkler = flip (if cloudy then 0.1 else 0.5) in let wet_roof = flip 0.7 && rain in let wet_grass = flip 0.9 && rain || flip 0.9 && sprinkler in if wet_grass then rain else fail () Models are ordinary code (in OCaml) using a library function dist . Random variables are ordinary variables. 4/16

  13. Grass model cloudy let flip = fun p -> rain sprinkler dist [(p, true); (1.-.p, false)] let grass_model = fun () -> wet_roof wet_grass let cloudy = flip 0.5 in let rain = flip (if cloudy then 0.8 else 0.2) in let sprinkler = flip (if cloudy then 0.1 else 0.5) in let wet_roof = flip 0.7 && rain in let wet_grass = flip 0.9 && rain || flip 0.9 && sprinkler in if wet_grass then rain else fail () normalize (exact_reify grass_model) Models are ordinary code (in OCaml) using a library function dist . Random variables are ordinary variables. Inference applies to thunks and returns a distribution. 4/16

  14. Grass model cloudy let flip = fun p -> rain sprinkler dist [(p, true); (1.-.p, false)] let grass_model = fun () -> wet_roof wet_grass let cloudy = flip 0.5 in let rain = flip (if cloudy then 0.8 else 0.2) in let sprinkler = flip (if cloudy then 0.1 else 0.5) in let wet_roof = flip 0.7 && rain in let wet_grass = flip 0.9 && rain || flip 0.9 && sprinkler in if wet_grass then rain else fail () normalize (exact_reify grass_model) Models are ordinary code (in OCaml) using a library function dist . Random variables are ordinary variables. Inference applies to thunks and returns a distribution. Deterministic parts of models run at full speed. 4/16

  15. Models as programs in a general-purpose language Reuse existing infrastructure! ◮ Rich libraries: lists, arrays, database access, I/O, . . . ◮ Type inference ◮ Functions as first-class values ◮ Compiler ◮ Debugger ◮ Memoization 5/16

  16. Models as programs in a general-purpose language Reuse existing infrastructure! ◮ Rich libraries: lists, arrays, database access, I/O, . . . ◮ Type inference ◮ Functions as first-class values ◮ Compiler ◮ Debugger ◮ Memoization Express Dirichlet processes, etc. (Goodman et al. 2008) Speed up inference using lazy evaluation bucket elimination sampling w/memoization (Pfeffer 2007) 5/16

  17. ♣ ♣ ✵ ✿ ✸ ♣ Self application: nested inference Choose a coin that is either fair or completely biased for true . let biased = flip 0.5 in let coin = fun () -> flip 0.5 || biased in 6/16

  18. ♣ Self application: nested inference Choose a coin that is either fair or completely biased for true . let biased = flip 0.5 in let coin = fun () -> flip 0.5 || biased in Let ♣ be the probability that flipping the coin yields true . What is the probability that ♣ is at least ✵ ✿ ✸ ? 6/16

  19. ♣ Self application: nested inference Choose a coin that is either fair or completely biased for true . let biased = flip 0.5 in let coin = fun () -> flip 0.5 || biased in Let ♣ be the probability that flipping the coin yields true . What is the probability that ♣ is at least ✵ ✿ ✸ ? Answer: 1. at_least 0.3 true (exact_reify coin) 6/16

  20. ♣ Self application: nested inference exact_reify (fun () -> Choose a coin that is either fair or completely biased for true . let biased = flip 0.5 in let coin = fun () -> flip 0.5 || biased in Let ♣ be the probability that flipping the coin yields true . What is the probability that ♣ is at least ✵ ✿ ✸ ? Answer: 1. at_least 0.3 true (exact_reify coin) ) 6/16

  21. Self application: nested inference exact_reify (fun () -> Choose a coin that is either fair or completely biased for true . let biased = flip 0.5 in let coin = fun () -> flip 0.5 || biased in Let ♣ be the probability that flipping the coin yields true . Estimate ♣ by flipping the coin twice. What is the probability that our estimate of ♣ is at least ✵ ✿ ✸ ? Answer: 7/8. at_least 0.3 true (sample 2 coin) ) 6/16

  22. Self application: nested inference exact_reify (fun () -> Choose a coin that is either fair or completely biased for true . let biased = flip 0.5 in let coin = fun () -> flip 0.5 || biased in Let ♣ be the probability that flipping the coin yields true . Estimate ♣ by flipping the coin twice. What is the probability that our estimate of ♣ is at least ✵ ✿ ✸ ? Answer: 7/8. at_least 0.3 true (sample 2 coin) ) Returns a distribution—not just nested query (Goodman et al. 2008) . Inference procedures are OCaml code using dist , like models. Works with observation, recursion, memoization. Bounded-rational theory of mind without interpretive overhead . 6/16

  23. Grice and Marr probabilistic model (e.g., grammar) 7/16

  24. Grice and Marr approximate inference approximate inference (e.g., comprehension) (e.g., comprehension) probabilistic model (e.g., grammar) 7/16

  25. Grice and Marr probabilistic model probabilistic model probabilistic model (e.g., joint activity and goal) (e.g., joint activity and goal) (e.g., joint activity and goal) approximate inference (e.g., comprehension) probabilistic model (e.g., grammar) 7/16

  26. Grice and Marr approximate inference approximate inference approximate inference approximate inference (e.g., plan utterance) (e.g., plan utterance) (e.g., plan utterance) (e.g., plan utterance) probabilistic model (e.g., joint activity and goal) approximate inference (e.g., comprehension) probabilistic model (e.g., grammar) 7/16

  27. Outline Expressivity Memoization Nested inference ◮ Implementation Reifying a model into a search tree Importance sampling with look-ahead Applications 8/16

Recommend


More recommend