the role of mobility and control in the inference of
play

the role of mobility and control in the inference of - PowerPoint PPT Presentation

the role of mobility and control in the inference of representations stefano soatto ucla t. lee, a. ayvaci, j. dong, d. davis, j. balzer, j. hernandez, l. valente 1 Saturday, November 30, 13 1 what is a representation ? why do we need


  1. the role of mobility and control in the inference of representations stefano soatto ucla t. lee, a. ayvaci, j. dong, d. davis, j. balzer, j. hernandez, l. valente 1 Saturday, November 30, 13 1

  2. what is a “representation” ? why do we need it? what does control have to do with it? keywords: data processing inequality, information bottleneck, lambert-ambient model, sufficient excitation, actionable information gap, active sensing/ perception 2 Saturday, November 30, 13 2

  3. data 3 Saturday, November 30, 13 3

  4. y t . = { y 0 , . . . , y t } data 3 Saturday, November 30, 13 3

  5. y t . = { y 0 , . . . , y t } data 3 Saturday, November 30, 13 3

  6. y t . = { y 0 , . . . , y t } data task 3 Saturday, November 30, 13 3

  7. y t . = { y 0 , . . . , y t } ξ data task 3 Saturday, November 30, 13 3

  8. ξ ? y t . = { y 0 , . . . , y t } data task 3 Saturday, November 30, 13 3

  9. ξ ? y t . = { y 0 , . . . , y t } data task “representation”? 3 Saturday, November 30, 13 3

  10. y t . ? = { y 0 , . . . , y t } ξ data task “representation”? ξ = φ ( y t ) ˆ 3 Saturday, November 30, 13 3

  11. y t . ? = { y 0 , . . . , y t } ξ data task “representation”? ξ = φ ( y t ) ˆ 3 Saturday, November 30, 13 3

  12. “representation” y t . = { y 0 , . . . , y t } 4 Saturday, November 30, 13 4

  13. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) ≥ I ( ξ ; φ ( y t )) R ( u t | y t ) ≤ R ( u t | φ ( y t )) 4 Saturday, November 30, 13 4

  14. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ 4 Saturday, November 30, 13 4

  15. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ sufficient statistics [r. fisher] 4 Saturday, November 30, 13 4

  16. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ ) + β H (ˆ min I ( ξ ; y t ) − I ( ξ ; φ ( y t ) ξ ) | {z } ˆ ξ 4 Saturday, November 30, 13 4

  17. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ min I ( y t , ˆ ξ ) − β I (ˆ ξ ; ξ ) 4 Saturday, November 30, 13 4

  18. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ min I ( y t , ˆ ξ ) − β I (ˆ ξ ; ξ ) information bottleneck [n. tishby] 4 Saturday, November 30, 13 4

  19. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ min I ( y t , ˆ ξ ) − β I (ˆ ξ ; ξ ) 4 Saturday, November 30, 13 4

  20. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ min I ( y t , ˆ ξ ) − β I (ˆ ξ ; ξ ) ξ ) + 1 t | ˆ β H (ˆ min H ( y ∞ ξ ) 4 Saturday, November 30, 13 4

  21. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ min I ( y t , ˆ ξ ) − β I (ˆ ξ ; ξ ) ξ ) + 1 t | ˆ β H (ˆ min H ( y ∞ ξ ) actionable information 4 Saturday, November 30, 13 4

  22. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ min I ( y t , ˆ ξ ) − β I (ˆ ξ ; ξ ) ξ ) + 1 t | ˆ β H (ˆ min H ( y ∞ ξ ) representation = “state” 4 Saturday, November 30, 13 4

  23. “representation” y t . = { y 0 , . . . , y t } I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ min I ( y t , ˆ ξ ) − β I (ˆ ξ ; ξ ) ξ ) + 1 t | ˆ β H (ˆ min H ( y ∞ ξ ) function of the past that best predicts future (nuisance- invariants of the) data given available resources 4 Saturday, November 30, 13 4

  24. “representation” I ( ξ ; y t ) = I ( ξ ; φ ( y t ) ) R ( u t | y t ) ≤ R ( u t | φ ( y t )) | {z } ˆ ξ min I ( y t , ˆ ξ ) − β I (ˆ ξ ; ξ ) ξ ) + 1 t | ˆ β H (ˆ min H ( y ∞ ξ ) function of the past that best predicts future (nuisance- invariants of the) data given available resources 4 Saturday, November 30, 13 4

  25. “the past” φ 5 Saturday, November 30, 13 5

  26. “the past” phylogenic data aggregation is encoded in the structure of (nuisances, invariances, φ policies, tradeoffs, tasks); 5 Saturday, November 30, 13 5

  27. “the past” phylogenic data aggregation is encoded in the structure of (nuisances, invariances, φ policies, tradeoffs, tasks); ontogenic data aggregation is continuously integrated into the representation ˆ ξ 5 Saturday, November 30, 13 5

  28. “nuisances” account for almost all uncertainty/variability in visual data some can be removed from the data at the outset: lossless: canonization (e.g., contrast, planar isometries)* co-variant detection/invariant description others have to be sampled/marginalized: e.g., scale, class-specific deformation other have to be “discovered” e.g., occlusion “sufficient exploration” instantiate for a specific data-formation model (LA Model) 6 Saturday, November 30, 13 6

  29. UNMODELED PHENOMENA SENSORS n EO IR SCENE ξ MS ... NUISANCES IMU ν g LIDR y t SENSING ACTION .. CANONIZATION TASK φ ∧ ( y t ) INFERENCE H ( y t +1 | ˆ min ξ t ) - INNOVATION p (ˆ ξ | y t ) g ˆ h (ˆ ξ , ˆ ν ) NUISANCES QUERIES REPRESENTATION g ( u ) ˆ ˆ ξ t ˆ ν CONTROL u t max H ( y t +1 | ˆ ξ t , u ) u 7 Saturday, November 30, 13 7

  30. UNMODELED PHENOMENA SENSORS n EO IR SCENE ξ MS ... NUISANCES IMU ν g LIDR y t SENSING ACTION .. CANONIZATION TASK φ ∧ ( y t ) INFERENCE H ( y t +1 | ˆ min ξ t ) - INNOVATION p (ˆ ξ | y t ) g ˆ h (ˆ ξ , ˆ ν ) NUISANCES QUERIES REPRESENTATION g ( u ) ˆ INFORMATION ˆ ξ t BOTTLENECK ˆ ν CONTROL u t max H ( y t +1 | ˆ ξ t , u ) u 7 Saturday, November 30, 13 7

  31. UNMODELED PHENOMENA SENSORS n EO IR SCENE ξ MS ... NUISANCES IMU ν g LIDR y t SENSING ACTION .. CANONIZATION TASK φ ∧ ( y t ) INFERENCE H ( y t +1 | ˆ min ξ t ) ACTIONABLE - INNOVATION p (ˆ INFORMATION ξ | y t ) INCREMENT g ˆ h (ˆ ξ , ˆ ν ) NUISANCES QUERIES REPRESENTATION g ( u ) ˆ INFORMATION ˆ ξ t BOTTLENECK ˆ ν CONTROL u t max H ( y t +1 | ˆ ξ t , u ) u 7 Saturday, November 30, 13 7

  32. the LA model (lambert-ambient) S ⊂ R 3 ρ : S → R + p 7! ρ ( p ) p ( y t ( x ) = κ t ( ρ ( p )) + n t ( x ) ∈ R + ¯ x p ∈ S ⊂ R 3 x = π ( g t p ) , D ⊂ R 2 x I t : D → R + g t ∈ SE (3) 8 Saturday, November 30, 13 8

  33. the LA model (lambert-ambient) S ⊂ R 3 ρ : S → R + p 7! ρ ( p ) p ( y t ( x ) = κ t ( ρ ( p )) + n t ( x ) ∈ R + ¯ x p ∈ S ⊂ R 3 x = π ( g t p ) , D ⊂ R 2 x I t : D → R + ξ = { ρ , S } g t ∈ SE (3) 8 Saturday, November 30, 13 8

  34. the LA model (lambert-ambient) S ⊂ R 3 ρ : S → R + p 7! ρ ( p ) p ( y t ( x ) = κ t ( ρ ( p )) + n t ( x ) ∈ R + ¯ x p ∈ S ⊂ R 3 x = π ( g t p ) , D ⊂ R 2 x I t : D → R + ξ = { ρ , S } g t ∈ SE (3) g t ∈ SE (3) 8 Saturday, November 30, 13 8

  35. the LA model (lambert-ambient) S ⊂ R 3 ρ : S → R + p 7! ρ ( p ) p ( y t ( x ) = κ t ( ρ ( p )) + n t ( x ) ∈ R + ¯ x p ∈ S ⊂ R 3 x = π ( g t p ) , D ⊂ R 2 x I t : D → R + ξ = { ρ , S } κ t : R + → R + g t ∈ SE (3) g t ∈ SE (3) 8 Saturday, November 30, 13 8

  36. the LA model (lambert-ambient) S ⊂ R 3 ρ : S → R + p 7! ρ ( p ) p ( y t ( x ) = κ t ( ρ ( p )) + n t ( x ) ∈ R + ¯ x p ∈ S ⊂ R 3 x = π ( g t p ) , D ⊂ R 2 x I t : D → R + ξ = { ρ , S } κ t : R + → R + g t ∈ SE (3) ν t = { n t , π } g t ∈ SE (3) 8 Saturday, November 30, 13 8

  37. the LA model (lambert-ambient) S ⊂ R 3 ρ : S → R + p 7! ρ ( p ) p ¯ x D ⊂ R 2 x I t : D → R + g t ∈ SE (3) 9 Saturday, November 30, 13 9

  38. the LA model (lambert-ambient) S ⊂ R 3 ρ : S → R + p 7! ρ ( p ) p ¯ x D ⊂ R 2 y t = h ( g t , ξ , ν t ) + n t x I t : D → R + g t ∈ SE (3) 9 Saturday, November 30, 13 9

  39. complete representation given one or more images, , a representation ˆ y t ξ t is a statistic ˆ such that ξ t = φ ( y t ) 10 Saturday, November 30, 13 10

  40. complete representation given one or more images, , a representation ˆ y t ξ t is a statistic ˆ such that ξ t = φ ( y t ) ξ t , ν t ) , e ∈ G, ν t ∈ V} . φ ∧ ( y t ) = { h ( e, ˆ = L (ˆ ξ t ) 10 Saturday, November 30, 13 10

  41. complete representation given one or more images, , a representation ˆ y t ξ t is a statistic ˆ such that ξ t = φ ( y t ) ξ t , ν t ) , e ∈ G, ν t ∈ V} . φ ∧ ( y t ) = { h ( e, ˆ = L (ˆ ξ t ) i.e., a statistic from which the (maximal invariant of the) images can be “hallucinated” up to an “uninformative” residual 10 Saturday, November 30, 13 10

Recommend


More recommend