Action Respecting Embedding Michael Bowling April 25, 2005 University of Alberta
Acknowledgments • Joint work with. . . Ali Ghodsi and Dana Wilkinson. • Localization work with. . . Adam Milstein and Wesley Loh. • Discussions and insight. . . Finnegan Southey, Dale Schuurmans, Tao Wang, Dan Lizotte, Michael Littman, and Pascal Poupart. 2 Subjective Mapping
What is a Map? 3 Subjective Mapping
Robot Maps • Maps are Models. x t : pose u t : action or control – Motion: P ( x t +1 | x t , u t ) . z t : observation – Sensor: P ( z t | x t ) . • Robot Pose: x t = ( x t , y t , θ t ) . – Objective representation. – Models relate actions ( u t ) and observations ( z t ) to this frame of reference. • Can a map be learned with only subjective experience? z 1 , u 1 , z 2 , u 2 , . . . , u T − 1 , z T Not an objective map. 4 Subjective Mapping
ImageBot • A “Robot” moving around on a large image. • A “Robot” moving around on a large image. – Forward – Forward – Backward – Backward – Left – Left – Right – Right – Turn-CW – Turn-CW – Turn-CCW – Turn-CCW – Zoom-In – Zoom-In – Zoom-Out – Zoom-Out • Example: F × 5, CW × 8, F × 5, CW × 16, F × 5, CW × 8 • Example: F × 5, CW × 8, F × 5, CW × 16, F × 5, CW × 8 5 Subjective Mapping
ImageBot • Construct a map from a stream of input. z 1 z 2 z 3 z 4 z 5 u 1 u 1 u 1 u 1 u 2 ⇒ ⇒ ⇒ ⇒ ⇒ z 6 z 7 z 8 z 9 z 10 u 2 u 2 u 2 u 2 u 2 ⇒ ⇒ ⇒ ⇒ ⇒ . . . . . . . . . . . . . . . • Actions are labels with no semantics. • No image features, just high-dimensional vectors. • Can a map be learned with only subjective experience? 6 Subjective Mapping
Overview • What is a Map? • Subjective Mapping • Action Respecting Embedding (ARE) • Results • Future Work 7 Subjective Mapping
Subjective Maps • What is a subjective map? – Allows you to do “map things”, e.g., localize and plan. – No models. – Representation (i.e., pose) can be anything. • Subjective mapping becomes choosing a representation. – What is a good representation? – How do we extract it from experience? – How do we answer our map questions with it? 8 Subjective Mapping
Subjective Representation • ( x , y , θ ) is often a good representation. Why? – Sufficient representation for generating observations. �✁�✁� ✂✁✂✁✂ ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� x θ �✁�✁� ✂✁✂✁✂ y – Low dimensional (despite high dimensional observations). – Actions are simple transformations. x t +1 = ( x t + F cos θ t , y t + F sin θ t , θ t + R ) 9 Subjective Mapping
Subjective Representation • How do we extract a representation like ( x , y , θ ) ? • How do we extract a representation like ( x , y , θ ) ? – Low dimensional description of observations? – Low dimensional description of observations? Dimensionality Reduction Dimensionality Reduction – Respects actions as simple transformations? – Respects actions as simple transformations? 10 Subjective Mapping
Semidefinite Embedding (SDE) (Weinberger & Saul, 2004) • Goal: Learn the kernel matrix, K , from the data. • Optimization Problem: Maximize: Maximize: Maximize: Maximize: Tr ( K ) Tr ( K ) Tr ( K ) Tr ( K ) Subject to: Subject to: Subject to: Subject to: K � 0 K � 0 K � 0 K � 0 � � � � ij K ij = 0 ij K ij = 0 ij K ij = 0 ij K ij = 0 η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ ∀ i, j ∀ i, j ∀ i, j ∀ i, j K ii − 2 K ij + K jj = || x i − x j || 2 K ii − 2 K ij + K jj = || x i − x j || 2 K ii − 2 K ij + K jj = || x i − x j || 2 K ii − 2 K ij + K jj = || x i − x j || 2 where η comes from k -nearest neighbors. • Use learned K with kernel PCA. 11 Subjective Mapping
Semidefinite Embedding (SDE) • It works. • Reduces dimensionality reduction to a constrained optimization problem. 12 Subjective Mapping
Dimensionality Reduction Maps x position on manifold 40 20 time 13 Subjective Mapping
Subjective Representation • How do we extract a representation like ( x , y , θ ) ? – Low dimensional description of observations? Dimensionality Reduction – Respects actions as simple transformations? Action Respecting Embedding 14 Subjective Mapping
Overview • What is a Map? • Subjective Mapping • Action Respecting Embedding (ARE) – Non-Uniform Neighborhoods – Action Respecting Constraints • Results • Future Work 15 Subjective Mapping
Action Respecting Embedding (ARE) • Like SDE, learns a kernel matrix K through optimization. • Uses the kernel matrix K with kernel PCA. • Exploits the action data ( u t ) in two ways. – Non-uniform neighborhood graph. – Action respecting constraints. 16 Subjective Mapping
Non-Uniform Neighborhoods • Actions only have a small effect on the robot’s pose. • If z t − → z t +1 , then Φ( z t ) and Φ( z t +1 ) should be close. u t • Set z t ’s neighborhood size to include z t − 1 and z t +1 . z t z t +1 • Neighbor graph uses this non-uniform neighborhood size. 17 Subjective Mapping
Action Respecting Constraints • Constrain the representation so that actions must be simple transformations. • What’s a simple transformation? – Linear transformations are simple. f ( x ) = Ax + b – Distance preserving ones are just slightly simpler. f ( x ) = Ax + b where A T A = I . i.e., rotation and translation, but no scaling. 18 Subjective Mapping
Action Respecting Constraints • Distance preserving ⇐ ⇒ || f ( x ) − f ( x ′ ) || = || x − x ′ || . x x x x ′ x ′ x ′ f ( x ′ ) f ( x ′ ) f ( x ) f ( x ) • For our representation, if u i = u j = u , || f u (Φ( z i )) − f u (Φ( z j )) || = || Φ( z i ) − Φ( z j ) || || Φ( z i +1 ) − Φ( z j +1 ) || = || Φ( z i ) − Φ( z j ) || K ( i +1)( i +1) − 2 K ( i +1)( j +1) + K ( j +1)( j +1) = K ii − 2 K ij + K jj 19 Subjective Mapping
Action Respecting Embedding • Optimization Problem: Maximize: Maximize: Maximize: Maximize: Maximize: Tr ( K ) Tr ( K ) Tr ( K ) Tr ( K ) Tr ( K ) Subject to: Subject to: Subject to: Subject to: Subject to: K � 0 K � 0 K � 0 K � 0 K � 0 � � � � � ij K ij = 0 ij K ij = 0 ij K ij = 0 ij K ij = 0 ij K ij = 0 η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ ∀ i, j ∀ i, j ∀ i, j ∀ i, j ∀ i, j K ii − 2 K ij + K jj = || z i − z j || 2 K ii − 2 K ij + K jj = || z i − z j || 2 K ii − 2 K ij + K jj = || z i − z j || 2 K ii − 2 K ij + K jj = || z i − z j || 2 K ii − 2 K ij + K jj = || z i − z j || 2 ∀ i, j ∀ i, j ∀ i, j ∀ i, j ∀ i, j u i = u j ⇒ u i = u j ⇒ u i = u j ⇒ u i = u j ⇒ u i = u j ⇒ K ( i +1)( i +1) − 2 K ( i +1)( j +1) + K ( j +1)( j +1) = K ( i +1)( i +1) − 2 K ( i +1)( j +1) + K ( j +1)( j +1) = K ( i +1)( i +1) − 2 K ( i +1)( j +1) + K ( j +1)( j +1) = K ( i +1)( i +1) − 2 K ( i +1)( j +1) + K ( j +1)( j +1) = K ( i +1)( i +1) − 2 K ( i +1)( j +1) + K ( j +1)( j +1) = K ii − 2 K ij + K jj K ii − 2 K ij + K jj K ii − 2 K ij + K jj K ii − 2 K ij + K jj K ii − 2 K ij + K jj where η is the non-uniform neighborhood graph. where η is the non-uniform neighborhood graph. where η is the non-uniform neighborhood graph. where η is the non-uniform neighborhood graph. where η is the non-uniform neighborhood graph. • Use learned K with kernel PCA to extract x 1 , . . . , x T . • Use learned K with kernel PCA to extract x 1 , . . . , x T . • Use learned K with kernel PCA to extract x 1 , . . . , x T . • Use learned K with kernel PCA to extract x 1 , . . . , x T . • Use learned K with kernel PCA to extract x 1 , . . . , x T . 20 Subjective Mapping
Overview • What is a Map? • Subjective Mapping • Action Respecting Embedding (ARE) • Results • Future Work 21 Subjective Mapping
ImageBot • Construct a map from a stream of input. z 1 z 2 z 3 z 4 z 5 u 1 u 1 u 1 u 1 u 2 ⇒ ⇒ ⇒ ⇒ ⇒ z 6 z 7 z 8 z 9 z 10 u 2 u 2 u 2 u 2 u 2 ⇒ ⇒ ⇒ ⇒ ⇒ . . . . . . . . . . . . . . . • Actions are labels with no semantics. • No image features, just high-dimensional vectors. 22 Subjective Mapping
Learned Representations SDE x ARE position on manifold 40 20 time 23 Subjective Mapping
Learned Representations SDE x ARE position on manifold 40 20 time 24 Subjective Mapping
Learned Representations SDE ARE 2nd dimension of manifold 16 16 θ θ 8 1st dimension of manifold 25 Subjective Mapping
Learned Representations SDE ARE 2nd dimension of manifold 10 10 10 10 5 5 5 5 5 5 5 10 x x x x 5 5 5 1st dimension of manifold 5 y y y 5 5 5 y 26 Subjective Mapping
Learned Representations SDE ARE 2nd dimension of manifold 10 5 5 10 20 x 5 1st dimension of manifold 5 z 27 Subjective Mapping
Learned Representations 8 8 x x x x 5 5 3rd dimension 10 10 10 10 16 10 10 10 5 1st dimension 2nd dimension 8 8 8 28 Subjective Mapping
Recommend
More recommend