Subjective Mapping Michael Bowling April 22, 2005 University of Alberta
Acknowledgments • Joint work with. . . Ali Ghodsi and Dana Wilkinson. • Localization work with. . . Adam Milstein and Wesley Loh. • Discussions and insight. . . Finnegan Southey, Dale Schuurmans, Tao Wang, Dan Lizotte, Michael Littman, and Pascal Poupart. 2 Subjective Mapping
What is a Map? 3 Subjective Mapping
What is a Map? Essentially maps answer three questions. . . 1. Where have I been? 1. Where have I been? 1. Where have I been? 1. Where have I been? 2. Where am I now? 2. Where am I now? 2. Where am I now? 2. Where am I now? 3. How do I get where I want to go? 3. How do I get where I want to go? 3. How do I get where I want to go? 3. How do I get where I want to go? McConnell 3 1 1 1 2 2 4 Subjective Mapping
Robot Maps ∗ From Sebastian Thrun’s web page. ∗ From Sebastian Thrun’s web page. ∗ From Sebastian Thrun’s web page. 5 Subjective Mapping
Robot Maps • Maps are Models. x t : pose u t : action or control – Motion: P ( x t +1 | x t , u t ) . z t : observation – Sensor: P ( z t | x t ) . • Robot Pose: x t = ( x t , y t , θ t ) . – Objective representation. – Models relate actions ( u t ) and observations ( z t ) to this frame of reference. • Can a map be learned with only subjective experience? z 1 , u 1 , z 2 , u 2 , . . . , u T − 1 , z T Not an objective map. 6 Subjective Mapping
ImageBot • A “Robot” moving around on a large image. • A “Robot” moving around on a large image. – Forward – Forward – Backward – Backward – Left – Left – Right – Right – Turn-CW – Turn-CW – Turn-CCW – Turn-CCW – Zoom-In – Zoom-In – Zoom-Out – Zoom-Out • Example: F × 5, CW × 8, F × 5, CW × 16, F × 5, CW × 8 • Example: F × 5, CW × 8, F × 5, CW × 16, F × 5, CW × 8 7 Subjective Mapping
ImageBot • Construct a map from a stream of input. z 1 z 2 z 3 z 4 z 5 u 1 u 1 u 1 u 1 u 2 ⇒ ⇒ ⇒ ⇒ ⇒ z 6 z 7 z 8 z 9 z 10 u 2 u 2 u 2 u 2 u 2 ⇒ ⇒ ⇒ ⇒ ⇒ . . . . . . . . . . . . . . . • Actions are labels with no semantics. • No image features, just high-dimensional vectors. • Can a map be learned with only subjective experience? 8 Subjective Mapping
Overview • What is a Map? • What is a Map? • Subjective Mapping • Subjective Mapping • Action Respecting Embedding (ARE) • Action Respecting Embedding (ARE) • Results and Applications • Results and Applications • Future Work • Future Work 9 Subjective Mapping
Subjective Maps • What is a subjective map? – Answers the map questions. 1. Where have I been? 2. Where am I now? 3. How do I get where I want to go? – No models. – Representation (i.e., pose) can be anything. • Subjective mapping becomes choosing a representation. – What is a good representation? – How do we extract it from experience? – How do we answer our map questions with it? 10 Subjective Mapping
Subjective Representation • ( x , y , θ ) is often a good representation. Why? – Sufficient representation for generating observations. �✁�✁� ✂✁✂✁✂ ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� ✂✁✂✁✂ �✁�✁� x θ �✁�✁� ✂✁✂✁✂ y – Low dimensional (despite high dimensional observations). – Actions are simple transformations. x t +1 = ( x t + F cos θ t , y t + F sin θ t , θ t + R ) 11 Subjective Mapping
Subjective Representation • How do we extract a representation like ( x , y , θ ) ? • How do we extract a representation like ( x , y , θ ) ? – Low dimensional description of observations? – Low dimensional description of observations? Dimensionality Reduction Dimensionality Reduction – Respects actions as simple transformations? – Respects actions as simple transformations? 12 Subjective Mapping
Dimensionality Reduction Maps x position on manifold 40 20 time 13 Subjective Mapping
Subjective Representation • How do we extract a representation like ( x , y , θ ) ? – Low dimensional description of observations? Dimensionality Reduction – Respects actions as simple transformations? Action Respecting Embedding 14 Subjective Mapping
Overview • What is a Map? • Subjective Mapping • Action Respecting Embedding (ARE) – Dimensionality Reduction 101 – Non-Uniform Neighborhoods – Action Respecting Constraints • Results and Applications • Future Work 15 Subjective Mapping
Dimensionality Reduction 101 • Principal Components Analysis (PCA) · · · X = · · · x 1 x 2 x n · · · – Top d eigenvectors of X T X form subspace basis. – X T X is the inner product matrix of X (a.k.a. kernel). 16 Subjective Mapping
Dimensionality Reduction 101 • Non-linear mapping followed by PCA. – Let Φ be a non-linear mapping, X → Y . – Top d eigenvectors of Φ( X ) T Φ( X ) (a.k.a. the kernel). – Φ is expected to be very high dimensional. 17 Subjective Mapping
Dimensionality Reduction 101 • Kernel PCA – Compute K = Φ( X ) T Φ( X ) directly. ∗ K is n × n regardless of the dimensionality of Y . ∗ K ij = � Φ( x i ) , Φ( x j ) � = k ( x i , x j ) . – Y = diag ( σ ) V T , where ∗ σ is the square root of the top d eigenvalues of K . ∗ V is the top d eigenvectors of K . ∗ Y is d × n : the low-dimensional embedding of X . 18 Subjective Mapping
Semidefinite Embedding (SDE) (Weinberger & Saul, 2004) • Goal: Learn the kernel matrix, K , from the data. • Optimization Problem: Maximize: Maximize: Maximize: Maximize: Tr ( K ) Tr ( K ) Tr ( K ) Tr ( K ) Subject to: Subject to: Subject to: Subject to: K � 0 K � 0 K � 0 K � 0 � � � � ij K ij = 0 ij K ij = 0 ij K ij = 0 ij K ij = 0 η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ η ij > 0 ∨ [ η T η ] ij > 0 ⇒ ∀ i, j ∀ i, j ∀ i, j ∀ i, j K ii − 2 K ij + K jj = || x i − x j || 2 K ii − 2 K ij + K jj = || x i − x j || 2 K ii − 2 K ij + K jj = || x i − x j || 2 K ii − 2 K ij + K jj = || x i − x j || 2 where η comes from k -nearest neighbors. • Use learned K with kernel PCA. 19 Subjective Mapping
Semidefinite Embedding (SDE) • It works. • Reduces dimensionality reduction to a constrained optimization problem. 20 Subjective Mapping
Subjective Mapping • Given a stream of data, z 1 , u 1 , z 2 , u 2 , . . . , u T − 1 , z T , construct a good subjective representation ( x 1 , . . . , x T )? – Low dimensional description of observations. – Actions are simple transformations. • SDE on z i will find a low dimensional representation. – Does not make use of action labels. – Resulting representations (already seen) are poor. • Action Respecting Embedding to the rescue. 21 Subjective Mapping
Action Respecting Embedding (ARE) • Like SDE, learns a kernel matrix K through optimization. • Uses the kernel matrix K with kernel PCA. • Exploits the action data ( u t ) in two ways. – Non-uniform neighborhood graph. – Action respecting constraints. 22 Subjective Mapping
Non-Uniform Neighborhoods • Actions only have a small effect on the robot’s pose. • If z t − → z t +1 , then Φ( z t ) and Φ( z t +1 ) should be close. u t • Set z t ’s neighborhood size to include z t − 1 and z t +1 . z t z t +1 • Neighbor graph uses this non-uniform neighborhood size. 23 Subjective Mapping
Action Respecting Constraints • Constrain the representation so that actions must be simple transformations. • What’s a simple transformation? – Linear transformations are simple. f ( x ) = Ax + b – Distance preserving ones are just slightly simpler. f ( x ) = Ax + b where A T A = I . i.e., rotation and translation, but no scaling. 24 Subjective Mapping
Action Respecting Constraints • Distance preserving ⇐ ⇒ || f ( x ) − f ( x ′ ) || = || x − x ′ || . x x x x ′ x ′ x ′ f ( x ′ ) f ( x ′ ) f ( x ) f ( x ) • For our representation, if u i = u j = u , || f u (Φ( z i )) − f u (Φ( z j )) || = || Φ( z i ) − Φ( z j ) || || Φ( z i +1 ) − Φ( z j +1 ) || = || Φ( z i ) − Φ( z j ) || K ( i +1)( i +1) − 2 K ( i +1)( j +1) + K ( j +1)( j +1) = K ii − 2 K ij + K jj 25 Subjective Mapping
Recommend
More recommend