Kangaroo: An Efficient Constraint-Based Local Search System Using Lazy Propagation M.A.Hakim Newton 1 , 2 , Duc Nghia Pham 1 , 2 , Abdul Sattar 1 , 2 , Michael Maher 1 , 3 , 4 1 National ICT Australia, 2 IIIS, Griffith University 3 CSE, University of NSW, 4 Reasoning Research Institute, Sydney CP’2011, September 12–16, Perugia, Italy 1 / 26
Presentation Outline CBLS Introduction 1 Comet & Kangaroo 2 Kangaroo Details 3 Conclusions 4 2 / 26
Constraint-Based Local Search (CBLS) Constraint satisfaction as optimisation ◮ Minimising constraints’ violation metrics. ◮ Solved when violation metrics are zero. ◮ Search is perhaps guided by the constraints. Using local search to solve the problem 1 Start from a random value-to-variable assignment. 2 Repeatedly make the “best” possible move each time. 3 If stuck at plateaus, restart the search process. 3 / 26
Magic Square Example Frequent Operations During Search ◮ Execution: given a move, propagate changes as needed. ◮ Simulation: given a move, what would be the violation metric? ◮ Simulations help find the “best” move for next execution. 4 / 26
CBLS System Separation of Representation and Search ◮ Representation: easy problem specification (e.g. sum, equal). ◮ Search: easy to play with strategies (e.g. best swap). ◮ The CBLS system supports the interactions in between. Invariants and Differentiable Objects ◮ Invariants are expressions that are maintained up-to-date. ◮ Diff. objects efficiently compute effects of potential moves. ◮ These commitments are maintained by the CBLS system. 5 / 26
Local Search and CBLS Systems iOpt, 2000: A toolkit for private use within British Telecom, UK. HotFrame, 2002: Template-based libraries focusing on flexibility rather than efficiency. EasyLocal++, 2003: Template-based libraries focusing on flexibility rather than efficiency. Localizer, 2000: A precursor to Comet; supports mainly invariants. Comet, 2005: The state-of-the-art CBLS system with its own language for problem model & search specification. Kangaroo, 2011: An efficient C++ library; first exposition of key implementation details. 6 / 26
Magic Square Representation Comet Kangaroo defSolver (Solver); int C = n ∗ ( n ∗ n + 1) / 2; Solver � LS � m (); forall( i in 1 .. n ∗ n ) int C = n ∗ ( n ∗ n + 1) / 2; defVar (Solver, s [ i ] , 1 , n ∗ n ) range Size = 1 .. n ∗ n ; Row[ i ] : Vars in Row i , Col[ j ] : Vars in Col j var { int } s [Size]( m , Size); Diag1, Diag2 : Vars in the 2 diagonals resp. RS, CS, DS1, DS2: row, col, and diag. sums resp. ConstraintSystem � LS � S ( m ); Eq, AC: equal constraints and their combinator. forall( i in 1 .. n ) forall( k in 1 .. n ) S . post (( sum ( j in 1 .. n ) defSum (Solver, RS[ k ], Row[ k ]) s [( i − 1) ∗ n + j ]) == C ) defEqConstr (Solver, Eq[ k ], RS[ k ], C) S . post (( sum ( j in 1 .. n ) defSum (Solver, CS[ k ], Col[ k ]) s [( j − 1) ∗ n + i ]) == C ) defEqConstr (Solver, Eq[ n + k ], CS[ k ], C) S . post (( sum ( i in 1 .. n ) defSum (Solver, DS1, Diag1) s [( i − 1) ∗ n + i ]) == C ) defEqConstr (Solver, Eq[2 ∗ n + 1], DS1, C) S . post (( sum ( i in 1 .. n ) defSum (Solver, DS2, Diag2) s [( i − 1) ∗ n + n − i + 1]) == C ) defEqConstr (Solver, Eq[2 ∗ n + 2], DS2, C) defAndConstr (Solver, AC, Eq) 7 / 26
Magic Square Search Comet Kangaroo RandomPermutation distr (Size); forall( i in Size ) s [ i ] := distr. get (); setTabuLength (Solver, TabuLength); int tabu[Size] = 0; defTabuMinSwapSel // handles tabu while( S . violations () > 0 ) { (Solver , Selector , AndConstr); if it ≥ MaxIt then return ; selectMin( i in Size : tabu[ i ] ≤ it , assign a random permutation of [1.. n ∗ n ] to s ; j in Size : i < j ∧ tabu[ j ] ≤ it ) while( AC . violations () > 0 ) { ( S . getSwapDelta ( s [ i ] , s [ j ])) if it ≥ MaxIt then return ; { run Selector to select a var-pair ( s [ i ] , s [ j ]); s [ i ] :=: s [ j ]; // Swap the values swap the value of s [ i ] and s [ j ]; tabu[ i ] = it + TabuLength; it = it + 1; tabu[ j ] = it + TabuLength; } } it = it + 1; } 8 / 26
Comet and Kangaroo: Performance Benchmark Problems from CSP-Lib The problem model and search algorithms are semantically very close. Kangaroo Comet Instances %Succ #Iter Time Mem %Succ #Iter Time Mem all interval series (4) 75 % 21 , 734 1 . 6 20 50% 504 , 465 65 . 1 42 golomb ruler (11) 91 % 681 , 452 8 . 0 21 45% not computed 42 graph coloring (20) 100 % 774 0 . 0 22 50% 590 0 . 3 44 magic square (9) 100 % 212 172 . 6 22 100 % 213 103 . 3 43 n -queens (18) 100 % 8 , 532 104 . 7 111 100 % 8 , 597 140 . 9 293 social golfer (16) 88 % 987 , 822 21 . 7 22 19% not computed 47 vessel loading (45) 100 % 212 0 . 0 24 62% 3 , 741 , 397 96 . 4 43 Time in cpu-seconds, Memory in megabytes ◮ Kangaroo’s memory usage is half of Comet’s. ◮ Kangaroo solves problems more frequently. ◮ Kangaroo is usually very efficient. 9 / 26
Kangaroo Features Key Features ◮ Lazy on-demand recomputation using top-down traversal ◮ Specialised incremental execution for aggregate formula ◮ Specialised incremental simulation boosted by caching ◮ When ranges of values are tried for the same variables ◮ Dfferent sets of variables differ just by one variable Other features ◮ Array-based fast data structures ◮ Indexes unify values of any data types. ◮ Low-level memory management for compactness. ◮ System level data encapsulation (not object level) 10 / 26
Incremental Computation Key to Efficient Propagation in CBLS systems ◮ Each move normally changes values only to a few variables. ◮ Expr. values recomputed only by incorporating the changes. ◮ Undo/Redo with the old/new value of the changed operand. Summation Example: B = 5 , S =? ◮ S = A + B + C ◮ S = 9+ undo ( B )+ redo ( B ) ◮ S = 9 − B . old + B . new ◮ S = 3 + 4 + 2 ◮ S = 9 ◮ S = 9 − 4 + 5 = 10 11 / 26
Incremental Propagation ◮ Change variables and propagate the changes incrementally. ◮ Propagation is required for both execution and simulation. 12 / 26
Prompt Propagation ◮ Ancestors of the changed variable would be affected ◮ The would-be-affected nodes are to be recomputed 13 / 26
Comet Propagation ◮ Prompt, bottom-up, topological ordering, priority queue ◮ No change in a recomputed node, no further propagation 14 / 26
Lazy Propagation ◮ Recompute only nodes that are currently required ◮ Defer computations for nodes not required currently 15 / 26
Mark-Sweep † Propagation: Marking 1 Two phases in the algorithm: marking and sweeping. 2 Mark the would-be-affected nodes recursively bottom up. 3 The would-be-affected nodes become out-of-date. † Incremental Attribute Evaluation: A Flexible Algorithm for Lazy Update, Scott Hudson, TransactIons on Programming Languages and Systems, Vol 13, No 3, July 1991. 16 / 26
Mark-Sweep † Propagation: Sweeping 1 Sweep only the required nodes recursively top-down. 2 Required nodes are now recomputed and up-to-date. 3 Recomp. of out-of-date nodes appears non-incremental. † Incremental Attribute Evaluation: A Flexible Algorithm for Lazy Update, Scott Hudson, TransactIons on Programming Languages and Systems, Vol 13, No 3, July 1991. 17 / 26
Kangaroo Propagation 1 Top-down recursive recomputation from requisite nodes. 2 Bottom-up recursive notification to deferred nodes. 3 No separate marking/sweeping/notification phase. 18 / 26
Kangaroo Propagation If a node is changed 1 Perform undo operation in its deferred parents. 2 Undeferred parents take the change into account. 19 / 26
Kangaroo Propagation If a node is unchanged 1 No propagation is required to the deferred parents. 2 Undeferred parents take the no-change into account. 20 / 26
Kangaroo Propagation If a deferred node is undone 1 Deferred ancestors are notified about potential changes. 2 This (out-of-date) notification is recursive and bottom-up. 21 / 26
Kangaroo Propagation Deferred Computation Summary 1 Immediate deferred parents perform undo operation. 2 Higher-level deferred parents just get notifications. 3 Recomp. of a deferred node carefully performs undo/redo. 22 / 26
Incremental Simulation Comet and Mark-Sweep ◮ Comet propagates and reverse the effect; very costly. ◮ Comet depends on specialised APIs, not simulations. ◮ Mark-Sweep approach does not consider simulations. Kangaroo ◮ Two separate sets of data buffers, so no reversing. ◮ Supported for both out-of-date and up-to-date nodes. ◮ Simulation, boosted by caching, is at the center. 23 / 26
Kangaroo Simulation with Caching Same variables, Ranges of values: factor out and cache. ◮ S = X + Y + Z , X = 1, Simulate for X = 2 , 3 ◮ S . newval = S . oldval + undo (1) + redo (2) ◮ S . newval = S . oldval + undo (1) + redo (3) Shared variables in swap: factor out and cache ◮ S = f ( A , B , C ) = f ′ ( X , Y , Z ), Simulate swaps (X, Y), (X, Z) ◮ Factor out and cache children dependent on X. ◮ Factor out and cache undo operation for X. 24 / 26
Conclusions ◮ We introduced a lazy but efficient CBLS system. ◮ We plan to release it under open source license. ◮ We plan to further extend its functionalities. 25 / 26
Kangaroo vs Kangaroo ◮ Australian Software ◮ Australian Marsupial ◮ Explore Neighbor & Move ◮ Look around and hop ◮ Performs lazy propagation ◮ Usually very lazy ◮ Cache to hold results ◮ A pouch to hold kids ◮ Normally no backtracking ◮ Cannot move backward 26 / 26
Recommend
More recommend