black box search by unbiased variation
play

Black Box Search By Unbiased Variation Per Kristian Lehre and - PowerPoint PPT Presentation

Black Box Search By Unbiased Variation Per Kristian Lehre and Carsten Witt CERCIA, University of Birmingham, UK DTU Informatics, Copenhagen, Denmark ThRaSH - March 24th 2010 State of the Art in Runtime Analysis of RSHs OneMax (1+1) EA O ( n


  1. Black Box Search By Unbiased Variation Per Kristian Lehre and Carsten Witt CERCIA, University of Birmingham, UK DTU Informatics, Copenhagen, Denmark ThRaSH - March 24th 2010

  2. State of the Art in Runtime Analysis of RSHs OneMax (1+1) EA O ( n log n ) [M¨ uhlenbein, 1992] (1+ λ ) EA O ( λn + n log n ) [Jansen et al., 2005] ( µ +1) EA O ( µn + n log n ) [Witt, 2006] O ( n 2 ) w.h.p. 1-ANT [Neumann and Witt, 2006] ( µ +1) IA O ( µn + n log n ) [Zarges, 2009] Linear Functions (1+1) EA Θ( n log n ) [Droste et al., 2002] and [He and Yao, 2003] Θ( n 2+ ε ) , ε > 0 const. cGA [Droste, 2006] e Ω( n ) , PRAS Max. Matching (1+1) EA [Giel and Wegener, 2003] Θ( n 2 log n ) Sorting (1+1) EA [Scharnow et al., 2002] O ( n 3 log( nw max )) SS Shortest Path (1+1) EA [Baswana et al., 2009] O ( n 3 ) MO (1+1) EA [Scharnow et al., 2002] Θ( m 2 log( nw max )) MST (1+1) EA [Neumann and Wegener, 2007] O ( nλ log( nw max )) , λ = ⌈ m 2 (1+ λ ) EA n ⌉ [Neumann and Wegener, 2007] 1-ANT O ( mn log( nw max )) [Neumann and Witt, 2008] Θ( n 5 ) Max. Clique (1+1) EA [Storch, 2006] Θ( n 5 / 3 ) (rand. planar) ( 16 n +1) RLS [Storch, 2006] Θ( m 2 log m ) Eulerian Cycle (1+1) EA [Doerr et al., 2007] Partition (1+1) EA PRAS, avg. [Witt, 2005] e Ω( n ) , arb. bad approx. Vertex Cover (1+1) EA [Friedrich et al., 2007] and [Oliveto et al., 2007a] e Ω( n ) , arb. bad approx. Set Cover (1+1) EA [Friedrich et al., 2007] SEMO Pol. O (log n ) -approx. [Friedrich et al., 2007] Intersection of (1+1) EA 1 /p -approximation in [Reichel and Skutella, 2008] O ( | E | p +2 log( | E | w max )) p ≥ 3 matroids e Ω( n ) UIO/FSM conf. (1+1) EA [Lehre and Yao, 2007] See survey [Oliveto et al., 2007b].

  3. Motivation - A Theory of Randomised Search Heuristics Computational Complexity ◮ Classification of problems according to inherent difficulty. ◮ Common limits on the efficiency of all algorithms. ◮ Assuming a particular model of computation. Computational Complexity of Search Problems ◮ Polynomial-time Local Search [Johnson et al., 1988]. ◮ Black-Box Complexity [Droste et al., 2006].

  4. Black Box Complexity A f Function class F Photo: E. Gerhard (1846). [Droste et al., 2006]

  5. Black Box Complexity x 1 , x 2 , x 3 , ... f ( x 1 ) , f ( x 2 ) , f ( x 3 ) , ... A f Function class F Photo: E. Gerhard (1846). [Droste et al., 2006]

  6. Black Box Complexity x 1 , x 2 , x 3 , ..., x t f ( x 1 ) , f ( x 2 ) , f ( x 3 ) , ..., f ( x t ) A f Function class F Photo: E. Gerhard (1846). ◮ Black box complexity on function class F T F := min A max f ∈ F T A,f [Droste et al., 2006]

  7. Results with old Model ◮ Very general model with few restrictions on resources. ◮ Example: Needle has BB complexity (2 n + 1) / 2 . ◮ Some NP-hard problems have polynomial BB complexity. ◮ Artificially low BB complexity on example functions, e.g. ◮ n/ log(2 n + 1) − 1 on OneMax ◮ n/ 2 − o ( n ) on LeadingOnes

  8. Refined Black Box Model A f Function class F Photo: E. Gerhard (1846).

  9. Refined Black Box Model f ( x 0 ) 0 x 0 f ( x 0 ) A f Function class F Photo: E. Gerhard (1846).

  10. Refined Black Box Model f ( x 0 ) , f ( x 1 ) 0 , 0 x 0 f ( x 0 ) A x 1 f ( x 1 ) f Function class F Photo: E. Gerhard (1846).

  11. Refined Black Box Model f ( x 0 ) , f ( x 1 ) , f ( x 2 ) 0 , 0 , 2 x 0 f ( x 0 ) A x 1 f ( x 1 ) f x 2 f ( x 2 ) Function class F Photo: E. Gerhard (1846).

  12. Refined Black Box Model f ( x 0 ) , f ( x 1 ) , f ( x 2 ) , f ( x 3 ) 0 , 0 , 2 , 3 x 0 f ( x 0 ) A x 1 f ( x 1 ) f x 2 f ( x 2 ) Function class F x 3 f ( x 3 ) Photo: E. Gerhard (1846).

  13. Refined Black Box Model f ( x 0 ) , f ( x 1 ) , f ( x 2 ) , f ( x 3 ) , f ( x 4 ) , f ( x 5 ) , f ( x 6 ) 0 , 0 , 2 , 3 , 0 , 2 x 0 f ( x 0 ) A x 1 f ( x 1 ) f x 2 f ( x 2 ) Function class F x 3 f ( x 3 ) x 4 f ( x 4 ) x 5 f ( x 5 ) Photo: E. Gerhard (1846). x 6 f ( x 6 )

  14. Refined Black Box Model x 0 f ( x 0 ) A x 1 f ( x 1 ) f x 2 f ( x 2 ) Function class F x 3 f ( x 3 ) x 4 f ( x 4 ) x 5 f ( x 5 ) Photo: E. Gerhard (1846). x 6 f ( x 6 ) ◮ Unbiased black box complexity on function class F T F := min A max f ∈ F T A,f

  15. Unbiased Variation Operators 1 Encoding of solution by bitstring x = x 1 x 2 x 3 x 4 x 5 x 1 x 5 x 2 x 4 x 3 1Figure by Dake, available under a Creative Commons Attribution-Share Alike 2.5 Generic license.

  16. Unbiased Variation Operators 1 Encoding of solution by bitstring x = x 1 x 2 x 3 x 4 x 5 x 1 x 2 x 3 x 4 x 5 1Figure by Dake, available under a Creative Commons Attribution-Share Alike 2.5 Generic license.

  17. Unbiased Variation Operators 1 Encoding of solution by bitstring x = x 1 x 2 x 3 x 4 x 5 x 1 x 2 = 1 = ⇒ blue in! x 3 x 4 = 1 = ⇒ orange in! x 5 1Figure by Dake, available under a Creative Commons Attribution-Share Alike 2.5 Generic license.

  18. Unbiased Variation Operators 1 Encoding of solution by bitstring x = x 1 x 2 x 3 x 4 x 5 x 1 x 2 = 1 = ⇒ blue in! x 3 x 4 = 1 = ⇒ orange out! x 5 1Figure by Dake, available under a Creative Commons Attribution-Share Alike 2.5 Generic license.

  19. Unbiased Variation Operators p ( y | x ) For any bitstrings x, y, z and permutation σ , we require 1) p ( y | x ) = p ( y ⊕ z | x ⊕ z ) 2) p ( y | x ) = p ( y σ (1) y σ (2) · · · y σ ( n ) | x σ (1) x σ (2) · · · x σ ( n ) ) → We consider unary operators, but higher arities possible. [Droste and Wiesmann, 2000, Rowe et al., 2007]

  20. Unbiased Variation Operators x ∗ x y r Condition 1) and 2) imply Hamming-invariance.

  21. Unbiased Black-Box Algorithm Scheme 1: t ← 0 . 2: Choose x ( t ) uniformly at random from { 0 , 1 } n . 3: repeat t ← t + 1 . 4: Compute f ( x ( t − 1)) . 5: I ( t ) ← ( f ( x (0)) , ..., f ( x ( t − 1))) . 6: Depending on I ( t ) , choose a prob. distr. p s on { 0 , ..., t − 1 } . 7: Randomly choose an index j according to p s . 8: Depending on I ( t ) , choose an unbiased variation op. p v ( · | x ( j )) . 9: Randomly choose a bitstring x ( t ) according to p v . 10: 11: until termination condition met. → ( µ + , λ ) EA, simulated annealing, metropolis, RLS, any population size, any selection mechanism, steady state EAs, cellular EAs, ranked based mutation ...

  22. Simple Unimodal Functions Algorithm LeadingOnes Θ( n 2 ) (1+1) EA Θ( n 2 + λn ) (1+ λ ) EA Θ( n 2 + µn log n ) ( µ +1) EA BB Ω( n )

  23. Simple Unimodal Functions Algorithm LeadingOnes Θ( n 2 ) (1+1) EA Θ( n 2 + λn ) (1+ λ ) EA Θ( n 2 + µn log n ) ( µ +1) EA BB Ω( n ) Theorem The expected runtime of any black box algorithm with unary, unbiased variation on LeadingOnes is Ω( n 2 ) .

  24. Simple Unimodal Functions Algorithm LeadingOnes Θ( n 2 ) (1+1) EA Θ( n 2 + λn ) (1+ λ ) EA Θ( n 2 + µn log n ) ( µ +1) EA BB Ω( n ) Theorem The expected runtime of any black box algorithm with unary, unbiased variation on LeadingOnes is Ω( n 2 ) . Proof idea ◮ Potential between n/ 2 and 3 n/ 4 . ◮ # 0-bits flipped hypergeometrically distributed. ◮ Lower bound by polynomial drift.

  25. Escaping from Local Optima Jump ( x ) | x | m

  26. Escaping from Local Optima Jump ( x ) | x | m Theorem For any m ≤ n (1 − ε ) / 2 with 0 < ε < 1 , the expected runtime of any black box algorithm with unary, unbiased variation is at least ◮ 2 cm with probability 1 − 2 − Ω( m ) . ◮ � n � cm with probability 1 − 2 − Ω( m ln( n/ ( rm ))) . rm → These bounds are lower than the Θ( n m ) bound for (1+1) EA!

  27. Escaping from Local Optima Jump ( x ) | x | m Proof idea ◮ Simplified drift in gaps 1. Expectation of hypergeometric distribution. 2. Chv´ atal’s bound.

  28. General Pseudo-boolean Functions Algorithm OneMax (1+1) EA Θ( n log n ) (1+ λ ) EA O ( λn + n log n ) ( µ +1) EA O ( µn + n log n ) BB Ω( n/ log n )

  29. General Pseudo-boolean Functions Algorithm OneMax (1+1) EA Θ( n log n ) (1+ λ ) EA O ( λn + n log n ) ( µ +1) EA O ( µn + n log n ) BB Ω( n/ log n ) Theorem The expected runtime of any black box search algorithm with unbiased, unary variation on any pseudo-boolean function with a single global optimum is Ω( n log n ) .

  30. General Pseudo-boolean Functions Algorithm OneMax (1+1) EA Θ( n log n ) (1+ λ ) EA O ( λn + n log n ) ( µ +1) EA O ( µn + n log n ) BB Ω( n/ log n ) Theorem The expected runtime of any black box search algorithm with unbiased, unary variation on any pseudo-boolean function with a single global optimum is Ω( n log n ) . Proof idea ◮ Expected multiplicative weight decrease. ◮ Chv´ atal’s bound.

  31. Summary and Conclusion ◮ Refined black box model. ◮ Proofs are (relatively) easy! ◮ Comprises EAs never previously analysed. ◮ Ω( n log n ) on general functions. ◮ Some bounds coincide with the runtime of (1+1) EA. ◮ Future work: k -ary variation operators for k > 1 .

Recommend


More recommend