genetic algorithms read chapter 9 exercises 9 1 9 2 9 3 9
play

Genetic Algorithms [Read Chapter 9] [Exercises 9.1, 9.2, 9.3, - PDF document

Genetic Algorithms [Read Chapter 9] [Exercises 9.1, 9.2, 9.3, 9.4] Ev olutionary computation Protot ypical GA An example: GABIL Genetic Programming Individual learning and p opulation ev olution


  1. Genetic Algorithms [Read Chapter 9] [Exercises 9.1, 9.2, 9.3, 9.4] � Ev olutionary computation � Protot ypical GA � An example: GABIL � Genetic Programming � Individual learning and p opulation ev olution 168 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  2. Ev oluationary Computation 1. Computational pro cedures patterned after biological ev olution 2. Searc h pro cedure that probabilisti cal l y applies searc h op erators to set of p oin ts in the searc h space 169 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  3. Biological Ev olution Lamarc k and others: � Sp ecies \transm ute" o v er time Darwin and W allace: � Consisten t, heritable v ariation among individuals in p opulation � Natural selection of the �ttest Mendel and genetics: � A mec hanism for inheriting traits � genot yp e ! phenot yp e mapping 170 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  4. GA ( F itness; F itness thr eshol d; p; r ; m ) � Initialize: P p random h yp otheses � Evaluate: for eac h h in P , compute F itness ( h ) � While [max F itness ( h )] < F itness thr eshol d h 1. Sele ct: Probabilistic al ly select (1 � r ) p mem b ers of P to add to P . S F itness ( h ) i Pr ( h ) = i P p F itness ( h ) j j =1 r � p 2. Cr ossover: Probabilistic al l y select pairs of 2 h yp otheses from P . F or eac h pair, h h ; h i , 1 2 pro duce t w o o�spring b y applying the Crosso v er op erator. Add all o�spring to P . s 3. Mutate: In v ert a randomly selected bit in m � p random mem b ers of P s 4. Up date: P P s 5. Evaluate: for eac h h in P , compute F itness ( h ) � Return the h yp othesis from P that has the highest �tness. 171 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  5. Represen ting Hyp otheses Represen t ( O utl ook = O v er cast _ R ain ) ^ ( W ind = S tr ong ) b y O utl ook W ind 011 10 Represen t IF W ind = S tr ong THEN P l ay T ennis = y es b y O utl ook W ind P l ay T ennis 111 10 10 172 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  6. Op erators for Genetic Algorithms Initial strings Crossover Mask Offspring Single-point crossover: 11101001000 11101010101 11111000000 00001010101 00001001000 Two-point crossover: 11101001000 11001011000 00111110000 00001010101 00101000101 Uniform crossover: 11101001000 10001000100 10011010011 173 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997 00001010101 01101011001 Point mutation: 11101001000 11101011000

  7. Selecting Most Fit Hyp otheses Fitness prop ortionate selection: F itness ( h ) i Pr( h ) = i P p F itness ( h ) j j =1 ... can lead to cr owding T ournamen t selection: � Pic k h ; h at random with uniform prob. 1 2 � With probabilit y p , select the more �t. Rank selection: � Sort all h yp otheses b y �tness � Prob of selection is prop ortional to rank 174 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  8. GABIL [DeJong et al. 1993] Learn disjunctiv e set of prop ositional rules, comp etitiv e with C4.5 Fitness: 2 F itness ( h ) = ( cor r ect ( h )) Represen tatio n: IF a = T ^ a = F THEN c = T ; IF a = T THEN c = F 1 2 2 represen ted b y a a c a a c 1 2 1 2 10 01 1 11 10 0 Genetic op erators: ??? � w an t v ariable length rule sets � w an t only w ell-formed bitstring h yp otheses 175 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  9. Crosso v er with V ariable-Length Bit- strings Start with a a c a a c 1 2 1 2 h : 10 01 1 11 10 0 1 h : 01 11 0 10 01 0 2 1. c ho ose crosso v er p oin ts for h , e.g., after bits 1, 8 1 2. no w restrict p oin ts in h to those that pro duce 2 bitstrings with w ell-de�ned seman tics, e.g., h 1 ; 3 i , h 1 ; 8 i , h 6 ; 8 i . if w e c ho ose h 1 ; 3 i , result is a a c 1 2 h : 11 10 0 3 a a c a a c a a c 1 2 1 2 1 2 h : 00 01 1 11 11 0 10 01 0 4 176 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  10. GABIL Extensions Add new genetic op erators, also applied probabilistic al l y: 1. A ddA lternative : generalize constrain t on a b y i c hanging a 0 to 1 2. Dr opCondition : generalize constrain t on a b y i c hanging ev ery 0 to 1 And, add new �eld to bitstring to determine whether to allo w these a a c a a c AA D C 1 2 1 2 01 11 0 10 01 0 1 0 So no w the learning strategy also ev olv es! 177 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  11. GABIL Results P erformance of GABIL comparable to sym b olic rule/tree learning metho ds C4.5 , ID5R , A Q14 Av erage p erformance on a set of 12 syn thetic problems: � GABIL without AA and D C op erators: 92.1% accuracy � GABIL with AA and D C op erators: 95.2% accuracy � sym b olic learning metho ds ranged from 91.2 to 96.6 178 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  12. Sc hemas Ho w to c haracterize ev olution of p opulation in GA? Sc hema = string con taining 0, 1, * (\don't care") � T ypical sc hema: 10**0* � Instances of ab o v e sc hema: 101101, 100000, ... Characterize p opulation b y n um b er of instances represen ting eac h p ossible sc hema � m ( s; t ) = n um b er of instances of sc hema s in p op at time t 179 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  13. Consider Just Selection � � f ( t ) = a v erage �tness of p op. at time t � m ( s; t ) = instances of sc hema s in p op at time t � u ( s; ^ t ) = a v e. �tness of instances of s at time t Probabilit y of selecting h in one selection step f ( h ) Pr ( h ) = P n f ( h ) i i =1 f ( h ) = � n f ( t ) Probabilt y of selecting an instance of s in one step f ( h ) X Pr ( h 2 s ) = � h 2 s \ p n f ( t ) t u ( s; ^ t ) = m ( s; t ) � n f ( t ) Exp ected n um b er of instances of s after n selections u ( s; ^ t ) E [ m ( s; t + 1)] = m ( s; t ) � f ( t ) 180 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  14. Sc hema Theorem 0 1 u ^ ( s; t ) d ( s ) B C o ( s ) B C E [ m ( s; t +1)] � m ( s; t ) 1 � p (1 � p ) @ A c m � f ( t ) l � 1 � m ( s; t ) = instances of sc hema s in p op at time t � � f ( t ) = a v erage �tness of p op. at time t � u ( s; ^ t ) = a v e. �tness of instances of s at time t � p = probabilit y of single p oin t crosso v er c op erator � p = probabilit y of m utation op erator m � l = length of single bit strings � o ( s ) n um b er of de�ned (non \*") bits in s � d ( s ) = distance b et w een leftmost, righ tmost de�ned bits in s 181 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

  15. Genetic Programming P opulation of programs represen ted b y trees r 2 sin ( x ) + x + y + sin + x y ^ 182 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997 2 x

  16. Crosso v er + + sin ^ sin 2 + + x x ^ x y y 2 x + + sin ^ sin 2 ^ + x x 2 + x y 183 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997 x y

  17. Blo c k Problem Goal: sp ell UNIVERSAL T erminals: � CS (\curren t stac k") = name of the top blo c k on stac k, or F . � TB (\top correct blo c k") = name of topmost n e correct blo c k on stac k s r v u l a i � NN (\next necessary") = name of the next blo c k needed ab o v e TB in the stac k 184 lecture slides for textb o ok Machine L e arning , T. Mitc hell, McGra w Hill, 1997

Recommend


More recommend