artificial neural networks design using evolutionary
play

ARTIFICIAL NEURAL NETWORKS DESIGN USING EVOLUTIONARY ALGORITHMS - PowerPoint PPT Presentation

Department of Architecture and Computer Technology University of Granada ARTIFICIAL NEURAL NETWORKS DESIGN USING EVOLUTIONARY ALGORITHMS Pedro ngel Castillo Valdivieso September 2002 Collaboration: 1 Dealing with ANN Powerful methods


  1. Department of Architecture and Computer Technology University of Granada ARTIFICIAL NEURAL NETWORKS DESIGN USING EVOLUTIONARY ALGORITHMS Pedro Ángel Castillo Valdivieso September 2002 Collaboration: 1

  2. Dealing with ANN � Powerful methods commonly used in several areas: • pattern recognition problems • function approximation and prediction problems � Problems when using ANN: • structure in layers • initial weights • learning parameters • training the net Dealing with ANN 2

  3. Dealing with ANN � Using incremental / decremental methods: • cascade correlation • OBD • tiling algorithm • OBS � Gradient descent: • back-propagation • quick-propagation • resilient-propagation Dealing with ANN 3

  4. ANN design using EA Which kind of ANN are commonly evolved? � generic ANN despite its architecture + avoid to restrict the search to an specific area - it requires to decide the coding, representation and genetic operators � prefixed architecture easily evolutionable + some previous knowledge on the problem is available ANN design using EA 4

  5. ANN design using EA Choosing the coding and representation � representation : Binary vs. Real + simplicity + more precise + classical mutation - specific operators and crossover needed - to make balance between precision and length of individuals � coding : Direct vs. Indirect + facility of implementation + compact representations - lack of scalability ANN design using EA 5

  6. ANN design using EA Evolving connection weights � weight initialization : Random initialization between [-4,+4] or [-0.5,+0.5] depending on the problem � training weights: Descent based algorithm vs. training weights using EA + faster + less sensitive to initial conditions - they can only find the - slower local optimum ANN design using EA 6

  7. ANN design using EA Evolving network architecture This problem can be solved more easily using EA than using incremental or pruning methods: • the search space is infinitely large • the error funtion surface is not differentiable • similar architectures can have a different ability • different architectures can have a similar ability ANN design using EA 7

  8. ANN design using EA Evolving the learning rule � search for the learning algorithm parameters : Back-propagation learning parameters � evolution of the learning rule : To generate the learning rule depending on the problem at hand ANN design using EA 8

  9. Specific genetic operators � mutation : Used... to tune solutions if small changes are applied or to change the area in the search space if local search operators are used Specific genetic operators 9

  10. Specific genetic operators � crossover : Used... to recombine useful parts of the population individuals or to make similar changes to those of the mutation operator Specific genetic operators 10

  11. Specific genetic operators � incremental operators : to start with small networks and increase them adding new units (too small nets may have difficulties to learn) � decremental operators : to remove hidden units to obtain smaller nets, avoiding the problem of overfitting Specific genetic operators 11

  12. Specific genetic operators � local search : Using local search algorithms as genetic operators... + they increase training speed - reduction of the diversity Specific genetic operators 12

  13. Neuro-genetic software � INANNA flexible needs SNNS extensible lack of documentation C++ � ENZO only MLP not easily extensible needs SNNS good documentation � EO+G-Prop flexible MLP and RBF extensible good documentation C++ Neuro-genetic software 13

  14. Conclussions � method that establishes all parameters of a MLP � searches for the topology � optimizes both the precision and size of nets. Work in progress � to develop new genetic operators � to apply the method to optimize other ANN � parallel and distributed version of EO library 14

Recommend


More recommend