Automatic generation of neural network architecture using evolutionary computation /

This book describes the application of evolutionary computation in the automatic generation of a neural network architecture. The architecture has a significant influence on the performance of the neural network. It is the usual practice to use trial and error to find a suitable neural network archi...

Full description

Saved in:
Bibliographic Details
Main Authors: Vonk, E. (Author)
Corporate Authors: World Scientific (Firm)
Group Author: Jain, L. C.; Johnson, R. P. (Ray P.)
Published: World Scientific Pub. Co.,
Publisher Address: Singapore ; River Edge, N.J. :
Publication Dates: 1997.
Literature type: eBook
Language: English
Series: Advances in fuzzy systems--Applications and theory ; v. 14
Subjects:
Online Access: http://www.worldscientific.com/worldscibooks/10.1142/3449#t=toc
Summary: This book describes the application of evolutionary computation in the automatic generation of a neural network architecture. The architecture has a significant influence on the performance of the neural network. It is the usual practice to use trial and error to find a suitable neural network architecture for a given problem. The process of trial and error is not only time-consuming but may not generate an optimal network. The use of evolutionary computation is a step towards automation in neural network architecture generation. An overview of the field of evolutionary computation is presented, together with the biological background from which the field was inspired. The most commonly used approaches to a mathematical foundation of the field of genetic algorithms are given, as well as an overview of the hybridization between evolutionary computation and neural networks. Experiments on the implementation of automatic neural network generation using genetic programming and one using genetic algorithms are described, and the efficacy of genetic algorithms as a learning algorithm for a feedforward neural network is also investigated.
Carrier Form: 1 online resource (x,182pages) : illustrations.
Bibliography: Includes bibliographical references (pages 173-179) and index.
ISBN: 9789814366441
Index Number: QA76
CLC: TP183
Contents: 1. Introduction -- 2. Artificial neural networks. 2.1. Introduction. 2.2. Basic types of neural networks. 2.3. Conclusion -- 3. Evolutionary computation. 3.1. Genetic algorithms (GAs). 3.2. Genetic programming (GP). 3.3. Evolutionary algorithms (EAs) -- 4. The biological background. 4.1. Genetic structures. 4.2. Reproduction. 4.3. Mutations. 4.4. Natural evolution. 4.5. Links to evolutionary computation -- 5. Mathematical foundations of genetic algorithms. 5.1. The operation of genetic algorithms. 5.2. The schema theorem and the building block hypothesis. 5.3. Criticism on the schema theorem and the building block hypothesis. 5.4. Price's theorem as an alternative to the schema theorem. 5.5. Markov chain analysis -- 6. Implementing GAs. 6.1. GA performance. 6.2. Fitness function. 6.3. Coding. 6.4. Selection schemes. 6.5. Crossover, mutation and inversion -- 7. Hybridisation of evolutionary computation and neural networks. 7.1. Evolutionary computing to train the weights of a NN. 7.2. Evolutionary computing to analyse a NN. 7.3. Evolutionary computing to optimise a NN architecture and its weights -- 8. Using genetic programming to generate neural networks. 8.1. Set-up. 8.2. Example of a genetically programmed neural network. 8.3. Creation and crossover rules for genetic programming for neural networks. 8.4. Automatically defined functions (ADFs). 8.5. Implementation of the fitness function. 8.6. Experiments with genetic programming for neural networks. 8.7. Discussion of genetic programming for neural networks -- 9. Using a GA to optimise the weights of a neural network. 9.1. Description of the GA software. 9.2. Set-up. 9.3. Experiments. 9.4. Discussion -- 10. Using a GA with grammar encoding to generate neural networks. 10.1. Structured genetic algorithms in neural network design. 10.2. Kitano's matrix grammar. 10.3. The modified matrix grammar. 10.4. Combining structured GAs with the matrix grammar. 10.5. Direct encoding. 10.6. Network pruning and reduction. 10.7. Experiments. 10.8. Discussion -- 11. Conclusions and future directions.