neural networks and their applications
play

Neural Networks and Their Applications Edited by J. G. Taylor - PDF document

Neural Networks and Their Applications Edited by J. G. Taylor King's College London II IIIMIII^/?M\/1 vw II\I ii > f c s ^if irni JOHN WILEY AND SONS Chichester New York Brisbane Toronto Singapore Contents Preface xiii


  1. Neural Networks and Their Applications Edited by J. G. Taylor King's College London II IIIMIII^/?M\/1 vw II\I ii > f c s ^if irni JOHN WILEY AND SONS Chichester • New York • Brisbane • Toronto • Singapore

  2. Contents Preface xiii List of Figures xvii List of Tables xxiii List of Contributors xxv 1 Modelling and Controller design of an EGR Solenoid Valve using Neural Networks M.A. Aram 1 1.1 Introduction 1 1.2 Neural Networks 2 1.2.1 Neural Networks in General 2 1.2.2 Multi-Layer Networks 3 1.2.3 Radial Basis Function (RBF) Network 4 1.3 Training of the Neural Networks as Predictor 6 1.3.1 Testing of Models 7 1.4 Training of a Neuro-Controller 9 1.4.1 Testing of Neuro-Controller 10 1.4.2 Analysis of Controller Outputs 13 1.5 Conclusion on Modelling 14 1.5.1 Significant Points Observed 15 1.6 Conclusion on Neurocontroller 15 1.6.1 Significant Features Observed for Neurocontroller . . . . 15 1.7 Future Work 16 References 16

  3. vi Table of Contents 2 Neural Networks as Decoders in Error-Correct ing Systems for Digital Data Transmission J. Serra-Sagrista 19 2.1 Introduction 19 2.2 Digital Transmission 21 2.2.1 Coding 21 2.2.2 Channel 21 2.2.3 Decoding 21 2.3 Neural Networks as Decoders 22 2.4 Neural Network Model 22 2.4.1 Training Phase 25 2.5 Simulation Results 27 2.6 Conclusion 31 References 32 3 Estimating Helicopter Strain Using a Neural Network Ap- proach A.D. Vella et al. 35 3.1 Introduction 35 3.2 Fatigue 36 3.3 Problem Definition from the ANN Point of View 38 3.4 Problems 40 3.4.1 Dealing with Large Training Sets 40 3.4.2 1 1 1 Conditioning 40 3.4.3 Transient Behaviour 40 3.4.4 Scale 40 3.4.5 Evaluation 41 3.5 Strategies Evolved 41 3.5.1 Introduction 41 3.5.2 Smoothing 42 3.5.3 Selecting Points by Mixing 43 3.5.4 Timing 43 3.5.5 Model Refinement 44 3.5.6 Visualization 45 3.6 Results 45 3.7 Further Work 47 3.7.1 Better Correlation Coefficients 47 3.7.2 Better Measures of Success 47 3.7.3 Transferability to Other Applications 47 3.7.4 Genetic Algorithms 49 3.8 Conclusion 49 References 49

  4. Table of Contents vii 4 Neural Networks for Texture Classification J.F. Boyce and J.F. Haddon 51 4.1 Introduction 51 4.1.1 Texture Information 51 4.1.2 Characterizing a Texture 51 4.2 Classification 52 4.3 Scene Analysis 52 4.3.1 Problem Definition 52 4.3.2 Mutual Information 53 4.4 Co-occurrence Matrices 55 4.4.1 Definition 55 4.4.2 Temporal Consistency 57 4.4.3 Labelling Segmented Images 57 4.4.4 Hermite Functions on a Lattice 57 4.4.5 Decomposition of Co-occurrence Matrices 59 4.4.6 Texture Description using Lattice Hermite Functions . . 60 4.5 Texture Classification using Neural Networks 62 4.5.1 Feature Selection 62 4.5.2 Multilayer Perceptron Neural Networks 62 4.5.3 Application of the Neural Network 63 4.6 Conclusions 64 References 64 5 Modelling psychiatric decisions with linear regression and neural networks W. Penny and D.P. Frost 67 5.1 Introduction 67 5.2 A Hierarchy of Networks 69 5.3 Selection of Input Variables 72 5.4 Linear Regression 75 5.5 Neural Networks 79 5.6 Conclusions 79 References 79 6 Visualizing Nuclear Magnetic Resonance Spectra with Self- Organizing Neural Networks A. Koski, et al. 83 6.1 Introduction 83 6.1.1 Nuclear Magnetic Resonance Spectroscopy 84 6.2 Material 84 6.2.1 Basics of Hyperornithinemia with Gyrate Atrophy . . . 84 6.3 Methods 87 6.3.1 Feature Extraction from the Spectra 87 6.3.2 Self-organizing Neural Network 87

  5. viii Table of Contents 6.3.3 Clustering Properties of the Self-organizing Map . . . . 89 6.4 Results 89 6.5 Discussion 91 References 92 7 Comparative Study of Two Self Organizing and Structurally Adaptive Dynamic Neural Tree Networks K. Butchart et al. 93 7.1 Introduction 93 7.2 Clustering and Competitive Learning Neural Networks 94 7.3 Dynamic Neural Tree Networks 95 7.3.1 Network Structure and Operation 95 7.3.2 The Production of the Tree Structure 96 7.3.3 Advantages of Dynamic Neural Tree Networks 97 7.4 The Networks 98 7.4.1 Li et al. Network 98 7.4.2 Racz and Klotz 99 7.5 The Tests 100 7.6 Network Performance 101 7.6.1 The Squares Data 101 7.6.2 Single Source Gaussian Data 102 7.6.3 Gaussian Clusters with Noisy Inputs 105 7.6.4 Highly Structured Data 107 7.7 Comparison of Performance with Fixed Size CLNNs 108 7.7.1 Single Source Gaussian Data 108 7.7.2 Gaussian Clusters 109 7.8 Non-Stationary Data 109 7.9 Conclusions 110 References 112 8 A Clustering Algorithm to Produce Context Rich Networks N. Allott et al. 113 8.1 Introduction 113 8.2 Algorithm 114 8.3 Saliency 115 8.4 Modifying the Network 116 8.5 Formal Specification of Data Structure and Heuristics 117 8.6 Application of Produced Trees 119 8.7 Further Work 119 References 120

  6. Table of Contents ix 9 Robust Financial Modelling by Combining Neural Network Estimators of Mean and Median A.N. Burgess 121 9.1 Introduction 121 9.2 Learning Conditional Expectations 122 9.3 Median and Quantile Learning 123 9.4 Implications 126 9.5 Using Combined Estimatior for Trading the FTSE100 128 9.5.1 Results for Model Trained Using MSE 128 9.5.2 Results for Model Trained Using MAD 128 9.5.3 Results for Combined Model 129 9.6 Conclusion 130 References 131 10 ISTRIA: An On-Line Neural Network System for the Analy- sis of Financial Markets S. Di Pasquale et al. 133 10.1 Introduction 133 10.2 Laboratory Phase 133 10.3 Executive Phase (Phase 1) 134 10.4 Introduction of the Genetic Algorithms (Phase 2) 138 10.5 Conclusion 140 References 142 11 Prediction of the S&P 500 Index with Neural Networks J. Angstenberger 143 11.1 Introduction: The Problem of Forecasting Stock Prices 143 11.2 DataEngine: A Software Package for Intelligent Data Analysis 144 11.3 Data Material and Objective 145 11.4 Data Pre-processing 146 11.5 Neural Networks for Forecasting 146 11.6 Validation of Results 148 11.7 Conclusion 151 11.8 Application to Similar Problems 151 References 152 12 Neural Network Based Models for Forecasting X. Ding et al. 153 12.1 Introduction 153 12.2 Water Demand 155 12.3 Energy Demand 156 12.4 Highway Traffic 157 12.5 Urban Storm Water Pollution Forecasting 159 12.6 Rainfall Forecasting Using Radar Images 160

  7. x Table of Contents 12.6.1 Rain Cell Tracking 160 12.6.2 Rain Field Modelling 161 12.7 Towards a Methodology for the Development 162 12.8 Conclusion 165 References 166 13 GA to Train NNs Using Sharing and Pruning: Global GA Search Combined with Local BP Search M. Schmidt and T. Stidsen 169 13.1 Introduction 169 13.2 Why Sharing and Pruning? 170 13.3 Why and How Use a GA to Train a NN? 170 13.3.1 Why a GA? 170 13.3.2 Coding of the NN 171 13.3.3 Description of the GA Used 172 13.4 Fitness Functions 173 13.5 BP-afterburner 175 13.6 Lamarkian Learning 175 13.7 Description of Problems 176 13.7.1 The Left-Right-Shift Problem 176 13.7.2 The Monks3 Problem 176 13.7.3 Probenl Problems 176 13.8 Detailed Analysis 177 13.9 Comments on our Results 184 13.10Comparing our Results with other NN Techniques 184 13.11Overall Conclusion 188 References 190 14 Evolutionary Neurocontrol of Chaos and the Attitude Control Problem D.C. Dracopoulos 193 14.1 Introduction 193 14.2 The Euler Equations 194 14.3 The Adaptive Control Architecture 195 14.4 Spin Stabilization of a Satellite from a Random Initial Position 199 14.5 Satellite Attitude Control Subject 200 14.6 Conclusions 201 References 202 15 An Object-Oriented Framework for NN-ES Hybrid Systems A.I. Vermesan and O. Vermesan 205 15.1 Introduction 205 15.2 Hybrid framework for design 207 15.2.1 Physical Integration 209

  8. Table of Con ten ts xi 15.3 An Object Oriented Neural Expert System 210 15.4 Hybrid Framework for Specification 214 15.5 Dealing with Incomplete Knowledge 222 15.6 Discussion 225 References 226 16 Using Correlation Matrix Memories for Inferencing in Expert Systems J. Austin and R. Filer 229 16.1 Introduction 229 16.2 Correlation Matrix Memory 230 16.3 CMM Inference Engine 232 16.3.1 Lexical Token Converter 232 16.3.2 Binding Variable and Value Tokens 232 16.3.3 Superimposing Inputs 234 16.3.4 Identifying CMM Units of Appropriate Arity 234 16.3.5 Occurrence Checking 234 16.3.6 Providing Separator Tokens (Training) 234 16.3.7 Thresholding (Retrieval) 235 16.3.8 Decoding the Output 235 16.4 System Characteristics 236 16.4.1 Storage 236 16.4.2 Partial Matching Capability 237 16.4.3 Implementing a Predicate Calculus 241 16.5 Conclusions 242 References 243 17 Neural Networks in VLSI Hardware T. Clarkson 245 17.1 Introduction 245 17.2 Neural Applications 245 17.3 Custom Integrated Circuits 247 17.4 Artificial Neurons 247 17.5 Training 248 17.6 Neuron Models 249 17.7 Interface to External Systems 249 17.8 On-chip Training 250 17.9 Temporal Neurons 251 17.10Conclusions 252 References 252

Recommend


More recommend