close
1.

図書

図書
Kevin L. Priddy, Paul E. Keller, Peter J. Angeline, chair/editor ; sponsored and published by SPIE--the International Society for Optical Engineering
出版情報: Bellingham, Wash. : SPIE, c2001  ix, 306 p. ; 28 cm
シリーズ名: Proceedings / SPIE -- the International Society for Optical Engineering ; v. 4390
所蔵情報: loading…
2.

図書

図書
Kevin L. Priddy, Paul E. Keller, Peter J. Angeline, chairs/editors ; sponsored and published by SPIE--the International Society for Optical Engineering
出版情報: Bellingham, Wash. : SPIE, c2002  ix, 226 p. ; 28 cm
シリーズ名: Proceedings / SPIE -- the International Society for Optical Engineering ; v. 4739
所蔵情報: loading…
3.

図書

図書
Peter J. Angeline ... [et al.] ; in cooperation with IEEE Neural Networks Council
出版情報: Berlin : Springer, c1997  x, 456 p. ; 24 cm
シリーズ名: Lecture notes in computer science ; 1213
所蔵情報: loading…
4.

図書

図書
edited by Kenneth E. Kinnear, Jr.
出版情報: Cambridge, Mass. : MIT Press, c1994-  v. ; 24 cm
シリーズ名: Complex adaptive systems
Bradford book
所蔵情報: loading…
目次情報: 続きを見る
Contributors
Acknowledgments
An Introduction to the Third Volume / 1:
A brief overview of genetic programming / 1.1:
Other GP Resources / 1.2:
Public Domain GP Implementations (unsupported) / 1.3:
The work in this volume / 1.4:
Part I: Applications / 1.4.1:
Part II: Theory / 1.4.2:
Part III: Extensions / 1.4.3:
Bibliography
Applications / I:
An Automatic Software Re-Engineering Tool Based on Genetic Programming / 2:
Introduction / 2.1:
Software Re-engineering / 2.1.1:
Parallel Problems / 2.2:
Problems with Data Dependency Analysis / 2.2.1:
Genetic Structure / 2.3:
Atom Mode / 2.3.1:
Atom Mode Transformations. / 2.3.2:
P and S / 2.3.2.1:
F and L / 2.3.2.2:
Shift / 2.3.2.3:
Null/Parnull / 2.3.2.4:
Atom Mode Fitness / 2.3.3:
Directed Analysis for Fpar / 2.3.3.1:
Directed Analysis for Lpar / 2.3.3.2:
Directed Analysis for Pxx / 2.3.3.3:
Directed Analysis / 2.3.3.4:
Loop Mode / 2.3.4:
Loop Fusion / 2.3.4.1:
Loop Shrinking / 2.3.4.2:
Example Individual / 2.4:
Resumption of Atom Mode / 2.4.1:
Directed Data Dependency Analysis / 2.4.3:
Experimental Results / 2.4.4:
Scheduling / 2.4.5:
Conclusion / 2.5:
CAD Surface Reconstruction from Digitized 3D Point Data with a Genetic Programming/Evolution Strategy Hybrid / 3:
Classic Context / 3.1:
Digitizing and Preprocessing / 3.2.1:
Gridded Representation and Topological Information / 3.2.2:
Surreal- a Genetic Programming/Evolution Strategy Hybrid / 3.3:
The Approach / 3.3.1:
Overview / 3.3.2:
Algorithmic Structure / 3.3.3:
Genetic Representation / 3.3.4:
Constructive Solid Geometry / 3.3.4.1:
Terminal and Function Set / 3.3.4.2:
Search Space / 3.3.4.3:
Quality Measure / 3.3.5:
Distance Criterion Delta / 3.3.5.1:
Angle Criterion Abn / 3.3.5.2:
Curvature-type Criterion Ctype / 3.3.5.3:
Primitive-number Criterion Prim / 3.3.5.4:
Variation / 3.3.6:
Mutation / 3.3.6.1:
Recombination / 3.3.6.2:
Creation / 3.3.7:
Selection for variation / 3.3.8:
Selection for the next generation / 3.3.9:
Results / 3.4:
Problem: dowel reconstruction / 3.4.1:
Parameters / 3.4.2:
Discussion / 3.4.3:
Incremental optimization / 3.4.3.1:
Fitness progression / 3.4.3.2:
Population size and convergence / 3.4.3.3:
Interactive evolution / 3.4.3.4:
Problem: cross-structure reconstruction / 3.4.4:
Conclusion and future work / 3.5:
A Genetic Programming Approach for Robust Language Interpretation / 4:
Abstraction on the Problem of Parse Repair / 4.1:
The Basic Problem / 4.2.1:
Program Induction as a Solution / 4.2.2:
ROSE's Application of GP / 4.3:
The Partial Parsing Stage / 4.3.1:
The Combination Stage / 4.3.2:
Applying Genetic Programming / 4.3.2.1:
Fitness Evaluation for the Combination Problem / 4.3.2.2:
Why GP? / 4.4:
Evaluation / 4.5:
Challenges / 4.6:
Acknowledgements
Time Series Modeling Using Genetic Programming: An Application to Rainfall-Runoff Models / 5:
The Genetic Programming System CFG-GP / 5.1:
Rainfall Runoff Modelling / 5.3:
CFG-GP Setup / 5.4:
Catchment Descriptions and Results / 5.5:
The Glan Teifi Catchment / 5.5.1:
The Namoi River Catchment / 5.5.2:
References / 5.5.4:
Automatic Synthesis, Placement, and Routing of Electrical Circuits by Means of Genetic Programming / 6:
Automatic Creation of Circuit Topology and Sizing / 6.1:
Method for Automatic Creation of Circuit Topology, Sizing, Placement, and Routing / 6.3:
The Initial Circuit / 6.3.1:
Circuit-Constructing Functions / 6.3.2:
Component-Creating Functions / 6.3.2.1:
Topology-Modifying Functions / 6.3.2.2:
Development-Controlling Functions / 6.3.2.3:
The Developmental Process / 6.3.3:
Statement of the Illustrative Problem / 6.4:
Preparatory Steps / 6.5:
Initial Circuit / 6.5.1:
Program Architecture / 6.5.2:
Function and Terminal Sets / 6.5.3:
Fitness Measure / 6.5.4:
Control Parameters / 6.5.5:
Termination Criterion and Results Designation / 6.5.6:
Implementation on Parallel Computer / 6.5.7:
Computer Time / 6.6:
Genetic Programming as an Invention Machine / 6.8:
Quantum Computing Applications of Genetic Programming / 6.9:
Quantum Computation / 7.1:
A Virtual Quantum Computer / 7.2:
State Representation and Notation / 7.2.1:
Quantum Gates / 7.2.2:
Quantum NOT and SQUARE ROOT OF NOT / 7.2.2.1:
Applying quantum gates to multi-qubit systems / 7.2.2.2:
Other Quantum Gates / 7.2.2.3:
Running a Quantum Algorithm / 7.2.3:
Example Execution Trace / 7.2.4:
The Power of Quantum Computation / 7.2.5:
Evolving Quantum Algorithms / 7.3:
Standard Tree-based Genetic Programming / 7.3.1:
Stack-Based, Linear Genome Genetic Programming / 7.3.2:
Stackless Linear Genome Genetic Programming / 7.3.3:
Fitness Function / 7.3.4:
Deutsch's Early Promise Problem / 7.4:
The Scaling Majority-On Problem / 7.4.2:
The Database Search Problem / 7.4.3:
The And-Or Query Problem / 7.4.4:
Conclusions / 7.5:
Theory / II:
The Evolution of Size and Shape / 8:
Background / 8.1:
Program Search Spaces / 8.3:
Bloat Inherent in Variable Length Representations / 8.4:
Sextic Polynomial / 8.5:
GP Runs / 8.5.1:
Non GP Search Strategies / 8.5.2:
New Tree Mutation Operators / 8.5.3:
50-150% Fair Mutation Runs / 8.5.3.1:
Subtree Fair Mutation Runs / 8.5.3.2:
Direct Measurement of Genetic Operators Effects on Performance / 8.5.4:
Self Crossover / 8.5.4.1:
Mutation Operators / 8.5.4.2:
Bloat in Discrete Problems / 8.6:
Code Bloat as Protection / 8.6.1:
Code bloat due to "Removal Bias" / 8.6.2:
Evolution of Program Shapes / 8.7:
Future Work / 8.8:
Fitness Distributions: Tools for Designing Efficient Evolutionary Computations / 9:
Fitness distributions / 9.1:
Evolving computer programs using evolutionary programming / 9.4:
Initialization / 9.4.1:
Offspring generation through variation / 9.4.2:
Parent selection / 9.4.3:
Test problems / 9.4.4:
Experiments / 9.4.5:
Analysis of Single-Node (Building) Blocks in Genetic Programming / 9.5:
Current Usages and Definitions / 10.1:
Objectives / 10.1.2:
Case Study Description / 10.2:
Motivations / 10.2.1:
Method / 10.2.2:
Fitness-Centric Experiment / 10.3:
Fitness-Centric Results / 10.3.1:
Fitness-Centric Discussion / 10.3.2:
ERC-Centric Experiment / 10.4:
ERC-Centric Results / 10.4.1:
ERC-Centric Discussion / 10.4.2:
Implications for Building Blocks / 10.5:
Appendix A.10.1 Approaches to Solving f(x) / 10.6:
Appendix A.10.2 Known ERC Strategies
Appendix A.10.3 Alternative Frame for Analyzing GP and ERCs
Rooted-Tree Schemata in Genetic Programming / 11:
Schema Theory / 11.1:
Schemata in genetic algorithms / 11.2.1:
Schemata in genetic programming / 11.2.2:
Portraying Variable Complexity Representations / 11.3:
The rooted-tree schema property / 11.3.1:
Growth of rooted-tree schemata / 11.3.2:
The role of variable size during evolution / 11.3.3:
Adding Parsimony / 11.4:
Selection with a parsimonious fitness function / 11.4.1:
Growth of rooted-tree schemata with parsimony / 11.4.2:
Controlling Schema Growth / 11.5:
Fitness based on pure performance / 11.6:
Parsimonious fitness / 11.6.2:
Adaptive probability of destruction / 11.6.3:
Summary of experiments / 11.6.4:
Solution Acquisition in GP / 11.7:
Related Work / 11.8:
Extensions / 11.9:
Efficient Evolution of Machine Code for CISC Architectures Using Instruction Blocks and Homologous Crossover / 12:
Why Evolve Machine Code? / 12.1:
Advantages of Evolving Machine Code / 12.2.1:
Why is Binary Manipulation so Fast? / 12.3:
Overview of AIM-GP / 12.4:
Making Machine Code Genetic Programming Work on CISC Processors / 12.5:
The Importance of Extending AIM-GP to CISC Processors / 12.5.1:
Challenges in Moving AIM-GP to CISC Processors / 12.5.2:
Instruction Blocks / 12.5.3:
Instruction Annotations / 12.5.4:
The Benefits of ''Glue'' / 12.5.5:
Other AIM-GP Innovations / 12.6:
Memory Access and Large Input Sets / 12.6.1:
Decompilation / 12.6.2:
Homologous Crossover / 12.6.3:
Floating Point Arithmetic / 12.6.4:
Automatically Defined Functions / 12.6.5:
AIM-GP and Tree-Based GP / 12.7:
Summary and Conclusion / 12.8:
Sub-machine-code Genetic Programming / 13:
Sub-machine-code GP / 13.1:
Examples / 13.4:
1-bit and 2-bit Adder Problems / 13.4.1:
Character Recognition Problem / 13.4.2:
Fast Parallel Evaluation of Fitness Cases / 13.4.3:
Appendix: Implementation / 13.6:
Description / 13.A.1:
Code / 13.A.2:
The Internal Reinforcement of Evolving Algorithms / 14:
Neural Programming / 14.1:
The Neural Programming Representation / 14.2.1:
Illustrative Examples / 14.2.2:
Example 1: The Fibonacci Series / 14.2.2.1:
Example 2: The Golden Mean / 14.2.2.2:
Example 3: Foveation / 14.2.2.3:
Internal Reinforcement in NP / 14.3:
Creating a Credit-Blame Map / 14.3.1:
Accumulation of Explicit Credit Scores / 14.3.1.1:
Function Sensitivity Approximation / 14.3.1.2:
Refining the Credit-Blame Map / 14.3.1.3:
Credit Scoring the NP arcs / 14.3.1.4:
Exploration vs. Exploitation Within a Program / 14.3.2:
Using a Credit-Blame Map / 14.3.3:
Mutation: Applying a Credit-Blame Map / 14.3.3.1:
Crossover: Applying a Credit-Blame Map / 14.3.3.2:
The Credit-Blame Map Before/After Refinement / 14.3.4:
IRNP Discussion / 14.3.5:
Experimental Overview / 14.4:
Natural Images / 14.4.2:
Setting PADO up to Solve the Problem / 14.4.2.1:
The Results / 14.4.2.2:
Acoustic Signals / 14.4.3:
Acoustic Signals Revisited / 14.4.3.1:
Inductive Genetic Programming with Immune Network Dynamics / 14.4.4.1:
Immune Version of the GP System / 15.1:
Biological Networks / 15.2.1:
Computational CounterParts in GP / 15.2.2:
The Dynamic Fitness Function / 15.2.3:
Micromechanisms of the Inductive GP / 15.3:
Inductive Learning and Regression / 15.3.1:
Multivariate Trees / 15.3.2:
Context-Preserving Mutation / 15.3.3:
Crossover by Cut and Splice / 15.3.4:
Practical Induction by Immune Dynamics / 15.4:
Traditional and Immune Versions of iGP / 15.4.1:
Performance Measures / 15.4.2:
Machine Learning / 15.4.3:
Time-Series Prediction / 15.4.4:
Relevance to Other Works / 15.5:
A Self-Tuning Mechanism for Depth-Dependent Crossover / 15.7:
The 11MX problem / 16.1:
The ANT problem / 16.3.2:
The robot problem / 16.3.3:
Genetic Recursive Regression for Modeling and Forecasting Real-World Chaotic Time Series / 16.4:
Problem Definition: Data Driven Model Building / 17.1:
Genetic Symbolic Regression and Data Driven Model Building / 17.2:
A New Algorithm Genetic Recursive Regression (GRR) / 17.3:
Recursive Regression / 17.3.1:
Representation of the Regression Model as a Series Expansion / 17.3.2:
Parallel Computational Architecture / 17.3.3:
Adaptive Update of the Numerical Coefficients. / 17.3.4:
Derived Terminal Set / 17.3.5:
Implementation Issues / 17.4:
Fitness Assignment / 17.4.1:
Super Population and Migration between Multiple Populations / 17.4.2:
Dealing with Absurd Attributes of a Symbolic Form / 17.4.3:
Division of the Data Set: Training, Validation and Prediction Regions / 17.4.4:
Termination Conditions / 17.4.5:
Some Manipulations on the Raw Data / 17.4.6:
Application to Read World Chaotic Time Series / 17.5:
Benchmarking / 17.5.1:
Data and Computational Settings / 17.5.1.1:
Benchmarking Results and Discussion / 17.5.1.2:
Effects of the Adaptive Update of the Numerical Coefficients
Effects of the Parallel Architecture
Effects of the Recursive Model Building
Rend World Chaotic Time Series / 17.5.2:
Results and Discussion / 17.5.2.1:
Effects of the Derived Terminal Set (DTS)
Comparisons with Earlier Works / 17.5.3:
ASHRAE and Santa Fe Competitions / 17.5.3.1:
Mackey-Glass Equation / 17.5.3.2:
Conclusion and Issues Remaining / 17.6:
Co-evolutionary Fitness Switching: Learning Complex Collective Behaviors Using Genetic Programming / 18:
Genetic Programming with Coevolutionary Fitness Switching / 18.1:
Results on Table Transport / 18.3:
Application to Robotic Soccer / 18.4:
Evolving Multiple Agents by Genetic Programming / 18.5:
Example Tasks / 19.1:
Fitness Assignment and Breeding Strategies / 19.3:
Comparison with Reinforcement Learning / 19.4:
Evolving Agents with Communication / 19.5:
Evolving Controlling Agents / 19.5.1:
Evolving Negotiating Agents / 19.5.2:
Evolving Other Types of Communicating Agents / 19.6:
Q-learning and Genetic Programming / 19.6.2:
A Multi-agent reinforcement learning / 19.6.3:
Index
Preface
A Perspective on the Work in this Book
What is Evolutionary Computation?
Why is Evolutionary Computation interesting? / 1.1.1:
Styles of Evolutionary Computation / 1.1.2:
What defines Genetic Programming?
Current activity in Genetic Programming
Part II: Increasing the Power of Genetic Programming / 1.3.1:
Part III: Innovative Applications of Genetic Programming / 1.3.2:
Practical Guidance
It isn't as easy as it looks -- but it does work.
The fitness function is exceptionally important.
Representation is important too.
It all comes together in the transmission function. / 1.4.4:
Population size and diversity are also important. / 1.4.5:
Don't generalize from one run. / 1.4.6:
Genetic programming is robust. / 1.4.7:
Know your problem, know your data. / 1.4.8:
Where to go for more information and inspiration. / 1.5:
Biology / 1.5.1:
Complex Adaptive Systems / 1.5.2:
Genetic Algorithms and other Evolutionary Computation paradigms / 1.5.3:
Introduction to Genetic Programming / 1.6:
Introduction to Genetic Algorithms
Program Trees and the LISP Programming Language
Genetic Programming
Automatic Function Definition in Genetic Programming
Sources of Additional Information about Genetic Programming
Sources of Additional Information about Genetic Algorithms / 2.6:
Increasing the Power of Genetic Programming
The Evolution of Evolvability in Genetic Programming
Evolvability / 3.1.1:
Representations / 3.1.2:
Evolving evolvability / 3.1.3:
Constructional selection / 3.1.4:
Synopsis of the models / 3.1.5:
Selection, Transmission, and Evolvability
A general model of the canonical genetic algorithm
Measurement functions
Price's theorem applied to evolvability / 3.2.3:
Price's theorem and the Schema Theorem / 3.2.4:
Dynamics of Genetic Programming
A model of genetic programming dynamics
The "constructional" fitness of a block of code
The constructional fitness is distinct from the marginal fitness
Conservative versus exploratory code
Principle 1 applied to genetic programming
Further work
Notice
Genetic Programming and Emergent Intelligence
Prelude
General Computational Problem Solving
Weak and Strong Methods
Credit Assignment
Evolutionary Weak Methods and Empirical Credit Assignment / 4.2.3:
Emergent Intelligence / 4.2.4:
Genetic Programming: The Inside Story
Comparing Genetic Algorithms and Genetic Programming
Representation of Genotypes / 4.3.1.1:
Complexity of Interpretation / 4.3.1.2:
Syntax Preserving Crossover / 4.3.1.3:
Innate Emergent Intelligence in GP
Emergence of Introns
Emergent Diploidy and Dominance
Exploiting Emergent Intelligence
Emergent Problem Decomposition in Genetic Programs / 4.4.1:
The Genetic Library Builder / 4.4.1.1:
Emergent Evaluation of Modules / 4.4.1.2:
Comparison to ADFs / 4.4.1.3:
Emergence of High-Level Representations / 4.4.1.4:
Generality of Emergent Modules / 4.4.1.5:
Emergent Goal-Directed Behavior / 4.4.2:
Guidelines for Emergent Intelligence
Coaxing Rather than Coercing / 4.5.1:
Hitchhiking as a Technique / 4.5.2:
Opportunism Over Prespecification / 4.5.3:
Explicit Knowledge As A Last Resort / 4.5.4:
Scalable Learning in Genetic Programming using Automatic Function Definition
The Lawn Mower Problem
Preparatory Steps Without Automatic Function Definition
Lawn Size of 64 Without Automatic Function Definition
Lawn Size of 96 Without Automatic Function Definition / 5.4.1:
Lawn Size of 32 Without Automatic Function Definition / 5.4.2:
Preparatory Steps With Automatic Function Definition
Lawn Size of 64 With Automatic Function Definition / 5.6:
Lawn Size of 96 With Automatic Function Definition / 5.6.1:
Lawn Size of 32 With Automatic Function Definition / 5.6.2:
Relationship of Parsimony to Problem Size
Relationship of Computational Effort to Problem Size
Alternatives in Automatic Function Definition: A Comparison of Performance / 5.9:
The Even-4-Parity Problem
Automatic Function Definition: Two Approaches
Automatically Defined Functions: ADF
Module Acquisition: MA
The Genetic Programming Environment
Steady State GP / 6.4.1:
Selection Procedures / 6.4.2:
Genetic Operators / 6.4.3:
Comparisons of ADF and MA strategies
Basic Performance Comparison
Use of a priori Knowledge
Frequency of Function Calls
Functions Including Global Variables
Multiple Use of Parameters
Local vs. Global Function Definitions
Evolution of Function Definitions
Structural Regularity / 6.5.8:
Self-Crossover / 6.5.8.1:
Modular Crossover / 6.5.8.2:
Discussion of Structural Regularity / 6.5.8.3:
Further Work
The Donut Problem: Scalability, Generalization and Breeding Policies in Genetic Programming
Introduction: Depth vs. Breadth
The Donut Problem
Purposely Introduced Imperfections
There is a Solution (Sort of)
Breeding Policies
"Demes" and Spatially Distributed Evolution
Implementation of Distributed Evolution
Elitism and the Steady State Model
Implementation of Steady-State Elitism
Experimental Method
Performance of GP as Class Overlap Increases
Generalization and Uniform Undersampling of the Training Set
Generalization and Nonuniform Sampling of the Training Set
Assessing the Effects of Demes and Elitism
Summary of Experimental Configurations / 7.4.5:
Scalability With Respect to Class Overlap / 7.5.1:
Generalization With Respect to Class Overlap / 7.5.2:
Generalization and Uniform Undersampling of Training Data / 7.5.3:
Generalization and Nonuniformly Distributed Training Data / 7.5.4:
Comparative Performance and the Optimal Function Set / 7.5.5:
Comparative Performance of Breeding Policies / 7.5.6:
Performance Across All Experiments / 7.5.6.1:
Performance Using Uniformly Sparse Training Data / 7.5.6.2:
Performance Using Nonuniform Training Data / 7.5.6.3:
Performance Using Sparse and Nonuniform Training Data / 7.5.6.4:
Performance Using Sparse Data With High Degree of Class Overlap / 7.5.6.5:
Performance Using Non-Sparse Data Sets / 7.5.6.6:
Conclusions About Distributed Evolution / 7.6:
Conclusions Concerning Elitism / 7.6.2:
Comments on the Procedures Used / 7.6.3:
Need for Benchmark Test Functions / 7.6.4:
Big Deme Grids and Other Parameters / 7.6.5:
Gene Frequencies and Distributed Evolution / 7.6.6:
Effects of Locality in Individual and Population Evolution
Domain
Terminal Set / 8.3.1:
Function Set / 8.3.2:
Fitness Evaluation / 8.3.3:
Discussion of Results
Population Seeding
Statistical Analysis
Emergence of Demes
Structural Analysis of Individuals
Recommendations for Future Work
Appendix A: Seed Tank Code
Appendix B: Original Example Tank Code
Appendix C: Stimulus-Response Maps of Example Tanks
The Evolution of Mental Models
The Method and The Model
The Environment
The Implementation
Discussion of Indexed Memory
Discussion of Mental Models
Evolution of Obstacle Avoidance Behavior: Using Noise to Promote Robust Solutions / 9.8:
Previous Work
Obstacle Avoidance as Genetic Programming
The Vehicle
The Obstacle Course
Sensors
Noise / 10.7:
Measuring Fitness in the Presence of Noise / 10.8:
Pygmies and Civil Servants / 10.9:
The Problem Space / 11.1.1:
Genetic Programming Or String GA? / 11.1.2:
Implementation Notes / 11.1.3:
The Benefits of Elitism / 11.1.4:
Traditional Methods
The Fitness Function - Punish or Reward?
Early Results
Maintaining Diversity In Artificial Evolution
Sharing And Crowding
Isolation by Distance
Steady State Genetic Algorithms
Restricted Mating / 11.3.4:
Breeding For Secondary Features / 11.3.5:
Pygmies And Civil Servants / 11.3.6:
Implementation / 11.3.6.1:
Extending the model / 11.3.6.2:
Generalising the model / 11.3.6.3:
Pygmies and Genetic Programming / 11.4.0:
Appendix: The Pygmy Algorithm / 11.5.0:
Acknowlegements
Genetic Programming Using a Minimum Description Length Principle
GP using an MDL principle
Decision Trees and Genetic Programming
MDL-based fitness functions / 12.2.2:
Evolving decision trees / 12.2.3:
Evolving trees with an MDL-based fitness function
Genetic Programming in C++: Implementation Issues
Pointer Based Implementations
A Postfix, Stack-Based Approach
Memory Efficiency / 13.3.1:
Manipulating Postfix Programs / 13.3.2:
Postfix Initialization / 13.3.2.1:
Postfix Crossover / 13.3.2.2:
Postfix Mutation / 13.3.2.3:
The Flow Control Problem with Postfix / 13.3.3:
Mixfix
Prefix Ordering
Initialization, Crossover and Mutation with Prefix / 13.5.1:
Handling Program Flow with Prefix / 13.5.2:
The Node Representation
General Data Support / 13.6.1:
The Opcode Format / 13.6.2:
The Jump Table Mechanism / 13.6.3:
The Prefix, Jump-Table (PJT) Approach / 13.7:
Advanced Topics (Looking for Roadblocks) / 13.8:
Beyond Closure: Handling Multiple Data Types / 13.9.1:
Module Implementation / 13.9.2:
Encapsulation / 13.9.2.1:
Module Execution / 13.9.2.2:
Handling Recursion / 13.9.3:
Simulated Multi-Tasking / 13.9.4:
Using Tables to Evaluate Diversity / 13.9.5:
Conclusion and Future Directions / 13.10:
A Compiling Genetic Programming System that Directly Manipulates the Machine Code
Prologue / 14.0:
Reading Guidance / 14.0.1:
The Compiling Genetic Programming System (CGPS)
The Hardware Environment
The Language for the Genetic Algorithm Implementation
The Structure of a Machine Code Function Callable by a 'C'-function / 14.2.3:
The SPARC Architecture / 14.2.4:
The Instruction Set / 14.2.5:
The Genetic Algorithm / 14.2.6:
Comparison between CGPS and interpreting GP Systems. / 14.2.7:
A Genetic Programming System for Heuristic Classification
Comparison between the CGPS and a Neural Network
The Sample Problem
The Training Set / 14.4.1.1:
Coding of Words for the Genetic Programming System. / 14.4.1.2:
Coding of words for the Neural Network / 14.4.1.3:
The Neural Network
Training Method
Results of comparison
Population Size and Efficiency / 14.4.5:
Applicability
Concluding Remarks / 14.7:
Biblography
Innovative Applications of Genetic Programming
Automatic Generation of Programs for Crawling and Walking
The problem
The approach
Functions and terminals
Side-effecting functions and simulated memory
Constant perturbation
Fitness evaluation
Program structure / 15.3.5:
Experiment 1 / 15.3.6:
Experiment 2
Experiment 3
Analysis of the results
Analysis of the method
Comparison with random search / 15.6.1:
Comparison with other methods / 15.6.2:
Practical considerations / 15.6.3:
Scalability / 15.6.3.1:
Real-Time control / 15.6.3.2:
Genetic Programming for the Acquisition of Double Auction Market Strategies
Double Auction Markets and the Santa Fe Tournaments
The Double Auction Mechanism / 16.1.1:
Measuring Trading Efficiency / 16.1.2:
The Santa Fe Tournaments / 16.1.3:
Structure of Local Double Auctions / 16.1.4:
The Local Experiments / 16.1.5:
Genetic Programming of Strategies
The GP Environment / 16.2.1:
The Programming Constructs / 16.2.2:
GP Selection Parameters / 16.2.3:
A Comparable Simulated Annealing Environment / 16.2.4:
Is Genetic Search Useful ?
Economising on Fitness Evaluations
Two Scientific Applications of Genetic Programming: Stack Filters and Non-Linear Equation Fitting to Chaotic Data
Development of Stack Filters
Methods / 17.2.1:
Fitting of Non-Linear Equations to Chaotic Data / 17.2.3:
Deception and Fitness / 17.3.4.1:
Fitness Measures / 17.3.4.2:
Effectiveness of Prediction / 17.3.4.3:
Structural Insights / 17.3.4.4:
Conclusions and Prospects
The Automatic Generation of Plans for a Mobile Robot via Genetic Programming with Automatically Defined Functions
The Genetic Planner
An Example World: A Robot on a 2-D Grid.
A set of procedurally-defined operators / 18.3.1:
A set of predicates that describe the world / 18.3.2:
Fitness functions for each of the predicates. / 18.3.3:
A ground goal expression / 18.3.4:
A simulation of the world / 18.3.5:
A Demonstration of The Genetic Planner
Analysis of Some Best-Of-Run ADFs / 18.6:
Competitively Evolving Decision Trees Against Fixed Training Cases for Natural Language Processing / 18.7:
The Domain: Word Sense Disambiguation
The Training Cases
How Decision Trees Work
Crossover Operations on Decision Trees
How Fixed Training Data Participate in Competitive Adaptation
Averting Overlearning with Decision Trees: Fitness Penalty
Non-trivial Learning and Generalization Performance / 19.8:
Competition / 19.8.2:
Fitness Penalty / 19.8.3:
Linguistic Data / 19.8.4:
Cracking and Co-Evolving Randomizers / 19.9:
Motivation / 20.1:
Arguments for Success / 20.3:
Two Player Penny Matching Game / 20.3.1:
Uniform Distribution / 20.3.2:
Models / 20.4:
Two Player Multi-Penny Matching Game / 20.4.1:
Single Generator / 20.4.2:
Separate Generators and Guessers / 20.4.4:
Sexing Populations / 20.4.5:
New Techniques / 20.5:
Dynamic Sampling / 20.5.1:
Fitness Tournament / 20.5.2:
Tested Randomizers / 20.6:
Tableau / 20.6.2:
Functions and Terminals / 20.6.3:
GP Shell Modifications / 20.6.4:
Optimizing Confidence of Text Classification by Evolution of Symbolic Expressions / 20.7:
The News Story Classification Problem / 21.1:
Automated keyword assignment using MBR / 21.3:
The Coding Algorithm / 21.4:
The Referral Problem / 21.5:
Referral with single keyword per document / 21.5.1:
Referral with multiple keywords / 21.5.2:
Brief Overview of Genetic Algorithms / 21.6:
Example formulae / 21.7:
example results / 21.7.0.1:
Representation of evolved formulae / 21.7.1:
The environment for Genetic evolution / 21.8:
Generation of the initial random populations / 21.8.1:
Evaluating fitness / 21.8.2:
Fitness proportionate reproduction / 21.8.3:
Cross-over / 21.8.4:
Population size / 21.8.5:
Number of generations / 21.8.7:
The test environment / 21.9:
Discussion of results / 21.10:
Continuing and Future Work / 21.11:
Evolvable 3D Modeling for Model-Based Object Recognition Systems / 22:
Evolvable 3D Modeling with Multi-level GP/GA / 22.1:
Evolvable modeling of jetliners--implementation / 22.3:
The Data Structure of the Evolvable 3D Jet Models / 22.3.1:
The Parameter Constraints and Parametric Relations / 22.3.2:
The Population Structure and Population Transition / 22.3.3:
Related Works, Extensions, and Applications / 22.4:
Comments and Conclusions / 22.5:
Automatically Defined Features: The Simultaneous Evolution of 2-Dimensional Feature Detectors and an Algorithm for Using Them / 23:
Introduction and Overview / 23.1:
Hit-Miss Matrices and the Two-Dimensional Genetic Algorithm / 23.2:
Hit-miss Matrices in OCR / 23.2.1:
Hit-Miss Matrices in GP / 23.2.2:
Evolutionary Operators for Hit-Miss Matrices / 23.2.3:
Definition of Three Test Problem Sets / 23.3:
Problem Set 1: The {L,T} problem: / 23.3.1:
Problem Set 2: The single example digit discrimination task: / 23.3.2:
Problem Set 3: The multiple example digit discrimination task: / 23.3.3:
Method for applying GP and GA to digit recognition / 23.4:
Functions, Terminals, and Basic Architecture / 23.4.1:
Methods of generation and application of the genetic operators / 23.4.2:
Example of an individual / 23.4.3:
The fitness function / 23.4.4:
Values of run parameters and the success predicate / 23.4.5:
Problem Set #2: Single example digit discrimination problem / 23.5:
Problem Set #3: Multiple example digit discrimination tasks / 23.5.3:
Conclusions and Future work / 23.6:
Genetic Micro Programming of Neural Networks / 24:
Review of Cellular Encoding / 24.1:
JaNNeT: A Neural Compiler of Pascal Program / 24.3:
A Hierarchy of Genetic Languages / 24.5:
Cellular Encoding Versus LISP / 24.6:
Author Index / 24.7:
Contributors
Acknowledgments
An Introduction to the Third Volume / 1:
文献の複写および貸借の依頼を行う
 文献複写・貸借依頼