Mathematical Preliminaries / 1: |
Mathematical Concepts and Notations / 1.1: |
Vector Space Concepts / 1.1.1: |
Matrix Notations / 1.1.2: |
Eigenvectors and Eigenvalues of Matrices / 1.1.3: |
Further Properties of Matrices / 1.1.4: |
On Matrix Differential Calculus / 1.1.5: |
Distance Measures for Patterns / 1.2: |
Measures of Similarity and Distance in Vector Spaces / 1.2.1: |
Measures of Similarity and Distance Between Symbol Strings / 1.2.2: |
Averages Over Nonvectorial Variables / 1.2.3: |
Statistical Pattern Analysis / 1.3: |
Basic Probabilistic Concepts / 1.3.1: |
Projection Methods / 1.3.2: |
Supervised Classification / 1.3.3: |
Unsupervised Classification / 1.3.4: |
The Subspace Methods of Classification / 1.4: |
The Basic Subspace Method / 1.4.1: |
Adaptation of a Model Subspace to Input Subspace / 1.4.2: |
The Learning Subspace Method (LSM) / 1.4.3: |
Vector Quantization / 1.5: |
Definitions / 1.5.1: |
Derivation of the VQ Algorithm / 1.5.2: |
Point Density in VQ / 1.5.3: |
Dynamically Expanding Context / 1.6: |
Setting Up the Problem / 1.6.1: |
Automatic Determination of Context-Independent Productions / 1.6.2: |
Conflict Bit / 1.6.3: |
Construction of Memory for the Context-Dependent Productions / 1.6.4: |
The Algorithm for the Correction of New Strings / 1.6.5: |
Estimation Procedure for Unsuccessful Searches / 1.6.6: |
Practical Experiments / 1.6.7: |
Neural Modeling / 2: |
Models, Paradigms, and Methods / 2.1: |
A History of Some Main Ideas in Neural Modeling / 2.2: |
Issues on Artificial Intelligence / 2.3: |
On the Complexity of Biological Nervous Systems / 2.4: |
What the Brain Circuits Are Not / 2.5: |
Relation Between Biological and Artificial Neural Networks / 2.6: |
What Functions of the Brain Are Usually Modeled? / 2.7: |
When Do We Have to Use Neural Computing? / 2.8: |
Transformation, Relaxation, and Decoder / 2.9: |
Categories of ANNs / 2.10: |
A Simple Nonlinear Dynamic Model of the Neuron / 2.11: |
Three Phases of Development of Neural Models / 2.12: |
Learning Laws / 2.13: |
Hebb's Law / 2.13.1: |
The Riccati-Type Learning Law / 2.13.2: |
The PCA-Type Learning Law / 2.13.3: |
Some Really Hard Problems / 2.14: |
Brain Maps / 2.15: |
The Basic SOM / 3: |
A Qualitative Introduction to the SOM / 3.1: |
The Original Incremental SOM Algorithm / 3.2: |
The "Dot-Product SOM" / 3.3: |
Other Preliminary Demonstrations of Topology-Preserving Mappings / 3.4: |
Ordering of Reference Vectors in the Input Space / 3.4.1: |
Demonstrations of Ordering of Responses in the Output Space / 3.4.2: |
Basic Mathematical Approaches to Self-Organization / 3.5: |
One-Dimensional Case / 3.5.1: |
Constructive Proof of Ordering of Another One-Dimensional SOM / 3.5.2: |
The Batch Map / 3.6: |
Initialization of the SOM Algorithms / 3.7: |
On the "Optimal" Learning-Rate Factor / 3.8: |
Effect of the Form of the Neighborhood Function / 3.9: |
Does the SOM Algorithm Ensue from a Distortion Measure? / 3.10: |
An Attempt to Optimize the SOM / 3.11: |
Point Density of the Model Vectors / 3.12: |
Earlier Studies / 3.12.1: |
Numerical Check of Point Densities in a Finite One-Dimensional SOM / 3.12.2: |
Practical Advice for the Construction of Good Maps / 3.13: |
Examples of Data Analyses Implemented by the SOM / 3.14: |
Attribute Maps with Full Data Matrix / 3.14.1: |
Case Example of Attribute Maps Based on Incomplete Data Matrices (Missing Data): "Poverty Map" / 3.14.2: |
Using Gray Levels to Indicate Clusters in the SOM / 3.15: |
Interpretation of the SOM Mapping / 3.16: |
"Local Principal Components" / 3.16.1: |
Contribution of a Variable to Cluster Structures / 3.16.2: |
Speedup of SOM Computation / 3.17: |
Shortcut Winner Search / 3.17.1: |
Increasing the Number of Units in the SOM / 3.17.2: |
Smoothing / 3.17.3: |
Combination of Smoothing, Lattice Growing, and SOM Algorithm / 3.17.4: |
Physiological Interpretation of SOM / 4: |
Conditions for Abstract Feature Maps in the Brain / 4.1: |
Two Different Lateral Control Mechanisms / 4.2: |
The WTA Function, Based on Lateral Activity Control / 4.2.1: |
Lateral Control of Plasticity / 4.2.2: |
Learning Equation / 4.3: |
System Models of SOM and Their Simulations / 4.4: |
Recapitulation of the Features of the Physiological SOM Model / 4.5: |
Similarities Between the Brain Maps and Simulated Feature Maps / 4.6: |
Magnification / 4.6.1: |
Imperfect Maps / 4.6.2: |
Overlapping Maps / 4.6.3: |
Variants of SOM / 5: |
Overview of Ideas to Modify the Basic SOM / 5.1: |
Adaptive Tensorial Weights / 5.2: |
Tree-Structured SOM in Searching / 5.3: |
Different Definitions of the Neighborhood / 5.4: |
Neighborhoods in the Signal Space / 5.5: |
Dynamical Elements Added to the SOM / 5.6: |
The SOM for Symbol Strings / 5.7: |
Initialization of the SOM for Strings / 5.7.1: |
The Batch Map for Strings / 5.7.2: |
Tie-Break Rules / 5.7.3: |
A Simple Example: The SOM of Phonemic Transcriptions / 5.7.4: |
Operator Maps / 5.8: |
Evolutionary-Learning SOM / 5.9: |
Evolutionary-Learning Filters / 5.9.1: |
Self-Organization According to a Fitness Function / 5.9.2: |
Supervised SOM / 5.10: |
The Adaptive-Subspace SOM (ASSOM) / 5.11: |
The Problem of Invariant Features / 5.11.1: |
Relation Between Invariant Features and Linear Subspaces / 5.11.2: |
The ASSOM Algorithm / 5.11.3: |
Derivation of the ASSOM Algorithm by Stochastic Approximation / 5.11.4: |
ASSOM Experiments / 5.11.5: |
Feedback-Controlled Adaptive-Subspace SOM (FASSOM) / 5.12: |
Learning Vector Quantization / 6: |
Optimal Decision / 6.1: |
The LVQ1 / 6.2: |
The Optimized-Learning-Rate LVQ1 (OLVQ1) / 6.3: |
The Batch-LVQ1 / 6.4: |
The Batch-LVQ1 for Symbol Strings / 6.5: |
The LVQ2 (LVQ 2.1) / 6.6: |
The LVQ3 / 6.7: |
Differences Between LVQ1, LVQ2 and LVQ3 / 6.8: |
General Considerations / 6.9: |
The Hypermap-Type LVQ / 6.10: |
The "LVQ-SOM" / 6.11: |
Applications / 7: |
Preprocessing of Optic Patterns / 7.1: |
Blurring / 7.1.1: |
Expansion in Terms of Global Features / 7.1.2: |
Spectral Analysis / 7.1.3: |
Expansion in Terms of Local Features (Wavelets) / 7.1.4: |
Recapitulation of Features of Optic Patterns / 7.1.5: |
Acoustic Preprocessing / 7.2: |
Process and Machine Monitoring / 7.3: |
Selection of Input Variables and Their Scaling / 7.3.1: |
Analysis of Large Systems / 7.3.2: |
Diagnosis of Speech Voicing / 7.4: |
Transcription of Continuous Speech / 7.5: |
Texture Analysis / 7.6: |
Contextual Maps / 7.7: |
Artifically Generated Clauses / 7.7.1: |
Natural Text / 7.7.2: |
Organization of Large Document Files / 7.8: |
Statistical Models of Documents / 7.8.1: |
Construction of Very Large WEBSOM Maps by the Projection Method / 7.8.2: |
The WEBSOM of All Electronic Patent Abstracts / 7.8.3: |
Robot-Arm Control / 7.9: |
Simultaneous Learning of Input and Output Parameters / 7.9.1: |
Another Simple Robot-Arm Control / 7.9.2: |
Telecommunications / 7.10: |
Adaptive Detector for Quantized Signals / 7.10.1: |
Channel Equalization in the Adaptive QAM / 7.10.2: |
Error-Tolerant Transmission of Images by a Pair of SOMs / 7.10.3: |
The SOM as an Estimator / 7.11: |
Symmetric (Autoassociative) Mapping / 7.11.1: |
Asymmetric (Heteroassociative) Mapping / 7.11.2: |
Software Tools for SOM / 8: |
Necessary Requirements / 8.1: |
Desirable Auxiliary Features / 8.2: |
SOM Program Packages / 8.3: |
SOM_PAK / 8.3.1: |
SOM Toolbox / 8.3.2: |
Nenet (Neural Networks Tool) / 8.3.3: |
Viscovery SOMine / 8.3.4: |
Examples of the Use of SOMLPAK / 8.4: |
File Formats / 8.4.1: |
Description of the Programs in SOM_PAK / 8.4.2: |
A Typical Training Sequence / 8.4.3: |
Neural-Networks Software with the SOM Option / 8.5: |
Hardware for SOM / 9: |
An Analog Classifier Circuit / 9.1: |
Fast Digital Classifier Circuits / 9.2: |
SIMD Implementation of SOM / 9.3: |
Transputer Implementation of SOM / 9.4: |
Systolic-Array Implementation of SOM / 9.5: |
The COKOS Chip / 9.6: |
The TInMANN Chip / 9.7: |
NBISOM_25 Chip / 9.8: |
An Overview of SOM Literature / 10: |
Books and Review Articles / 10.1: |
Early Works on Competitive Learning / 10.2: |
Status of the Mathematical Analyses / 10.3: |
Zero-Order Topology (Classical VQ) Results / 10.3.1: |
Alternative Topological Mappings / 10.3.2: |
Alternative Architectures / 10.3.3: |
Functional Variants / 10.3.4: |
Theory of the Basic SOM / 10.3.5: |
The Learning Vector Quantization / 10.4: |
Diverse Applications of SOM / 10.5: |
Machine Vision and Image Analysis / 10.5.1: |
Optical Character and Script Reading / 10.5.2: |
Speech Analysis and Recognition / 10.5.3: |
Acoustic and Musical Studies / 10.5.4: |
Signal Processing and Radar Measurements / 10.5.5: |
Industrial and Other Real-World Measurements / 10.5.6: |
Process Control / 10.5.8: |
Robotics / 10.5.9: |
Electronic-Circuit Design / 10.5.10: |
Physics / 10.5.11: |
Chemistry / 10.5.12: |
Biomedical Applications Without Image Processing / 10.5.13: |
Neurophysiological Research / 10.5.14: |
Data Processing and Analysis / 10.5.15: |
Linguistic and AI Problems / 10.5.16: |
Mathematical and Other Theoretical Problems / 10.5.17: |
Applications of LVQ / 10.6: |
Survey of SOM and LVQ Implementations / 10.7: |
Glossary of "Neural" Terms / 11: |
References |
Index |