close
1.

図書

図書
T. コホネン著 ; 中谷和夫監訳
出版情報: 東京 : シュプリンガー・フェアラーク東京, 1993.2  xvi, 341p ; 22cm
所蔵情報: loading…
2.

図書

図書
Teuvo Kohonen
出版情報: Berlin ; Tokyo : Springer-Verlag, c1987  xiii, 388 p. ; 23 cm
シリーズ名: Springer series in information sciences ; 1
所蔵情報: loading…
3.

図書

図書
Teuvo Kohonen
出版情報: Berlin ; New York ; Tokyo : Springer-Verlag, c1988  xv, 312 p. ; 24 cm
シリーズ名: Springer series in information sciences ; v. 8
所蔵情報: loading…
4.

図書

図書
G.ザブック, T.コホネン編 ; 徳高平蔵, 田中雅博監訳
出版情報: 東京 : シュプリンガー・フェアラーク東京, 1999.7  l, 357p, 図版8p ; 21cm
所蔵情報: loading…
5.

図書

図書
Teuvo Kohonen
出版情報: Berlin ; New York : Springer, c1995  ix, 362 p. ; 25 cm
シリーズ名: Springer series in information sciences ; 30
所蔵情報: loading…
目次情報: 続きを見る
Mathematical Preliminaries / 1:
Mathematical Concepts and Notations / 1.1:
Vector Space Concepts / 1.1.1:
Matrix Notations / 1.1.2:
Eigenvectors and Eigenvalues of Matrices / 1.1.3:
Further Properties of Matrices / 1.1.4:
On Matrix Differential Calculus / 1.1.5:
Distance Measures for Patterns / 1.2:
Measures of Similarity and Distance in Vector Spaces / 1.2.1:
Measures of Similarity and Distance Between Symbol Strings / 1.2.2:
Averages Over Nonvectorial Variables / 1.2.3:
Statistical Pattern Analysis / 1.3:
Basic Probabilistic Concepts / 1.3.1:
Projection Methods / 1.3.2:
Supervised Classification / 1.3.3:
Unsupervised Classification / 1.3.4:
The Subspace Methods of Classification / 1.4:
The Basic Subspace Method / 1.4.1:
Adaptation of a Model Subspace to Input Subspace / 1.4.2:
The Learning Subspace Method (LSM) / 1.4.3:
Vector Quantization / 1.5:
Definitions / 1.5.1:
Derivation of the VQ Algorithm / 1.5.2:
Point Density in VQ / 1.5.3:
Dynamically Expanding Context / 1.6:
Setting Up the Problem / 1.6.1:
Automatic Determination of Context-Independent Productions / 1.6.2:
Conflict Bit / 1.6.3:
Construction of Memory for the Context-Dependent Productions / 1.6.4:
The Algorithm for the Correction of New Strings / 1.6.5:
Estimation Procedure for Unsuccessful Searches / 1.6.6:
Practical Experiments / 1.6.7:
Neural Modeling / 2:
Models, Paradigms, and Methods / 2.1:
A History of Some Main Ideas in Neural Modeling / 2.2:
Issues on Artificial Intelligence / 2.3:
On the Complexity of Biological Nervous Systems / 2.4:
What the Brain Circuits Are Not / 2.5:
Relation Between Biological and Artificial Neural Networks / 2.6:
What Functions of the Brain Are Usually Modeled? / 2.7:
When Do We Have to Use Neural Computing? / 2.8:
Transformation, Relaxation, and Decoder / 2.9:
Categories of ANNs / 2.10:
A Simple Nonlinear Dynamic Model of the Neuron / 2.11:
Three Phases of Development of Neural Models / 2.12:
Learning Laws / 2.13:
Hebb's Law / 2.13.1:
The Riccati-Type Learning Law / 2.13.2:
The PCA-Type Learning Law / 2.13.3:
Some Really Hard Problems / 2.14:
Brain Maps / 2.15:
The Basic SOM / 3:
A Qualitative Introduction to the SOM / 3.1:
The Original Incremental SOM Algorithm / 3.2:
The "Dot-Product SOM" / 3.3:
Other Preliminary Demonstrations of Topology-Preserving Mappings / 3.4:
Ordering of Reference Vectors in the Input Space / 3.4.1:
Demonstrations of Ordering of Responses in the Output Space / 3.4.2:
Basic Mathematical Approaches to Self-Organization / 3.5:
One-Dimensional Case / 3.5.1:
Constructive Proof of Ordering of Another One-Dimensional SOM / 3.5.2:
The Batch Map / 3.6:
Initialization of the SOM Algorithms / 3.7:
On the "Optimal" Learning-Rate Factor / 3.8:
Effect of the Form of the Neighborhood Function / 3.9:
Does the SOM Algorithm Ensue from a Distortion Measure? / 3.10:
An Attempt to Optimize the SOM / 3.11:
Point Density of the Model Vectors / 3.12:
Earlier Studies / 3.12.1:
Numerical Check of Point Densities in a Finite One-Dimensional SOM / 3.12.2:
Practical Advice for the Construction of Good Maps / 3.13:
Examples of Data Analyses Implemented by the SOM / 3.14:
Attribute Maps with Full Data Matrix / 3.14.1:
Case Example of Attribute Maps Based on Incomplete Data Matrices (Missing Data): "Poverty Map" / 3.14.2:
Using Gray Levels to Indicate Clusters in the SOM / 3.15:
Interpretation of the SOM Mapping / 3.16:
"Local Principal Components" / 3.16.1:
Contribution of a Variable to Cluster Structures / 3.16.2:
Speedup of SOM Computation / 3.17:
Shortcut Winner Search / 3.17.1:
Increasing the Number of Units in the SOM / 3.17.2:
Smoothing / 3.17.3:
Combination of Smoothing, Lattice Growing, and SOM Algorithm / 3.17.4:
Physiological Interpretation of SOM / 4:
Conditions for Abstract Feature Maps in the Brain / 4.1:
Two Different Lateral Control Mechanisms / 4.2:
The WTA Function, Based on Lateral Activity Control / 4.2.1:
Lateral Control of Plasticity / 4.2.2:
Learning Equation / 4.3:
System Models of SOM and Their Simulations / 4.4:
Recapitulation of the Features of the Physiological SOM Model / 4.5:
Similarities Between the Brain Maps and Simulated Feature Maps / 4.6:
Magnification / 4.6.1:
Imperfect Maps / 4.6.2:
Overlapping Maps / 4.6.3:
Variants of SOM / 5:
Overview of Ideas to Modify the Basic SOM / 5.1:
Adaptive Tensorial Weights / 5.2:
Tree-Structured SOM in Searching / 5.3:
Different Definitions of the Neighborhood / 5.4:
Neighborhoods in the Signal Space / 5.5:
Dynamical Elements Added to the SOM / 5.6:
The SOM for Symbol Strings / 5.7:
Initialization of the SOM for Strings / 5.7.1:
The Batch Map for Strings / 5.7.2:
Tie-Break Rules / 5.7.3:
A Simple Example: The SOM of Phonemic Transcriptions / 5.7.4:
Operator Maps / 5.8:
Evolutionary-Learning SOM / 5.9:
Evolutionary-Learning Filters / 5.9.1:
Self-Organization According to a Fitness Function / 5.9.2:
Supervised SOM / 5.10:
The Adaptive-Subspace SOM (ASSOM) / 5.11:
The Problem of Invariant Features / 5.11.1:
Relation Between Invariant Features and Linear Subspaces / 5.11.2:
The ASSOM Algorithm / 5.11.3:
Derivation of the ASSOM Algorithm by Stochastic Approximation / 5.11.4:
ASSOM Experiments / 5.11.5:
Feedback-Controlled Adaptive-Subspace SOM (FASSOM) / 5.12:
Learning Vector Quantization / 6:
Optimal Decision / 6.1:
The LVQ1 / 6.2:
The Optimized-Learning-Rate LVQ1 (OLVQ1) / 6.3:
The Batch-LVQ1 / 6.4:
The Batch-LVQ1 for Symbol Strings / 6.5:
The LVQ2 (LVQ 2.1) / 6.6:
The LVQ3 / 6.7:
Differences Between LVQ1, LVQ2 and LVQ3 / 6.8:
General Considerations / 6.9:
The Hypermap-Type LVQ / 6.10:
The "LVQ-SOM" / 6.11:
Applications / 7:
Preprocessing of Optic Patterns / 7.1:
Blurring / 7.1.1:
Expansion in Terms of Global Features / 7.1.2:
Spectral Analysis / 7.1.3:
Expansion in Terms of Local Features (Wavelets) / 7.1.4:
Recapitulation of Features of Optic Patterns / 7.1.5:
Acoustic Preprocessing / 7.2:
Process and Machine Monitoring / 7.3:
Selection of Input Variables and Their Scaling / 7.3.1:
Analysis of Large Systems / 7.3.2:
Diagnosis of Speech Voicing / 7.4:
Transcription of Continuous Speech / 7.5:
Texture Analysis / 7.6:
Contextual Maps / 7.7:
Artifically Generated Clauses / 7.7.1:
Natural Text / 7.7.2:
Organization of Large Document Files / 7.8:
Statistical Models of Documents / 7.8.1:
Construction of Very Large WEBSOM Maps by the Projection Method / 7.8.2:
The WEBSOM of All Electronic Patent Abstracts / 7.8.3:
Robot-Arm Control / 7.9:
Simultaneous Learning of Input and Output Parameters / 7.9.1:
Another Simple Robot-Arm Control / 7.9.2:
Telecommunications / 7.10:
Adaptive Detector for Quantized Signals / 7.10.1:
Channel Equalization in the Adaptive QAM / 7.10.2:
Error-Tolerant Transmission of Images by a Pair of SOMs / 7.10.3:
The SOM as an Estimator / 7.11:
Symmetric (Autoassociative) Mapping / 7.11.1:
Asymmetric (Heteroassociative) Mapping / 7.11.2:
Software Tools for SOM / 8:
Necessary Requirements / 8.1:
Desirable Auxiliary Features / 8.2:
SOM Program Packages / 8.3:
SOM_PAK / 8.3.1:
SOM Toolbox / 8.3.2:
Nenet (Neural Networks Tool) / 8.3.3:
Viscovery SOMine / 8.3.4:
Examples of the Use of SOMLPAK / 8.4:
File Formats / 8.4.1:
Description of the Programs in SOM_PAK / 8.4.2:
A Typical Training Sequence / 8.4.3:
Neural-Networks Software with the SOM Option / 8.5:
Hardware for SOM / 9:
An Analog Classifier Circuit / 9.1:
Fast Digital Classifier Circuits / 9.2:
SIMD Implementation of SOM / 9.3:
Transputer Implementation of SOM / 9.4:
Systolic-Array Implementation of SOM / 9.5:
The COKOS Chip / 9.6:
The TInMANN Chip / 9.7:
NBISOM_25 Chip / 9.8:
An Overview of SOM Literature / 10:
Books and Review Articles / 10.1:
Early Works on Competitive Learning / 10.2:
Status of the Mathematical Analyses / 10.3:
Zero-Order Topology (Classical VQ) Results / 10.3.1:
Alternative Topological Mappings / 10.3.2:
Alternative Architectures / 10.3.3:
Functional Variants / 10.3.4:
Theory of the Basic SOM / 10.3.5:
The Learning Vector Quantization / 10.4:
Diverse Applications of SOM / 10.5:
Machine Vision and Image Analysis / 10.5.1:
Optical Character and Script Reading / 10.5.2:
Speech Analysis and Recognition / 10.5.3:
Acoustic and Musical Studies / 10.5.4:
Signal Processing and Radar Measurements / 10.5.5:
Industrial and Other Real-World Measurements / 10.5.6:
Process Control / 10.5.8:
Robotics / 10.5.9:
Electronic-Circuit Design / 10.5.10:
Physics / 10.5.11:
Chemistry / 10.5.12:
Biomedical Applications Without Image Processing / 10.5.13:
Neurophysiological Research / 10.5.14:
Data Processing and Analysis / 10.5.15:
Linguistic and AI Problems / 10.5.16:
Mathematical and Other Theoretical Problems / 10.5.17:
Applications of LVQ / 10.6:
Survey of SOM and LVQ Implementations / 10.7:
Glossary of "Neural" Terms / 11:
References
Index
Mathematical Preliminaries / 1:
Mathematical Concepts and Notations / 1.1:
Vector Space Concepts / 1.1.1:
6.

図書

図書
Teuvo Kohonen
出版情報: Berlin : Springer-Verlag, c1980  xi, 368 p. ; 24 cm
シリーズ名: Springer series in information sciences ; 1
所蔵情報: loading…
7.

図書

図書
Teuvo Kohonen
出版情報: Berlin ; New York : Springer-Verlag, c1977  ix, 176 p. ; 25 cm
シリーズ名: Communication and cybernetics ; 17
所蔵情報: loading…
8.

図書

図書
Teuvo Kohonen
出版情報: Berlin ; Tokyo : Springer-Verlag, c1989  xv, 312 p. ; 24 cm
シリーズ名: Springer series in information sciences ; 8
所蔵情報: loading…
9.

図書

図書
T.コホネン著 ; 徳高平蔵 [ほか] 監修
出版情報: 東京 : 丸善出版, 2012.6  xvii, 479p ; 25cm
所蔵情報: loading…
目次情報: 続きを見る
数学的準備
ニューロのモデル化
基本SOM
SOMの生理学的解釈
色々なSOM
学習ベクトル量子化
応用
SOM用ソフトウェア
SOM用ハードウェア
SOM文献の総覧
ニューロ用語の小辞典
数学的準備
ニューロのモデル化
基本SOM
概要: 動くもの、動かないもの、そして、工学はもちろん、医学、農学、さらには社会科学の領域まであらゆる分野に応用できる脳の機能を模した視覚的情報処理。その決定版として知られる、原著者コホネンの独創的なニューラルネットワーク・パラダイム「Self‐O rganizing Maps(SOM)」の基本概念から応用までをわかりやすく解説した。本改訂版では、原著の第3版でなされた大改訂に対応している。例えば、ソフトウェアツールに関する章が加わり、さらに実用のために役立つ情報が盛り込まれたほか、各章にわたって最新の情報をもとに修正され、より丁寧な解説と広い見解を含むように記述が改められた。 続きを見る
10.

図書

図書
Teuvo Kohonen
出版情報: Berlin ; Tokyo : Springer-Verlag, 1984  xii, 255 p. ; 24 cm
シリーズ名: Springer series in information sciences ; 8
所蔵情報: loading…
文献の複写および貸借の依頼を行う
 文献複写・貸借依頼