close
1.

図書

図書
edited by Erkki Oja and Samuel Kaski
出版情報: Amsterdam : Elsevier, 1999  ix, 390 p. ; 25 cm
所蔵情報: loading…
目次情報: 続きを見る
Selected papers only
Preface / Kohonen Maps
Analyzing and representing multidimentional quantitative and qualitative data: Demographic study of the / Rhône valley
The domeatic consumption of the Canadian families / M. Cottrell ; P. Gaubert ; P. Letremy ; P. Rousset
Value maps: Finding value in markets that are expensive / G.J. Deboeck
Data mining and knowledge discovery with emergent Self-Organizing Feature Maps for multivariate time series / A. Ultsch
Tree structured Self-Organizing Maps / P. Koikkalainen
On the optimization of Self-Organizing Maps by genetic algorithms / D. Polani
Self organization of a massive text document collection / T. Kohonen ; S. Kaski ; K. Lagus ; J. Salojárvi ; J. Honkela ; V. Paatero ; A. Saarela
Document classification with Self-Organizing Maps / D. Merkl
Navigation in databases using Self-Organizing Maps / S.A. Shumsky
Self-Organising Maps in computer aided design of electronic circuits / A. Hemani ; A. Postula
Modeling self-organization in the visual cortex / R. Miikkulainen ; J.A. Bednar ; Y. Choe ; J. Sirosh
A spatio-temporal memory based on SOMs with activity diffusion / N.R. Euliano ; J.C. Principe
Advances in modeling cortical maps / P.G. Morasso ; V. Sanguineti ; F. Frisone
Topology preservation in Self-Organizing Maps / T. Villmann
Second-order learing in Self-Organizing Maps / R. Der ; M. Herrmann
Energy functions for Self-Organizing Maps / T. Heskes
LVQ and single trial EEG classification / G. Pfurtscheller ; M. Pregenzer
Self-Organizing Map in categorization of voice qualities / L. Leinonen
Self-Organizing Map in analysis of large-scale industrial systems / O. Simula ; J. Ahola ; E. Alhoniemi ; J. Himberg ; J. Vesanto
Keyword index
Selected papers only
Preface / Kohonen Maps
Analyzing and representing multidimentional quantitative and qualitative data: Demographic study of the / Rhône valley
2.

図書

図書
Witold Pedrycz
出版情報: Boca Raton, Fla. : CRC Press, c1998  284 p. ; 26 cm
所蔵情報: loading…
3.

図書

図書
Pierre Baldi, Søren Brunak
出版情報: Cambridge, Mass. : The MIT Press, 1998  xviii, 351 p., [8] p. of plats ; 24 cm
シリーズ名: Adaptive computation and machine learning
Bradford book
所蔵情報: loading…
目次情報: 続きを見る
Series Foreword
Preface
Introduction / 1:
Biological Data in Digital Symbol Sequences / 1.1:
Genomes--Diversity, Size, and Structure / 1.2:
Proteins and Proteomes / 1.3:
On the Information Content of Biological Sequences / 1.4:
Prediction of Molecular Function and Structure / 1.5:
Machine Learning Foundations: The Probabilistic Framework / 2:
Introduction: Bayesian Modeling / 2.1:
The Cox-Jaynes Axioms / 2.2:
Bayesian Inference and Induction / 2.3:
Model Structures: Graphical Models and Other Tricks / 2.4:
Summary / 2.5:
Probabilistic Modeling and Inference: Examples / 3:
The Simplest Sequence Models / 3.1:
Statistical Mechanics / 3.2:
Machine Learning Algorithms / 4:
Dynamic Programming / 4.1:
Gradient Descent / 4.3:
EM/GEM Algorithms / 4.4:
Markov Chain Monte Carlo Methods / 4.5:
Simulated Annealing / 4.6:
Evolutionary and Genetic Algorithms / 4.7:
Learning Algorithms: Miscellaneous Aspects / 4.8:
Neural Networks: The Theory / 5:
Universal Approximation Properties / 5.1:
Priors and Likelihoods / 5.3:
Learning Algorithms: Backpropagation / 5.4:
Neural Networks: Applications / 6:
Sequence Encoding and Output Interpretation / 6.1:
Prediction of Protein Secondary Structure / 6.2:
Prediction of Signal Peptides and Their Cleavage Sites / 6.3:
Applications for DNA and RNA Nucleotide Sequences / 6.4:
Hidden Markov Models: The Theory / 7:
Prior Information and Initialization / 7.1:
Likelihood and Basic Algorithms / 7.3:
Learning Algorithms / 7.4:
Applications of HMMs: General Aspects / 7.5:
Hidden Markov Models: Applications / 8:
Protein Applications / 8.1:
DNA and RNA Applications / 8.2:
Conclusion: Advantages and Limitations of HMMs / 8.3:
Hybrid Systems: Hidden Markov Models and Neural Networks / 9:
Introduction to Hybrid Models / 9.1:
The Single-Model Case / 9.2:
The Multiple-Model Case / 9.3:
Simulation Results / 9.4:
Probabilistic Models of Evolution: Phylogenetic Trees / 9.5:
Introduction to Probabilistic Models of Evolution / 10.1:
Substitution Probabilities and Evolutionary Rates / 10.2:
Rates of Evolution / 10.3:
Data Likelihood / 10.4:
Optimal Trees and Learning / 10.5:
Parsimony / 10.6:
Extensions / 10.7:
Stochastic Grammars and Linguistics / 11:
Introduction to Formal Grammars / 11.1:
Formal Grammars and the Chomsky Hierarchy / 11.2:
Applications of Grammars to Biological Sequences / 11.3:
Likelihood / 11.4:
Applications of SCFGs / 11.6:
Experiments / 11.8:
Future Directions / 11.9:
Internet Resources and Public Databases / 12:
A Rapidly Changing Set of Resources / 12.1:
Databases over Databases and Tools / 12.2:
Databases over Databases / 12.3:
Databases / 12.4:
Sequence Similarity Searches / 12.5:
Alignment / 12.6:
Selected Prediction Servers / 12.7:
Molecular Biology Software Links / 12.8:
Ph.D. Courses over the Internet / 12.9:
HMM/NN Simulator / 12.10:
Statistics / A:
Decision Theory and Loss Functions / A.1:
Quadratic Loss Functions / A.2:
The Bias/Variance Trade-off / A.3:
Combining Estimators / A.4:
Error Bars / A.5:
Sufficient Statistics / A.6:
Exponential Family / A.7:
Gaussian Process Models / A.8:
Variational Methods / A.9:
Information Theory, Entropy, and Relative Entropy / B:
Entropy / B.1:
Relative Entropy / B.2:
Mutual Information / B.3:
Jensen's Inequality / B.4:
Maximum Entropy / B.5:
Minimum Relative Entropy / B.6:
Probabilistic Graphical Models / C:
Notation and Preliminaries / C.1:
The Undirected Case: Markov Random Fields / C.2:
The Directed Case: Bayesian Networks / C.3:
HMM Technicalities, Scaling, Periodic Architectures, State Functions, and Dirichlet Mixtures / D:
Scaling / D.1:
Periodic Architectures / D.2:
State Functions: Bendability / D.3:
Dirichlet Mixtures / D.4:
List of Main Symbols and Abbreviations / E:
References
Index
Series Foreword
Preface
Introduction / 1:
4.

図書

図書
edited by Cornelius T. Leondes
出版情報: San Diego : Academic Press, c1998  xxix, 460 p. ; 24 cm
シリーズ名: Neural network systems techniques and applications ; vol. 1
所蔵情報: loading…
5.

図書

図書
edited by Omid Omidvar, Patrick van der Smagt
出版情報: San Diego ; Tokyo : Academic Press, c1997  xvii, 346 p. ; 24 cm
所蔵情報: loading…
目次情報: 続きを見る
Neural Network Sonar as a Perceptual Modality for Robotics / W.T. Miller III ; A.L. Kun,lt;/i>
Dynamic Balance of a Biped Walking / Robot. P. van der Smagt ; F. Groen,lt;/i>
Visual Feedback in Motion / D. DeMers ; K. Kreutz-Delgado,lt;/i>
Inverse Kinematics of Dextrous Manipulators / Y. Jin, T. Pipe ; A. Winfield,lt;/i>
Stable Manipulator Trajectory Control Using Neural Networks / P. Gaudiano ; F.H. Guenther ; E. Zalama,lt;/i>
The Neural Dynamics Approach to Sensory-Motor Control / A. Buhlmeier ; G. Maneuffel,lt;/i>
Operant Conditioning in Robots / B. Hallam ; J. Hallam ; G. Hayes,lt;/i>
A Dynamic Net for Robot Control / Ben Krise ; J. van Dam,lt;/i>
Neural Vehicles / J. Heikkonen ; P. Koikkalainen,lt;/i>
Self-Organization and Autonomous Robots
Neural Network Sonar as a Perceptual Modality for Robotics / W.T. Miller III ; A.L. Kun,lt;/i>
Dynamic Balance of a Biped Walking / Robot. P. van der Smagt ; F. Groen,lt;/i>
Visual Feedback in Motion / D. DeMers ; K. Kreutz-Delgado,lt;/i>
6.

図書

図書
James M. Bower and David Beeman
出版情報: Santa Clara, Calif. : TELOS, Springer-Verlag, c1995  xx, 409 p. ; 24 cm
所蔵情報: loading…
7.

図書

図書
edited by A.M.S. Zalzala and A.S. Morris
出版情報: New York ; Tokyo : Ellis Horwood, 1996  viii, 278 p. ; 25 cm
所蔵情報: loading…
8.

図書

図書
Kenneth Hunt, George Irwin and Kevin Warwick (eds.)
出版情報: Berlin ; New York : Springer, c1995  278 p. ; 24 cm
シリーズ名: Advances in industrial control
所蔵情報: loading…
9.

図書

図書
B.D. Ripley
出版情報: New York : Cambridge University Press, 1996  xi, 403 p. ; 26 cm
所蔵情報: loading…
目次情報: 続きを見る
Introduction and examples / 1:
Statistical decision theory / 2:
Linear discriminant analysis / 3:
Flexible discriminants / 4:
Feed-forward neural networks / 5:
Non-parametric methods / 6:
Tree-structured classifiers / 7:
Belief networks / 8:
Unsupervised methods / 9:
Finding good pattern features / 10:
statistical sidelines / Appendix:
Glossary
References
Author index
Subject index
Introduction and examples / 1:
Statistical decision theory / 2:
Linear discriminant analysis / 3:
10.

図書

図書
N. K. Bose, P. Liang
出版情報: New York ; Tokyo : McGraw-Hill, c1996  xxxiii, 478 p. ; 25 cm
シリーズ名: McGraw-Hill series in electrical and computer engineering ; . Communications and signal processing
所蔵情報: loading…
11.

図書

図書
by Anne-Johan Annema
出版情報: Boston : Kluwer Academic Publishers, c1995  xiii, 238 p. ; 25 cm
シリーズ名: The Kluwer international series in engineering and computer science ; Analog circuits and signal processing
所蔵情報: loading…
12.

図書

図書
Bahram Nabet, Robert B. Pinter
出版情報: Boca Raton : CRC Press, c1991  xi, 182 p. ; 25 cm
所蔵情報: loading…
13.

図書

図書
edited by Gail A. Carpenter and Stephen Grossberg
出版情報: Cambridge, Mass. ; London : MIT Press, c1992  467 p. ; 26 cm
所蔵情報: loading…
14.

図書

図書
Tomas Hrycej
出版情報: New York, NY : Wiley, c1992  xiii, 235 p. ; 25 cm
シリーズ名: Sixth-generation computer technology series
所蔵情報: loading…
15.

図書

図書
Paolo Antognetti and Veljko Milutinović, editors
出版情報: Englewood Cliffs, N.J. : Prentice Hall, 1991  4 v. ; 24 cm
シリーズ名: Prentice Hall advanced reference series ; . Engineering
所蔵情報: loading…
16.

図書

図書
by David P. Morgan, Christopher L. Scofield ; foreword by Leon N. Cooper
出版情報: Boston : Kluwer Academic Publishers, c1991  xvi, 391 p. ; 25 cm
シリーズ名: The Kluwer international series in engineering and computer science ; . VLSI, computer architecture, and digital signal processing
所蔵情報: loading…
17.

図書

図書
edited by Harry Wechsler
出版情報: Boston : Academic Press, c1992  xix, 363 p. ; 24 cm
シリーズ名: Neural networks for perception / edited by Harry Wechsler ; v. 2
所蔵情報: loading…
18.

図書

図書
Yi-Tong Zhou, Rama Chellappa
出版情報: New York ; Tokyo : Springer-Verlag, c1992  xi, 170 p. ; 24 cm
シリーズ名: Research notes in neural computing ; v. 5
所蔵情報: loading…
19.

図書

図書
Halbert White with A.R. Gallant ... [et al.]
出版情報: Cambridge, Mass., USA ; Oxford : Blackwell, 1992  x, 329 p.
所蔵情報: loading…
20.

図書

図書
edited by M.A. Arbib and J.A. Robinson
出版情報: Cambridge, MA : MIT Press, c1990  x, 345 p. ; 24 cm
所蔵情報: loading…
21.

図書

図書
by Hervé A. Bourlard, Nelson Morgan ; foreword by Richard Lippmann
出版情報: Boston : Kluwer Academic Publishers, c1994  xxviii, 312 p. ; 25 cm
シリーズ名: The Kluwer international series in engineering and computer science ; SECS 247 . VLSI, computer architecture, and digital signal processing
所蔵情報: loading…
目次情報: 続きを見る
List of Figures
List of Tables
Notation
Foreword
Preface
Background / I:
Introduction / 1:
Statistical Pattern Classification / 2:
Hidden Markov Models / 3:
Multilayer Perceptions / 4:
Hybrid HMM/MLP Systems / II:
Speech Recognition using ANNs / 5:
Statistical Inference in MLPs / 6:
The Hybrid HMM/MLP Approach / 7:
Experimental Systems / 8:
Context-Dependent MPLs / 9:
System Tradeoffs / 10:
Training Hardware and Software / 11:
Additional Topics / III:
Cross-Validation in MLP Training / 12:
HMM/MLP and Predictive Models / 13:
Feature Extraction by MLP / 14:
Finale / IV:
Final System Overview / 15:
Conclusions / 16:
Bibliography
Index
Acronyms
List of Figures
List of Tables
Notation
22.

図書

図書
edited by Richard J. Mammone
出版情報: London ; New York : Chapman & Hall, 1994  xx, 586 p.
シリーズ名: Chapman & Hall neural computing ; 4
所蔵情報: loading…
23.

図書

図書
Albert Nigrin
出版情報: Cambridge, Mass. : MIT Press, c1993  xvii, 413 p. ; 24 cm
所蔵情報: loading…
24.

図書

図書
Paul John Werbos
出版情報: New York : J. Wiley & Sons, c1994  xii, 319 p. ; 25 cm
シリーズ名: Adaptive and learning systems for signal processing, communications, and control
A Wiley-Interscience publication
所蔵情報: loading…
目次情報: 続きを見る
Thesis
Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences
Dynamic Feedback, Statistical Estimation, and Systems Optimization: General Techniques
The Multivariate ARMA(1,1) Model: Its Significance and Estimation
Simulation Studies of Techniques of Time-Series Analysis
General Applications of These Ideas: Practical Hazards and New Possibilities
Nationalism and Social Communications: A Test Case for Mathematical Approaches
Applications and Extensions
Forms of Backpropagation for Sensitivity Analysis, Optimization, and Neural Networks
Backpropagation Through Time: What It Does and How to Do It
Neurocontrol: Where It Is Going and Why It Is Crucial
Neural Networks and the Human Mind: New Mathematics Fits Humanistic Insight
Index
Thesis
Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences
Dynamic Feedback, Statistical Estimation, and Systems Optimization: General Techniques
25.

図書

図書
P.J.G. Lisboa, M.J. Taylor
出版情報: New York : Ellis Horwood, 1993  307 p. ; 25 cm
シリーズ名: Ellis Horwood workshop series
所蔵情報: loading…
26.

図書

図書
Derong Liu and Anthony N. Michel
出版情報: London ; Berlin ; New York : Springer-Verlag, c1994  xiv, 191 p. ; 24 cm
シリーズ名: Lecture notes in control and information sciences ; 195
所蔵情報: loading…
27.

図書

図書
edited by Alan F. Murray
出版情報: Dordrecht ; Boston : Kluwer Academic Publishers, c1995  xii, 322 p. ; 25 cm
所蔵情報: loading…
28.

図書

図書
by Bing J. Sheu, Joongho Choi ; with special assistance from Robert C. Chang ... [et al.]
出版情報: Boston : Kluwer Academic Publishers, c1995  xix, 559 p. ; 25 cm
シリーズ名: The Kluwer international series in engineering and computer science ; SECS 304
所蔵情報: loading…
29.

図書

図書
Stephen I. Gallant
出版情報: Cambridge, Mass. : MIT Press, c1993  xvi, 365 p. ; 24 cm
シリーズ名: Bradford book
所蔵情報: loading…
目次情報: 続きを見る
Foreword
Basics / I:
Introduction and Important Definitions / 1:
Why Connectionist Models? / 1.1:
The Grand Goals of Al and Its Current Impasse / 1.1.1:
The Computational Appeal of Neural Networks / 1.1.2:
The Structure of Connectionist Models / 1.2:
Network Properties / 1.2.1:
Cell Properties / 1.2.2:
Dynamic Properties / 1.2.3:
Learning Properties / 1.2.4:
Two Fundamental Models: Multilayer Perceptrons (MLP's) and Backpropagation Networks (BPN's) / 1.3:
Multilayer Perceptrons (MLP's) / 1.3.1:
Backpropagation Networks (BPN's) / 1.3.2:
Gradient Descent / 1.4:
The Algorithm / 1.4.1:
Practical Problems / 1.4.2:
Comments / 1.4.3:
Historic and Bibliographic Notes / 1.5:
Early Work / 1.5.1:
The Decline of the Perceptron / 1.5.2:
The Rise of Connectionist Research / 1.5.3:
Other Bibliographic Notes / 1.5.4:
Exercises / 1.6:
Programming Project / 1.7:
Representation Issues / 2:
Representing Boolean Functions / 2.1:
Equivalence of {+1, -1,0} and {1,0} Forms / 2.1.1:
Single-Cell Models / 2.1.2:
Nonseparable Functions / 2.1.3:
Representing Arbitrary Boolean Functions / 2.1.4:
Representing Boolean Functions Using Continuous Connectionist Models / 2.1.5:
Distributed Representations / 2.2:
Definition / 2.2.1:
Storage Efficiency and Resistance to Error / 2.2.2:
Superposition / 2.2.3:
Learning / 2.2.4:
Feature Spaces and ISA Relations / 2.3:
Feature Spaces / 2.3.1:
Concept-Function Unification / 2.3.2:
ISA Relations / 2.3.3:
Binding / 2.3.4:
Representing Real-Valued Functions / 2.4:
Approximating Real Numbers by Collections of Discrete Cells / 2.4.1:
Precision / 2.4.2:
Approximating Real Numbers by Collections of Continuous Cells / 2.4.3:
Example: Taxtime! / 2.5:
Programming Projects / 2.6:
Learning In Single-Layer Models / II:
Perceptron Learning and the Pocket Algorithm / 3:
Perceptron Learning for Separable Sets of Training Examples / 3.1:
Statement of the Problem / 3.1.1:
Computing the Bias / 3.1.2:
The Perceptron Learning Algorithm / 3.1.3:
Perceptron Convergence Theorem / 3.1.4:
The Perceptron Cycling Theorem / 3.1.5:
The Pocket Algorithm for Nonseparable Sets of Training Examples / 3.2:
Problem Statement / 3.2.1:
Perceptron Learning Is Poorly Behaved / 3.2.2:
The Pocket Algorithm / 3.2.3:
Ratchets / 3.2.4:
Examples / 3.2.5:
Noisy and Contradictory Sets of Training Examples / 3.2.6:
Rules / 3.2.7:
Implementation Considerations / 3.2.8:
Proof of the Pocket Convergence Theorem / 3.2.9:
Khachiyan's Linear Programming Algorithm / 3.3:
Winner-Take-All Groups or Linear Machines / 3.4:
Generalizes Single-Cell Models / 4.1:
Perceptron Learning for Winner-Take-All Groups / 4.2:
The Pocket Algorithm for Winner-Take-All Groups / 4.3:
Kessler's Construction, Perceptron Cycling, and the Pocket Algorithm Proof / 4.4:
Independent Training / 4.5:
Autoassociators and One-Shot Learning / 4.6:
Linear Autoassociators and the Outer-Product Training Rule / 5.1:
Anderson's BSB Model / 5.2:
Hopfieid's Model / 5.3:
Energy / 5.3.1:
The Traveling Salesman Problem / 5.4:
The Cohen-Grossberg Theorem / 5.5:
Kanerva's Model / 5.6:
Autoassociative Filtering for Feedforward Networks / 5.7:
Concluding Remarks / 5.8:
Mean Squared Error (MSE) Algorithms / 5.9:
Motivation / 6.1:
MSE Approximations / 6.2:
The Widrow-Hoff Rule or LMS Algorithm / 6.3:
Number of Training Examples Required / 6.3.1:
Adaline / 6.4:
Adaptive Noise Cancellation / 6.5:
Decision-Directed Learning / 6.6:
Unsupervised Learning / 6.7:
Introduction / 7.1:
No Teacher / 7.1.1:
Clustering Algorithms / 7.1.2:
k-Means Clustering / 7.2:
Topology-Preserving Maps / 7.2.1:
Example / 7.3.1:
Demonstrations / 7.3.4:
Dimensionality, Neighborhood Size, and Final Comments / 7.3.5:
Art1 / 7.4:
Important Aspects of the Algorithm / 7.4.1:
Art2 / 7.4.2:
Using Clustering Algorithms for Supervised Learning / 7.6:
Labeling Clusters / 7.6.1:
ARTMAP or Supervised ART / 7.6.2:
Learning In Multilayer Models / 7.7:
The Distributed Method and Radial Basis Functions / 8:
Rosenblatt's Approach / 8.1:
The Distributed Method / 8.2:
Cover's Formula / 8.2.1:
Robustness-Preserving Functions / 8.2.2:
Hepatobiliary Data / 8.3:
Artificial Data / 8.3.2:
How Many Cells? / 8.4:
Pruning Data / 8.4.1:
Leave-One-Out / 8.4.2:
Radial Basis Functions / 8.5:
A Variant: The Anchor Algorithm / 8.6:
Scaling, Multiple Outputs, and Parallelism / 8.7:
Scaling Properties / 8.7.1:
Multiple Outputs and Parallelism / 8.7.2:
A Computational Speedup for Learning / 8.7.3:
Computational Learning Theory and the BRD Algorithm / 8.7.4:
Introduction to Computational Learning Theory / 9.1:
PAC-Learning / 9.1.1:
Bounded Distributed Connectionist Networks / 9.1.2:
Probabilistic Bounded Distributed Concepts / 9.1.3:
A Learning Algorithm for Probabilistic Bounded Distributed Concepts / 9.2:
The BRD Theorem / 9.3:
Polynomial Learning / 9.3.1:
Noisy Data and Fallback Estimates / 9.4:
Vapnik-Chervonenkis Bounds / 9.4.1:
Hoeffding and Chernoff Bounds / 9.4.2:
Pocket Algorithm / 9.4.3:
Additional Training Examples / 9.4.4:
Bounds for Single-Layer Algorithms / 9.5:
Fitting Data by Limiting the Number of Iterations / 9.6:
Discussion / 9.7:
Exercise / 9.8:
Constructive Algorithms / 9.9:
The Tower and Pyramid Algorithms / 10.1:
The Tower Algorithm / 10.1.1:
Proof of Convergence / 10.1.2:
A Computational Speedup / 10.1.4:
The Pyramid Algorithm / 10.1.5:
The Cascade-Correlation Algorithm / 10.2:
The Tiling Algorithm / 10.3:
The Upstart Algorithm / 10.4:
Other Constructive Algorithms and Pruning / 10.5:
Easy Learning Problems / 10.6:
Decomposition / 10.6.1:
Expandable Network Problems / 10.6.2:
Limits of Easy Learning / 10.6.3:
Backpropagation / 10.7:
The Backpropagation Algorithm / 11.1:
Statement of the Algorithm / 11.1.1:
A Numerical Example / 11.1.2:
Derivation / 11.2:
Practical Considerations / 11.3:
Determination of Correct Outputs / 11.3.1:
Initial Weights / 11.3.2:
Choice of r / 11.3.3:
Momentum / 11.3.4:
Network Topology / 11.3.5:
Local Minima / 11.3.6:
Activations in [0,1] versus [-1, 1] / 11.3.7:
Update after Every Training Example / 11.3.8:
Other Squashing Functions / 11.3.9:
NP-Completeness / 11.4:
Overuse / 11.5:
Interesting Intermediate Cells / 11.5.2:
Continuous Outputs / 11.5.3:
Probability Outputs / 11.5.4:
Using Backpropagation to Train Multilayer Perceptrons / 11.5.5:
Backpropagation: Variations and Applications / 11.6:
NETtalk / 12.1:
Input and Output Representations / 12.1.1:
Experiments / 12.1.2:
Backpropagation through Time / 12.1.3:
Handwritten Character Recognition / 12.3:
Neocognitron Architecture / 12.3.1:
The Network / 12.3.2:
Robot Manipulator with Excess Degrees of Freedom / 12.3.3:
The Problem / 12.4.1:
Training the Inverse Network / 12.4.2:
Plan Units / 12.4.3:
Simulated Annealing and Boltzmann Machines / 12.4.4:
Simulated Annealing / 13.1:
Boltzmann Machines / 13.2:
The Boltzmann Model / 13.2.1:
Boltzmann Learning / 13.2.2:
The Boltzmann Algorithm and Noise Clamping / 13.2.3:
Example: The 4-2-4 Encoder Problem / 13.2.4:
Remarks / 13.3:
Neural Network Expert Systems / 13.4:
Expert Systems and Neural Networks / 14:
Expert Systems / 14.1:
What Is an Expert System? / 14.1.1:
Why Expert Systems? / 14.1.2:
Historically Important Expert Systems / 14.1.3:
Critique of Conventional Expert Systems / 14.1.4:
Neural Network Decision Systems / 14.2:
Example: Diagnosis of Acute Coronary Occlusion / 14.2.1:
Example: Autonomous Navigation / 14.2.2:
Other Examples / 14.2.3:
Decision Systems versus Expert Systems / 14.2.4:
MACIE, and an Example Problem / 14.3:
Diagnosis and Treatment of Acute Sarcophagal Disease / 14.3.1:
Network Generation / 14.3.2:
Sample Run of Macie / 14.3.3:
Real-Valued Variables and Winner-Take-All Groups / 14.3.4:
Not-Yet-Known versus Unavailable Variables / 14.3.5:
Applicability of Neural Network Expert Systems / 14.4:
Details of the MACIE System / 14.5:
Inferencing and Forward Chaining / 15.1:
Discrete Multilayer Perceptron Models / 15.1.1:
Continuous Variables / 15.1.2:
Winner-Take-All Groups / 15.1.3:
Using Prior Probabilities for More Aggressive Inferencing / 15.1.4:
Confidence Estimation / 15.2:
A Confidence Heuristic Prior to Inference / 15.2.1:
Confidence in Inferences / 15.2.2:
Information Acquisition and Backward Chaining / 15.3:
Concluding Comment / 15.4:
Noise, Redundancy, Fault Detection, and Bayesian Decision Theory / 15.5:
The High Tech Lemonade Corporation's Problem / 16.1:
The Deep Model and the Noise Model / 16.2:
Generating the Expert System / 16.3:
Probabilistic Analysis / 16.4:
Noisy Single-Pattern Boolean Fault Detection Problems / 16.5:
Convergence Theorem / 16.6:
Extracting Rules from networks / 16.7:
Why Rules? / 17.1:
What Kind of Rules? / 17.2:
Criteria / 17.2.1:
Inference Justifications versus Rule Sets / 17.2.2:
Which Variables in Conditions / 17.2.3:
Inference Justifications / 17.3:
MACIE's Algorithm / 17.3.1:
The Removal Algorithm / 17.3.2:
Key Factor Justifications / 17.3.3:
Justifications for Continuous Models / 17.3.4:
Rule Sets / 17.4:
Limiting the Number of Conditions / 17.4.1:
Approximating Rules / 17.4.2:
Conventional + Neural Network Expert Systems / 17.5:
Debugging an Expert System Knowledge Base / 17.5.1:
The Short-Rule Debugging Cycle / 17.5.2:
Appendix Representation Comparisons / 17.6:
DNF Expressions / A.1 DNF Expressions and Polynomial Representability:
Polynomial Representability / A.1.2:
Space Comparison of MLP and DNF Representations / A.1.3:
Speed Comparison of MLP and DNF Representations / A.1.4:
MLP versus DNF Representations / A.1.5:
Decision Trees / A.2:
Representing Decision Trees by MLP's / A.2.1:
Speed Comparison / A.2.2:
Decision Trees versus MLP's / A.2.3:
p-lDiagrams / A.3:
Symmetric Functions and Depth Complexity / A.4:
Bibliography / A.5:
Index
Foreword
Basics / I:
Introduction and Important Definitions / 1:
30.

図書

図書
Larry R. Medsker
出版情報: Boston : Kluwer Academic, c1994  240 p. ; 25 cm
所蔵情報: loading…
31.

図書

図書
Bart Kosko, editor
出版情報: Englewood Cliffs, N.J. : Prentice Hall, c1992  xv, 399 p. ; 25 cm
所蔵情報: loading…
文献の複写および貸借の依頼を行う
 文献複写・貸借依頼