close
1.

電子ブック

EB
Timo Koski, John Noble, John M. Noble
出版情報: Wiley Online Library - AutoHoldings Books , John Wiley & Sons, Inc., 2009
所蔵情報: loading…
目次情報: 続きを見る
Preface
Graphical models and probabilistic reasoning / 1:
Introduction / 1.1:
Axioms of probability and basic notations / 1.2:
The Bayes update of probability / 1.3:
Inductive learning / 1.4:
Bayes' rule / 1.4.1:
Jeffrey's rule / 1.4.2:
Pearl's method of virtual evidence / 1.4.3:
Interpretations of probability and Bayesian networks / 1.5:
Learning as inference about parameters / 1.6:
Bayesian statistical inference / 1.7:
Tossing a thumb-tack / 1.8:
Multinomial sampling and the Dirichlet integral / 1.9:
Notes
Exercises: Probabilistic theories of causality, Bayes' rule, multinomial sampling and the Dirichlet density
Conditional independence, graphs and d-separation / 2:
Joint probabilities / 2.1:
Conditional independence / 2.2:
Directed acyclic graphs and d-separation / 2.3:
Graphs / 2.3.1:
Directed acyclic graphs and probability distributions / 2.3.2:
The Bayes ball / 2.4:
Illustrations / 2.4.1:
Potentials / 2.5:
Bayesian networks / 2.6:
Object oriented Bayesian networks / 2.7:
d-Separation and conditional independence / 2.8:
Markov models and Bayesian networks / 2.9:
I-maps and Markov equivalence / 2.10:
The trek and a distribution without a faithful graph / 2.10.1:
Exercises: Conditional independence and d-separation
Evidence, sufficiency and Monte Carlo methods / 3:
Hard evidence / 3.1:
Soft evidence and virtual evidence / 3.2:
Queries in probabilistic inference / 3.2.1:
The chest clinic problem / 3.3.1:
Bucket elimination / 3.4:
Bayesian sufficient statistics and prediction sufficiency / 3.5:
Bayesian sufficient statistics / 3.5.1:
Prediction sufficiency / 3.5.2:
Prediction sufficiency for a Bayesian network / 3.5.3:
Time variables / 3.6:
A brief introduction to Markov chain Monte Carlo methods / 3.7:
Simulating a Markov chain / 3.7.1:
Irreducibility, aperiodicity and time reversibility / 3.7.2:
The Metropolis-Hastings algorithm / 3.7.3:
The one-dimensional discrete Metropolis algorithm / 3.7.4:
Exercises: Evidence, sufficiency and Monte Carlo methods
Decomposable graphs and chain graphs / 4:
Definitions and notations / 4.1:
Decomposable graphs and triangulation of graphs / 4.2:
Junction trees / 4.3:
Markov equivalence / 4.4:
Markov equivalence, the essential graph and chain graphs / 4.5:
Exercises: Decomposable graphs and chain graphs
Learning the conditional probability potentials / 5:
Initial illustration: maximum likelihood estimate for a fork connection / 5.1:
The maximum likelihood estimator for multinomial sampling / 5.2:
MLE for the parameters in a DAG: the general setting / 5.3:
Updating, missing data, fractional updating / 5.4:
Exercises: Learning the conditional probability potentials
Learning the graph structure / 6:
Assigning a probability distribution to the graph structure / 6.1:
Markov equivalence and consistency / 6.2:
Establishing the DAG isomorphic property / 6.2.1:
Reducing the size of the search / 6.3:
The Chow-Liu tree / 6.3.1:
The Chow-Liu tree: A predictive approach / 6.3.2:
The K2 structural learning algorithm / 6.3.3:
The MMHC algorithm / 6.3.4:
Monte Carlo methods for locating the graph structure / 6.4:
Women in mathematics / 6.5:
Exercises: Learning the graph structure
Parameters and sensitivity / 7:
Changing parameters in a network / 7.1:
Measures of divergence between probability distributions / 7.2:
The Chan-Darwiche distance measure / 7.3:
Comparison with the Kullback-Leibler divergence and euclidean distance / 7.3.1:
Global bounds for queries / 7.3.2:
Applications to updating / 7.3.3:
Parameter changes to satisfy query constraints / 7.4:
Binary variables / 7.4.1:
The sensitivity of queries to parameter changes / 7.5:
Exercises: Parameters and sensitivity
Graphical models and exponential families / 8:
Introduction to exponential families / 8.1:
Standard examples of exponential families / 8.2:
Noisy 'or' as an exponential family / 8.3:
Properties of the log partition function / 8.5:
Fenchel Legendre conjugate / 8.6:
Kullback-Leibler divergence / 8.7:
Mean field theory / 8.8:
Conditional Gaussian distributions / 8.9:
CG potentials / 8.9.1:
Some results on marginalization / 8.9.2:
CG regression / 8.9.3:
Exercises: Graphical models and exponential families
Causality and intervention calculus / 9:
Conditioning by observation and by intervention / 9.1:
The intervention calculus for a Bayesian network / 9.3:
Establishing the model via a controlled experiment / 9.3.1:
Properties of intervention calculus / 9.4:
Transformations of probability / 9.5:
A note on the order of 'see' and 'do' conditioning / 9.6:
The 'Sure Thing' principle / 9.7:
Back door criterion, confounding and identifiability / 9.8:
Exercises: Causality and intervention calculus
The junction tree and probability updating / 10:
Probability updating using a junction tree / 10.1:
Potentials and the distributive law / 10.2:
Marginalization and the distributive law / 10.2.1:
Elimination and domain graphs / 10.3:
Factorization along an undirected graph / 10.4:
Factorizing along a junction tree / 10.5:
Flow of messages initial illustration / 10.5.1:
Local computation on junction trees / 10.6:
Schedules / 10.7:
Local and global consistency / 10.8:
Message passing for conditional Gaussian distributions / 10.9:
Using a junction tree with virtual evidence and soft evidence / 10.10:
Exercises: The junction tree and probability updating
Factor graphs and the sum product algorithm / 11:
Factorization and local potentials / 11.1:
Examples of factor graphs / 11.1.1:
The sum product algorithm / 11.2:
Detailed illustration of the algorithm / 11.3:
Exercise: Factor graphs and the sum product algorithm
References
Index
Preface
Graphical models and probabilistic reasoning / 1:
Introduction / 1.1:
2.

図書

図書
by Timo Koski
出版情報: Dordrecht : Kluwer Academic Publishers, c2001  xvii, 391 p. ; 25 cm
シリーズ名: Computational biology series ; vol. 2
所蔵情報: loading…
文献の複写および貸借の依頼を行う
 文献複写・貸借依頼