close
1.

図書

図書
Paul Glasserman
出版情報: New York ; Tokyo : Springer, c2004  xiii, 596 p. ; 25 cm
シリーズ名: Applications of mathematics ; 53
所蔵情報: loading…
目次情報: 続きを見る
Foundations
Generating Random Numbers and Random Variables
Generating Sample Paths
Variance Reduction Techniques
Quasi-Monte Carlo Methods
Discretization Methods
Estimating Sensitivities
Pricing American Options
Applications in Risk Management
Appendices
Foundations
Generating Random Numbers and Random Variables
Generating Sample Paths
2.

図書

図書
Michael Evans and Tim Swartz
出版情報: Oxford ; New York : Oxford University Press, c2000  ix, 288 p. ; 24 cm
シリーズ名: Oxford statistical science series ; 20
所蔵情報: loading…
3.

図書

図書
David P. Landau, Kurt Binder
出版情報: Cambridge : Cambridge University Press, 2000  xiii, 384 p. ; 25 cm
所蔵情報: loading…
目次情報: 続きを見る
Preface
Introduction / 1:
Some necessary background / 2:
Simple sampling Monte Carlo methods / 3:
The metropolis Monte Carlo algorithm for lattice models / 4:
More on importance sampling Monte Carlo methods for lattice systems / 5:
Off-lattice models / 6:
Reweighting methods / 7:
Quantum Monte Carlo methods / 8:
Monte Carlo renormalization group methods / 9:
Non-equilibrium and irreversible processes / 10:
Lattice gauge models: a brief introduction / 11:
A brief review of other methods of computer simulation / 12:
Outlook / 13:
Preface
Introduction / 1:
Some necessary background / 2:
4.

図書

図書
Gerhard Winkler
出版情報: Berlin ; Tokyo : Springer, c2003  xvi, 387 p. ; 25 cm.
シリーズ名: Applications of mathematics ; 27
所蔵情報: loading…
目次情報: 続きを見る
Introduction
Bayesian Image Analysis: Introduction / Part I:
The Bayesian Paradigm / 1:
Warming up for Absolute Beginners / 1.1:
linages and Observations / 1.2:
Prior and Posterior Distributions / 1.3:
Bayes Estimators / 1.4:
Cleaning Dirty Pictures / 2:
Boundaries and Their Information Content / 2.1:
Towards Piccewiso Smoothing / 2.2:
Filters, Smoothers, and Bayes Estimators / 2.3:
Boundary Extraction / 2.4:
Dependence on Hyperparameters / 2.5:
Finite Random Fields / 3:
Markov Random Fields / 3.1:
Gibbs Fields and Potentials / 3.2:
Potentials Continued / 3.3:
The Gibbs Sampler and Simulated Annealing / Part II:
Markov Chains: Limit Theorems / 4:
Preliminaries / 4.1:
The Contraction Coefficient / 4.2:
Homogeneous Markov Chains / 4.3:
Exact Sampling / 4.4:
Inhomogeneous Markov Chains / 4.5:
A Law of Large Numbers for Inhomogeneous Chains / 4.6:
A Counterexample for the Law of Large Numbers / 4.7:
Gibbsian Sampling and Annealing / 5:
Sampling / 5.1:
Simulated Annealing / 5.2:
Discussion / 5.3:
Cooling Schedules / 6:
The ICM Algorithm / 6.1:
Exact MAP Estimation Versus Fast Cooling / 6.2:
Finite Time Annealing / 6.3:
Variations of the Gibbs Sampler / Part III:
Gibbsian Sampling and Annealing Revisited / 7:
A General Gibbs Sampler / 7.1:
Sampling and Annealing Under Constraints / 7.2:
Partially Parallel Algorithms / 8:
Synchronous Updating on Independent Sets / 8.1:
The Swendson-Wang Algorithm / 8.2:
Synchronous Algorithms / 9:
Invariant Distributions and Convergence / 9.1:
Support of the Limit Distribution / 9.2:
Synchronous Algorithms and Reversibility / 9.3:
Metropolis Algorithms and Spectral Methods / Part IV:
Metropolis Algorithms / 10:
Metropolis Sampling and Annealing / 10.1:
Convergence Theorems / 10.2:
Best Constants / 10.3:
About Visiting Schemes / 10.4:
Generalizations and Modifications / 10.5:
The Metropolis Algorithm in Combinatorial Optimization / 10.6:
The Spectral Gap and Convergence of Markov Chains / 11:
Eigenvalues of Markov Kernels / 11.1:
Geometric Convergence Rates / 11.2:
Eigenvalues, Sampling, Variance Reduction / 12:
Samplers and Their Eigenvalues / 12.1:
Variance Reduction / 12.2:
Importance Sampling / 12.3:
Continuous Time Processes / 13:
Discrete State Space / 13.1:
Continuous State Space / 13.2:
Texture Analysis / Part V:
Partitioning / 14:
How to Tell Textures Apart / 14.1:
Bayesian Texture Segmentation / 14.2:
Segmentation by a Boundary Model / 14.3:
Julesz's Conjecture and Two Point Processes / 14.4:
Random Fields and Texture Models / 15:
Neighbourhood Relations / 15.1:
Random Field Texture Models / 15.2:
Texture Synthesis / 15.3:
Bayesian Texture Classification / 16:
Contextual Classification / 16.1:
Marginal Posterior Modes Methods / 16.2:
Parameter Estimation / Part VI:
Maximum Likelihood Estimation / 17:
The Likelihood Function / 17.1:
Objective Functions / 17.2:
Consistency of Spatial ML Estimators / 18:
Observation Windows and Specifications / 18.1:
Pseudolikelihood Methods / 18.2:
Large Deviations and Full Maximum Likelihood / 18.3:
Partially Observed Data / 18.4:
Computation of Pull ML Estimators / 19:
A Naive Algorithm / 19.1:
Stochastic Optimization for the Full Likelihood / 19.2:
Main Results / 19.3:
Error Decomposition / 19.4:
Supplement / 19.5:
A Glance at Neural Networks / 20:
Boltzmann Machines / 20.1:
A Learning Rule / 20.2:
Three Applications / 21:
Motion Analysis / 21.1:
Tomographic Image Reconstruction / 21.2:
Biological Shape / 21.3:
Appendix / Part VIII:
Simulation of Random Variables / A:
Pseudorandom Numbers / A.1:
Discrete Random Variables / A.2:
Special Distributions / A.3:
Analytical Tools / B:
Concave Functions / B.1:
Convergence of Descent Algorithms / B.2:
A Discrete Gronwall Lemma / B.3:
A Gradient System / B.4:
Physical Imaging Systems / C:
The Software Package AntsInFields / D:
References
Symbols
Index
Introduction
Bayesian Image Analysis: Introduction / Part I:
The Bayesian Paradigm / 1:
5.

図書

図書
Arnaud Doucet, Nando de Freitas, Neil Gordon, editors ; foreword by Adrian Smith
出版情報: New York : Springer, c2001  xxvii, 581 p. ; 24 cm
シリーズ名: Statistics for engineering and information science
所蔵情報: loading…
6.

図書

図書
Christian P. Robert, George Casella
出版情報: New York : Springer, c2004  xxx, 645 p. ; 24 cm
シリーズ名: Springer texts in statistics
所蔵情報: loading…
7.

図書

図書
Daniel J. Duffy, Jörg Kienitz
出版情報: Chichester, U.K. : Wiley, 2009  xxv, 750 p. ; 26 cm.
所蔵情報: loading…
目次情報: 続きを見る
Preface
My First Monte Carlo Application One-Factor Problems / Chapter 0:
Mathematical Preparations for the Monte Carlo Method / Chapter 1:
The Mathematics of Stochastic Differential Equations (SDE) / Chapter 2:
Alternative SDEs and Toolkit Functionality / Chapter 3:
An Introduction to the Finite Difference Method for SDE / Chapter 4:
Design and Implementation of Finite Difference Schemes in Computational Finance / Chapter 5:
Advanced Finance Models and Numerical Methods / Chapter 6:
Architectures and Frameworks for Monte Carlo Methods: Overview / Chapter 8:
System Decomposition and System Patterns / Chapter 9:
Detailed Design using the GOF Patterns / Chapter 10:
Combining Object-Oriented and Generic Programming Models / Chapter 11:
Data Structures and their Application to the Monte Carlo Method / Chapter 12:
The Boost Library: An Introduction / Chapter 13:
C++ Application Optimisation and Performance Improvement / Chapter 21:
An Introduction to Multi-threaded and Parallel Programming / Chapter 24:
An Introduction to OpenMP and its Applications to the Monte Carlo Method / Chapter 25:
Excel, C++ and Monte Carlo Integration / Chapter 27:
Preface
My First Monte Carlo Application One-Factor Problems / Chapter 0:
Mathematical Preparations for the Monte Carlo Method / Chapter 1:
8.

図書

図書
Ming-Hui Chen, Qi-Man Shao, Joseph G. Ibrahim
出版情報: New York : Springer, c2000  xiii, 386 p. ; 25 cm
シリーズ名: Springer series in statistics
所蔵情報: loading…
9.

図書

図書
James J. Buckley and Leonard J. Jowers
出版情報: Berlin : Springer, c2008  xiii, 260 p. ; 25 cm
シリーズ名: Studies in fuzziness and soft computing ; 222
所蔵情報: loading…
10.

電子ブック

EB
Faming Liang, Chuanhai Liu, Raymond J. Carroll
出版情報: [S.l.] : Wiley Online Library, [20--]  1 online resource (xix, 357 p.)
シリーズ名: Wiley series in computational statistics
所蔵情報: loading…
目次情報: 続きを見る
Preface
Acknowledgements
List of Figures
List of Tables
Bayesian Inference and Markov chain Monte Carlo / 1:
Bayes / 1.1:
Bayes output / 1.2:
Monte Carlo Integration / 1.3:
Random variable generation / 1.4:
Markov chain Monte Carlo / 1.5:
Exercises
The Gibbs sampler / 2:
Data Augmentation / 2.1:
Implementation strategies and acceleration methods / 2.3:
Applications / 2.4:
The Metropolis-Hastings Algorithm / 3:
Some Variants of the Metropolis-Hastings Algorithm / 3.1:
Reversible Jump MCMC Algorithm for Bayesian Model Selection / 3.3:
Problems
Metropolis-within-Gibbs Sampler for ChIP-chip Data Analysis / 3.4:
Auxiliary Variable MCMC Methods / 4:
Simulated Annealing / 4.1:
Simulated Tempering / 4.2:
Slice Sampler / 4.3:
The Swendsen-Wang Algorithm / 4.4:
The Wolff Algorithm / 4.5:
The Mller algorithm / 4.6:
The Exchange Algorithm / 4.7:
Double MH Sampler / 4.8:
Monte Carlo MH Sampler / 4.9:
Population-Based MCMC Methods / 4.10:
Adaptive Direction Sampling / 5.1:
Conjugate Gradient Monte Carlo / 5.2:
Sample Metropolis-Hastings Algorithm / 5.3:
Parallel Tempering / 5.4:
Evolutionary Monte Carlo / 5.5:
Sequential Parallel Tempering for Simulation of High Dimensional / 5.6:
Systems
Equi-Energy Sampler / 5.7:
Forecasting / 5.8:
Dynamic Weighting / 6:
Dynamically Weighted Importance Sampling / 6.1:
Monte Carlo Dynamically Weighted Importance Sampling / 6.3:
Sequentially Dynamically Weighted Importance Sampling / 6.4:
Stochastic Approximation Monte Carlo / 7:
Multicanonical Monte Carlo / 7.1:
1/k-Ensemble Sampling / 7.2:
Wang-Landau Algorithm / 7.3:
Applications of Stochastic Approximation Monte Carlo / 7.4:
Variants of Stochastic Approximation Monte Carlo / 7.6:
Theory of Stochastic Approximation Monte Carlo / 7.7:
Trajectory Averaging: Toward the Optimal Convergence Rate / 7.8:
Markov Chain Monte Carlo with Adaptive Proposals / 8:
Stochastic Approximation-based Adaptive Algorithms / 8.1:
Adaptive Independent Metropolis-Hastings Algorithms / 8.2:
Regeneration-based Adaptive Algorithms / 8.3:
Population-based Adaptive Algorithms / 8.4:
References
Index
Acknowledgments
Publisher's Acknowledgments
Bayesian Inference and Markov Chain Monte Carlo
Specification of Bayesian Models / 1.1.1:
The Jeffreys Priors and Beyond / 1.1.2:
Bayes Output
Credible Intervals and Regions / 1.2.1:
Hypothesis Testing: Bayes Factors / 1.2.2:
The Problem / 1.3.1:
Monte Carlo Approximation / 1.3.2:
Monte Carlo via Importance Sampling / 1.3.3:
Random Variable Generation
Direct or Transformation Methods / 1.4.1:
Acceptance-Rejection Methods / 1.4.2:
The Ratio-of-Uniforms Method and Beyond / 1.4.3:
Adaptive Rejection Sampling / 1.4.4:
Perfect Sampling / 1.4.5:
Markov Chain Monte Carlo
Markov Chains / 1.5.1:
Convergence Results / 1.5.2:
Convergence Diagnostics / 1.5.3:
The Gibbs Sampler
Implementation Strategies and Acceleration Methods
Blocking and Collapsing / 2.3.1:
Hierarchical Centering and Reparameterization / 2.3.2:
Parameter Expansion for Data Augmentation / 2.3.3:
Alternating Subspace-Spanning Resampling / 2.3.4:
The Student-t Model / 2.4.1:
Robit Regression or Binary Regression with the Student-t Link / 2.4.2:
Linear Regression with Interval-Censored Responses / 2.4.3:
The EM and PX-EM Algorithms / Appendix 2A:
Independence Sampler / 3.1.1:
Random Walk Chains / 3.1.2:
Problems with Metropolis-Hastings Simulations / 3.1.3:
Variants of the Metropolis-Hastings Algorithm
The Hit-and-Run Algorithm / 3.2.1:
The Langevin Algorithm / 3.2.2:
The Multiple-Try MH Algorithm / 3.2.3:
Reversible Jump MCMC Algorithm for Bayesian Model Selection Problems
Reversible Jump MCMC Algorithm / 3.3.1:
Change-Point Identification / 3.3.2:
Metropolis-Within-Gibbs Sampler for ChIP-chip Data Analysis
Metropolis-Within-Gibbs Sampler / 3.4.1:
Bayesian Analysis for ChIP-chip Data / 3.4.2:
The Slice Sampler
The Møller Algorithm
The Double MH Sampler
Spatial Autologistic Models / 4.8.1:
Monte Carlo MH Algorithm / 4.9.1:
Convergence / 4.9.2:
Spatial Autologistic Models (Revisited) / 4.9.3:
Marginal Inference / 4.9.4:
Autonormal Models / 4.10.1:
Social Networks / 4.10.2:
Evolutionary Monte Carlo in Binary-Coded Space / 5.5.1:
Evolutionary Monte Carlo in Continuous Space / 5.5.2:
Implementation Issues / 5.5.3:
Two Illustrative Examples / 5.5.4:
Discussion / 5.5.5:
Sequential Parallel Tempering for Simulation of High Dimensional Systems
Build-up Ladder Construction / 5.6.1:
Sequential Parallel Tempering / 5.6.2:
An Illustrative Example: the Witch's Hat Distribution / 5.6.3:
Bayesian Curve Fitting / 5.6.4:
Protein Folding Simulations: 2D HP Model / 5.8.2:
Bayesian Neural Networks for Nonlinear Time Series Forecasting / 5.8.3:
Protein Sequences for 2D HP Models / Appendix 5A:
The IWIW Principle / 6.1.1:
Tempering Dynamic Weighting Algorithm / 6.1.2:
Dynamic Weighting in Optimization / 6.1.3:
The Basic Idea / 6.2.1:
A Theory of DWIS / 6.2.2:
Two DWIS Schemes / 6.2.3:
Weight Behavior Analysis / 6.2.5:
A Numerical Example / 6.2.6:
Sampling from Distributions with Intractable Normalizing Constants / 6.3.1:
Bayesian Analysis for Spatial Autologistic Models / 6.3.2:
The Wang-Landau Algorithm
Efficient p-Value Evaluation for Resampling-Based Tests / 7.5.1:
Bayesian Phylogeny Inference / 7.5.2:
Bayesian Network Learning / 7.5.3:
Smoothing SAMC for Model Selection Problems / 7.6.1:
Continuous SAMC for Marginal Density Estimation / 7.6.2:
Annealing SAMC for Global Optimization / 7.6.3:
Convergence Rate / 7.7.1:
Ergodicity and its IWIW Property / 7.7.3:
Trajectory Averaging for a SAMCMC Algorithm / 7.8.1:
Trajectory Averaging for SAMC / 7.8.2:
Proof of Theorems 7.8.2 and 7.8.3 / 7.8.3:
Test Functions for Global Optimization / Appendix 7At:
Stochastic Approximation-Based Adaptive Algorithms
Ergodicity and Weak Law of Large Numbers / 8.1.1:
Adaptive Metropolis Algorithms / 8.1.2:
Regeneration-Based Adaptive Algorithms
Identification of Regeneration Times / 8.3.1:
Proposal Adaptation at Regeneration Times / 8.3.2:
Population-Based Adaptive Algorithms
ADS, EMC, NKC and More / 8.4.1:
Adaptive.EMC / 8.4.2:
Application to Sensor Placement Problems / 8.4.3:
Preface
Acknowledgements
List of Figures
文献の複写および貸借の依頼を行う
 文献複写・貸借依頼