Introduction / 1: |
Information and Coding / Part I: |
Shannon-Wiener Information / 2: |
Coding of Random Variables / 2.1: |
Universal prior for integers and their coding / 2.1.1: |
Basic Properties of Entropy and Related Quantities / 2.2: |
Channel Capacity / 2.3: |
Chain Rules / 2.4: |
Jensen's Inequality / 2.5: |
Theory of Types / 2.6: |
Equipartition Property / 2.7: |
Coding of Random Processes / 3: |
Random Processes / 3.1: |
Entropy of Stationary Processes / 3.2: |
Markov Processes / 3.3: |
Tree Machines / 3.4: |
Tunstall's Algorithm / 3.5: |
Arithmetic Codes / 3.6: |
Universal Coding / 3.7: |
Lempel-Ziv and Ma algorithms / 3.7.1: |
Statistical Modeling / Part II: |
Kolmogorov Complexity / 4: |
Elements of Recursive Function Theory / 4.1: |
Complexities / 4.2: |
Kolmogorov's Structure Function / 4.3: |
Stochastic Complexity / 5: |
Model Classes / 5.1: |
Exponential family / 5.1.1: |
Maximum entropy family with simple loss functions / 5.1.2: |
Universal Models / 5.2: |
Mixture models / 5.2.1: |
Normalized maximum-likelihood model / 5.2.2: |
A predictive universal model / 5.2.3: |
Conditional NML model / 5.2.4: |
Strong Optimality / 5.3: |
Prediction bound for [alpha]-loss functions / 5.3.1: |
Structure Function / 6: |
Partition of Parameter Space / 6.1: |
Model Complexity / 6.2: |
Sets of Typical Sequences / 6.3: |
Optimally Distinguishable Models / 7: |
Special Cases / 7.1: |
Bernoulli class / 7.1.1: |
Normal distributions / 7.1.2: |
The MDL Principle / 8: |
Applications / 9: |
Hypothesis Testing / 9.1: |
Universal Tree Machine and Variable-Order Markov Chain / 9.2: |
Extension to time series / 9.2.1: |
Linear Regression / 9.3: |
Orthonormal regression matrix / 9.3.1: |
MDL Denoising / 9.4: |
Linear-quadratic denoising / 9.4.1: |
Histogram denoising / 9.4.2: |
AR and ARMA Models / 9.5: |
AR models / 9.5.1: |
ARMA models / 9.5.2: |
Logit Regression / 9.6: |
References |
Index |
Introduction / 1: |
Information and Coding / Part I: |
Shannon-Wiener Information / 2: |