Information Measures in Simple Coding Problems / Part I: |
Source coding and hypothesis testing: information measures / 1: |
Types and typical sequences / 2: |
Some formal properties of Shannon's information measures / 3: |
Non-block source coding / 4: |
Blowing up lemma: a combinatorial digression / 5: |
Two-Terminal Systems / Part II: |
The noisy channel problem / 6: |
Rate-distortion trade-off in source coding and the source-channel transmission problem / 7: |
Computation of channel capacity and Δ-distortion rates / 8: |
A covering lemma: error exponent in source coding / 9: |
A packing lemma: on the error exponent in channel coding / 10: |
The compound channel revisited: zero-error information theory and extremal combinatorics / 11: |
Arbitrary varying channels / 12: |
Multi-Terminal Systems / Part III: |
Separate coding of correlated source / 13: |
Multiple-access channels / 14: |
Entropy and image size characteristics / 15: |
Source and channel networks / 16: |
Information-theoretic security / 17: |
Preface to the first edition |
Preface to the second edition |
Basic notation and conventions |
Introduction |
Information measures in simple coding problems |
Source coding and hypothesis testing; information measures |
Formal properties of Shannon's information measures |
Two-terminal systems |
The noisy channel coding problem |
Computation of channel capacity and $$$-distortion rates |
A covering lemma and the error exponent in source coding |
A packing lemma and the error exponent in channel coding |
Arbitrarily varying channels |
Multi-terminal systems |
Separate coding of correlated sources |
Entropy and image size characterization |
References |
Name index |
Index of symbols and abbreviations |
Subject index |
Information Measures in Simple Coding Problems / Part I: |
Source coding and hypothesis testing: information measures / 1: |
Types and typical sequences / 2: |