Csiszár and Körner's book is widely regarded as a classic in the field of information theory, providing deep insights and expert treatment of the key theoretical issues. It includes in-depth coverage of the mathematics of reliable information transmission, both in two-terminal and multi-terminal net
Information Theory: Coding Theorems for Discrete Memoryless Systems
✍ Scribed by Imre Csiszar
- Publisher
- Cambridge University Press
- Year
- 2011
- Tongue
- English
- Leaves
- 523
- Edition
- 2
- Category
- Library
No coin nor oath required. For personal study only.
✦ Synopsis
Csiszár and Körner's book is widely regarded as a classic in the field of information theory, providing deep insights and expert treatment of the key theoretical issues. It includes in-depth coverage of the mathematics of reliable information transmission, both in two-terminal and multi-terminal network scenarios. Updated and considerably expanded, this new edition presents unique discussions of information theoretic secrecy and of zero-error information theory, including the deep connections of the latter with extremal combinatorics. The presentations of all core subjects are self contained, even the advanced topics, which helps readers to understand the important connections between seemingly different problems. Finally, 320 end-of-chapter problems, together with helpful solving hints, allow readers to develop a full command of the mathematical techniques. It is an ideal resource for graduate students and researchers in electrical and electronic engineering, computer science and applied mathematics.
✦ Table of Contents
Cover
Information Theory
Title
Copyright
Contents
Preface to the first edition
Preface to the second edition
Basic notation and conventions
Preliminaries on random variables and probability distributions
Introduction
Intuitive background
Informal description of the basic mathematical model
Measuring information
Multi-terminal systems
Part I Information measures in simple coding problems
1 Source coding and hypothesis testing; information measures
Discussion
Problems
Postulational characterizations of entropy (Problems 1.11–1.14)
Story of the results
2 Types and typical sequences
Discussion
Problems
Story of the results
3 Formal properties of Shannon's information measures
Problems
Properties of informational divergence (Problems 3.17–3.20)
Structural results on entropy (Problems 3.21–3.22)
Story of the results
4 Non-block source coding
Problems
General noiseless channels (Problems 4.20–4.22)
Universal variable-length codes (Problems 4.23–4.26)
Story of the results
5 Blowing up lemma: a combinatorial digression
Problems
Story of the results
Part II Two-terminal systems
6 The noisy channel coding problem
Discussion
Problems
Comparison of channels (Problems 6.16–6.18)
Zero-error capacity and graphs (Problems 6.23–6.25)
Story of the results
7 Rate-distortion trade-off in source coding and the source–channel transmission problem
Discussion
Problems
Story of the results
8 Computation of channel capacity and ∆-distortion rates
Problems
Story of the results
9 A covering lemma and the error exponent in source coding
Problems
Graph entropy and convex corners
Story of the results
10 A packing lemma and the error exponent in channel coding
Discussion
Problems
Compound DMCs (Problems 10.12–10.14)
Reliability at R = 0 (Problems 10.20–10.23)
Story of the results
11 The compound channel revisited: zero-error information theory and extremal combinatorics
Discussion
Problems
Story of the results
12 Arbitrarily varying channels
Discussion
Problems
Story of the results
Part III Multi-terminal systems
13 Separate coding of correlated sources
Discussion
Problems
Story of the results
14 Multiple-access channels
Discussion
Problems
Reduction of channel network problems (Problems 14.22–14.24)
Story of the results
15 Entropy and image size characterization
Discussion
Problems
Image size of arbitrary sets (Problems 15.4–15.5)
More-than-three-component sources (Problems 15.16–15.21)
Story of the results
16 Source and channel networks
Discussion
Problems
Broadcast channels (Problems 16.8–16.12)
Source networks with three inputs and one helper (Problems 16.13–16.18)
Source networks with two helpers
General fidelity criteria (Problems 16.22–16.24)
Common information (Problems 16.27–16.30)
Miscellaneous source networks (Problems 16.31–16.33)
Story of the results
17 Information-theoretic security
17.1 Basic concepts and tools
17.2 Secure transmission over an insecure channel
17.3 Secret key generation using public discussion
Discussion
Problems
Computation of PK capacities (Problems 17.17–17.20)
Seeded extractors (Problems 17.21–17.23)
Story of the results
References
Name index
Index of symbols and abbreviations
Subject index
📜 SIMILAR VOLUMES
Csiszár and Körner's book is widely regarded as a classic in the field of information theory, providing deep insights and expert treatment of the key theoretical issues. It includes in-depth coverage of the mathematics of reliable information transmission, both in two-terminal and multi-terminal net
Csiszár and Körner's book is widely regarded as a classic in the field of information theory, providing deep insights and expert treatment of the key theoretical issues. It includes in-depth coverage of the mathematics of reliable information transmission, both in two-terminal and multi-terminal net
Information Theory: Coding Theorems for Discrete Memoryless Systems presents mathematical models that involve independent random variables with finite range. This three-chapter text specifically describes the characteristic phenomena of information theory. <br><br>Chapter 1 deals with information me