The latest edition of this classic is updated with new problem sets and materialThe Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and inf
Elements of Information Theory 2nd Edition
โ Scribed by Thomas M. Cover, Joy A. Thomas
- Publisher
- Wiley Interscience
- Year
- 2006
- Tongue
- English
- Leaves
- 773
- Series
- Wiley Series in Telecommunications and Signal Processing
- Edition
- 2ed.
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
โฆ Table of Contents
Content: Half Title page
Title page
Copyright page
Preface to the Second Edition
Preface to the First Edition
Acknowledgments for the Second Edition
Acknowledgments for the First Edition
Chapter 1: Introduction and Preview
1.1 Preview of the Book
Chapter 2: Entropy, Relative Entropy, and Mutual Information
2.1 Entropy
2.2 Joint Entropy and Conditional Entropy
2.3 Relative Entropy and Mutual Information
2.4 Relationship Between Entropy and Mutual Information
2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information
2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications2.8 Data-Processing Inequality
2.9 Sufficient Statistics
2.10 Fano's Inequality
Summary
Problems
Historical Notes
Chapter 3: Asymptotic Equipartition Property
3.1 Asymptotic Equipartition Property Theorem
3.2 Consequences of the AEP: Data Compression
3.3 High-Probability Sets and The Typical Set
Summary
Problems
Historical Notes
Chapter 4: Entropy Rates of a Stochastic Process
4.1 Markov Chains
4.2 Entropy Rate
4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph
4.4 Second Law of Thermodynamics. 4.5 Functions of Markov ChainsSummary
Problems
Historical Notes
Chapter 5: Data Compression
5.1 Examples of Codes
5.2 Kraft Inequality
5.3 Optimal Codes
5.4 Bounds on the Optimal Code Length
5.5 Kraft Inequality for Uniquely Decodable Codes
5.6 Huffman Codes
5.7 Some Comments on Huffman Codes
5.8 Optimality of Huffman Codes
5.9 Shannon-Fano-Elias Coding
5.10 Competitive Optimality of the Shannon Code
5.11 Generation of Discrete Distributions from Fair Coins
Summary
Problems
Historical Notes
Chapter 6: Gambling and Data Compression
6.1 The Horse Race. 6.2 Gambling and Side Information6.3 Dependent Horse Races and Entropy Rate
6.4 The Entropy of English
6.5 Data Compression and Gambling
6.6 Gambling Estimate of the Entropy of English
Summary
Problems
Historical Notes
Chapter 7: Channel Capacity
7.1 Examples of Channel Capacity
7.2 Symmetric Channels
7.3 Properties of Channel Capacity
7.4 Preview of the Channel Coding Theorem
7.5 Definitions
7.6 Jointly Typical Sequences
7.7 Channel Coding Theorem
7.8 Zero-Error Codes
7.9 Fano's Inequality and the Converse to the Coding Theorem. 7.10 Equality in the Converse to the Channel Coding Theorem7.11 Hamming Codes
7.12 Feedback Capacity
7.13 Source-Channel Separation Theorem
Summary
Problems
Historical Notes
Chapter 8: Differential Entropy
8.1 Definitions
8.2 AEP for Continuous Random Variables
8.3 Relation of Differential Entropy to Discrete Entropy
8.4 Joint and Conditional Differential Entropy
8.5 Relative Entropy and Mutual Information
8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information
Summary
Problems
Historical Notes
Chapter 9: Gaussian Channel.
๐ SIMILAR VOLUMES
The latest edition of this classic is updated with new problem sets and materialThe Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and inf
The latest edition of this classic is updated with new problem sets and material<br><br> The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics
Information Theory and Evolution discusses the phenomenon of life, including its origin and evolution (and also human cultural evolution), against the background of thermodynamics, statistical mechanics, and information theory. Among the central themes is the seeming contradiction between the second