Information and Coding Theory
β Scribed by Gareth A. Jones, J.Mary Jones
- Publisher
- Springer
- Year
- 2000
- Tongue
- English
- Leaves
- 217
- Series
- Springer Undergraduate Mathematics Series
- Edition
- 1st
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
This text is an elementary introduction to information and coding theory. The first part focuses on information theory, covering uniquely decodable and instantaneous codes, Huffman coding, entropy, information channels, and Shannonβs Fundamental Theorem. In the second part, linear algebra is used to construct examples of such codes, such as the Hamming, Hadamard, Golay and Reed-Muller codes. Contains proofs, worked examples, and exercises.
β¦ Table of Contents
Preface......Page 6
Notes to the Reader......Page 13
Contents......Page 10
1.1 Definitions and Examples......Page 14
1.2 Uniquely Decodable Codes......Page 17
1.3 Instantaneous Codes......Page 22
1.4 Constructing Instantaneous Codes......Page 24
1.5 Kraft'8 Inequality......Page 26
1.6 McMillan's Inequality......Page 27
1.7 Comments on Kraft's and McMillan'sInequalities......Page 29
1.8 Supplementary Exercises......Page 30
2.1 Optimality......Page 32
2.2 Binary Huffman Codes......Page 35
2.3 Average Word-length of Huffman Codes......Page 39
2.4 Optimality of Binary Huffman Codes......Page 40
2.5 r-ary Huffman Codes......Page 41
2.6 Extensions of Sources......Page 43
2.7 Supplementary Exercises......Page 45
3.1 Information and Entropy......Page 47
3.2 Properties of the Entropy Function......Page 52
3.3 Entropy and Average Word-length......Page 54
3.4 Shannon-Fano Coding......Page 57
3.5 Entropy of Extensions and Products......Page 59
3.6 Shannon's First Theorem......Page 60
3.7 An Example of Shannon's First Theorem......Page 61
3.8 Supplementary Exercises......Page 63
4.1 Notation and Definitions......Page 66
4.2 The Binary Symmetric Channel......Page 71
4.3 System Entropies......Page 73
4.4 System Entropies for the Binary SymmetricChannel......Page 75
4.5 Extension of Shannon's First Theorem toInformation Channels......Page 78
4.6 Mutual Information......Page 81
4.7 Mutual Information for the BinarySymmetric Channel......Page 83
4.8 Channel Capacity......Page 84
4.9 Supplementary Exercises......Page 87
5.1 Decision Rules......Page 90
5.2 An Example of Improved Reliability......Page 93
5.3 Hamming Distance......Page 96
5.4 Statement and Outline Proof of Shannon'sTheorem......Page 99
5.5 The Converse of Shannon's Theorem......Page 101
5.6 Comments on Shannon's Theorem......Page 104
5.7 Supplementary Exercises......Page 105
6.1 Introductory Concepts......Page 108
6.2 Examples of Codes......Page 111
6.3 Minimum Distance......Page 115
6.4 Hamming's Sphere-packing Bound......Page 118
6.5 The Gilbert-Varshamov Bound......Page 122
6.6 Hadamard Matrices and Codes......Page 125
6.7 Supplementary Exercises......Page 129
7.1 Matrix Description of Linear Codes......Page 131
7.2 Equivalence of Linear Codes......Page 137
7.3 Minimum Distance of Linear Codes......Page 141
7.4 The Hamming Codes......Page 143
7.5 The Golay Codes......Page 146
7.6 The Standard Array......Page 151
7.7 Syndrome Decoding......Page 153
7.8 Supplementary Exercises......Page 156
A Proof of the Sardinas-Patterson Theorem......Page 162
B The Law of Large Numbers......Page 166
C Proof of Shannon's FundamentalTheorem......Page 168
Suggestions for Further Reading......Page 159
Solutions to Exercises......Page 174
Bibliography......Page 199
Index of Symbols and Abbreviations......Page 203
Index......Page 208
π SIMILAR VOLUMES
<p>As this Preface is being written, the twentieth century is coming to an end. Historians may perhaps come to refer to it as the century of information, just as its predecessor is associated with the process of industrialisation. Successive technological developments such as the telephone, radio, t