<DIV>Comprehensive, rigorous introduction to work of Shannon, McMillan, Feinstein and Khinchin. Translated by R. A. Silverman and M. D. Friedman. </DIV>
Mathematical Foundations of Information Theory
β Scribed by A. Ya. Khinchin
- Publisher
- Dover Publications
- Year
- 1957
- Tongue
- English
- Leaves
- 125
- Series
- Dover Books on Mathematics
- Edition
- 1st Dover
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite βscheme,β and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts βto give a complete, detailed proof of both β¦ Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory.β
Partial Contents: I. The Entropy Concept in Probability Theory β Entropy of Finite Schemes. The Uniqueness Theorem. Entropy of Markov chains. Application to Coding Theory. II. On the Fundamental Theorems of Information Theory β Two generalizations of Shannonβs inequality. Three inequalities of Feinstein. Concept of a source. Stationarity. Entropy. Ergodic sources. The E property. The martingale concept. Noise. Anticipation and memory. Connection of the channel to the source. Feinsteinβs Fundamental Lemma. Coding. The first Shannon theorem. The second Shannon theorem.
β¦ Subjects
Information Theory;Computer Science;Computers & Technology;Electrical & Electronics;Circuits;Digital Design;Electric Machinery & Motors;Electronics;Fiber Optics;Networks;Superconductivity;Engineering;Engineering & Transportation;History & Philosophy;Science & Math;Applied;Biomathematics;Differential Equations;Game Theory;Graph Theory;Linear Programming;Probability & Statistics;Statistics;Stochastic Modeling;Vector Analysis;Mathematics;Science & Math
π SIMILAR VOLUMES
Comprehensive, rigorous introduction to work of Shannon, McMillan, Feinstein, and Khinchin. Translated by R. A. Silverman and M. D. Friedman.
The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory andΒ fundamental theorems as well as ergodic sources, the martingale
<div><div>The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engi