It was mentioned by Kolmogorov (1968, IEEE Trans. Inform. Theory 14, 662 664) that the properties of algorithmic complexity and Shannon entropy are similar. We investigate one aspect of this similarity. Namely, we are interested in linear inequalities that are valid for Shannon entropy and for Kolmo
An inequality for logarithmic mapping and applications for the shannon entropy
β Scribed by S.S. Dragomir
- Publisher
- Elsevier Science
- Year
- 2003
- Tongue
- English
- Weight
- 328 KB
- Volume
- 46
- Category
- Article
- ISSN
- 0898-1221
No coin nor oath required. For personal study only.
β¦ Synopsis
using an inequality for convex functions by Andrica and %a [l] (2.1), we point out a new inequality for log mappings and apply it in information theory for the Shannon entropy and mutual information.
π SIMILAR VOLUMES
A new analytic inequality for logarithms which provides a converse to arithmetic meangeometric mean inequality and its applications in information theory are given.
We propose a method, based on logarithmic convexity, for producing sharp Ε½ . Ε½ . bounds for the ratio β« x q β€ rβ« x . As an application, we present an inequality that sharpens and generalizes inequalities due to Gautschi, Chu, Boyd, Lazarevic-ΔΉupas ΒΈ, and Kershaw.