Information measures, effective complexity, and total information
β Scribed by Murray Gell-Mann; Seth Lloyd
- Publisher
- John Wiley and Sons
- Year
- 1996
- Tongue
- English
- Weight
- 120 KB
- Volume
- 2
- Category
- Article
- ISSN
- 1076-2787
No coin nor oath required. For personal study only.
β¦ Synopsis
This article defines the concept of an information measure and shows how common information measures such as entropy, Shannon information, and algorithmic information content can be combined to solve problems of characterization, inference, and learning for complex systems. Particularly useful quantities are the effective complexity, which is roughly the length of a compact description of the identified regularities of an entity, and total information, which is effective complexity plus an entropy term that measures the information required to describe the random aspects of the entity. Mathematical definitions are given for both quantities and some applications are discussed. In particular, it is pointed out that if one compares different sets of identified regularities of an entity, the 'best' set minimizes the total information, and then, subject to that constraint and to constraints on computation time, minimizes the effective complexity; the resulting effective complexity is then in many respects independent of the observer.
π SIMILAR VOLUMES
A suggestion is made regarding the nature of informa-day notion of information that is central and is indepention: That the information in a theory be evaluated by dent of cognitive factors. It focuses on the concept of measuring either its distance from the perfect theory or information as being sp