This article defines the concept of an information measure and shows how common information measures such as entropy, Shannon information, and algorithmic information content can be combined to solve problems of characterization, inference, and learning for complex systems. Particularly useful quant
Semiclassical information from deformed and escort information measures
β Scribed by F. Pennini; A. Plastino; G.L. Ferri
- Publisher
- Elsevier Science
- Year
- 2007
- Tongue
- English
- Weight
- 375 KB
- Volume
- 383
- Category
- Article
- ISSN
- 0378-4371
No coin nor oath required. For personal study only.
β¦ Synopsis
Escort distributions are a well established but (for physicists) a relatively new concept that is rapidly gaining wide acceptance in world. In this work we wish to revisit the concept within the strictures of the celebrated semiclassical Husimi distributions (HDs) and thereby investigate the possibility of extracting new semiclassical information contained, not in the HD themselves, but in their associated escort Husimi distributions. We will also establish relations, for various information measures, between their deformed versions [J. Naudts, Physica A 316 (2002) 323] and those built up with escort HDs. Bounds on the concomitant power exponents will be determined.
π SIMILAR VOLUMES
The paper presents a new method applied to measure displacement fields using mutual information concept. Based on analysis of digital image sequences, we propose the formulation derived from the mutual information between subsets in the un-deformed and deformed image. The quantity of the mutual info
A suggestion is made regarding the nature of informa-day notion of information that is central and is indepention: That the information in a theory be evaluated by dent of cognitive factors. It focuses on the concept of measuring either its distance from the perfect theory or information as being sp