In this paper, we derive some upper bounds for the relative entropy D(p 11 q) of two probability distribution and apply them to mutual information and entropy mapping. To achieve this, we use an inequality for the logarithm function, (2.3) below, and some classical inequalities such as the Kantorovi
β¦ LIBER β¦
Relative entropy of DNA and its application
β Scribed by Chun Li; Jun Wang
- Publisher
- Elsevier Science
- Year
- 2005
- Tongue
- English
- Weight
- 187 KB
- Volume
- 347
- Category
- Article
- ISSN
- 0378-4371
No coin nor oath required. For personal study only.
π SIMILAR VOLUMES
Some upper bounds for relative entropy a
β
S.S. Dragomir; M.L. Scholz; J. Sunde
π
Article
π
2000
π
Elsevier Science
π
English
β 500 KB
The fuzzy entropy concept and its applic
β
Akira Ishikawa; Hiroshi Mieno
π
Article
π
1979
π
Elsevier Science
π
English
β 617 KB
Entropy of Endomorphisms and Relative En
β
Erling StΓΈrmer
π
Article
π
2000
π
Elsevier Science
π
English
β 186 KB
We show the analogue for the entropy of automorphisms of finite von Neumann algebras of the classical formula H(T )=H( i=0 T &i P | i=1 T &i P), where T is a measure preserving transformation of a probability space, and P is a generator.
Research group on relative information a
β
Guy Jumarie
π
Article
π
1988
π
Elsevier Science
π
English
β 283 KB
An upper bound for the entropy and its a
β
Y. Alhassid; N. Agmon; R.D. Levine
π
Article
π
1978
π
Elsevier Science
π
English
β 442 KB
Linear regression model of DNA sequences
β
Qi Dai; Xiao-Qing Liu; Tian-Ming Wang; Damir Vukicevic
π
Article
π
2007
π
John Wiley and Sons
π
English
β 412 KB
## Abstract We constructed six new models to analyze the DNA sequences. First, we regarded a DNA primary sequence as a random process in __t__ and gave three ways to define nucleotides' random distribution functions. We extracted some parameters from the linear model and analyzed the changes of the