Statistical Inference Based on Divergence Measures
โ Scribed by Pardo L., Llorente L. P.
- Year
- 2005
- Tongue
- English
- Leaves
- 492
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
Pardo (statistics and operations research, Complutense U. of Madrid, Spain) analyzes issues of statistical inference, such as estimation and hypotheses testing, using measures of entropy and divergence. He discusses Information Theory, asymptotic behavior of measure of entropy in solving statistical problems, statistical analysis of discrete multivariate data, goodness-of-fit in simple and composite null hypothesis, optimality of phi-divergence test statistics, minimum phi-divergence estimators, loglinear models, contingency tables, and testing in general populations.
๐ SIMILAR VOLUMES
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete m
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete m
A coherent, unified set of statistical methods, based on ranks, for analyzing data resulting from various experimental designs. Uses MINITAB, a statistical computing system for the implementation of the methods. Assesses the statistical and stability properties of the methods through asymptotic effi
This book presents a study of statistical inferences based on the kernel-type estimators of distribution functions. The inferences involve matters such as quantile estimation, nonparametric tests, and mean residual life expectation, to name just some. Convergence rates for the kernel estimators of d