๐”– Scriptorium
โœฆ   LIBER   โœฆ

๐Ÿ“

Statistical Inference Based on Divergence Measures

โœ Scribed by Leandro Pardo


Publisher
Chapman and Hall/CRC
Year
2005
Tongue
English
Leaves
497
Series
Statistics: A Series of Textbooks and Monographs
Category
Library

โฌ‡  Acquire This Volume

No coin nor oath required. For personal study only.

โœฆ Synopsis


The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.

Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, presenting the interesting possibility of introducing alternative test statistics to classical ones like Wald, Rao, and likelihood ratio. Each chapter concludes with exercises that clarify the theoretical results and present additional results that complement the main discussions.

Clear, comprehensive, and logically developed, this book offers a unique opportunity to gain not only a new perspective on some standard statistics problems, but the tools to put it into practice.


๐Ÿ“œ SIMILAR VOLUMES


Statistical Inference Based on Divergenc
โœ Pardo L., Llorente L. P. ๐Ÿ“‚ Library ๐Ÿ“… 2005 ๐ŸŒ English

Pardo (statistics and operations research, Complutense U. of Madrid, Spain) analyzes issues of statistical inference, such as estimation and hypotheses testing, using measures of entropy and divergence. He discusses Information Theory, asymptotic behavior of measure of entropy in solving statistical

Statistical inference based on divergenc
โœ Leandro Pardo ๐Ÿ“‚ Library ๐Ÿ“… 2006 ๐Ÿ› Chapman & Hall/CRC ๐ŸŒ English

The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete m

Statistical inference based on ranks
โœ Hettmansperger, T.P. ๐Ÿ“‚ Library ๐Ÿ“… 1984 ๐Ÿ› Wiley ๐ŸŒ English

A coherent, unified set of statistical methods, based on ranks, for analyzing data resulting from various experimental designs. Uses MINITAB, a statistical computing system for the implementation of the methods. Assesses the statistical and stability properties of the methods through asymptotic effi

Statistical Inference Based on Kernel Di
โœ Rizky Reza Fauzi; Yoshihiko Maesono ๐Ÿ“‚ Library ๐Ÿ“… 2023 ๐Ÿ› Springer Nature ๐ŸŒ English

This book presents a study of statistical inferences based on the kernel-type estimators of distribution functions. The inferences involve matters such as quantile estimation, nonparametric tests, and mean residual life expectation, to name just some. Convergence rates for the kernel estimators of d