We consider a wide class of statistics, namely C-divergences. We obtain asymptotic distributions of these statistics in nested models. Our result generalizes previous results in this field.
Minimum Kφ-divergence estimator
✍ Scribed by T Pérez; J.A Pardo
- Publisher
- Elsevier Science
- Year
- 2004
- Tongue
- English
- Weight
- 379 KB
- Volume
- 17
- Category
- Article
- ISSN
- 0893-9659
No coin nor oath required. For personal study only.
✦ Synopsis
In the present work, the problem of estimating parameters of statistical models for categorical data is analyzed. The minimum K¢-divergence estimator is obtained minimizing the K¢-divergence measure between the theoretical and the empirical probability vectors. Its asymptotic properties are obtained. From a simulation study, the conclusion is that our estimator emerges as an attractive alternative to the classical maximum likelihood estimator. (~) 2004 Elsevier Ltd. All rights reserved.
📜 SIMILAR VOLUMES
In this paper we consider statistical problems involving experiments whose observation does not provide exact information, but that it may be assimilated with fuzzy information. First, we present the minimum @divergence estimator 0k for the unknown parameter 0 on the basis of the 4)-divergence betwe
gives a goodness-of-fit statistic for multinomial distributed data. We define a generalized f-divergence that unifies the j-divergence approach with that of C. R. Rao and S. K. Mitra (''Generalized Inverse of Matrices and Its Applications,' ' Wiley, New York, 1971) and derive weak convergence to a q