Analysis of the posterior for spline estimators in logistic regression
β Scribed by Nandini Raghavan; Dennis D. Cox
- Publisher
- Elsevier Science
- Year
- 1998
- Tongue
- English
- Weight
- 172 KB
- Volume
- 71
- Category
- Article
- ISSN
- 0378-3758
No coin nor oath required. For personal study only.
β¦ Synopsis
A 'partially improper' Gaussian prior is considered for Bayesian inference in logistic regression. This includes generalized smoothing spline priors that are used for nonparametric inference about the logit, and also priors that correspond to generalized linear mixed models. Necessary and su cient conditions are given for the posterior to be a proper probability measure, and bounds are given for the tails of the posterior density. These results are applied to investigate Monte Carlo methods for approximating the posterior.
π SIMILAR VOLUMES
In this note we discuss the breakdown behavior of the maximum likelihood (ML) estimator in the logistic regression model. We formally prove that the ML-estimator never explodes to inΓΏnity, but rather breaks down to zero when adding severe outliers to a data set. An example conΓΏrms this behavior.
A new modification of Berkson's minimum logit chi-squared estimator in simple linear logistic regression is suggested in order to achieve reduction of first order biae of the estimator ae well as in the model. Furthermore, unlike estimators currently available, our procedure is quite simple to apply
The aim of this study was to estimate the risk of viable unbalanced offspring for a parental carrier of reciprocal translocation. On a large computerized database of reciprocal translocations we used logistic regression to model this risk. The status of the progeny is the outcome variable. Explanato