Strong Consistency of Bayes Estimates in Stochastic Regression Models
β Scribed by Inchi Hu
- Publisher
- Elsevier Science
- Year
- 1996
- Tongue
- English
- Weight
- 517 KB
- Volume
- 57
- Category
- Article
- ISSN
- 0047-259X
No coin nor oath required. For personal study only.
β¦ Synopsis
Under minimum assumptions on the stochastic regressors, strong consistency of Bayes estimates is established in stochastic regression models in two cases:
(1) When the prior distribution is discrete, the p.d.f. f of i.i.d. random errors is assumed to have finite Fisher information I= & ( f $) 2 Γf dx< ; (2) for general priors, we assume f is strongly unimodal. The result can be considered as an application of a theorem of Doob to stochastic regression models.
π SIMILAR VOLUMES
The strong universal pointwise consistency of some modified versions of the standard regression function estimates of partitioning, kernel, and nearest neighbor type is shown.
In this paper we propose a new approach for estimating the unknown parameter in the stochastic linear regressive model with stationary ergodic sequence of covariates. Under mild conditions on the joint distribution of the covariate and the error, the estimator constructed is shown to be strongly con
Conditions for superiority of the minimum dispersion estimator over another with respect to the covariance matrix are derived when the vector parameter of a regression model is subject to competing stochastic restrictions. The restrictions may also consist both of a deterministic part and a stochast
Conditions for the strong consistency of the parameter estimates in direct self-tuning control algorithms based on stochastic approximation are derived in the general delay-general dither case and the influence of a priori information to the estimator dimension is analyzed.