A new criterion for variable selection
โ Scribed by R. Philips; I. Guttman
- Publisher
- Elsevier Science
- Year
- 1998
- Tongue
- English
- Weight
- 493 KB
- Volume
- 38
- Category
- Article
- ISSN
- 0167-7152
No coin nor oath required. For personal study only.
โฆ Synopsis
The variable/model selection problem is reexamined from a Bayesian perspective using data splitting to establish a joint prior for the relevent parameters. This allows for the required integrations that have to be performed to be over the same dimensional parameter space. It also produces a result which is independent of the scaling of both the independent as well as dependent variables. The posterior probability of each model .J#~ is calculated, where the subscript cยข is used to index the subsets of the predictor variables. This probability is shown to be asymptotically equal to 1, if +J#~ is the correct model. A new model selection criterion is also derived from this expression. Examples using simulated data and real data sets are provided.
๐ SIMILAR VOLUMES
Model or variable selection is usually achieved through ranking models according to the increasing order of preference. One of methods is applying Kullback-Leibler distance or relative entropy as a selection criterion. Yet that will raise two questions, why use this criterion and are there any other