Convergence analysis for a multiplicatively relaxed EM algorithm
β Scribed by Alfredo N. Iusem
- Publisher
- John Wiley and Sons
- Year
- 1991
- Tongue
- English
- Weight
- 810 KB
- Volume
- 14
- Category
- Article
- ISSN
- 0170-4214
No coin nor oath required. For personal study only.
β¦ Synopsis
Abstract
The expectation maximization (EM) algorithm is an iterative procedure used to determine maximum likelihood estimators in situations of incomplete data. In the case of independent Poisson variables it converges to a solution of a problem of the form min β[γa^i^,xγ β b~i~ log γa^i^, xγ] such that x β©Ύ0. Thus, it can be used to solve systems of the form Ax = b, xβ©Ύ0 (with A stochastic and b positive.) It converges to a feasible solution if it exists and to an approximate one otherwise (the one that minimizes d (b, Ax), where d is the KullbackβLeibler information divergence). We study the convergence of the multiplicatively relaxed version proposed by Tanaka for use in positron emission tomography. We prove global convergence in the underrelaxed and unrelaxed cases. In the overrelaxed case we present a local convergence theorem together with two partial global results: the sequence generated by the algorithm is bounded and, if it converges, its limit point is a solution of the aforementioned problem.
π SIMILAR VOLUMES
It is well-known that the EM algorithm generally converges to a local maximum likelihood estimate. However, there have been many evidences to show that the EM algorithm can converge correctly to the true parameters as long as the overlap of Gaussians in the sample data is small enough. This paper st
We propose a new algorithm for solving the weighted histogram analysis method (WHAM) equations to estimate free energies out of a set of Monte Carlo (MC) or molecular dynamics (MD) simulations. The algorithm, based on free-energy differences, provides a more natural way of approaching the problem an
This paper illustrates that the conditions and the main proof of two main theorems of Verma [R.U. Verma, The over-relaxed proximal point algorithm based on H-maximal monotonicity design and applications, Computers and Mathematics with Applications 55 (2008) 2673-2679] concerning the strong convergen
Minor component analysis (MCA) is a powerful statistical tool for signal processing and data analysis. Convergence of MCA learning algorithms is an important issue in practical applications. In this paper, we will propose a simple MCA learning algorithm to extract minor component from input signals.