𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Convergence analysis for a multiplicatively relaxed EM algorithm

✍ Scribed by Alfredo N. Iusem


Publisher
John Wiley and Sons
Year
1991
Tongue
English
Weight
810 KB
Volume
14
Category
Article
ISSN
0170-4214

No coin nor oath required. For personal study only.

✦ Synopsis


Abstract

The expectation maximization (EM) algorithm is an iterative procedure used to determine maximum likelihood estimators in situations of incomplete data. In the case of independent Poisson variables it converges to a solution of a problem of the form min βˆ‘[γ€ˆa^i^,x〉 βˆ’ b~i~ log γ€ˆa^i^, x〉] such that x β©Ύ0. Thus, it can be used to solve systems of the form Ax = b, xβ©Ύ0 (with A stochastic and b positive.) It converges to a feasible solution if it exists and to an approximate one otherwise (the one that minimizes d (b, Ax), where d is the Kullback–Leibler information divergence). We study the convergence of the multiplicatively relaxed version proposed by Tanaka for use in positron emission tomography. We prove global convergence in the underrelaxed and unrelaxed cases. In the overrelaxed case we present a local convergence theorem together with two partial global results: the sequence generated by the algorithm is bounded and, if it converges, its limit point is a solution of the aforementioned problem.


πŸ“œ SIMILAR VOLUMES


EM Algorithm for Segregation Analysis
✍ N. R. Achuthan; T. Krishnan πŸ“‚ Article πŸ“… 1992 πŸ› John Wiley and Sons 🌐 English βš– 713 KB
On the correct convergence of the EM alg
✍ Jinwen Ma; Shuqun Fu πŸ“‚ Article πŸ“… 2005 πŸ› Elsevier Science 🌐 English βš– 235 KB

It is well-known that the EM algorithm generally converges to a local maximum likelihood estimate. However, there have been many evidences to show that the EM algorithm can converge correctly to the true parameters as long as the overlap of Gaussians in the sample data is small enough. This paper st

Optimized convergence for multiple histo
✍ Tristan Bereau; Robert H. Swendsen πŸ“‚ Article πŸ“… 2009 πŸ› Elsevier Science 🌐 English βš– 377 KB

We propose a new algorithm for solving the weighted histogram analysis method (WHAM) equations to estimate free energies out of a set of Monte Carlo (MC) or molecular dynamics (MD) simulations. The algorithm, based on free-energy differences, provides a more natural way of approaching the problem an

A remark on the strong convergence of th
✍ Zhenyu Huang πŸ“‚ Article πŸ“… 2010 πŸ› Elsevier Science 🌐 English βš– 233 KB

This paper illustrates that the conditions and the main proof of two main theorems of Verma [R.U. Verma, The over-relaxed proximal point algorithm based on H-maximal monotonicity design and applications, Computers and Mathematics with Applications 55 (2008) 2673-2679] concerning the strong convergen

Convergence analysis of a simple minor c
✍ Dezhong Peng; Zhang Yi; Wenjing Luo πŸ“‚ Article πŸ“… 2007 πŸ› Elsevier Science 🌐 English βš– 541 KB

Minor component analysis (MCA) is a powerful statistical tool for signal processing and data analysis. Convergence of MCA learning algorithms is an important issue in practical applications. In this paper, we will propose a simple MCA learning algorithm to extract minor component from input signals.