## Abstract In this paper, a synthesis method developed in the last few years is applied to derive a cellular nonβlinear network (CNN) able to find an approximate solution to a variational imageβfusion problem. The functional to be minimized is based on regularization theory and takes into account
A fast snake model based on non-linear diffusion for medical image segmentation
β Scribed by Min Wei; Yongjin Zhou; Mingxi Wan
- Publisher
- Elsevier Science
- Year
- 2004
- Tongue
- English
- Weight
- 364 KB
- Volume
- 28
- Category
- Article
- ISSN
- 0895-6111
No coin nor oath required. For personal study only.
β¦ Synopsis
In this paper, the traditional snake model and gradient vector flow (GVF) snake model are studied, which are believed to be quite slow due to the need to compute inverse matrix. Actually, the GVF in the latter snake model is formed by a biased linear diffusion procedure, and there would be oscillations around the edge of the object. Based on GVF generated through non-linear diffusion, we present a fast GVF (FGVF) snake model which is much faster than the traditional snake model and GVF snake model, and would cause no degradation of stability and flexibility, meanwhile, it could reduce the oscillations around the edges. The segmentation results using FGVF and error analysis on simulated images are presented. Finally, the demonstration of FGVF applied to Computed Tomography and Magnetic Resonance images are shown, the segmentation results are satisfactory visually with much less computation time in comparison with former snakes.
π SIMILAR VOLUMES
The paper describes the development of a constitutive model for a poorly graded sand, which was used in geotechnical experiments on buried pipes (reported elsewhere). The sand was tested extensively in the laboratory to determine the state parameter constants. Triaxial tests on the sand included con
This short communication comments on a series of papers using artificial neural networks published by Guessasma and co-workers in structures journals. The issues discussed include the size of the database for training a neural network, database enlargement for training a neural network, and extrapol