๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

A self-organizing neural-network-based fuzzy system

โœ Scribed by Yin Wang; Gang Rong


Publisher
Elsevier Science
Year
1999
Tongue
English
Weight
716 KB
Volume
103
Category
Article
ISSN
0165-0114

No coin nor oath required. For personal study only.

โœฆ Synopsis


A neural-network-based fuzzy system (NNFS) is proposed in this paper. It is a self-organizing neural-network which can partition the input spaces in a flexible way, based on the distribution of the training data in order to reduce the number of rules without any loss of modeling accuracy. Associated with the NNFS is a two-phase hybrid learning algorithm which utilizes a nearest-neighborhood clustering scheme for both structure learning and initial parameters setting and a gradient descent method for fine tuning the parameters of the NNFS. By combining the above two methods, the learning speed converges much faster than the original back-propagation algorithm. Simulation results suggest that the NNFS has merits of simple structure, fast learning speed, few fuzzy logic rules and relatively high modeling accuracy. Finally, the NNFS is applied to the construction of the soft sensor for a distillation column.


๐Ÿ“œ SIMILAR VOLUMES


Generalized fuzzy inference neural netwo
โœ Hiroshi Kitajima; Masafumi Hagiwara ๐Ÿ“‚ Article ๐Ÿ“… 1999 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 194 KB ๐Ÿ‘ 2 views

A new model for generalized fuzzy inference neural networks (GFINN) is proposed in this paper. The networks consist of three layers: an input-output layer, an if layer, and a then layer. In each layer, there are the operational nodes. A GFINN can perform three representative fuzzy inference methods

Parts clustering by self-organizing map
โœ Ping-Feng Pai; E.S. Lee ๐Ÿ“‚ Article ๐Ÿ“… 2001 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 607 KB

The description of the attributes or characteristics of the individual parts in a featurebased clustering system is frequently vague, and linguistic, fuzzy number or fuzzy coding is ideally suited to represent these attributes. However, due to the vagueness of the description, the resulting fuzzy me

Convergence of a self-organizing stochas
โœ Olivier Francois; Jacques Demongeot; Thierry Herve ๐Ÿ“‚ Article ๐Ÿ“… 1992 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 451 KB

In this paper, we focus on the convergence of a stochastic neural process. In this process, a "physiologically plausible" Hebb's learning rule gives rise to a self-organization phenomenon. Some preliminary results concern the asymptotic behaviour of the nework given that the update of neurons is eit

Extracting rules from a (fuzzy/crisp) re
โœ A. Blanco; M. Delgado; M. C. Pegalajar ๐Ÿ“‚ Article ๐Ÿ“… 2000 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 347 KB ๐Ÿ‘ 2 views

Although the extraction of symbolic knowledge from trained feedforward neural net-ลฝ . works has been widely studied, research in recurrent neural networks RNN has been more neglected, even though it performs better in areas such as control, speech recognition, time series prediction, etc. Nowadays,

Early lexical development in a self-orga
โœ Ping Li; Igor Farkas; Brian MacWhinney ๐Ÿ“‚ Article ๐Ÿ“… 2004 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 404 KB

In this paper we present a self-organizing neural network model of early lexical development called DevLex. The network consists of two self-organizing maps (a growing semantic map and a growing phonological map) that are connected via associative links trained by Hebbian learning. The model capture