๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

A functional neural network computing some eigenvalues and eigenvectors of a special real matrix

โœ Scribed by Yiguang Liu; Zhisheng You; Liping Cao


Publisher
Elsevier Science
Year
2005
Tongue
English
Weight
256 KB
Volume
18
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

โœฆ Synopsis


How to quickly compute eigenvalues and eigenvectors of a matrix, especially, a general real matrix, is significant in engineering. Since neural network runs in asynchronous and concurrent manner, and can achieve high rapidity, this paper designs a concise functional neural network (FNN) to extract some eigenvalues and eigenvectors of a special real matrix. After equivalent transforming the FNN into a complex differential equation and obtaining the analytic solution, the convergence properties of the FNN are analyzed. If the eigenvalue whose imaginary part is nonzero and the largest of all eigenvalues is unique, the FNN will converge to the eigenvector corresponding to this special eigenvalue with general nonzero initial vector. If all eigenvalues are real numbers or there are more than one eigenvalue whose imaginary part equals the largest, the FNN will converge to zero point or fall into a cycle procedure. Comparing with other neural networks designed for the same domain, the restriction to matrix is very slack. At last, three examples are employed to illustrate the performance of the FNN.


๐Ÿ“œ SIMILAR VOLUMES


Another neural network based approach fo
โœ Ying Tang; Jianping Li ๐Ÿ“‚ Article ๐Ÿ“… 2010 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 390 KB

This paper introduces a novel neural network based approach for extracting the eigenvalues with the largest or smallest modulus of real skew-symmetric matrices, as well as the corresponding eigenvectors. To this end, unlike the previous neural network based methods that can be summarized by some 2n-

Computation of eigenvalues and eigenvect
โœ David J. Evans ๐Ÿ“‚ Article ๐Ÿ“… 1977 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 556 KB

A recursive algorithm for the implicit derivation of the determinant of a symmetric quindiagonal matrix is developed in terms of its leading principal minors. The algorithm is shown to yield a Sturmian sequence of polynomials from which the eigenvalues can be obtained by use of the bisection process

A recurrent neural network computing the
โœ Yiguang Liu; Zhisheng You; Liping Cao ๐Ÿ“‚ Article ๐Ÿ“… 2007 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 820 KB

As the efficient calculation of eigenpairs of a matrix, especially, a general real matrix, is significant in engineering, and neural networks run asynchronously and can achieve high performance in calculation, this paper introduces a recurrent neural network (RNN) to extract some eigenpair. The RNN,

Iterative calculation of eigenvalues and
โœ G. A. Gallup ๐Ÿ“‚ Article ๐Ÿ“… 1982 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 262 KB

## Abstract An improved method for obtaining a few eigenvalues and eigenvectors of the symmetric matrix system is presented: where **S** โ‰  **I**. The method allows us to handle larger systems more easily than any other known to the author. It requires the inversion of **S**, and __N__^3^ step, but

Matrix pseudo-spectroscopy: iterative ca
โœ Gregory A. Parker; Wei Zhu; Youhong Huang; David K. Hoffman; Donald J. Kouri ๐Ÿ“‚ Article ๐Ÿ“… 1996 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 597 KB

The method of diagonalizing Hermitian matrices based on a polynomial expansion of the Dirac delta function S( E -H) is further refined so as to accelerate the convergence. Improved choices of the bases used for subspace diagonalization of the matrix, along with accuracy controls and estimates, are i