Catálogo de publicaciones - libros

Compartir en
redes sociales


Advances in Neural Networks: 4th International Symposium on Neural Networks, ISNN 2007, Nanjing, China, June 3-7, 2007, Proceedings, Part I

Derong Liu ; Shumin Fei ; Zeng-Guang Hou ; Huaguang Zhang ; Changyin Sun (eds.)

En conferencia: 4º International Symposium on Neural Networks (ISNN) . Nanjing, China . June 3, 2007 - June 7, 2007

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Artificial Intelligence (incl. Robotics); Computation by Abstract Devices; Computer Communication Networks; Algorithm Analysis and Problem Complexity; Discrete Mathematics in Computer Science; Pattern Recognition

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-72382-0

ISBN electrónico

978-3-540-72383-7

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

Existence and Stability of Periodic Solution of Non-autonomous Neural Networks with Delay

Minghui Jiang; Xiaohong Wang; Yi Shen

The paper investigates the existence and global stability of periodic solution of non-autonomous neural networks with delay. Then the existence and uniqueness of periodic solutions of the neural networks are discussed in the paper. Moreover, criterion on stability of periodic solutions of the neural networks is obtained by using matrix function inequality, and algorithm for the criterion on the neural networks is provided. Result in the paper generalizes and improves the result in the existing references. In the end, an illustrate example is given to verify our results.

- Stability Analysis of Neural Networks | Pp. 952-957

Stability Analysis of Generalized Nonautonomous Cellular Neural Networks with Time-Varying Delays

Xiaobing Nie; Jinde Cao; Min Xiao

In this paper, a class of generalized nonautonomous cellular neural networks with time-varying delays are studied. By means of Lyapunov functional method, improved Young inequality  ≤  +  (0 ≤  ≤ 1,  +  = 1, > 0) and the homeomorphism theory, several sufficient conditions are given guaranteeing the existence, uniqueness and global exponential stability of the equilibrium point. The proposed results generalize and improve previous works. An illustrative example is also given to demonstrate the effectiveness of the proposed results.

- Stability Analysis of Neural Networks | Pp. 958-967

LMI-Based Approach for Global Asymptotic Stability Analysis of Discrete-Time Cohen-Grossberg Neural Networks

Sida Lin; Meiqin Liu; Yanhui Shi; Jianhai Zhang; Yaoyao Zhang; Gangfeng Yan

The global asymptotic stability of discrete-time Cohen–Grossberg neural networks (CGNNs) with or without time delays is studied in this paper. The CGNNs are transformed into discrete-time interval systems, and several sufficient conditions of asymptotic stability for these interval systems are derived by constructing some suitable Lyapunov functionals. The obtained conditions are given in the form of linear matrix inequalities that can be checked numerically and very efficiently by resorting to the MATLAB LMI Control Toolbox.

- Stability Analysis of Neural Networks | Pp. 968-976

Novel LMI Criteria for Stability of Neural Networks with Distributed Delays

Qiankun Song; Jianting Zhou

In this paper, the global asymptotic and exponential stability are investigated for a class of neural networks with distributed time-varying delays. By using appropriate Lyapunov-Krasovskii functional and linear matrix inequality (LMI) technique, two delay-dependent sufficient conditions in LMIs form are obtained to guarantee the global asymptotic and exponential stability of the addressed neural networks. The proposed stability criteria do not require the monotonicity of the activation functions and the differentiability of the distributed time-varying delays, which means that the results generalize and further improve those in the earlier publications. An example is given to show the effectiveness of the obtained condition.

- Stability Analysis of Neural Networks | Pp. 977-985

Asymptotic Convergence Properties of Entropy Regularized Likelihood Learning on Finite Mixtures with Automatic Model Selection

Zhiwu Lu; Xiaoqing Lu; Zhiyuan Ye

In finite mixture modelling, it is crucial to select the number of components for a data set. We have proposed an entropy regularized likelihood (ERL) learning principle for the finite mixtures to solve this model selection problem under regularization theory. In this paper, we further give an asymptotic analysis of the ERL learning, and find that the global minimization of the ERL function in a simulated annealing way (i.e., the regularization factor is gradually reduced to zero) leads to automatic model selection on the finite mixtures with a good parameter estimation. As compared with the EM algorithm, the ERL learning can go across the local minima of the negative likelihood and keep robust with respect to initialization. The simulation experiments then prove our theoretic analysis.

- Stability Analysis of Neural Networks | Pp. 986-993

Existence and Stability of Periodic Solutions for Cohen-Grossberg Neural Networks with Less Restrictive Amplification

Haibin Li; Tianping Chen

The existence and global asymptotic stability of a large class of Cohen-Grossberg neural networks is discussed in this paper. Previous papers always assume that the amplification function has positive lower and upper bounds, which excludes a large class of functions. In our paper, it is only needed that the amplification function is positive. Also, the model discussed is general, the method used is direct and the conditions needed are weak.

- Stability Analysis of Neural Networks | Pp. 994-1000

Global Exponential Convergence of Time-Varying Delayed Neural Networks with High Gain

Lei Zhang; Zhang Yi

This paper studies a general class of neural networks with time-varying delays and the neuron activations belong to the set of discontinuous monotone increasing functions. The discontinuities in the activations are an ideal model of the situation where the gain of the neuron amplifiers is very high. Because the delay in combination with high-gain nonlinearities is a particularly harmful source of potential instability, in the paper, conditions which ensure the global convergence of the neural network are derived.

- Stability Analysis of Neural Networks | Pp. 1001-1007

Differences in Input Space Stability Between Using the Inverted Output of Amplifier and Negative Conductance for Inhibitory Synapse

Min-Jae Kang; Ho-Chan Kim; Wang-Cheol Song; Junghoon Lee; Hee-Sang Ko; Jacek M. Zurada

In this paper, the difference between using the inverted neuron output and negative resistor for expressing inhibitory synapse is studied. We analyzed that the total conductance seen at the neuron input is different in these two methods. And this total conductance has been proved to effect on the system stability in this paper. Also, we proposed the method how to stabilize the input space and improve the system’s performance by adjusting the input conductance between neuron input and ground. Pspice is used for circuit level simulation.

- Stability Analysis of Neural Networks | Pp. 1015-1024

Positive Solutions of General Delayed Competitive or Cooperative Lotka-Volterra Systems

Wenlian Lu; Tianping Chen

In this paper, we investigate dynamical behavior of a general class of competitive or cooperative Lotka-Volterra systems with delays. Positive solutions and global stability of nonnegative equilibrium are discussed. Sufficient condition independent of delays guaranteeing existence of globally stable equilibrium is given. A Simulation verifying theoretical results is given, too.

- Stability Analysis of Neural Networks | Pp. 1034-1044

Dynamics of Continuous-Time Neural Networks and Their Discrete-Time Analogues with Distributed Delays

Lingyao Wu; Liang Ju; Lei Guo

Discrete-time analogues of continuous-time neural networks with continuously distributed delays and periodic inputs are introduced. The discrete-time analogues are considered to be numerical discretizations of the continuous-time networks and we study their dynamical characteristics. By employing Halanay-type inequality, we obtain easily verifiable sufficient conditions ensuring that every solutions of the discrete-time analogue converge exponentially to the unique periodic solutions. It is shown that the discrete-time analogues preserve the periodicity of the continuous-time networks.

- Stability Analysis of Neural Networks | Pp. 1054-1060