Catálogo de publicaciones - libros

Compartir en
redes sociales


Advances in Neural Networks: 4th International Symposium on Neural Networks, ISNN 2007, Nanjing, China, June 3-7, 2007, Proceedings, Part III

Derong Liu ; Shumin Fei ; Zengguang Hou ; Huaguang Zhang ; Changyin Sun (eds.)

En conferencia: 4º International Symposium on Neural Networks (ISNN) . Nanjing, China . June 3, 2007 - June 7, 2007

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Computation by Abstract Devices; Computer Communication Networks; Algorithm Analysis and Problem Complexity; Discrete Mathematics in Computer Science; Artificial Intelligence (incl. Robotics); Pattern Recognition

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-72394-3

ISBN electrónico

978-3-540-72395-0

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

A Novel Global Hybrid Algorithm for Feedforward Neural Networks

Hongru Li; Hailong Li; Yina Du

A novel global optimization hybrid algorithm was presented for training neural networks in this paper. During the course of neural networks training, when the weights are being adjusted with Quasi-Newton(QN) method, the error function may be stuck in a local minimum. In order to solve this problem, a original Filled-Function was created and proved. It was combined with QN method to become a global optimization hybrid algorithm. When the net is trained with our new hybrid algorithm, if error function was tripped in a local minimal point, the new hybrid algorithm was able to help networks out of the local minimal point. After that, the weights could being adjusted until the global minimal point for weights vector was found. One illustrative example is used to demonstrate the effectiveness of the presented scheme.

- Feedforward Neural Networks | Pp. 9-16

Study on Relationship Between NIHSS and TCM-SSASD Based on the BP Neural Network Multiple Models Method

Zhang Yanxin; Xu Junfeng; Gao Ying; Hou Zhongsheng

In this paper, the complex nonlinear relationship between NIHSS and TCM-SSASD is analyzed. Much method based on the BP neural networks is induced to approximate this complex nonlinear relationship. At the same time, two schemes of multiple models are proposed to improve the approximation performance of the BP neural network. Through the comparison among these schemes, it is shown that there is exact complex nonlinear relationship between the two diagnosis sheets. The works in the paper would be guidance for the diagnosis of the Apoplexy Syndromes and assistant for the further study of the relation between Western medicine and Traditional Chinese Medicine.

- Feedforward Neural Networks | Pp. 17-25

Momentum BP Neural Networks in Structural Damage Detection Based on Static Displacements and Natural Frequencies

Xudong Yuan; Chao Gao; Shaoxia Gao

Modeling error, measured noises and incomplete measured data are main difficulties for many structural damage processes being utilized. In this study, using static displacements and frequencies constitutes the input parameter vectors for neural networks. A damage numerical verification study on a five-bay truss was carried out by using an improved momentum BP neural network. Identification results indicate that the neural networks have excellent capability to identify structural damage location and extent under the conditions of limited noises and incomplete measured data.

- Feedforward Neural Networks | Pp. 35-40

Deformation Measurement of the Large Flexible Surface by Improved RBFNN Algorithm and BPNN Algorithm

Xiyuan Chen

The Radial Basis Function (RBF) Neural Network (NN) is one of the approaches which has shown a great promise in this sort of problems because of its faster learning capacity. This paper presents the information fusion method based on improved RBFNN to deduce the deformation information of the whole flexible surface considering the complexity of the deformation of the large flexible structure. A distributed Strapdown Inertial Units (SIU) information fusion model for deformation measurement of the large flexible structure is presented. Comparing with the modeling results by improved RBFNN and back propagation (BP) NN, the simulation on a simple thin plate model shows that the information fusion based on improved RBFNN is effective and has higher precision than based on BPNN.

- Feedforward Neural Networks | Pp. 41-48

GA-Based Neural Network to Identification of Nonlinear Structural Systems

Grace S. Wang; Fu-Kuo Huang

The initial weights of neural network (NN) are randomly selected and thus the optimization algorithm used in the training of NN may get stuck in the local minimal. Genetic algorithm (GA) is a parallel and global search technique that searches multiple points, so it is more likely to obtain a global solution. In this regard, a new algorithm of combining GA and NN is proposed here. The GA is employed to exploit the initial weights and the NN is to obtain the network topology. Through the iterative process of selection, reproduction, cross over and mutation, the optimal weights can then be obtained. The proposed new algorithm is applied to the Duffing’s oscillator and Wen’s degrading nonlinear systems. Finally, the accuracy of this method is illustrated by comparing the results of the predicted response with the measured one.

- Feedforward Neural Networks | Pp. 57-65

Approximation Capability Analysis of Parallel Process Neural Network with Application to Aircraft Engine Health Condition Monitoring

Gang Ding; Shisheng Zhong

Parallel process neural network (PPNN) is a novel spatio-temporal artificial neural network. The approximation capability analysis is very important for the PPNN to enhance its adaptability to time series prediction. The approximation capability of the PPNN is analyzed in this paper, and it can be proved that the PPNN can approximate any continuous functional to any degree of accuracy. Finally, the PPNN is utilized to predict the iron concentration of the lubricating oil in the aircraft engine health condition monitoring to highlight the approximation capability of the PPNN, and the application test results also indicate that the PPNN can be used as a well predictive maintenance tool in the aircraft engine condition monitoring.

- Feedforward Neural Networks | Pp. 66-72

Tourism Room Occupancy Rate Prediction Based on Neural Network

Junping Du; Wensheng Guo; Ruijie Wang

We studied how to use neural network in the tourism room occupancy rate prediction in Beijing. We gave the result of prediction on room occupancy rate. The results of the experiment showed that the prediction of the room occupancy rate made by neural network is superior to the two methods of regression and naïve extrapolation which are often used.

- Feedforward Neural Networks | Pp. 80-84

Recurrent Neural Networks on Duty of Anomaly Detection in Databases

Jaroslaw Skaruz; Franciszek Seredynski

In the paper we present a new approach based on application of neural networks to detect SQL attacks. SQL attacks are those attacks that take advantage of using SQL statements to be performed. The problem of detection of this class of attacks is transformed to time series prediction problem. SQL queries are used as a source of events in a protected environment. To differentiate between normal SQL queries and those sent by an attacker, we divide SQL statements into tokens and pass them to our detection system, which predicts the next token, taking into account previously seen tokens. In the learning phase tokens are passed to recurrent neural network (RNN) trained by backpropagation through time (BPTT) algorithm. Teaching data are shifted by one token forward in time with relation to input. The purpose of the testing phase is to predict the next token in the sequence. All experiments were conducted on Jordan and Elman networks using data gathered from PHP Nuke portal. Experimental results show that the Jordan network outperforms the Elman network predicting correctly queries of the length up to ten.

- Recurrent Neural Networks | Pp. 85-94

Solving Variational Inequality Problems with Linear Constraints Based on a Novel Recurrent Neural Network

Youshen Xia; Jun Wang

Variational inequalities with linear inequality constraints are widely used in constrained optimization and engineering problems. By extending a new recurrent neural network [14], this paper presents a recurrent neural network for solving variational inequalities with general linear constraints in real time. The proposed neural network has one-layer projection structure and is amenable to parallel implementation. As a special case, the proposed neural network can include two existing recurrent neural networks for solving convex optimization problems and monotone variational inequality problems with box constraints, respectively. The proposed neural network is stable in the sense of Lyapunov and globally convergent to the solution under a monotone condition of the nonlinear mapping without the Lipschitz condition. Illustrative examples show that the proposed neural network is effective for solving this class of variational inequality problems.

- Recurrent Neural Networks | Pp. 95-104

Convergence of Gradient Descent Algorithm for a Recurrent Neuron

Dongpo Xu; Zhengxue Li; Wei Wu; Xiaoshuai Ding; Di Qu

Probabilistic convergence results of online gradient descent algorithm have been obtained by many authors for the training of recurrent neural networks with innitely many training samples. This paper proves deterministic convergence of o2ine gradient descent algorithm for a recurrent neural network with nite number of training samples. Our results can be hopefully extended to more complicated recurrent neural networks, and serve as a complementary result to the existing probability convergence results.

- Recurrent Neural Networks | Pp. 117-122