Catálogo de publicaciones - libros

Compartir en
redes sociales


Advances in Neural Networks: 4th International Symposium on Neural Networks, ISNN 2007, Nanjing, China, June 3-7, 2007, Proceedings, Part III

Derong Liu ; Shumin Fei ; Zengguang Hou ; Huaguang Zhang ; Changyin Sun (eds.)

En conferencia: 4º International Symposium on Neural Networks (ISNN) . Nanjing, China . June 3, 2007 - June 7, 2007

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Computation by Abstract Devices; Computer Communication Networks; Algorithm Analysis and Problem Complexity; Discrete Mathematics in Computer Science; Artificial Intelligence (incl. Robotics); Pattern Recognition

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-72394-3

ISBN electrónico

978-3-540-72395-0

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

Stabilizing Lagrange-Type Nonlinear Programming Neural Networks

Yuancan Huang

Inspired by the Lagrangian multiplier method with quadratic penalty function, which is widely used in Nonlinear Programming Theory, a Lagrange-type nonlinear programming neural network whose equilibria coincide with KKT pairs of the underlying nonlinear programming problem was devised with minor modification in regard to handling inequality constraints[1,2]. Of course, the structure of neural network must be elaborately conceived so that it is asymptotically stable. Normally this aim is not easy to be achieved even for the simple nonlinear programming problems. However, if the penalty parameters in these neural networks are taken as control variables and a control law is found to stabilize it, we may reasonably conjecture that the categories of solvable nonlinear programming problems will be greatly increased. In this paper, the conditions stabilizing the Lagrange-type neural network are presented and control-Lyapunov function approach is used to synthesize the adjusting laws of penalty parameters.

- Neural Networks for Optimization | Pp. 320-329

Incremental Nonlinear Proximal Support Vector Machine

Qiuge Liu; Qing He; Zhongzhi Shi

Proximal SVM (PSVM), which is a variation of standard SVM, leads to an extremely faster and simpler algorithm for generating a linear or nonlinear classifier than classical SVM. An efficient incremental method for linear PSVM classifier has been introduced, but it can’t apply to nonlinear PSVM and incremental technique is the base of online learning and large data set training. In this paper we focus on the online learning problem. We develop an incremental learning method for a new nonlinear PSVM classifier, utilizing which we can realize online learning of nonlinear PSVM classifier efficiently. Mathematical analysis and experimental results indicate that these methods can reduce computation time greatly while still hold similar accuracy.

- Support Vector Machines | Pp. 336-341

Machinery Fault Diagnosis Using Least Squares Support Vector Machine

Lingling Zhao; Kuihe Yang

In order to enhance fault diagnosis precision, an improved fault diagnosis model based on least squares support vector machine (LSSVM) is presented. In the model, the wavelet packet analysis and LSSVM are combined effectively. The power spectrum of fault signals are decomposed by wavelet packet analysis, which predigests choosing method of fault eigenvectors. And then the LSSVM is adopted to realize fault diagnosis. The non-sensitive loss function is replaced by quadratic loss function and the inequality constraints are replaced by equality constraints. Consequently, quadratic programming problem is simplified as the problem of solving linear equation groups, and the SVM algorithm is realized by least squares method. It is presented to choose parameter of kernel function in definite range by dynamic way, which enhances preciseness rate of recognition. The simulation results show the model has strong non-linear solution and anti-jamming ability, and it can effectively distinguish fault type.

- Support Vector Machines | Pp. 342-349

Support Vector Machine with Composite Kernels for Time Series Prediction

Tiejun Jiang; Shuzong Wang; Ruxiang Wei

In Support Vector Machine (SVM), Kernels are employed to map the nonlinear model into a higher dimensional feature space where the linear learning is adopted. The characteristics of kernels have great impacts on learning and predictive results of SVM. Considering the characteristics for fitting and generalization of two kinds of typical kernels–global kernel (polynomial kernel) and local kernel (RBF kernel), a new kind of SVM modeling method based on composite kernels is proposed. In order to evaluate the reasonable fitness of kernel functions, the particle swarm optimization (PSO) algorithm is used to adaptively evolve SVM to obtain the best prediction performance, in which each particle represented as a real vector corresponds to a set of the candidate parameters of SVM. Experiments in time series prediction demonstrate that the SVM with composite kernels has the better performance than with a single kernel.

- Support Vector Machines | Pp. 350-356

Neuromorphic Quantum-Based Adaptive Support Vector Regression for Tuning BWGC/NGARCH Forecast Model

Bao Rong Chang; Hsiu Fen Tsai

A prediction model, called BPNN-weighted grey model and cumulated 3-point least square polynomial (BWGC), is used for resolving the overshoot effect; however, it may encounter volatility clustering due to the lack of localization property. Thus, we incorporate the non-linear generalized autoregressive conditional heteroscedasticity (NGARCH) into BWGC to compensate for the time-varying variance of residual errors when volatility clustering occurs. Furthermore, in order for adapting both models optimally, a neuromorphic quantum-based adaptive support vector regression (NQASVR) is schemed to regularize the coefficients for both BWGC and NGARCH linearly to improve the generalization and the localization at the same time effectively.

- Support Vector Machines | Pp. 357-367

Modulation Classification of Analog and Digital Signals Using Neural Network and Support Vector Machine

Cheol-Sun Park; Dae Young Kim

Most of the algorithms proposed in the literature deal with the problem of digital modulation classification and consider classic probabilistic or decision tree classifiers. In this paper, we compare and analyze the performance of 2 neural network classifiers and 3 support vector machine classifiers (i.e. 1-v-r type, 1-v-1 type and DAG type multi-class classifier). This paper also deals with the modulation classification problems of classifying both analog and digital modulation signals in military and civilian communications applications. A total of 7 statistical signal features are extracted and used to classify 9 modulation signals. It is known that the existing technology is able to classify reliably (accuracy ≥ 90%) only at SNR above 10dB when a large range of modulation types including both digital and analog is being considered. Numerical simulations were conducted to compare performance of classifiers. Results indicated an overall success rate of over 95% at the SNR of 10dB in all classifiers. Especially, it was shown that 3 support vector machine classifiers can achieve the probabilities of correct classification (Pcc) of 96.0%, 97.3% and 97.8% at the SNR of 5dB, respectively.

- Support Vector Machines | Pp. 368-373

Tree-Structured Support Vector Machines for Multi-class Classification

Siyu Xia; Jiuxian Li; Liangzheng Xia; Chunhua Ju

In this paper, a non-balanced binary tree is proposed for extending support vector machines (SVM) to multi-class problems. The non-balanced binary tree is constructed based on the prior distribution of samples, which can make the more separable classes separated at the upper node of the binary tree. For an class problem, this method only needs -1 SVM classifiers in the training phase, while it has less than binary test when making a decision. Further, this method can avoid the unclassifiable regions that exist in the conventional SVMs. The experimental result indicates that maintaining comparable accuracy, this method is faster than other methods in classification.

- Support Vector Machines | Pp. 392-398

Online Least Squares Support Vector Machines Based on Wavelet and Its Applications

Qian Zhang; Fuling Fan; Lan Wang

As the conventional training algorithms of least squares support vector machines (LS-SVM) are inefficient in online applications, an online learning algorithm is proposed. The online algorithm is suitable for the large data set and practical applications where the data come in sequentially. Aiming at the characteristics of signals, a wavelet kernel satisfying wavelet frames is presented. The wavelet kernel can approximate arbitrary functions in quadratic continuous integral space, hence the generalization ability of LS-SVM is improved. To illustrate its favorable performance, the wavelet based online LS-SVM (WOLS-SVM) is applied to nonlinear system identification. The simulation results show that the WOLS-SVM outperforms the existing algorithms with higher learning efficiency as well as better accuracy, and indicate its effectiveness.

- Support Vector Machines | Pp. 416-425

Ultrasound Estimation of Fetal Weight with Fuzzy Support Vector Regression

Jin-Hua Yu; Yuan-Yuan Wang; Ping Chen; Yue-Hua Song

Accurate acquisition of expected fetal weight (EFW) based on ultrasound measurements is important to antenatal care. The accuracy of EFW is disturbed by random error of measurements and impropriety of regression method. There have been several studies using neural networks to improve estimation validity, but these methods are all on the premises of measurements accuracy. This paper utilizes the fuzzy logic to deal with the measurements inconsistence, while combines with the support vector regression (SVR) to pursue generalization ability. By this way, the suspect inaccurate measurements can have relatively less contributions to the learning of new fuzzy support vector regression (FSVR). Tests on a clinical database show that proposed algorithm can achieve 6.09% mean absolute percent error (MAPE) for testing group while the back-propagation algorithm and classical SVR achieve 8.95% and 7.23% MAPE respectively. Experimental results show the effectives of the proposed algorithm over traditional methods based on neural network.

- Support Vector Machines | Pp. 426-433

Hybrid Support Vector Machine and General Model Approach for Audio Classification

Xin He; Ling Guo; Xianzhong Zhou; Wen Luo

In recent years, the searching and indexing techniques for multimedia data are getting more attention in the area of multimedia databases. As many research works were done on the content-based retrieval of image and video data, less attention was received to the content-based retrieval of audio data. Audio is one of important multimedia information and there is a growing need for automatic audio indexing and retrieval techniques in recent years. Audio data contain abundant semantics and the audio signal processing can reduce computational complexity, so effective and efficient indexing and retrieval techniques for audio data are getting more attention. In this paper, problems of audio retrieval are discussed firstly. Then, main audio characteristics and features are introduced. Finally the combination of Support Vector Machine and General Model is described and the hybrid model is used in audio retrieval. Experiments show that the hybrid model is effective for audio classification.

- Support Vector Machines | Pp. 434-440