Catálogo de publicaciones - libros

Compartir en
redes sociales


Advances in Natural Computation: 1st International Conference, ICNC 2005, Changsha, China, August 27-29, 2005, Proceedings, Part I

Lipo Wang ; Ke Chen ; Yew Soon Ong (eds.)

En conferencia: 1º International Conference on Natural Computation (ICNC) . Changsha, China . August 27, 2005 - August 29, 2005

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Artificial Intelligence (incl. Robotics); Image Processing and Computer Vision; Computation by Abstract Devices; Algorithm Analysis and Problem Complexity; Pattern Recognition; Evolutionary Biology

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2005 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-28323-2

ISBN electrónico

978-3-540-31853-8

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2005

Tabla de contenidos

Training Data Selection for Support Vector Machines

Jigang Wang; Predrag Neskovic; Leon N. Cooper

In recent years, support vector machines (SVMs) have become a popular tool for pattern recognition and machine learning. Training a SVM involves solving a constrained quadratic programming problem, which requires large memory and enormous amounts of training time for large-scale problems. In contrast, the SVM decision function is fully determined by a small subset of the training data, called support vectors. Therefore, it is desirable to remove from the training set the data that is irrelevant to the final decision function. In this paper we propose two new methods that select a subset of data for SVM training. Using real-world datasets, we compare the effectiveness of the proposed data selection strategies in terms of their ability to reduce the training set size while maintaining the generalization performance of the resulting SVM classifiers. Our experimental results show that a significant amount of training data can be removed by our proposed methods without degrading the performance of the resulting SVM classifiers.

- Statistical Neural Network Models and Support Vector Machines | Pp. 554-564

Modelling of Chaotic Systems with Recurrent Least Squares Support Vector Machines Combined with Reconstructed Embedding Phase Space

Zheng Xiang; Taiyi Zhang; Jiancheng Sun

A new strategy of modelling of chaotic systems is presented. First, more information is acquired utilizing the reconstructed embedding phase space. Then, based on the Recurrent Least Squares Support Vector Machines (RLS-SVM), modelling of the chaotic system is realized. We use the power spectrum and dynamic invariants involving the Lyapunov exponents and the correlation dimension as criterions, and then apply our method to the Chua‘s circuit time series. The simulation of dynamic invariants between the origin and generated time series shows that the proposed method can capture the dynamics of the chaotic time series effectively.

- Statistical Neural Network Models and Support Vector Machines | Pp. 573-581

Fuzzy Support Vector Machines Based on —Cut

Shengwu Xiong; Hongbing Liu; Xiaoxiao Niu

A new Fuzzy Support Vector Machines (—FSVMs) based on —cut is proposed in this paper. The proposed learning machines combine the membership of fuzzy set with support vector machines. The —cut set is introduced to distinguish the training samples set in term of the importance of the data. The more important sets are selected as new training sets to construct the fuzzy support vector machines. The benchmark two-class problems and multi-class problems datasets are used to test the effectiveness and validness of —FSVMs. The experiment results indicate that —FSVMs not only has higher precision but also solves the overfitting problem of the support vector machines more effectively.

- Statistical Neural Network Models and Support Vector Machines | Pp. 592-600

Mixtures of Kernels for SVM Modeling

Yan-fei Zhu; Lian-fang Tian; Zong-yuan Mao; Wei LI

Kernels are employed in Support Vector Machines (SVM) to map the nonlinear model into a higher dimensional feature space where the linear learning is adopted. The characteristic of kernels has a great impact on learning and predictive results of SVM. Good characteristic for fitting may not represents good characteristic for generalization. After the research on two kinds of typical kernels—global kernel (polynomial kernel) and local kernel (RBF kernel), a new kind of SVM modeling method based on mixtures of kernels is proposed. Through the implementation in Lithopone calcination process, it demonstrates the good performance of the proposed method compared to single kernel.

- Statistical Neural Network Models and Support Vector Machines | Pp. 601-607

Recurrent Support Vector Machines in Reliability Prediction

Wei-Chiang Hong; Ping-Feng Pai; Chen-Tung Chen; Ping-Teng Chang

Support vector machines (SVMs) have been successfully used in solving nonlinear regression and times series problems. However, the application of SVMs for reliability prediction is not widely explored. Traditionally, the recurrent neural networks are trained by the back-propagation algorithms. In the study, SVM learning algorithms are applied to the recurrent neural networks to predict system reliability. In addition, the parameter selection of SVM model is provided by Genetic Algorithms (GAs). A numerical example in an existing literature is used to compare the prediction performance. Empirical results indicate that the proposed model performs better than the other existing approaches.

- Statistical Neural Network Models and Support Vector Machines | Pp. 619-629

Gait Recognition via Independent Component Analysis Based on Support Vector Machine and Neural Network

Erhu Zhang; Jiwen Lu; Ganglong Duan

This paper proposes a method of automatic gait recognition using Fourier descriptors and independent component analysis (ICA) for the purpose of human identification at a distance. Firstly, a simple background generation algorithm is introduced to subtract the moving figures accurately and to obtain binary human silhouettes. Secondly, these silhouettes are described with Fourier descriptors and converted into associated one-dimension signals. Then ICA is applied to get the independent components of the signals. For reducing the computational cost, a fast and robust fixed-point algorithm for calculating ICs is adopted and a criterion how to select ICs is put forward. Lastly, the nearest neighbor (NN), support vector machine (SVM) and backpropagation neural network (BPNN) classifiers are chosen for recognition and this method is tested on the small UMD gait database and the NLPR gait database. Experimental results show that our method has encouraging recognition accuracy.

- Statistical Neural Network Models and Support Vector Machines | Pp. 640-649

An Incremental Learning Method Based on SVM for Online Sketchy Shape Recognition

Zhengxing Sun; Lisha Zhang; Enyi Tang

This paper presents briefly an incremental learning method based on SVM for online sketchy shape recognition. It can collect all classified results corrected by user and select some important samples as the retraining data according to their distance to the hyper-plane of the SVM-classifier. The classifier can then do incremental learning quickly on the newly added samples, and the retrained classifier can be adaptive to the user’s drawing styles. Experiment shows the effectiveness of the proposed method.

- Statistical Neural Network Models and Support Vector Machines | Pp. 655-659

Blind Extraction of Singularly Mixed Source Signals

Zhigang Zeng; Chaojin Fu

In this paper, a neural network model and its associate learning rule are developed for sequential blind extraction in the case that the number of observable mixed signals is less than the one of sources. This approach is also suitable for the case in which the mixed matrix is nonsingular. Using this approach, all separable sources can be extracted one by one. The solvability analysis of the problem is also presented, and the new solvable condition is weaker than existing solvable conditions in some literatures.

- Statistical Neural Network Models and Support Vector Machines | Pp. 664-667

Palmprint Recognition Based on Unsupervised Subspace Analysis

Guiyu Feng; Dewen Hu; Ming Li; Zongtan Zhou

As feature extraction techniques, Kernel Principal Component Analysis (KPCA) and Independent Component Analysis (ICA) can both be considered as generalization of Principal Component Analysis (PCA), which has been used for palmprint recognition and gained satisfactory results [3], therefore it is natural to wonder the performances of KPCA and ICA on this issue. In this paper, palmprint recognition using the KPCA and ICA methods is developed and compared with the PCA method. Based on the experimental results, some useful conclusions are drawn, which fits into the scene for a better picture about considering these unsupervised subspace classifiers for palmprint recognition.

- Statistical Neural Network Models and Support Vector Machines | Pp. 675-678

A New Alpha Seeding Method for Support Vector Machine Training

Du Feng; Wenkang Shi; Huawei Guo; Liangzhou Chen

In order to get good hyperparameters of SVM, user needs to conduct extensive cross-validation such as leave-one-out () cross-validation. Alpha seeding is often used to reduce the cost of SVM training. Compared with the existing schemes of alpha seeding, a new efficient alpha seeding method is proposed. Through some examples, its good performance has been proved. Interpretation from both geometrical and mathematical view is also given.

- Statistical Neural Network Models and Support Vector Machines | Pp. 679-682