Catálogo de publicaciones - libros
Advances in Neural Networks: 4th International Symposium on Neural Networks, ISNN 2007, Nanjing, China, June 3-7, 2007, Proceedings, Part III
Derong Liu ; Shumin Fei ; Zengguang Hou ; Huaguang Zhang ; Changyin Sun (eds.)
En conferencia: 4º International Symposium on Neural Networks (ISNN) . Nanjing, China . June 3, 2007 - June 7, 2007
Resumen/Descripción – provisto por la editorial
No disponible.
Palabras clave – provistas por la editorial
Computation by Abstract Devices; Computer Communication Networks; Algorithm Analysis and Problem Complexity; Discrete Mathematics in Computer Science; Artificial Intelligence (incl. Robotics); Pattern Recognition
Disponibilidad
Institución detectada | Año de publicación | Navegá | Descargá | Solicitá |
---|---|---|---|---|
No detectada | 2007 | SpringerLink |
Información
Tipo de recurso:
libros
ISBN impreso
978-3-540-72394-3
ISBN electrónico
978-3-540-72395-0
Editor responsable
Springer Nature
País de edición
Reino Unido
Fecha de publicación
2007
Información sobre derechos de publicación
© Springer-Verlag Berlin Heidelberg 2007
Tabla de contenidos
An Improved SVM Based on 1-Norm for Selection of Personal Credit Scoring Index System
Xin Xue; Guoping He
The selection of evaluating index system is the key to personal credit scoring, which is a feature selection problem.By improving the typical SVM based on 1-norm, which can select the important and necessary feature of samples, an improved SVM based on 1-norm adapted to the selection of personal credit scoring index system is proposed. Experimental results shows that the new improved method can select evaluating index system with small scale and enhance the generality ability and reduce the arithmetic complexity of the classification machine.
- Support Vector Machines | Pp. 441-447
Combining Weighted SVMs and Spectrum-Based NN for Multi-classification
Ling Ping; Lu Nan; Wang Jian-yu; Zhou Chun-Guang
This paper presents a Multi-Classification Schema (MCS) which combines Weighted SVMs (WSVM) and Spectrum-based NN (SNN). Basic SVM is equipped with belief coefficients to reveal its capacity in identifying classes. And basic SVM is built in individual feature space to bring adaptation to diverse training data context. Coupled with a weighted voting strategy and a local informative metric, SNN is used to address the case rejected by all basic classifiers. The local metric is derived from most discriminant directions carried by data spectrum information. Two strategies of MCS benefit computational cost: training dataset reduction, and pre-specification of SNN working set. Experiments on real datasets show MCS improves classification accuracy with moderate cost compared with the state of the art.
- Support Vector Machines | Pp. 448-453
SVM Based Adaptive Inverse Controller for Excitation Control
Xiaofang Yuan; Yaonan Wang
An adaptive inverse controller based on support vector machines (SVM) was designed for excitation control. Two SVM networks were utilized in the controller, one is SVM identifier (SVMI) and the other is SVM inverse controller (SVMC). The plant was identified by SVMI, which provided the sensitivity information of the plant to SVMC. SVMC was established using inverse system method as the pseudo-inverse model. Both SVMI and SVMC are offline learned firstly and are online trained using back propagation algorithm. To guarantee convergence and for faster learning, adaptive learning rates and convergence theorems are developed. Simulations show that this controller has better performance in system damping and transient improvement.
- Support Vector Machines | Pp. 469-478
A Fast and Accurate Progressive Algorithm for Training Transductive SVMs
Lei Wang; Huading Jia; Shixin Sun
This paper develops a fast and accurate algorithm for training transductive SVMs classifiers, which utilizes the classification information of unlabeled data in a progressive way. For improving the generalization accuracy further, we employ three important criteria to enhance the algorithm, i.e. confidence evaluation, suppression of labeled data, stopping with stabilization. Experimental results on several real world datasets confirm the effectiveness of these criteria and show that the new algorithm can reach to comparable accuracy as several state-of-the-art approaches for training transductive SVMs in much less training time.
- Support Vector Machines | Pp. 497-505
Fast Support Vector Data Description Using K-Means Clustering
Pyo Jae Kim; Hyung Jin Chang; Dong Sung Song; Jin Young Choi
Support Vector Data Description (SVDD) has a limitation for dealing with a large data set in which computational load drastically increases as training data size becomes large. To handle this problem, we propose a new fast SVDD method using K-means clustering method. Our method uses strategy; trains each decomposed sub-problems to get support vectors and retrains with the support vectors to find a global data description of a whole target class. The proposed method has a similar result to the original SVDD and reduces computational cost. Through experiments, we show efficiency of our method.
- Support Vector Machines | Pp. 506-514
A Kernel-Based Two-Stage One-Class Support Vector Machines Algorithm
Chi-Yuan Yeh; Shie-Jue Lee
One-class SVM is a kernel-based method that utilizes the kernel trick for data clustering. However it is only able to detect one cluster of non-convex shape in the input space. In this study, we propose an iterative two-stage one-class SVM to cluster data into several groups. In the first stage, one-class SVM is used to find an optimal weight vector for each cluster in the feature space, while in the second stage the weight vector is used to refine the clustering result. A mechanism is provided to control the optimal hyperplane to work against outliers. Experimental results have shown that our method compares favorably with other kernel based clustering algorithms, such as KKM and KFCM on several synthetic data sets and UCI real data sets.
- Support Vector Machines | Pp. 515-524
A Confident Majority Voting Strategy for Parallel and Modular Support Vector Machines
Yi-Min Wen; Bao-Liang Lu
Support vector machines (SVMs) have been accepted as a fashionable method in machine learning community, but they cannot be easily scaled to handle large scale problems because their time and space complexities are around quadratic in the number of training samples. To overcome this drawback of conventional SVMs, we propose a new majority voting (CMV) strategy for SVMs in this paper. We call the SVMs using the CMV strategy CMV-SVMs. In CMV-SVMs, a large-scale problem is divided into many smaller and simpler sub-problems in training phase and some confident component classifiers are chosen to vote for the final outcome in test phase. We compare CMV-SVMs with the standard SVMs and parallel SVMs using majority voting (MV-SVMs) on several benchmark problems. The experiments show that the proposed method can significantly reduce the overall time consumed in both training and test. More importantly, it can produce classification accuracy, which is almost the same as that of standard SVMs and better than that of MV-SVMs.
- Support Vector Machines | Pp. 525-534
Soft-Sensor Method Based on Least Square Support Vector Machines Within Bayesian Evidence Framework
Wei Wang; Tianmiao Wang; Hongxing Wei; Hongming Zhao
Based on the character and requirement of the dynamic weighing of loader, the soft sensor technique was adapted as the weighing method, and the least square support vector machine (LS-SVM) as its modelling method. Also the Bayesian evidence framework was used in LS-SVM for selecting and tuning its parameter. And then, after the nonlinear regression algorithms of LS-SVM and the principle of Bayesian evidence framework were introduced, the soft sensor model based on LS-SVM was given. In the end, emulation analysis results indicate that soft-sensor method based on LS-SVM within Bayesian evidence framework is a valid means for solving dynamic weighing of loader.
- Support Vector Machines | Pp. 535-544
Extension Neural Network Based on Immune Algorithm for Fault Diagnosis
Changcheng Xiang; Xiyue Huang; Gang Zhao; Zuyuan Yang
In this paper, the extension neural network (ENN) is proposed.To tune the weights of the ENN for achieving good clustering performance, the immune algorithm(IA) is applied to learning the ENN’s weights, which is replaced the BP algorithm. The affinity degree between the antibody and the antigen is measured by extension distance (ED), which is modified to the conjunction function(CF) in Extensions. The learning speed of the proposed ENN is shown to be faster than the traditional neural networks and other fuzzy classification methods. Moreover, the immune learning algorithm has been proved to have high accuracy and less memory consumption. Experimental results from two different examples verify the effectiveness and applicability of the proposed work.
- Fault Diagnosis/Detection | Pp. 553-560
Gear Fault Diagnosis by Using Wavelet Neural Networks
Y. Kang; C. C. Wang; Y. P. Chang
Fault diagnosis in gear train system is important in order to transmitting power effectively. The artificial intelligent such as neural network is widely used in fault diagnosis and already substituted for traditional methods such as kurtosis method, time analysis and so on. The symptoms of vibration signals in frequency domains have been used as inputs to the neural network and diagnosis results are obtained by network computation. This study presents gear fault diagnosis by using wavelet neural networks (WNN) and Morlet wavelet is used as the activation function in hidden layer of back-propagation neural networks (BPNN). Furthermore, the diagnosis results are compared within both methods of WNN and BPNN in four gear cases.
- Fault Diagnosis/Detection | Pp. 580-588