Catálogo de publicaciones - libros

Compartir en
redes sociales


Computational and Ambient Intelligence: 9th International Work-Conference on Artificial Neural Networks, IWANN 2007, San Sebastián, Spain, June 20-22, 2007. Proceedings

Francisco Sandoval ; Alberto Prieto ; Joan Cabestany ; Manuel Graña (eds.)

En conferencia: 9º International Work-Conference on Artificial Neural Networks (IWANN) . San Sebastián, Spain . June 20, 2007 - June 22, 2007

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Artificial Intelligence (incl. Robotics); Computation by Abstract Devices; Algorithm Analysis and Problem Complexity; Image Processing and Computer Vision; Pattern Recognition; Computational Biology/Bioinformatics

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-73006-4

ISBN electrónico

978-3-540-73007-1

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

F. J. Ruiz; C. Angulo; N. Agell; A. Català

This work presents a short introduction to the main ideas behind the design of specific kernel functions when used by machine learning algorithms, for example support vector machines, in the case that involved patterns are described by non-vectorial information. In particular the interval data case will be analysed as an illustrating example: explicit kernels based on the centre-radius diagram will be formulated for closed bounded intervals in the real line.

- Kernel Methods | Pp. 252-259

An EA Multi-model Selection for SVM Multiclass Schemes

G. Lebrun; O. Lezoray; C. Charrier; H. Cardot

Multiclass problems with binary SVM classifiers are commonly treated as a decomposition in several binary sub-problems. An open question is how to properly tune all these sub-problems (SVM hyperparameters) in order to have the lowest error rate for a SVM multiclass scheme based on decomposition. In this paper, we propose a new approach to optimize the generalization capacity of such SVM multiclass schemes. This approach consists in a global selection of hyperparameters for sub-problems all together and it is denoted as multi-model selection. A multi-model selection can outperform the classical individual model selection used until now in the literature. An evolutionary algorithm (EA) is proposed to perform multi-model selection. Experimentations with our EA method show the benefits of our approach over the classical one.

- Kernel Methods | Pp. 260-267

Classifier Complexity Reduction by Support Vector Pruning in Kernel Matrix Learning

V. Vijaya Saradhi; Harish Karnick

This paper presents an algorithm for reducing a classifier’s complexity by pruning support vectors in learning the kernel matrix. The proposed algorithm retains the ‘best’ support vectors such that the of support vectors, as defined by Vapnik and Chapelle, is as small as possible. Experiments on real world data sets show that the number of support vectors can be reduced in some cases by as much as 85% with little degradation in generalization performance.

- Kernel Methods | Pp. 268-275

Multi-classification with Tri-class Support Vector Machines. A Review

C. Angulo; L. González; A. Català; F. Velasco

In this article, with the aim to avoid the loss of information that occurs in the usual one-versus-one SVM decomposition procedure of the two-phases (decomposition, reconstruction) multi-classification scheme tri-class SVM approach is addressed. As the most relevant result, it will be demonstrated the robustness improvement of the proposed scheme based on tri-class machine versus that based on the bi-class machine.

- Kernel Methods | Pp. 276-283

Tuning L1-SVM Hyperparameters with Modified Radius Margin Bounds and Simulated Annealing

Javier Acevedo; Saturnino Maldonado; Philip Siegmann; Sergio Lafuente; Pedro Gil

In the design of support vector machines an important step is to select the optimal hyperparameters. One of the most used estimators of the performance is the Radius-Margin bound. Some modifications of this bound have been made to adapt it to soft margin problems, giving a convex optimization problem for the L2 soft margin formulation. However, it is still interesting to consider the L1 case due to the reduction in the support vector number. There have been some proposals to adapt the Radius-Margin bound to the L1 case, but the use of gradient descent to test them is not possible in some of them because these bounds are not differentiable. In this work we propose to use simulated annealing as a method to find the optimal hyperparameters when the bounds are not differentiable, have multiple local minima or the kernel is not differentiable with respect to its hyperparameters.

- Kernel Methods | Pp. 284-291

Well-Distributed Pareto Front by Using the Evolutionary Algorithm

J. M. Herrero; M. Martínez; J. Sanchis; X. Blasco

In the field of multiobjective optimization, important efforts have been made in recent years to generate global Pareto fronts uniformly distributed. A new multiobjective evolutionary algorithm, called , has been designed to converge towards , a reduced but well distributed representation of the Pareto set . The algorithm achieves good convergence and distribution of the Pareto front () with bounded memory requirements which are established with one of its parameters. Finally, a optimization problem of a three-bar truss is presented to illustrate the algorithm performance.

- Evolutionary and Genetic Algorithms | Pp. 292-299

The Parallel Single Front Genetic Algorithm (PSFGA) in Dynamic Multi-objective Optimization

Mario Cámara; Julio Ortega; Francisco de Toro

This paper analyzes the use of the, previously proposed, Parallel Single Front Genetic Algorithm (PSFGA) in applications in which the objective functions, the restrictions, and hence also solutions can change over the time. These dynamic optimization problems appear in quite different real applications with relevant socio-economic impacts. PSFGA uses a master process that distributes the population among the processors in the system (that evolve their corresponding solutions according to an island model), and collects and adjusts the set of local Pareto fronts found by each processor (this way, the master also allows an implicit communication among islands). The procedure exclusively uses non-dominated individuals for the selection and variation, and maintains the diversity of the approximation to the Pareto front by using a strategy based on a crowding distance.

- Evolutionary and Genetic Algorithms | Pp. 300-307

Exploring Macroevolutionary Algorithms: Some Extensions and Improvements

J. A. Becerra; V. Díaz Casás; R. J. Duro

Macroevolutionary Algorithms seem to work better than other Evolutionary Algorithms in problems characterized by having small populations where the evaluation of the individuals is computationally very expensive or is characterized by a very difficult search space with multiple narrow hyper-dimensional peaks and large areas between those peaks showing the same fitness value. This paper focuses on some aspects of Macroevolutionary Algorithms introducing some modifications that address weak points in the original algorithm, which are very relevant in some types of complex real world problems. All the modifications on the algorithm are tested in real world problems.

- Evolutionary and Genetic Algorithms | Pp. 308-315

Optimal Scheduling of Multiple Dam System Using Harmony Search Algorithm

Zong Woo Geem

Musician’s behavior-inspired harmony search (HS) algorithm was first applied to the optimal operation scheduling of a multiple dam system. The HS model tackled a popular benchmark system with four dams. Results showed that the HS model found five different global optimal solutions with identical maximum benefit from hydropower generation and irrigation, while enhanced GA model (real-value coding, tournament selection, uniform crossover, and modified uniform mutation) found only near-optimal solutions under the same number of function evaluations. Furthermore, the HS model arrived at the global optima without performing any sensitivity analysis of algorithm parameters whereas the GA model required tedious sensitivity analysis.

- Evolutionary and Genetic Algorithms | Pp. 316-323

CoEvRBFN: An Approach to Solving the Classification Problem with a Hybrid Cooperative-Coevolutive Algorithm

M. Dolores Pérez-Godoy; Antonio J. Rivera; M. José del Jesus; Ignacio Rojas

This paper presents a new cooperative-coevolutive algorithm for the design of Radial Basis Function Networks (RBFNs) for classification problems. The algorithm promotes a coevolutive environment where each individual represents a radial basis function (RBF) and the entire population is responsible for the final solution. As credit assignment three quality factors are considered which measure the role of the RBFs in the whole RBFN. In order to calculate the application probability of the coevolutive operators a Fuzzy Rule Base System has been used. The algorithm evaluation with different datasets has shown promising results.

- Evolutionary Learning | Pp. 324-332