Catálogo de publicaciones - libros
Computational and Ambient Intelligence: 9th International Work-Conference on Artificial Neural Networks, IWANN 2007, San Sebastián, Spain, June 20-22, 2007. Proceedings
Francisco Sandoval ; Alberto Prieto ; Joan Cabestany ; Manuel Graña (eds.)
En conferencia: 9º International Work-Conference on Artificial Neural Networks (IWANN) . San Sebastián, Spain . June 20, 2007 - June 22, 2007
Resumen/Descripción – provisto por la editorial
No disponible.
Palabras clave – provistas por la editorial
Artificial Intelligence (incl. Robotics); Computation by Abstract Devices; Algorithm Analysis and Problem Complexity; Image Processing and Computer Vision; Pattern Recognition; Computational Biology/Bioinformatics
Disponibilidad
Institución detectada | Año de publicación | Navegá | Descargá | Solicitá |
---|---|---|---|---|
No detectada | 2007 | SpringerLink |
Información
Tipo de recurso:
libros
ISBN impreso
978-3-540-73006-4
ISBN electrónico
978-3-540-73007-1
Editor responsable
Springer Nature
País de edición
Reino Unido
Fecha de publicación
2007
Información sobre derechos de publicación
© Springer-Verlag Berlin Heidelberg 2007
Tabla de contenidos
Surface Modelling with Radial Basis Functions Neural Networks Using Virtual Environments
Miguel Ángel López; Héctor Pomares; Miguel Damas; Antonio Díaz-Estrella; Alberto Prieto; Francisco Pelayo; Eva María de la Plaza Hernández
Modelling capabilities of Radial Basis Function Neural Networks (RBFNNs) are very dependent on four main factors: the number of neurons, the central location of each neuron, their associated weights and their widths (radii). In order to model surfaces defined, for example, as , it is common to use tri-dimensional gaussian functions with centres in the domain. In this scenario, it is very useful to have visual environments where the user can interact with every radial basis function, modify them, inserting and removing them, thus visually attaining an initial configuration as similar as possible to the surface to be approximated. In this way, the user (the novice researcher) can learn how every factor affects the approximation capability of the network, thus gaining important knowledge about how algorithms proposed in the literature tend to improve the approximation accuracy. This paper presents a didactic tool we have developed to facilitate the understanding of surface modelling concepts with ANNs in general and of RBFNNs in particular, with the aid of a virtual environment.
- Improving Models and Learning Procedures | Pp. 170-177
A New Learning Strategy for Classification Problems with Different Training and Test Distributions
Óscar Pérez; Manuel Sánchez-Montañés
Standard machine learning techniques assume that the statistical structure of the training and test datasets are the same (i.e. same attribute distribution (), and same class distribution (|)). However, in real prediction problems this is not usually the case for different reasons. For example, the training set is not usually representative of the whole problem due to sample selection biases during its acquisition. In addition, the measurement biases in training could be different than in test (for example, when the measurement devices are different). Another reason is that in real prediction tasks the statistical structure of the classes is not usually static but evolves in time, and there is usually a time lag between training and test sets. Due to these different problems, the performance of a learning algorithm can severely degrade. Here we present a new learning strategy that constructs a classifier in two steps. First, the labeled examples of the training set are used for constructing a statistical model of the problem. In the second step, the model is improved using the unlabeled patterns of the test set by means of a novel extension of the Expectation-Maximization (EM) algorithm presented here. We show the convergence properties of the algorithm and illustrate its performance with an artificial problem. Finally we demonstrate its strengths in a heart disease diagnosis problem where the training set is taken from a different hospital than the test set.
- Improving Models and Learning Procedures | Pp. 178-185
Gaussian Fitting Based FDA for Chemometrics
Tuomas Kärnä; Amaury Lendasse
In Functional Data Analysis (FDA) multivariate data are considered as sampled functions. We propose a non-supervised method for finding a good function basis that is built on the data set. The basis consists of a set of Gaussian kernels that are optimized for an accurate fitting. The proposed methodology is experimented with two spectrometric data sets. The obtained weights are further scaled using a Delta Test (DT) to improve the prediction performance. Least Squares Support Vector Machine (LS-SVM) model is used for estimation.
- Improving Models and Learning Procedures | Pp. 186-193
Two Pages Graph Layout Via Recurrent Multivalued Neural Networks
Domingo López-Rodríguez; Enrique Mérida-Casermeiro; Juan M. Ortíz-de-Lazcano-Lobato; Gloria Galán-Marín
In this work, we propose the use of two neural models performing jointly in order to minimize the same energy function. This model is focused on obtaining good solutions for the two pages book crossing problem, although some others problems can be efficiently solved by the same model. The neural technique applied to this problem allows to reduce the energy function by changing outputs from both networks –outputs of first network representing location of nodes in the nodes line, while the outputs of the second one meaning the half-plane where the edges are drawn.
Detailed description of the model is presented, and the technique to minimize an energy function is fully described. It has proved to be a very competitive and efficient algorithm, in terms of quality of solutions and computational time, when compared to the state-of-the-art methods. Some simulation results are presented in this paper, to show the comparative efficiency of the methods.
- Improving Models and Learning Procedures | Pp. 194-202
Speeding Up the Dissimilarity Self-Organizing Maps by Branch and Bound
Brieuc Conan-Guez; Fabrice Rossi
This paper proposes to apply the branch and bound principle from combinatorial optimization to the Dissimilarity Self-Organizing Map (DSOM), a variant of the SOM that can handle dissimilarity data. A new reference model optimization method is derived from this principle. Its results are strictly identical to those of the original DSOM algorithm by Kohonen and Somervuo, while its running time is reduced by a factor up to 2.5 compared to the one of the previously proposed optimized implementation.
- Self-organizing Networks | Pp. 203-210
Self-organization of Probabilistic PCA Models
Ezequiel López-Rubio; Juan Miguel Ortiz-de-Lazcano-Lobato; Domingo López-Rodríguez; María del Carmen Vargas-González
We present a new neural model, which extends Kohonen’s self-organizing map (SOM) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. Several self-organizing maps have been proposed in the literature to capture the local principal subspaces, but our approach offers a probabilistic model at each neuron while it has linear complexity on the dimensionality of the input space. This allows to process very high dimensional data to obtain reliable estimations of the local probability densities which are based on the PPCA framework. Experimental results are presented, which show the map formation capabilities of the proposal with high dimensional data.
- Self-organizing Networks | Pp. 211-218
A New Adaptation of Self-Organizing Map for Dissimilarity Data
Tien Ho-Phuoc; Anne Guérin-Dugué
Adaptation of the Self-Organizing Map to dissimilarity data is of a growing interest. For many applications, vector representation is not available and but only proximity data (distance, dissimilarity, similarity, ranks ...). In this article, we present a new adaptation of the SOM algorithm which is compared with two existing ones. Three metrics for quality estimate (quantization and neighborhood) are used for comparison. Numerical experiments on artificial and real data show the algorithm quality. The strong point of the proposed algorithm comes from a more accurate prototype estimate which is one of the difficult parts of Dissimilarity SOM algorithms (DSOM).
- Self-organizing Networks | Pp. 219-226
Fusion of Self Organizing Maps
Carolina Saavedra; Rodrigo Salas; Sebastián Moreno; Héctor Allende
An important issue in data-mining is to find effective and optimal forms to learn and preserve the topological relations of highly dimensional input spaces and project the data to lower dimensions for visualization purposes.
In this paper we propose a novel ensemble method to combine a finite number of Self Organizing Maps, we called this model . In the fusion process the nodes with similar Voronoi polygons are merged in one fused node and the neighborhood relation is given by links that measures the similarity between these fused nodes. The aim of combining the SOM is to improve the quality and robustness of the topological representation of the single model.
Computational experiments show that the Fusion-SOM model effectively preserves the topology of the input space and improves the representation of the single SOM. We report the performance results using synthetic and real datasets, the latter obtained from a benchmark site.
- Self-organizing Networks | Pp. 227-234
ViSOM Ensembles for Visualization and Classification
Bruno Baruque; Emilio Corchado; Hujun Yin
In this paper ensemble techniques have been applied in the frame of topology preserving mappings in two applications: classification and visualization. These techniques are applied for the first time to the ViSOM and their performance is compared with ensemble combination of some other topology preserving mapping such as the SOM or the MLSIM. Several methods to obtain a meaningful combination of the components of an ensemble are presented and tested together with the existing ones in order to identify the best performing method in the applications of these models.
- Self-organizing Networks | Pp. 235-243
Adaptive Representation of Objects Topology Deformations with Growing Neural Gas
José García-Rodríguez; Francisco Flórez-Revuelta; Juan Manuel García-Chamizo
Self-organising neural networks try to preserve the topology of an input space by means of their competitive learning. This capacity has been used, among others, for the representation of objects and their motion. In this work we use a kind of self-organising network, the Growing Neural Gas, to represent deformations in objects along a sequence of images. As a result of an adaptive process the objects are represented by a topology representing graph that constitutes an induced Delaunay triangulation of their shapes. These maps adapt the changes in the objects topology without reset the learning process.
- Self-organizing Networks | Pp. 244-251