Catálogo de publicaciones - libros

Compartir en
redes sociales


Simulated Evolution and Learning: 6th International Conference, SEAL 2006, Hefei, China, October 15-18, 2006, Proceedings

Tzai-Der Wang ; Xiaodong Li ; Shu-Heng Chen ; Xufa Wang ; Hussein Abbass ; Hitoshi Iba ; Guo-Liang Chen ; Xin Yao (eds.)

En conferencia: 6º Asia-Pacific Conference on Simulated Evolution and Learning (SEAL) . Hefei, China . October 15, 2006 - October 18, 2006

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Computation by Abstract Devices; Artificial Intelligence (incl. Robotics); Simulation and Modeling; User Interfaces and Human Computer Interaction; Discrete Mathematics in Computer Science; Computer Appl. in Social and Behavioral Sciences

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2006 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-47331-2

ISBN electrónico

978-3-540-47332-9

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2006

Tabla de contenidos

A New Dynamic Particle Swarm Optimizer

Binbin Zheng; Yuanxiang Li; Xianjun Shen; Bojin Zheng

This paper presents a new optimization model— Dynamic Particle Swarm Optimizer (DPSO). A new acceptance rule that based on the principle of minimal free energy from the statistical mechanics is introduced to the standard particle swarm optimizer. A definition of the entropy of the particle system is given. Then the law of entropy increment is applied to control the algorithm. Simulations have been done to illustrate the significant and effective impact of this new rule on the particle swarm optimizer.

- Evolutionary Optimisation | Pp. 481-488

A Diversity-Guided Quantum-Behaved Particle Swarm Optimization Algorithm

Jun Sun; Wenbo Xu; Wei Fang

One of the primary complaints toward Particle Swarm Optimization (PSO) is the occurrence of premature convergence. Quantum-behaved Particle Swarm Optimization (QPSO), a novel variant of PSO, is a global convergent algorithm whose search strategy makes it own stronger global search ability than PSO. But like PSO and other evolutionary optimization technique, premature convergence in the QPSO is also inevitable and may deteriorate with the problem to be solved becoming more complex. In this paper, we propose a new Diversity-Guided QPSO (DGQPSO), in which a mutation operation is exerted on global best particle to prevent the swarm from clustering, enabling the particle to escape the sub-optima more easily. The DGQPSO, along with the PSO and QPSO, is tested on several benchmark functions for performance comparison. The experiment results show that the DGQPSO outperforms the PSO and QPSO in alleviating the premature convergence.

- Evolutionary Optimisation | Pp. 497-504

Multimodal Optimisation with Structured Populations and Local Environments

Grant Dick; Peter A. Whigham

Spatially-structured evolutionary algorithms are frequently implemented using a homogeneous environment throughout space. Such a configuration does not promote local adaptation of individuals in space. This paper introduces an evolutionary algorithm using space and localised environments to promote speciation. Surprisingly, a randomly generated “rugged” landscape appears to best support speciation by encouraging crossover between niches, while maintaining locally distinct species.

- Evolutionary Optimisation | Pp. 505-512

A Time Complexity Analysis of ACO for Linear Functions

Zhifeng Hao; Han Huang; Xili Zhang; Kun Tu

The time complexity analysis of ant colony optimization (ACO) is one of the open problems in ACO research. There has been little proposed work on this topic recently. In the present paper, two ACO algorithms (ACO I and ACO II) for linear functions with Boolean input are indicated, and their time complexity is estimated based on drift analysis which is a mathematical tool for analyzing evolutionary algorithms. It is proved that the algorithm ACO II can find the optimal solution with a polynomial time complexity. It is a preliminary work about estimating the time complexity of ACO, which should be improved in the future study.

- Evolutionary Optimisation | Pp. 513-520

Particle Swarm Optimization Based on Information Diffusion and Clonal Selection

Yanping Lv; Shaozi Li; Shuili Chen; Qingshan Jiang; Wenzhong Guo

A novel PSO algorithm called InformPSO is introduced in this paper. The premature convergence problem is a deficiency of PSOs. First, we analyze the causes of premature convergence for conventional PSO. Second, the principles of information diffusion and clonal selection are incorporated into the proposed PSO algorithm to achieve a better diversity and break away from local optima. Finally, when compared with several other PSO variants, it yields better performance on optimization of unimodal and multimodal benchmark functions.

- Evolutionary Optimisation | Pp. 521-528

Evolutionary Bayesian Classifier-Based Optimization in Continuous Domains

Teresa Miquélez; Endika Bengoetxea; Pedro Larrañaga

In this work, we present a generalisation to continuous domains of an optimization method based on evolutionary computation that applies Bayesian classifiers in the learning process. The main difference between other estimation of distribution algorithms (EDAs) and this new method –known as Evolutionary Bayesian Classifier-based Optimization Algorithms (EBCOAs)– is the way the fitness function is taken into account, as a new variable, to generate the probabilistic graphical model that will be applied for sampling the next population.

We also present experimental results to compare performance of this new method with other methods of the evolutionary computation field like evolution strategies, and EDAs. Results obtained show that this new approach can at least obtain similar performance as these other paradigms.

- Evolutionary Optimisation | Pp. 529-536

A Simple Ranking and Selection for Constrained Evolutionary Optimization

Ehab Z. Elfeky; Ruhul A. Sarker; Daryl L. Essam

Many optimization problems that involve practical applications have functional constraints, and some of these constraints are active, meaning that they prevent any solution from improving the objective function value beyond the constraint limits. Therefore, the optimal solution usually lies on the boundary of the feasible region. In order to converge faster when solving such problems, a new ranking and selection scheme is introduced which exploits this feature of constrained problems. In conjunction with selection, a new crossover method is also presented based on three parents. When comparing the results of this new algorithm with four other evolutionary based methods, using nine benchmark problems from the relevant literature, it shows very encouraging performance.

- Evolutionary Optimisation | Pp. 537-544

Population Climbing Evolutionary Algorithm for Multimodal Function Global Optimization

Chen Ziyi; Kang Lishan

This paper presents a population climbing evolutionary algorithm (PCEA) for solving function optimization containing multiple global optima. The algorithm combines a multi-parent crossover operator with the complete local search. The multi-parent crossover operator can enables individual to draw closer to each optimal solution,thus the population will be divided into subpopulations automatically , meanwhile, the local search is adopted to enable individual to converge to the nearest optimal solution which belongs to the same attractor. By this way, each individuals can converge to a global optima, then the population can maintain all global optima. Comparing with other algorithms, it has the following advantages.(1) The algorithm is very simple with little computation complexity .(2) Proposed algorithm needs no additional control parameter which depends on a special problem. The experiment results show that PCEA is very efficient for the optimization of multimodal functions, usually it can obtain all the global optimal solutions by running once of the algorithm.

- Evolutionary Optimisation | Pp. 553-559

Nucleus Classification and Recognition of Uterine Cervical Pap-Smears Using Fuzzy ART Algorithm

Kwang-Baek Kim; Sungshin Kim; Kwee-Bo Sim

Segmentation for the region of nucleus in the image of uterine cervical cytodiagnosis is known as the most difficult and important part in the automatic cervical cancer recognition system. In this paper, the region of nucleus is extracted from an image of uterine cervical cytodiagnosis using the HSI model. The characteristics of the nucleus are extracted from the analysis of morphemetric features, densitometric features, colorimetric features, and textural features based on the detected region of nucleus area. The classification criterion of a nucleus is defined according to the standard categories of the Bethesda system. The fuzzy ART algorithm is used to the extracted nucleus and the results show that the proposed method is efficient in nucleus recognition and uterine cervical Pap-Smears extraction.

- Hybrid Learning | Pp. 560-567

Learning Bayesian Networks Structures Based on Memory Binary Particle Swarm Optimization

Xiao-Lin Li; Shuang-Cheng Wang; Xiang-Dong He

This paper describes a new data mining algorithm to learn Bayesian networks structures based on memory binary particle swarm optimization method and the Minimum Description Length (MDL) principle. An memory binary particle swarm optimization (MBPSO) is proposed. A memory influence is added to a binary particle swarm optimization. The purpose of the added memory feature is to prevent and overcome premature convergence by providing particle specific alternate target points to be used at times instead of the best current position of the particle. In addition, our algorithm, like some previous work, does not need to have a complete variable ordering as input. The experimental results illustrate that our algorithm not only improves the quality of the solutions, but also reduces the time cost.

- Hybrid Learning | Pp. 568-574