Catálogo de publicaciones - libros

Compartir en
redes sociales


Simulated Evolution and Learning: 6th International Conference, SEAL 2006, Hefei, China, October 15-18, 2006, Proceedings

Tzai-Der Wang ; Xiaodong Li ; Shu-Heng Chen ; Xufa Wang ; Hussein Abbass ; Hitoshi Iba ; Guo-Liang Chen ; Xin Yao (eds.)

En conferencia: 6º Asia-Pacific Conference on Simulated Evolution and Learning (SEAL) . Hefei, China . October 15, 2006 - October 18, 2006

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Computation by Abstract Devices; Artificial Intelligence (incl. Robotics); Simulation and Modeling; User Interfaces and Human Computer Interaction; Discrete Mathematics in Computer Science; Computer Appl. in Social and Behavioral Sciences

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2006 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-47331-2

ISBN electrónico

978-3-540-47332-9

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2006

Tabla de contenidos

Dual Guidance in Evolutionary Multi-objective Optimization by Localization

Lam T. Bui; Kalyanmoy Deb; Hussein A. Abbass; Daryl Essam

In this paper, we propose a framework using local models for multi-objective optimization to guide the search heuristic in both the decision and objective spaces. The localization is built using a limited number of adaptive spheres in the decision space. These spheres are usually guided, using some direction information, in the decision space towards the areas with non-dominated solutions. We use a second mechanism to adjust the spheres to specialize on different parts of the Pareto front using the guided dominance technique in the objective space. With this dual guidance, we can easily guide spheres towards different parts of the Pareto front while also exploring the decision space efficiently.

- Evolutionary Optimisation | Pp. 384-391

Hybridisation of Particle Swarm Optimization and Fast Evolutionary Programming

Jingsong He; Zhengyu Yang; Xin Yao

Particle swarm optimization (PSO) and fast evolutionary programming (FEP) are two widely used population-based optimisation algorithms. The ideas behind these two algorithms are quite different. While PSO is very efficient in local converging to an optimum due to its use of directional information, FEP is better at global exploration and finding a near optimum globally. This paper proposes a novel hybridisation of PSO and FEP, i.e., fast PSO (FPSO), where the strength of PSO and FEP is combined. In particular, the ideas behind Gaussian and Cauchy mutations are incorporated into PSO. The new FPSO has been tested on a number of benchmark functions. The preliminary results have shown that FPSO outperformed both PSO and FEP significantly.

- Evolutionary Optimisation | Pp. 392-399

An Improved Particle Swarm Pareto Optimizer with Local Search and Clustering

Ching-Shih Tsou; Hsiao-Hua Fang; Hsu-Hwa Chang; Chia-Hung Kao

In this paper, the local search and clustering mechanism are incorporated into the Multi-Objective Particle Swarm Optimization (MOPSO). The local search mechanism prevents premature convergence, hence enhances the convergence of optimizer to true Pareto-optimal front. The clustering mechanism reduces the nondominated solutions to a handful number such that we can speed up the search and maintain the diversity of the nondominated solutions. The performance of this approach is evaluated on metrics from literature. The results against a three objectives optimization problem show that the proposed Pareto optimizer is competitive with the strength Pareto evolutionary algorithm (SPEA) in converging towards the front and generates a well-distributed nondominated set.

- Evolutionary Optimisation | Pp. 400-407

A Hybrid Genetic Algorithm for Solving a Class of Nonlinear Bilevel Programming Problems

Hecheng Li; Yuping Wang

In this paper, a special nonlinear bilevel programming problem (BLPP), in which the follower’s problem is a convex quadratic programming in , is transformed into an equivalent single-level programming problem by using Karush-Kuhn-Tucker(K-K-T) condition. To solve the equivalent problem effectively, firstly, a genetic algorithm is incorporated with algorithm. For fixed, the optimal solution of the follower’s problem can be obtained by algorithm, then (,) is a feasible or approximately feasible solution of the transformed problem and considered as a point in the population; secondly, based on the best individuals in the population, a special crossover operator is designed to generate high quality individuals; finally, a new hybrid genetic algorithm is proposed for solving this class of bilevel programming problems. The simulation on 20 benchmark problems demonstrates the effectiveness of the proposed algorithm.

- Evolutionary Optimisation | Pp. 408-415

A Synergistic Selection Strategy in the Genetic Algorithms

Ting Kuo

According to the Neo-Darwinist, natural selection can be classified into three categories: directional selection, disruptive selection, and stabilizing selection. Traditional genetic algorithms can be viewed as a process of evolution based on directional selection that gives more chances of reproduction to superior individuals. However, this strategy sometimes is myopic and is apt to trap the search into a local optimal. Should we restrict genetic algorithms to direction selection? No! First, we show that stabilizing selection and disruptive selection are complementary and that hybridize them may supersede directional selection. Then, we adopt an island model of parallel genetic algorithms on which two types of selection strategies are applied to two subpopulations that both evolve independently and migration is allowed between them periodically. Experimental results show that the cooperation of disruptive selection and stabilizing selection is an effective and robust way in the genetic algorithms.

- Evolutionary Optimisation | Pp. 416-423

Priority-Based Genetic Local Search and Its Application to the Traveling Salesman Problem

Jyh-Da Wei; D. T. Lee

Genetic algorithms and genetic local search are population based general-purpose search algorithms. Nevertheless, most of combinatorial optimization problems have critical requirements in their definition and are usually not easy to solve due to the difficulty in gene encoding. The traveling salesman problem is an example that requires each node to be visited exactly once. In this paper, we propose a genetic local search method with priority-based encoding. This method retains generality in applications, supports schema analysis during searching process, and is verified to gain remarkable search results for the traveling salesman problem.

- Evolutionary Optimisation | Pp. 424-432

Integrated the Simplified Interpolation and Clonal Selection into the Particle Swarm Optimization for Optimization Problems

Jing Wang; Xiaohua Zhang; Licheng Jiao

Particle Swarm Optimization (PSO) is gaining momentum as a simple and effective optimization technique. However, its performance on complex problem with multiple minima falls short of that of the Ant Clony Optimization (ACO) algorithm. The new algorithm, which we call Hybrid Particle Swarm Optimization, combines the ideas of particle swarm optimizati-on with clonal selection strategy and simplified quadratic interpolation (SQI), which is used to improve its local search ability, and to escape from the local optima. Simulation results on 14 benchmark test functions show that the hybrid algorithm is able to avoid the premature convergence and find much better solutions with high speed.

- Evolutionary Optimisation | Pp. 433-440

Selection Enthusiasm

A. Agrawal; I. Mitchell

Selection Enthusiasm is a technique that allows weaker individuals in a population to compete with stronger individuals. In essence, each time a individual is selected its enthusiasm for being selected again is diminished relatively; the converse happens to the unselected individuals i.e. their raw fitness is adjusted. Therefore the fitness of an individual is based on two parameters; objectiveness and Selected Enthusiasm. The effects of such a technique are measured and results show that using selection enthusiasism yields fitter individuals and a more diverse population.

- Evolutionary Optimisation | Pp. 449-456

Suggested Algorithms for Blood Cell Images

Weixing Wang; Qi Zhao; Wang Luo

Morphological mathematics is a powerful tool for image segmentation. The watershed is popularly used for multiple object images. In this paper, a new watershed algorithm without considering accurate boundary information is presented, for grey scale image segmentation, and Morphological mathematics is also used for cluster splitting. The result of running this algorithm shows that blood cell image can be well segmented using the proposed algorithm. A genetic algorithm is suggested for the further study.

- Evolutionary Optimisation | Pp. 465-472

An Adaptive Multi-objective Particle Swarm Optimization for Color Image Fusion

Yifeng Niu; Lincheng Shen

A novel algorithm of adaptive multi-objective particle swarm optimization (AMOPSO-II) is proposed and used to search the optimal color image fusion parameters, which can achieve the optimal fusion indices. First the algorithm of AMOPSO-II is designed; then the model of color image fusion in YUV color space is established, and the proper evaluation indices are given; and finally AMOPSO-II is used to search the optimal fusion parameters. AMOPSO-II uses a new crowding operator to improve the distribution of nondominated solutions along the Pareto front, and uses the uniform design to obtain the optimal combination of the parameters of AMOPSO-II. Experimental results indicate that AMOPSO-II has better exploratory capabilities than MOPSO and AMOPSO-I, and that the approach to color image fusion based on AMOPSO-II realizes the Pareto optimal color image fusion.

- Evolutionary Optimisation | Pp. 473-480