Catálogo de publicaciones - libros

Compartir en
redes sociales


Linear Genetic Programming

Markus F. Brameier Wolfgang Banzhaf

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Artificial Intelligence (incl. Robotics); Theory of Computation; Computing Methodologies

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-0-387-31029-9

ISBN electrónico

978-0-387-31030-5

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer Science+Business Media, LLC 2007

Tabla de contenidos

Introduction

Markus F. Brameier; Wolfgang Banzhaf

Natural evolution has always been a source of inspiration for science and engineering. “Endless forms, most beautiful”, as Darwin put it. Who would not marvel at the unbelievable variety of life, who would not admire its complexity? A process that could bring about such wonders in nature, couldn't we glean from it some tricks that would be useful for our own activities? Couldn't we learn some methods for the design of other systems, for instance, machines and their behaviors? It is along these lines of thought that algorithms where conceived, able to catch aspects of natural evolution.

- Introduction | Pp. 1-9

Basic Concepts of Linear Genetic Programming

Markus F. Brameier; Wolfgang Banzhaf

In this chapter linear genetic programming (LGP) will be explored in further detail. The basis of the specific linear GP variant we want to investigate in this book will be described, in particular the programming language used for evolution, the representation of individuals, and the specific evolutionary algorithm employed. This will form the core of our LGP system, while fundamental concepts of linear GP will also be discussed, including various forms of program execution. Linear GP operates with imperative programs. All discussions and experiments in this book are conducted independently from a special type of programming language or processor architecture. Even though genetic programs are interpreted and partly noted in the high-level language C, the applied programming concepts exist principally in or may be translated into most modern imperative programming languages, down to the level of machine languages.

Part I - Fundamental Analysis | Pp. 13-34

Characteristics of the Linear Representation

Markus F. Brameier; Wolfgang Banzhaf

In the first instance linear genetic programming has been introduced for the benefit of shorter execution time. Genetic programs written as binary machine code do not have to pass through a time-consuming interpre- tation step (see Section 2.2). In this chapter we investigate other, more general features of the linear representation.One basic difference to a tree representation is the emergence of unused code parts in linear genetic pro- grams that are independent of program semantics. Another fundamental difference is that the data flow in a linear genetic program has a directed graph structure that is more general than a tree.

Part I - Fundamental Analysis | Pp. 35-62

A Comparison with Neural Networks

Markus F. Brameier; Wolfgang Banzhaf

We reported on LGP applied to a number of medical classification tasks. It was demonstrated that, on average, genetic programming performs competitive to RPROP neural networks with respect to the generalization performance.

The runtime performance of genetic programming becomes especially important for time-critical applications or when operating with large data sets from real-world domains like medicine. Two techniques were presented that reduced the computational costs significantly.

First, the elimination of noneffective code from linear genetic programs resulted in an average decrease in runtime of about a factor of 5 here. Second, by using a demetic population in combination with an elitist migration strategy the number of effective generations was reduced by a factor of about 3, without decreasing the performance of the evolutionary algorithm.

Part I - Fundamental Analysis | Pp. 63-74

Linear Genetic Operators I — Segment Variations

Markus F. Brameier; Wolfgang Banzhaf

Crossover has been the traditional operator in tree-based GP for varying the content and size of programs. In this chapter we systematically introduce crossover and mutation operators for the linear program representation and compare their influence on prediction performance and the complexity of evolved solutions.

We can distinguish between two different levels of variation done by these operators. operate on the instruction level (or ). In this perspective, an instruction represents the smallest unit. Micro variations operate on the level of instruction components () and manipulate registers, operators, and constants. Only macro variations influence program growth. Macro variations may be further divided into and

We will see that the performance of a variation operator strongly depends on its maximum (and average) step size on the symbolic program structure, on its influence on code growth, and on the proportion of effective and neutral variations. Among other things, macro mutations with minimum step size will turn out to be most effective provided that a change of the structurally effective code can be guaranteed. We will also investigate how linear genetic programs can be manipulated more efficiently through respecting their functional structure.

Part II - Method Design | Pp. 77-118

Linear Genetic Operators II — Instruction Mutations

Markus F. Brameier; Wolfgang Banzhaf

The experimental results from Chapter 5 have confirmed two important assumptions. First, when using recombination best results were obtained with a relatively low limit to segment length. Second, segment recombination has not been found to be more powerful than segment mutation. Both aspects motivate the use of mutations that affect instruction only. The following considerations try to point out why linear programs in particular are likely to be served better by using only minimal mutation operations.

Part II - Method Design | Pp. 119-148

Analysis of Control Parameters

Markus F. Brameier; Wolfgang Banzhaf

In the previous two chapters parameters have been analyzed that are closely related to one of the variation operators. In this chapter we analyze influences of more general system parameters that are relevant in linear genetic programming. In particular, the number of registers, the number of constants, the population size, and the maximum program length will be studied. Additionally, we compare different initialization techniques for linear genetic programs. Test problems are again the approximation problem mexican hat and the classification problem introduced in Section 5.8.1.

Part II - Method Design | Pp. 149-171

A Comparison with Tree-Based Genetic Programming

Markus F. Brameier; Wolfgang Banzhaf

In this chapter a comparison between the linear representation and the traditional tree representation of genetic programming is performed. The comparison examines prediction performance and model size based on two collections of benchmark problems that have been composed of artificial test problems and of real-world applications from bioinformatics, respectively. Both linear GP and tree-based GP use crossover for macro variations. Additionally, we apply the linear GP variant from Section 6.2.3 which works exclusively with (effective) instruction mutations to compare its performance for a larger number of problems. But first of all, we introduce tree-based GP in some further detail.

Part II - Method Design | Pp. 173-192

Control of Diversity and Variation Step Size

Markus F. Brameier; Wolfgang Banzhaf

In this chapter we will investigate structural and semantic distance metrics for linear genetic programs. A causal connection between changes of genotypes and of phenotypes form a necessary condition for being able to control differences between genetic programs. The two objectives of this chapter are to show [1] how distance information between individuals can be used to control structural diversity of individuals and [2] how variation distance on effective code can be controlled probabilistically with different linear genetic operators.

Part III - Advanced Techniques and Phenomena | Pp. 195-223

Code Growth and Neutral Variations

Markus F. Brameier; Wolfgang Banzhaf

This chapter brings together theories about neutral variations and code growth in genetic programming. We argue that neutral variations are important for the growth of code in GP runs. Other existing theories about code growth are verified for linear GP and are partly reevaluated from a different perspective.

In evolutionary computation neutral variations are argued to explore flat regions of the fitness landscape while non-neutral variations exploit regions with gradient information. We investigate the influence of different variation effects on growth of code and the prediction quality for different kinds of variation operators. It is well known that a high proportion of neutral code (introns) in genetic programs may increase the probability for variations to be neutral. But which type of variation creates the introns in the first place? For linear GP with minimum mutation, step size results show that neutral variations almost exclusively represent a cause for (rather than a result of) the emergence and growth of intron code. This part of the chapter is a continuation of our earlier studies [25].

We also examine different linear genetic operators regarding an implicit length bias. In contrast to an explicit bias, implicit bias does not result from the dynamics of the operator alone, but requires the existence of a fitness pressure.

We will close this chapter with a discussion about how to control code growth in linear GP. Different approaches are reviewed including variation-based methods and selection-based methods. Both may be applied specifically to effective code and/or to noneffective code.

Part III - Advanced Techniques and Phenomena | Pp. 225-260