Catálogo de publicaciones - libros

Compartir en
redes sociales


Mining Sequential Patterns from Large Data Sets

Wei Wang Jiong Yang

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Data Mining and Knowledge Discovery; Database Management; Information Storage and Retrieval; Data Structures; Multimedia Information Systems; Computer Communication Networks

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2005 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-0-387-24246-0

ISBN electrónico

978-0-387-24247-7

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer Science+Business Media, Inc. 2005

Tabla de contenidos

Introduction

Wei Wang; Jiong Yang

Regarding the scientific method of experimentation, it is desirable to construct an accurate, reliable, consistent and non-arbitrary representation of multi-objective evolutionary algorithm (MOEA) architectures and performance over a variety of multi-objective optimization problems (MOPs). In particular, through the use of standard procedures and criteria, one should attempt to minimize the influence of bias or prejudice of the experimenter when testing a MOEA hypothesis. The design of each experiment must conform then to an accepted “standard” approach as reflected in any generic scientific method. When employing the scientific method, the detailed design of MOEA experiments can draw heavily from outlines presented by Barr et al. [93] and Jackson et al. [765]. These generic articles discuss computational experiment design for heuristic methods, providing guidelines for reporting results and ensuring their reproducibility. Specifically, they suggest that a well-designed experiment follows the following steps: 1. Define experimental goals; 2. Choose measures of performance - metrics; 3. Design and execute the experiment; 4. Analyze data and draw conclusions; 5. Report experimental results.

Pp. 1-3

Related Work

Wei Wang; Jiong Yang

Regarding the scientific method of experimentation, it is desirable to construct an accurate, reliable, consistent and non-arbitrary representation of multi-objective evolutionary algorithm (MOEA) architectures and performance over a variety of multi-objective optimization problems (MOPs). In particular, through the use of standard procedures and criteria, one should attempt to minimize the influence of bias or prejudice of the experimenter when testing a MOEA hypothesis. The design of each experiment must conform then to an accepted “standard” approach as reflected in any generic scientific method. When employing the scientific method, the detailed design of MOEA experiments can draw heavily from outlines presented by Barr et al. [93] and Jackson et al. [765]. These generic articles discuss computational experiment design for heuristic methods, providing guidelines for reporting results and ensuring their reproducibility. Specifically, they suggest that a well-designed experiment follows the following steps: 1. Define experimental goals; 2. Choose measures of performance - metrics; 3. Design and execute the experiment; 4. Analyze data and draw conclusions; 5. Report experimental results.

Pp. 5-12

Periodic Patterns

Wei Wang; Jiong Yang

Regarding the scientific method of experimentation, it is desirable to construct an accurate, reliable, consistent and non-arbitrary representation of multi-objective evolutionary algorithm (MOEA) architectures and performance over a variety of multi-objective optimization problems (MOPs). In particular, through the use of standard procedures and criteria, one should attempt to minimize the influence of bias or prejudice of the experimenter when testing a MOEA hypothesis. The design of each experiment must conform then to an accepted “standard” approach as reflected in any generic scientific method. When employing the scientific method, the detailed design of MOEA experiments can draw heavily from outlines presented by Barr et al. [93] and Jackson et al. [765]. These generic articles discuss computational experiment design for heuristic methods, providing guidelines for reporting results and ensuring their reproducibility. Specifically, they suggest that a well-designed experiment follows the following steps: 1. Define experimental goals; 2. Choose measures of performance - metrics; 3. Design and execute the experiment; 4. Analyze data and draw conclusions; 5. Report experimental results.

Pp. 13-61

Statistically Significant Patterns

Wei Wang; Jiong Yang

Regarding the scientific method of experimentation, it is desirable to construct an accurate, reliable, consistent and non-arbitrary representation of multi-objective evolutionary algorithm (MOEA) architectures and performance over a variety of multi-objective optimization problems (MOPs). In particular, through the use of standard procedures and criteria, one should attempt to minimize the influence of bias or prejudice of the experimenter when testing a MOEA hypothesis. The design of each experiment must conform then to an accepted “standard” approach as reflected in any generic scientific method. When employing the scientific method, the detailed design of MOEA experiments can draw heavily from outlines presented by Barr et al. [93] and Jackson et al. [765]. These generic articles discuss computational experiment design for heuristic methods, providing guidelines for reporting results and ensuring their reproducibility. Specifically, they suggest that a well-designed experiment follows the following steps: 1. Define experimental goals; 2. Choose measures of performance - metrics; 3. Design and execute the experiment; 4. Analyze data and draw conclusions; 5. Report experimental results.

Pp. 63-112

Approximate Patterns

Wei Wang; Jiong Yang

Regarding the scientific method of experimentation, it is desirable to construct an accurate, reliable, consistent and non-arbitrary representation of multi-objective evolutionary algorithm (MOEA) architectures and performance over a variety of multi-objective optimization problems (MOPs). In particular, through the use of standard procedures and criteria, one should attempt to minimize the influence of bias or prejudice of the experimenter when testing a MOEA hypothesis. The design of each experiment must conform then to an accepted “standard” approach as reflected in any generic scientific method. When employing the scientific method, the detailed design of MOEA experiments can draw heavily from outlines presented by Barr et al. [93] and Jackson et al. [765]. These generic articles discuss computational experiment design for heuristic methods, providing guidelines for reporting results and ensuring their reproducibility. Specifically, they suggest that a well-designed experiment follows the following steps: 1. Define experimental goals; 2. Choose measures of performance - metrics; 3. Design and execute the experiment; 4. Analyze data and draw conclusions; 5. Report experimental results.

Pp. 113-160

Conclusion Remark

Wei Wang; Jiong Yang

Regarding the scientific method of experimentation, it is desirable to construct an accurate, reliable, consistent and non-arbitrary representation of multi-objective evolutionary algorithm (MOEA) architectures and performance over a variety of multi-objective optimization problems (MOPs). In particular, through the use of standard procedures and criteria, one should attempt to minimize the influence of bias or prejudice of the experimenter when testing a MOEA hypothesis. The design of each experiment must conform then to an accepted “standard” approach as reflected in any generic scientific method. When employing the scientific method, the detailed design of MOEA experiments can draw heavily from outlines presented by Barr et al. [93] and Jackson et al. [765]. These generic articles discuss computational experiment design for heuristic methods, providing guidelines for reporting results and ensuring their reproducibility. Specifically, they suggest that a well-designed experiment follows the following steps: 1. Define experimental goals; 2. Choose measures of performance - metrics; 3. Design and execute the experiment; 4. Analyze data and draw conclusions; 5. Report experimental results.

Pp. 161-161