Catálogo de publicaciones - libros
Process Optimization: A Statistical Approach
Enrique Del Castillo
Resumen/Descripción – provisto por la editorial
No disponible.
Palabras clave – provistas por la editorial
No disponibles.
Disponibilidad
| Institución detectada | Año de publicación | Navegá | Descargá | Solicitá |
|---|---|---|---|---|
| No detectada | 2007 | SpringerLink |
Información
Tipo de recurso:
libros
ISBN impreso
978-0-387-71434-9
ISBN electrónico
978-0-387-71435-6
Editor responsable
Springer Nature
País de edición
Reino Unido
Fecha de publicación
2007
Información sobre derechos de publicación
© Springer-Verlag US 2007
Cobertura temática
Tabla de contenidos
An Overview of Empirical Process Optimization
Enrique Del Castillo
Every engineering system or process is designed with an intended purpose. The purpose frequently entails a desired performance of the operation of the product being manufactured or of the process that manufactures it. Inmany cases, engineering design activities involve tests or experimentation, since the product or process is not well understood, and the desired performance can not be guaranteed. Classical examples abound in Chemical Engineering, in which results from a pilot plant experiment are scaled up to the manufacturing site. In traditional discrete part manufacturing, e.g., machining processes, experimental design and analysis has been used to improve the performance of processes given the inherent noise in the various responses of interest. In designing new products, research & development groups run experiments, build models, and try to optimize responses related to the performance of the new product being designed. In this chapter, we provide an overview of these methods and introduce some basic terminology that will be used in later chapters.
Part I - Preliminaries | Pp. 3-25
Optimization Of First Order Models
Enrique Del Castillo
In Response Surface Methods, the optimal region to run a process is usually determined after a sequence of experiments is conducted and a series of empirical models are obtained. As mentioned in Chapter 1, in a new or poorly understood process it is likely that a first order model will fit well. The Box-Wilson methodology suggests the use of a steepest ascent technique coupled with lack of fit and curvature tests to move the process from a region of little curvature to one where curvature – and the presence of a stationary point – exists. In this chapter we discuss, at an elementary level, steepest ascent/descent methods for optimizing a process described by a first order model. We will assume readers are familiar with the linear regression material reviewed in Appendix A. More advanced techniques related to exploring a new region that incorporate the statistical inference issues into the optimization methods are discussed in Chapter 6.
Part II - Elements of Response Surface Methods | Pp. 29-43
Experimental Designs For First Order Models
Enrique Del Castillo
From the late 1950’s to up to the later 1970’s a debate ensued in the Statistics community between two schools of thought in experimental design. The main point of contention was the practical utility of optimal experimental design theory, mainly developed by J. Kiefer and co-workers. We will refer to this school as the “optimal design school” in this chapter. As mentioned in Section 5.7, using optimality theory one designs an experiment that is optimal in some precisely defined way for a given model form; the design will not be optimal, and probably, not even “robust” if the true process obeys a model different than the assumed one. This point of view is held by G. Box and his co-workers, which we will refer here as the “Applied Statistics school”. The purpose of the present chapter is to introduce the main ideas behind this debate. Since very few practical conclusions resulted from the debate itself, the chapter is necessarily short (the methods developed by both schools have had a great impact, but that is the matter for the other chapters in this book).
Part II - Elements of Response Surface Methods | Pp. 45-83
Analysis and Optimization of Second Order Models
Enrique Del Castillo
As it can be seen from previous chapters, experimental design and process optimization are two intertwined tasks. Sequences of designed experiments are frequently run to optimize a process. In traditional RSM practice, such sequences are often first order designs with center runs that allow to test for curvature. If curvature is detected, second order experimental designs and models are used as a local approximation for process optimization. In this chapter we look at optimizing a second order model. Designs used to fit these models are described in Chapter 5.
Part II - Elements of Response Surface Methods | Pp. 85-107
Experimental Designs for Second Order Models
Enrique Del Castillo
Response Surface Methods suggest to estimate a second order polynomial when there is evidence that the response is curved in the current region of interest, or when lack of fit tests point to an inadequacy of the a first order model. The decision for when to change from using first order designs and models to second order designs and models is therefore based on the single degree of freedom test for curvature and the lack of fit (LOF) tests explained earlier. In this chapter we provide a description of designed experiments with which we can fit the second order model
Part II - Elements of Response Surface Methods | Pp. 109-156
Statistical Inference in First Order RSM Optimization
Enrique Del Castillo
All of the standard inferences in RSM as presented in previous chapters are based on point estimators which have sampling, or experimental, variability. Assuming a classical or frequentist point of view, every quantity computed based on experimental data is subject to sampling variability and is therefore a random quantity itself. As Draper [48] pointed out, one should not expect precise conclusions when using mathematical optimization techniques based on data subject to large errors. This comment applies to every technique previously discussed, namely, the steepest ascent/descent direction, eigenvalues of the quadratic matrix and point estimators of the stationary or optimal points in quadratic (second order) optimization for both canonical and ridge analysis. It also applies to more sophisticated mathematical programming techniques. In the RSM literature, there has been an over-emphasis on using different types of such mathematical techniques which neglect the main statistical issue that arises from random data: if the experiment is repeated and new models fitted, the parameters (or even the response model form) may change, and this will necessarily result in a different optimal solution.
Part III - Statistical Inference in Process Optimization | Pp. 159-192
Statistical Inference in Second Order RSM Optimization
Enrique Del Castillo
We continue in this chapter the discussion of methods for dealing with sampling variability in experimental optimization techniques. This chapter considers the effect of statistical sampling error in RSM techniques that are based on second order (quadratic) polynomial models. We first discuss finding confidence intervals for the eigenvalues of the Hessian matrix, that is, the effect of sampling variability in . Later sections consider the related and important problem of finding a confidence region for the optimal operating conditions . The unconstrained case is discussed first after which methods for the computation and display of confidence regions on constrained optima are discussed. Any traditional (frequentist) RSM optimization analysis should probably always include such regions.
Part III - Statistical Inference in Process Optimization | Pp. 193-208
Bias Vs. Variance
Enrique Del Castillo
From the late 1950’s to up to the later 1970’s a debate ensued in the Statistics community between two schools of thought in experimental design. The main point of contention was the practical utility of optimal experimental design theory, mainly developed by J. Kiefer and co-workers. We will refer to this school as the “optimal design school” in this chapter. As mentioned in Section 5.7, using optimality theory one designs an experiment that is optimal in some precisely defined way for a given model form; the design will not be optimal, and probably, not even “robust” if the true process obeys a model different than the assumed one. This point of view is held by G. Box and his co-workers, which we will refer here as the “Applied Statistics school”. The purpose of the present chapter is to introduce the main ideas behind this debate. Since very few practical conclusions resulted from the debate itself, the chapter is necessarily short (the methods developed by both schools have had a great impact, but that is the matter for the other chapters in this book).
Part III - Statistical Inference in Process Optimization | Pp. 209-219
Robust Parameter Design
Enrique Del Castillo
Just as success in competitive sports, finding process settings and product design parameters that are “prepared” against any eventuality or uncertainty is also the basic idea followed in industry to obtain processes and products. In this chapter we consider robustness with respect to variation in uncontrollable factors, also called noise factors, a problem that has received the name “Robust Parameter Design” (), a term coined by Taguchi. Genichi Taguchi [149], a textile engineer with a training in statistics, introduced a series of innovative ideas in designed experiments and process optimization which have had strong influence in the way we look at process optimization today. Some of Taguchi’s ideas and concepts have been criticized by several authors, mainly in the USA. This chapter first discusses the main ideas behind Taguchi’s approach to the RPD problem. In later sections, we describe how the same goals and ideas introduced by Taguchi can be approached using response surface techniques, including techniques developed relatively recently in answer to the controversy created by Taguchi in quality control and Applied Statistics circles.
Part IV - Robust Parameter Design and Robust Optimization | Pp. 223-278
Robust Optimization
Enrique Del Castillo
In this chapter we discuss robustness from a more general perspective, that of model building in general, without specific discussion of noise factors. This notion of robustness in the sense of lack of sensitivity of an optimal solution to variations in the model has always been a key idea in mathematical modeling, and in particular, in mathematical programming. This differs from environmental variation in the sense of Taguchi, as discussed in the previous chapter. Thus, in this chapter, no “noise factors” are assumed to exist.
In this chapter we present the “Minimax Deviation method” for robust optimization of Xu and Albin. This is a method that attempts to protect against sampling variability of the parameter estimates in the model, hence it is a frequentist method. We relate this method to confidence regions on the optimal settings (Chapter 7) and to some other proposals for process optimization from the area of Stochastic Programming.
A natural alternative to the Xu-Albin method is to employ a Bayesian approach in which the uncertainty in the model parameters, considered as random variables, is incorporated in the optimization. Such Bayesian approach to process optimization is presented in Part V of this book.
Part IV - Robust Parameter Design and Robust Optimization | Pp. 279-287