Catálogo de publicaciones - libros

Compartir en
redes sociales


Applied Multivariate Statistical Analysis

Wolfgang Härdle Léopold Simar

Second Edition.

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

Probability Theory and Stochastic Processes; Statistical Theory and Methods; Economic Theory/Quantitative Economics/Mathematical Methods; Quantitative Finance; Statistics for Business/Economics/Mathematical Finance/Insurance

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-72243-4

ISBN electrónico

978-3-540-72244-1

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

Comparison of Batches

Wolfgang Härdle; Léopold Simar

Multivariate statistical analysis is concerned with analyzing and understanding data in high dimensions. We suppose that we are given a set {} of observations of a variable vector in ℝ. That is, we suppose that each observation has dimensions: and that it is an observed value of a variable vector ∈ ℝ. Therefore, is composed of random variables: where , for = 1, . . ., , is a one-dimensional random variable. How do we begin to analyze this kind of data? Before we investigate questions on what inferences we can reach from the data, we should think about how to look at the data. This involves descriptive techniques. Questions that we could answer by descriptive techniques are:

Part I - Descriptive Techniques | Pp. 3-37

A Short Excursion into Matrix Algebra

Wolfgang Härdle; Léopold Simar

This chapter is a reminder of basic concepts of matrix algebra, which are particularly useful in multivariate analysis. It also introduces the notations used in this book for vectors and matrices. Eigenvalues and eigenvectors play an important role in multivariate techniques. In Sections 2.2 and 2.3, we present the spectral decomposition of matrices and consider the maximization (minimization) of quadratic forms given some constraints.

Part II - Multivariate Random Variables | Pp. 41-60

Moving to Higher Dimensions

Wolfgang Härdle; Léopold Simar

We have seen in the previous chapters how very simple graphical devices can help in understanding the structure and dependency of data. The graphical tools were based on either univariate (bivariate) data representations or on “slick” transformations of multivariate information perceivable by the human eye. Most of the tools are extremely useful in a modelling step, but unfortunately, do not give the full picture of the data set. One reason for this is that the graphical tools presented capture only certain dimensions of the data and do not necessarily concentrate on those dimensions or subparts of the data under analysis that carry the maximum structural information. In Part III of this book, powerful tools for reducing the dimension of a data set will be presented. In this chapter, as a starting point, simple and basic tools are used to describe dependency. They are constructed from elementary facts of probability theory and introductory statistics (for example, the covariance and correlation between two variables).

Part II - Multivariate Random Variables | Pp. 61-91

Multivariate Distributions

Wolfgang Härdle; Léopold Simar

The preceeding chapter showed that by using the two first moments of a multivariate distribution (the mean and the covariance matrix), a lot of information on the relationship between the variables can be made available. Only basic statistical theory was used to derive tests of independence or of linear relationships. In this chapter we give an introduction to the basic probability tools useful in statistical multivariate analysis.

Part II - Multivariate Random Variables | Pp. 93-146

Theory of the Multinormal

Wolfgang Härdle; Léopold Simar

In the preceeding chapter we saw how the multivariate normal distribution comes into play in many applications. It is useful to know more about this distribution, since it is often a good approximate distribution in many situations. Another reason for considering the multinormal distribution relies on the fact that it has many appealing properties: it is stable under linear transforms, zero correlation corresponds to independence, the marginals and all the conditionals are also multivariate normal variates, etc. The mathematical properties of the multinormal make analyses much simpler.

Part II - Multivariate Random Variables | Pp. 147-160

Theory of Estimation

Wolfgang Härdle; Léopold Simar

We know from our basic knowledge of statistics that one of the objectives in statistics is to better understand and model the underlying process which generates the data. This is known as statistical inference: we infer from information contained in a sample properties of the population from which the observations are taken. In multivariate statistical inference, we do exactly the same. The basic ideas were introduced in Section 4.5 on sampling theory: we observed the values of a multivariate random variable and obtained a sample . Under random sampling, these observations are considered to be realizations of a sequence of i.i.d. random variables , . . ., where each is a -variate random variable which replicates the or random variable . In this chapter, for notational convenience, we will no longer differentiate between a random variable and an observation of it, , in our notation. We will simply write and it should be clear from the context whether a random variable or an observed value is meant.

Part II - Multivariate Random Variables | Pp. 161-169

Hypothesis Testing

Wolfgang Härdle; Léopold Simar

In the preceding chapter, the theoretical basis of estimation theory was presented. Now we turn our interest towards testing issues: we want to test the hypothesis that the unknown parameter belongs to some subspace of ℝ. This subspace is called the and will be denoted by Ω ⊂ ∝.

Part II - Multivariate Random Variables | Pp. 171-199

Decomposition of Data Matrices by Factors

Wolfgang Härdle; Léopold Simar

In Chapter 1 basic descriptive techniques were developed which provided tools for “looking” at multivariate data. They were based on adaptations of bivariate or univariate devices used to reduce the dimensions of the observations. In the following three chapters, issues of reducing the dimension of a multivariate data set will be discussed. The perspectives will be different but the tools will be related.

Part III - Multivariate Techniques | Pp. 203-214

Principal Components Analysis

Wolfgang Härdle; Léopold Simar

Chapter 8 presented the basic geometric tools needed to produce a lower dimensional description of the rows and columns of a multivariate data matrix. Principal components analysis has the same objective with the exception that the rows of the data matrix will now be considered as observations from a -variate random variable . The principle idea of reducing the dimension of is achieved through linear combinations. Low dimensional linear combinations are often easier to interpret and serve as an intermediate step in a more complex data analysis. More precisely one looks for linear combinations which create the largest spread among the values of . In other words, one is searching for linear combinations with the largest variances.

Part III - Multivariate Techniques | Pp. 215-249

Factor Analysis

Wolfgang Härdle; Léopold Simar

A frequently applied paradigm in analyzing data from multivariate observations is to model the relevant information (represented in a multivariate variable ) as coming from a limited number of latent factors. In a survey on household consumption, for example, the consumption levels, , of different goods during one month could be observed. The variations and covariations of the components of throughout the survey might in fact be explained by two or three main social behavior factors of the household. For instance, a basic desire of comfort or the willingness to achieve a certain social level or other social latent concepts might explain most of the consumption behavior. These unobserved factors are much more interesting to the social scientist than the observed quantitative measures () themselves, because they give a better understanding of the behavior of households. As shown in the examples below, the same kind of factor analysis is of interest in many fields such as psychology, marketing, economics, politic sciences, etc.

Part III - Multivariate Techniques | Pp. 251-270