Catálogo de publicaciones - libros

Compartir en
redes sociales


Independent Component Analysis and Signal Separation: 7th International Conference, ICA 2007, London, UK, September 9-12, 2007. Proceedings

Mike E. Davies ; Christopher J. James ; Samer A. Abdallah ; Mark D Plumbley (eds.)

En conferencia: 7º International Conference on Independent Component Analysis and Signal Separation (ICA) . London, UK . September 9, 2007 - September 12, 2007

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

No disponibles.

Disponibilidad
Institución detectada Año de publicación Navegá Descargá Solicitá
No detectada 2007 SpringerLink

Información

Tipo de recurso:

libros

ISBN impreso

978-3-540-74493-1

ISBN electrónico

978-3-540-74494-8

Editor responsable

Springer Nature

País de edición

Reino Unido

Fecha de publicación

Información sobre derechos de publicación

© Springer-Verlag Berlin Heidelberg 2007

Tabla de contenidos

Stable Higher-Order Recurrent Neural Network Structures for Nonlinear Blind Source Separation

Yannick Deville; Shahram Hosseini

This paper concerns our general recurrent neural network structures for nonlinear blind source separation, especially suited to polynomial mixtures. We here focus on linear-quadratic mixtures. We introduce an extended structure, with additional free parameters as compared to the structure that we previously proposed. We derive the equilibrium points of our new structure, thus showing that it has no spurious fixed points. We analyze its stability in detail and propose a practical procedure for selecting its free parameters, so as to guarantee the stability of a separating point. We thus solve the stability issue of our previous structure. Numerical results illustrate the effectiveness of this approach.

- Algorithms | Pp. 161-168

Hierarchical ALS Algorithms for Nonnegative Matrix and 3D Tensor Factorization

Andrzej Cichocki; Rafal Zdunek; Shun-ichi Amari

In the paper we present new Alternating Least Squares (ALS) algorithms for Nonnegative Matrix Factorization (NMF) and their extensions to 3D Nonnegative Tensor Factorization (NTF) that are robust in the presence of noise and have many potential applications, including multi-way Blind Source Separation (BSS), multi-sensory or multi-dimensional data analysis, and nonnegative neural sparse coding. We propose to use local cost functions whose simultaneous or sequential (one by one) minimization leads to a very simple ALS algorithm which works under some sparsity constraints both for an under-determined (a system which has less sensors than sources) and over-determined model. The extensive experimental results confirm the validity and high performance of the developed algorithms, especially with usage of the multi-layer hierarchical NMF. Extension of the proposed algorithm to multidimensional Sparse Component Analysis and Smooth Component Analysis is also proposed.

- Algorithms | Pp. 169-176

Pivot Selection Strategies in Jacobi Joint Block-Diagonalization

Cédric Févotte; Fabian J. Theis

A common problem in independent component analysis after prewhitening is to optimize some contrast on the orthogonal or unitary group. A popular approach is to optimize the contrast only with respect to a single angle (Givens rotation) and to iterate this procedure. In this paper we discuss the choice of the sequence of rotations for such so-called Jacobi-based techniques, in the context of joint block-diagonalization (JBD). Indeed, extensive simulations with synthetic data, reported in the paper, illustrates the sensitiveness of this choice, as standard cyclic sweeps appear to often lead to non-optimal solutions. While not being able to guarantee convergence to an optimal solution, we propose a new schedule which, from empirical testing, considerably increases the chances to achieve global minimization of the criterion. We also point out the interest of initializing JBD with the output of joint diagonalization (JD), corroborating the idea that JD could in fact perform JBD up to permutations, as conjectured in previous works.

- Algorithms | Pp. 177-184

Speeding Up FastICA by Mixture Random Pruning

Sabrina Gaito; Giuliano Grossi

We study and derive a method to speed up kurtosis-based FastICA in presence of information redundancy, i.e., for large samples. It consists in randomly decimating the data set as more as possible while preserving the quality of the reconstructed signals. By performing an analysis of the kurtosis estimator, we find the maximum reduction rate which guarantees a narrow confidence interval of such estimator with high confidence level. Such a rate depends on a parameter easily computed a priori combining together the fourth and the eighth norms of the observations.

Extensive simulations have been done on different sets of real world signals. They show that actually the sample size reduction is very high, preserves the quality of the decomposition and impressively speeds up FastICA. On the other hand, the simulations also show that, decimating data more than the rate fixed by , the decomposition ability of FastICA is compromised, thus validating the reliability of the parameter . We are confident that our method will follow to better approach real time applications.

- Algorithms | Pp. 185-192

An Algebraic Non Orthogonal Joint Block Diagonalization Algorithm for Blind Separation of Convolutive Mixtures of Sources

Hicham Ghennioui; El Mostafa Fadaili; Nadège Thirion-Moreau; Abdellah Adib; Eric Moreau

This paper deals with the problem of the blind separation of convolutive mixtures of sources. We present a novel method based on a new non orthogonal joint block diagonalization algorithm (NO − JBD) of a given set of matrices. The main advantages of the proposed method are that it is more general and a preliminary whitening stage is no more compulsorily required. The proposed joint block diagonalization algorithm is based on the algebraic optimization of a least mean squares criterion. Computer simulations are provided in order to illustrate the effectiveness of the proposed approach in three cases: when exact block-diagonal matrices are considered, then when they are progressively perturbed by an additive Gaussian noise and finally when estimated correlation matrices are used. A comparison with a classical orthogonal joint block-diagonalization algorithm is also performed, emphasizing the good performances of the method.

- Algorithms | Pp. 193-200

A Toolbox for Model-Free Analysis of fMRI Data

P. Gruber; C. Kohler; F. J. Theis

We introduce Model-free Toolbox (MFBOX), a Matlab toolbox for analyzing multivariate data sets in an explorative fashion. Its main focus lies on the analysis of functional Nuclear Magnetic Resonance Imaging (fMRI) data sets with various model-free or data-driven techniques. In this context, it can also be used as plugin for SPM5, a popular tool in regression-based fMRI analysis. The toolbox includes BSS algorithms based on various source models including ICA, spatiotemporal ICA, autodecorrelation and NMF. They can all be easily combined with higher-level analysis methods such as reliability analysis using projective clustering of the components, sliding time window analysis or hierarchical decomposition. As an example, we use MFBOX for the analysis of an fMRI experiment and present short comparisons with the SPM results. The MFBOX is freely available for download at .

- Algorithms | Pp. 209-217

An Eigenvector Algorithm with Reference Signals Using a Deflation Approach for Blind Deconvolution

Mitsuru Kawamoto; Yujiro Inouye; Kiyotaka Kohno; Takehide Maeda

We propose an eigenvector algorithm (EVA) with reference signals for blind deconvolution (BD) of multiple-input multiple-output infinite impulse response (MIMO-IIR) channels. Differently from the conventional EVAs, each output of a deconvolver is used as a reference signal, and moreover the BD can be achieved without using whitening techniques. The validity of the proposed EVA is shown comparing with our conventional EVA.

- Algorithms | Pp. 218-226

Robust Independent Component Analysis Using Quadratic Negentropy

Jaehyung Lee; Taesu Kim; Soo-Young Lee

We present a robust algorithm for independent component analysis that uses the sum of marginal quadratic negentropies as a dependence measure. It can handle arbitrary source density functions by using kernel density estimation, but is robust for a small number of samples by avoiding empirical expectation and directly calculating the integration of quadratic densities. In addition, our algorithm is scalable because the gradient of our contrast function can be calculated in O(LN) using the fast Gauss transform, where L is the number of sources and N is the number of samples. In our experiments, we evaluated the performance of our algorithm for various source distributions and compared it with other, well-known algorithms. The results show that the proposed algorithm consistently outperforms the others. Moreover, it is extremely robust to outliers and is particularly more effective when the number of observed samples is small and the number of mixed sources is large.

- Algorithms | Pp. 227-235

Underdetermined Source Separation Using Mixtures of Warped Laplacians

Nikolaos Mitianoudis; Tania Stathaki

In a previous work, the authors have introduced a Mixture of Laplacians model in order to cluster the observed data into the sound sources that exist in an underdetermined two-sensor setup. Since the assumed linear support of the ordinary Laplacian distribution is not valid to model angular quantities, such as the Direction of Arrival to the set of sensors, the authors investigate the performance of a Mixture of Warped Laplacians to perform efficient source separation with promising results.

- Algorithms | Pp. 236-243

Blind Separation of Cyclostationary Sources Using Joint Block Approximate Diagonalization

D. T. Pham

This paper introduces an extension of an earlier method of the author for separating stationary sources, based on the joint approximated diagonalization of interspectral matrices, to the case of cyclostationary sources, to take advantage of their cyclostationarity. the proposed method is based on the joint block approximate diagonlization of cyclic interspectral density. An algorithm for this diagonalization is described. Some simulation experiments are provided, showing the good performance of the method.

- Algorithms | Pp. 244-251