Multidataset Independent Subspace Analysis with Application to Multimodal Fusion.

Published on Nov 11, 2019in arXiv: Machine Learning
Rogers F. Silva9
Estimated H-index: 9
Sergey M. Plis2
Estimated H-index: 2
+ 2 AuthorsVince D. Calhoun93
Estimated H-index: 93
In the last two decades, unsupervised latent variable models---blind source separation (BSS) especially---have enjoyed a strong reputation for the interpretable features they produce. Seldom do these models combine the rich diversity of information available in multiple datasets. Multidatasets, on the other hand, yield joint solutions otherwise unavailable in isolation, with a potential for pivotal insights into complex systems. To take advantage of the complex multidimensional subspace structures that capture underlying modes of shared and unique variability across and within datasets, we present a direct, principled approach to multidataset combination. We design a new method called multidataset independent subspace analysis (MISA) that leverages joint information from multiple heterogeneous datasets in a flexible and synergistic fashion. Methodological innovations exploiting the Kotz distribution for subspace modeling in conjunction with a novel combinatorial optimization for evasion of local minima enable MISA to produce a robust generalization of independent component analysis (ICA), independent vector analysis (IVA), and independent subspace analysis (ISA) in a single unified model. We highlight the utility of MISA for multimodal information fusion, including sample-poor regimes and low signal-to-noise ratio scenarios, promoting novel applications in both unimodal and multimodal brain imaging data.
  • References (17)
  • Citations (1)
📖 Papers frequently viewed together
2016ICPR: International Conference on Pattern Recognition
29 Citations
5 Authors (Jiamiao Xu, ..., Peng Zhang)
78% of Scinapse members use related papers. After signing in, all features are FREE.
#1Srinivas Rachakonda (The Mind Research Network)H-Index: 14
#2Rogers F. Silva (UNM: University of New Mexico)H-Index: 9
Last. Vince D. Calhoun (UNM: University of New Mexico)H-Index: 93
view all 4 authors...
Principal component analysis (PCA) is widely used for data reduction in group independent component analysis (ICA) of fMRI data. Commonly, group-level PCA of temporally concatenated datasets is computed prior to ICA of the group principal components. This work focuses on reducing very high dimensional temporally concatenated datasets into its group PCA space. Existing randomized PCA methods can determine the PCA subspace with minimal memory requirements and, thus, are ideal for solving large PCA...
9 CitationsSource
#1Rogers F. Silva (UNM: University of New Mexico)H-Index: 9
#2Sergey M. Plis (The Mind Research Network)H-Index: 20
Last. Vince D. Calhoun (UNM: University of New Mexico)H-Index: 93
view all 4 authors...
Abstract Multimodal fusion is becoming more common as it proves to be a powerful approach to identify complementary information from multimodal datasets. However, simulation of joint information is not straightforward. Published approaches mostly employ limited, provisional designs that often break the link between the model assumptions and the data for the sake of demonstrating properties of fusion techniques. This work introduces a new approach to synthetic data generation which allows full-co...
6 CitationsSource
#1Dana Lahat (TAU: Tel Aviv University)H-Index: 7
#2Jean-Francois Cardoso (ENST: Télécom ParisTech)H-Index: 76
Last. Hagit Messer (TAU: Tel Aviv University)H-Index: 28
view all 3 authors...
Independent component analysis (ICA) and blind source separation (BSS) deal with extracting a number of mutually independent elements from a set of observed linear mixtures. Motivated by various applications, this paper considers a more general and more flexible model: the sources can be partitioned into groups exhibiting dependence within a given group but independence between two different groups. We argue that this is tantamount to considering multidimensional components as opposed to the sta...
35 CitationsSource
#1Matthew Anderson (UMB: University of Maryland, Baltimore)H-Index: 13
#2Tulay Adali (UMB: University of Maryland, Baltimore)H-Index: 56
Last. Xi-Lin Li (UMB: University of Maryland, Baltimore)H-Index: 15
view all 3 authors...
In this paper, we consider the joint blind source separation (JBSS) problem and introduce a number of algorithms to solve the JBSS problem using the independent vector analysis (IVA) framework. Source separation of multiple datasets simultaneously is possible when the sources within each and every dataset are independent of one another and each source is dependent on at most one source within each of the other datasets. In addition to source separation, the IVA framework solves an essential prob...
98 CitationsSource
Jun 1, 2011 in CVPR (Computer Vision and Pattern Recognition)
#1Quoc V. Le (Stanford University)H-Index: 58
#2Will Y. Zou (Stanford University)H-Index: 5
Last. Andrew Y. Ng (Stanford University)H-Index: 100
view all 4 authors...
Previous work on action recognition has focused on adapting hand-designed local features, such as SIFT or HOG, from static images to the video domain. In this paper, we propose using unsupervised feature learning as a way to learn features directly from video data. More specifically, we present an extension of the Independent Subspace Analysis algorithm to learn invariant spatio-temporal features from unlabeled video data. We discovered that, despite its simplicity, this method performs surprisi...
771 CitationsSource
Jun 1, 2011 in SPAWC (International Workshop on Signal Processing Advances in Wireless Communications)
#1Klaus Nordhausen (RMIT: RMIT University)H-Index: 16
#2Esa Ollila (Aalto University)H-Index: 23
Last. Hannu Oja (RMIT: RMIT University)H-Index: 35
view all 3 authors...
For assessing the separation performance (quality and accuracy) of ICA estimators, several performance indices have been introduced in the literature. The purpose of this note is to outline, review and study the properties of performance indices as well as propose some new ones. Special emphasis is put on the properties that such performance indices ought to satisfy. We categorize the indices in two separate groups that differ in the approaches they deploy in measuring the closeness of the achie...
26 CitationsSource
#1Matthew Anderson (UMBC: University of Maryland, Baltimore County)H-Index: 13
#2Xi-Lin Li (UMBC: University of Maryland, Baltimore County)H-Index: 15
Last. Tulay Adali (UMBC: University of Maryland, Baltimore County)H-Index: 56
view all 3 authors...
We consider the problem of joint blind source separation of multiple datasets and introduce an effective solution to the problem. We pose the problem in an independent vector analysis (IVA) framework utilizing the multivariate Gaussian source vector distribution. We provide a new general IVA implementation using a decoupled nonorthogonal optimization algorithm and establish the connection between the new approach and another approach using second-order statistics, multiset canonical correlation ...
22 CitationsSource
#1Richard A. Waltz (NU: Northwestern University)H-Index: 10
#2José Luis Morales (ITAM: Instituto Tecnológico Autónomo de México)H-Index: 12
Last. Dominique Orban (NU: Northwestern University)H-Index: 18
view all 4 authors...
An interior-point method for nonlinear programming is presented. It enjoys the flexibility of switching between a line search method that computes steps by factoring the primal-dual equations and a trust region method that uses a conjugate gradient iteration. Steps computed by direct factorization are always tried first, but if they are deemed ineffective, a trust region iteration that guarantees progress toward stationarity is invoked. To demonstrate its effectiveness, the algorithm is implemen...
496 CitationsSource
#1Taesu Kim (UCSD: University of California, San Diego)H-Index: 11
#2Torbjørn Eltoft (University of Tromsø)H-Index: 26
Last. Te-Won Lee (UCSD: University of California, San Diego)H-Index: 44
view all 3 authors...
In this paper, we solve an ICA problem where both source and observation signals are multivariate, thus, vectorized signals. To derive the algorithm, we define dependence between vectors as Kullback-Leibler divergence between joint probability and the product of marginal probabilities, and propose a vector density model that has a variance dependency within a source vector. The example shows that the algorithm successfully recovers the sources and it does not cause any permutation ambiguities wi...
109 CitationsSource
#1A. Hyverinnen (Helsinki Institute for Information Technology)H-Index: 53
#2Urs Köster (Helsinki Institute for Information Technology)H-Index: 9
Independent Subspace Analysis (ISA; Hyvarinen & Hoyer, 2000) is an extension of ICA. In ISA, the components are divided into subspaces and compo- nents in different subspaces are assumed independent, whereas components in the same subspace have dependencies.In this paper we describe a fixed-point algorithm for ISA estimation, formulated in analogy to FastICA. In particular we give a proof of the quadratic convergence of the algorithm, and present simulations that confirm the fast convergence, bu...
42 Citations
Cited By1
#1Kuaikuai Duan (Georgia Institute of Technology)
#2Rogers F Silva (GSU: Georgia State University)
Last. Jingyu Liu (Georgia Institute of Technology)H-Index: 30
view all 4 authors...
Multimodal data fusion is a topic of great interest. Several fusion methods have been proposed to investigate coherent patterns and corresponding linkages across modalities, such as joint independent component analysis (jICA), multiset canonical correlation analysis (mCCA), mCCA+jICA, and parallel ICA. JICA exploits source independence but assumes shared loading parameters. MCCA maximizes correlation linkage across modalities directly but is limited to orthogonal features. While there is no theo...