Jayaraman J. Thiagarajan

Lawrence Livermore National Laboratory

Machine learningPattern recognitionMathematicsComputer scienceArtificial neural network

140Publications

14H-index

791Citations

What is this?

Publications 146

Newest

#1Rushil Anirudh (LLNL: Lawrence Livermore National Laboratory)H-Index: 6

#2Jayaraman J. Thiagarajan (LLNL: Lawrence Livermore National Laboratory)H-Index: 14

Last. Peer-Timo Bremer (LLNL: Lawrence Livermore National Laboratory)H-Index: 26

view all 4 authors...

In the past few years, Generative Adversarial Networks (GANs) have dramatically advanced our ability to represent and parameterize high-dimensional, non-linear image manifolds. As a result, they have been widely adopted across a variety of applications, ranging from challenging inverse problems like image completion, to problems such as anomaly detection and adversarial defense. A recurring theme in many of these applications is the notion of projecting an image observation onto the manifold tha...

#1Uday Shankar Shanthamallu (ASU: Arizona State University)H-Index: 2

#2Jayaraman J. Thiagarajan (LLNL: Lawrence Livermore National Laboratory)H-Index: 14

Last. Andreas SpaniasH-Index: 28

view all 3 authors...

Machine learning models that can exploit the inherent structure in data have gained prominence. In particular, there is a surge in deep learning solutions for graph-structured data, due to its wide-spread applicability in several fields. Graph attention networks (GAT), a recent addition to the broad class of feature learning models in graphs, utilizes the attention mechanism to efficiently learn continuous vector representations for semi-supervised learning problems. In this paper, we perform a ...

#1Bindya Venkatesh (ASU: Arizona State University)H-Index: 1

#1Bindya Venkatesh (ITU: IT University of Copenhagen)

Last. Prasanna Sattigeri (ITU: IT University of Copenhagen)

view all 4 authors...

The hypothesis that sub-network initializations (lottery) exist within the initializations of over-parameterized networks, which when trained in isolation produce highly generalizable models, has led to crucial insights into network initialization and has enabled computationally efficient inferencing. In order to realize the full potential of these pruning strategies, particularly when utilized in transfer learning scenarios, it is necessary to understand the behavior of winning tickets when the...

Scalable Topological Data Analysis and Visualization for Evaluating Data-Driven Models in Scientific Applications

#1Shusen Liu (LLNL: Lawrence Livermore National Laboratory)H-Index: 10

#2Rushil Anirudh (LLNL: Lawrence Livermore National Laboratory)H-Index: 6

Last. Valerio Pascucci (UofU: University of Utah)H-Index: 45

view all 16 authors...

With the rapid adoption of machine learning techniques for large-scale applications in science and engineering comes the convergence of two grand challenges in visualization. First, the utilization of black box models (e.g., deep neural networks) calls for advanced techniques in exploring and interpreting model behaviors. Second, the rapid growth in computing has produced enormous datasets that require techniques that can handle millions or more samples. Although some solutions to these interpre...

Transfer Learning as a Tool for Reducing Simulation Bias: Application to Inertial Confinement Fusion

#1Bogdan Kustowski (LLNL: Lawrence Livermore National Laboratory)

#2Jim Gaffney (LLNL: Lawrence Livermore National Laboratory)H-Index: 6

Last. Rushil Anirudh (LLNL: Lawrence Livermore National Laboratory)H-Index: 6

view all 6 authors...

We adopt a technique, known in the machine learning community as transfer learning, to reduce the bias of computer simulation using very sparse experimental data. Unlike the Bayesian calibration, which is commonly used to estimate the simulation bias, the transfer learning approach discussed in this article involves calculating an artificial neural network surrogate model of the simulations. Assuming that the simulation code correctly predicts the trends in the experimental data but it is subjec...

#1Rushil Anirudh (LLNL: Lawrence Livermore National Laboratory)H-Index: 6

#2Jayaraman J. ThiagarajanH-Index: 14

Last. Brian SpearsH-Index: 24

view all 4 authors...

Neural networks have become very popular in surrogate modeling because of their ability to characterize arbitrary, high dimensional functions in a data driven fashion. This paper advocates for the training of surrogates that are consistent with the physical manifold -- i.e., predictions are always physically meaningful, and are cyclically consistent -- i.e., when the predictions of the surrogate, when passed through an independently trained inverse model give back the original input parameters. ...

#1Rushil Anirudh (LLNL: Lawrence Livermore National Laboratory)H-Index: 6

#2Jayaraman J. ThiagarajanH-Index: 14

Last. Timo BremerH-Index: 4

view all 4 authors...

In the past few years, generative models like Generative Adversarial Networks (GANs) have dramatically advanced our ability to represent and parameterize high-dimensional, non-linear image manifolds. As a result, they have been widely adopted across a variety of applications, ranging from challenging inverse problems like image completion, to being used as a prior in problems such as anomaly detection and adversarial defense. A recurring theme in many of these applications is the notion of proje...

#1J. Luc PetersonH-Index: 18

#2Rushil AnirudhH-Index: 6

Last. Jae-Seung YeomH-Index: 8

view all 21 authors...

With the growing complexity of computational and experimental facilities, many scientific researchers are turning to machine learning (ML) techniques to analyze large scale ensemble data. With complexities such as multi-component workflows, heterogeneous machine architectures, parallel file systems, and batch scheduling, care must be taken to facilitate this analysis in a high performance computing (HPC) environment. In this paper, we present Merlin, a workflow framework to enable large ML-frien...

#1Huan Song (Bosch)H-Index: 4

#2Jayaraman J. Thiagarajan (LLNL: Lawrence Livermore National Laboratory)H-Index: 14

Inferencing with graph data necessitates the mapping of its nodes into a vector space, where the relationships are preserved. However, with multi-layered graphs, where multiple types of relationships exist for the same set of nodes, it is crucial to exploit the information shared between layers, in addition to the distinct aspects of each layer. In this paper, we propose a novel approach that first obtains node embeddings in all layers jointly via DeepWalk on a supra graph, which allows interact...

#1Jayaraman J. Thiagarajan (LLNL: Lawrence Livermore National Laboratory)H-Index: 14

#2Satyananda Kashyap (IBM)H-Index: 1

Last. Alexandros Karargyris (IBM)H-Index: 18

view all 3 authors...

Weakly supervised instance labeling using only image-level labels, in lieu of expensive fine-grained pixel annotations, is crucial in several applications including medical image analysis. In contrast to conventional instance segmentation in computer vision, the problems that we consider are characterized by a small number of training images and non-local patterns that lead to the diagnosis. In this paper, we explore the use of multiple instance learning (MIL) to design an instance label generat...

12345678910