Search Results for author: Rudrasis Chakraborty

Found 41 papers, 15 papers with code

Variational Sampling of Temporal Trajectories

no code implementations18 Mar 2024 Jurijs Nazarovs, Zhichun Huang, Xingjian Zhen, Sourav Pal, Rudrasis Chakraborty, Vikas Singh

In this work, we introduce a mechanism to learn the distribution of trajectories by parameterizing the transition function $f$ explicitly as an element in a function space.

Out-of-Distribution Detection

On the Versatile Uses of Partial Distance Correlation in Deep Learning

1 code implementation20 Jul 2022 Xingjian Zhen, Zihang Meng, Rudrasis Chakraborty, Vikas Singh

Comparing the functional behavior of neural network models, whether it is a single network over time or two (or more networks) during or post-training, is an essential step in understanding what they are learning (and what they are not), and for identifying strategies for regularization or efficiency improvements.

Equivariance Allows Handling Multiple Nuisance Variables When Analyzing Pooled Neuroimaging Datasets

1 code implementation CVPR 2022 Vishnu Suresh Lokhande, Rudrasis Chakraborty, Sathya N. Ravi, Vikas Singh

Pooling multiple neuroimaging datasets across institutions often enables improvements in statistical power when evaluating associations (e. g., between risk factors and disease outcomes) that may otherwise be too weak to detect.

Causal Inference Domain Adaptation +1

Mixed Effects Neural ODE: A Variational Approximation for Analyzing the Dynamics of Panel Data

no code implementations18 Feb 2022 Jurijs Nazarovs, Rudrasis Chakraborty, Songwong Tasneeyapant, Sathya N. Ravi, Vikas Singh

Panel data involving longitudinal measurements of the same set of participants taken over multiple time points is common in studies to understand childhood development and disease modeling.

Understanding Uncertainty Maps in Vision With Statistical Testing

no code implementations CVPR 2022 Jurijs Nazarovs, Zhichun Huang, Songwong Tasneeyapant, Rudrasis Chakraborty, Vikas Singh

Quantitative descriptions of confidence intervals and uncertainties of the predictions of a model are needed in many applications in vision and machine learning.

Forward Operator Estimation in Generative Models with Kernel Transfer Operators

no code implementations1 Dec 2021 Zhichun Huang, Rudrasis Chakraborty, Vikas Singh

Generative models which use explicit density modeling (e. g., variational autoencoders, flow-based generative models) involve finding a mapping from a known distribution, e. g. Gaussian, to the unknown input distribution.

Distribution Matching in Deep Generative Models with Kernel Transfer Operators

no code implementations29 Sep 2021 Zhichun Huang, Rudrasis Chakraborty, Vikas Singh

Generative models which use explicit density modeling (e. g., variational autoencoders, flow-based generative models) involve finding a mapping from a known distribution, e. g. Gaussian, to the unknown input distribution.

An Online Riemannian PCA for Stochastic Canonical Correlation Analysis

1 code implementation NeurIPS 2021 Zihang Meng, Rudrasis Chakraborty, Vikas Singh

We present an efficient stochastic algorithm (RSG+) for canonical correlation analysis (CCA) using a reparametrization of the projection matrices.

Attribute

VolterraNet: A higher order convolutional network with group equivariance for homogeneous manifolds

1 code implementation5 Jun 2021 Monami Banerjee, Rudrasis Chakraborty, Jose Bouza, Baba C. Vemuri

In this paper, we present a novel higher order Volterra convolutional neural network (VolterraNet) for data defined as samples of functions on Riemannian homogeneous spaces.

Translation

Simpler Certified Radius Maximization by Propagating Covariances

1 code implementation CVPR 2021 Xingjian Zhen, Rudrasis Chakraborty, Vikas Singh

One strategy for adversarially training a robust model is to maximize its certified radius -- the neighborhood around a given training sample for which the model's prediction remains unchanged.

Stochastic Canonical Correlation Analysis: A Riemannian Approach

no code implementations1 Jan 2021 Zihang Meng, Rudrasis Chakraborty, Vikas Singh

We present an efficient stochastic algorithm (RSG+) for canonical correlation analysis (CCA) derived via a differential geometric perspective of the underlying optimization task.

Attribute

Can Kernel Transfer Operators Help Flow based Generative Models?

no code implementations1 Jan 2021 Zhichun Huang, Rudrasis Chakraborty, Xingjian Zhen, Vikas Singh

Flow-based generative models refer to deep generative models with tractable likelihoods, and offer several attractive properties including efficient density estimation and sampling.

Density Estimation

Flow-based Generative Models for Learning Manifold to Manifold Mappings

1 code implementation18 Dec 2020 Xingjian Zhen, Rudrasis Chakraborty, Liu Yang, Vikas Singh

Partly due to this gap, there are also no modality transfer/translation models for manifold-valued data whereas numerous such methods based on generative models are available for natural images.

C-SURE: Shrinkage Estimator and Prototype Classifier for Complex-Valued Deep Learning

no code implementations22 Jun 2020 Yifei Xing, Rudrasis Chakraborty, Minxuan Duan, Stella Yu

We compare C-SURE with SurReal and a real-valued baseline on complex-valued MSTAR and RadioML datasets.

ManifoldNorm: Extending normalizations on Riemannian Manifolds

no code implementations30 Mar 2020 Rudrasis Chakraborty

One of the other remedies to deal with the instabilities including gradient explosion is to use normalization techniques including {\it batch norm} and {\it group norm} etc..

A Deep Learning Approach for Meibomian Gland Atrophy Evaluation in Meibography Images

1 code implementation Translational Vision Science & Technology 2019 Jiayun Wang, Thao N. Yeh, Rudrasis Chakraborty, Stella X. Yu, Meng C. Lin

The development set was used to train and tune the deep learning model, while the evaluation set was used to evaluate the performance of the model.

Orthogonal Convolutional Neural Networks

1 code implementation CVPR 2020 Jiayun Wang, Yubei Chen, Rudrasis Chakraborty, Stella X. Yu

We develop an efficient approach to impose filter orthogonality on a convolutional layer based on the doubly block-Toeplitz matrix representation of the convolutional kernel instead of using the common kernel orthogonality approach, which we show is only necessary but not sufficient for ensuring orthogonal convolutions.

Image Classification Image Retrieval

A GMM based algorithm to generate point-cloud and its application to neuroimaging

no code implementations5 Nov 2019 Liu Yang, Rudrasis Chakraborty

Experimental validation has been performed to show that the proposed scheme can generate new 3D structures using interpolation techniques, i. e., given two 3D structures represented as point-clouds, we can generate point-clouds in between.

An "augmentation-free" rotation invariant classification scheme on point-cloud and its application to neuroimaging

no code implementations5 Nov 2019 Liu Yang, Rudrasis Chakraborty

Though in the medical imaging community, 3D point-cloud processing is not a "go-to" choice, it is a canonical way to preserve rotation invariance.

Data Augmentation General Classification

POIRot: A rotation invariant omni-directional pointnet

no code implementations29 Oct 2019 Liu Yang, Rudrasis Chakraborty, Stella X. Yu

Our proposed model is rotationally invariant and can preserve geometric shape of a 3D point-cloud.

Data Augmentation Point Cloud Segmentation

SurReal: Complex-Valued Learning as Principled Transformations on a Scaling and Rotation Manifold

1 code implementation18 Oct 2019 Rudrasis Chakraborty, Yifei Xing, Stella Yu

We propose to extend the property instead of the form of real-valued functions to the complex domain.

Dilated Convolutional Neural Networks for Sequential Manifold-valued Data

1 code implementation ICCV 2019 Xingjian Zhen, Rudrasis Chakraborty, Nicholas Vogt, Barbara B. Bendlin, Vikas Singh

Efforts are underway to study ways via which the power of deep neural networks can be extended to non-standard data types such as structured data (e. g., graphs) or manifold-valued data (e. g., unit vectors or special matrices).

Spatial Transformer for 3D Point Clouds

1 code implementation26 Jun 2019 Jiayun Wang, Rudrasis Chakraborty, Stella X. Yu

We propose a novel end-to-end approach to learn different non-rigid transformations of the input point cloud so that optimal local neighborhoods can be adopted at each layer.

Semantic Segmentation

SurReal: Fréchet Mean and Distance Transform for Complex-Valued Deep Learning

1 code implementation24 Jun 2019 Rudrasis Chakraborty, Jiayun Wang, Stella X. Yu

On RadioML, our model achieves comparable RF modulation classification accuracy at 10% of the baseline model size.

General Classification

MANIFOLDNET: A DEEP NEURAL NETWORK FOR MANIFOLD-VALUED DATA

no code implementations ICLR 2019 Rudrasis Chakraborty, Jose Bouza, Jonathan Manton, Baba C. Vemuri

To this end, we present a provably convergent recursive computation of the wFM of the given data, where the weights makeup the convolution mask, to be learned.

General Classification Image Reconstruction +1

ManifoldNet: A Deep Network Framework for Manifold-valued Data

1 code implementation11 Sep 2018 Rudrasis Chakraborty, Jose Bouza, Jonathan Manton, Baba C. Vemuri

Thus, there is need to generalize the deep neural networks to cope with input data that reside on curved manifolds where vector space operations are not naturally admissible.

Dimensionality Reduction

A mixture model for aggregation of multiple pre-trained weak classifiers

no code implementations31 May 2018 Rudrasis Chakraborty, Chun-Hao Yang, Baba C. Vemuri

The other alternative to increase the performance is to learn multiple weak classifiers and boost their performance using a boosting algorithm or a variant thereof.

General Classification

A CNN for homogneous Riemannian manifolds with applications to Neuroimaging

no code implementations14 May 2018 Rudrasis Chakraborty, Monami Banerjee, Baba C. Vemuri

(ii) As a corrolary, we prove the equivariance of the correlation operation to group actions admitted by the input domains which are Riemannian homogeneous manifolds.

Dictionary Learning and Sparse Coding on Statistical Manifolds

no code implementations3 May 2018 Rudrasis Chakraborty, Monami Banerjee, Baba C. Vemuri

In this paper, we propose a novel information theoretic framework for dictionary learning (DL) and sparse coding (SC) on a statistical manifold (the manifold of probability distributions).

Dictionary Learning General Classification

Generative Adversarial Network based Autoencoder: Application to fault detection problem for closed loop dynamical systems

no code implementations15 Apr 2018 Indrasis Chakraborty, Rudrasis Chakraborty, Draguna Vrabie

Traditional classifier based method does not perform well, because of the inherent difficulty of detecting system level faults for closed loop dynamical system.

Fault Detection Generative Adversarial Network

Sparse Exact PGA on Riemannian Manifolds

no code implementations ICCV 2017 Monami Banerjee, Rudrasis Chakraborty, Baba C. Vemuri

In this paper, we present a novel generalization of SPCA, called sparse exact PGA (SEPGA) that can cope with manifold-valued input data and respect the intrinsic geometry of the underlying manifold.

Computational Efficiency Dimensionality Reduction

A Geometric Framework for Statistical Analysis of Trajectories With Distinct Temporal Spans

no code implementations ICCV 2017 Rudrasis Chakraborty, Vikas Singh, Nagesh Adluru, Baba C. Vemuri

Finally, by using existing algorithms for recursive Frechet mean and exact principal geodesic analysis on the hypersphere, we present several experiments on synthetic and real (vision and medical) data sets showing how group testing on such diversely sampled longitudinal data is possible by analyzing the reconstructed data in the subspace spanned by the first few PGs.

Statistics on the (compact) Stiefel manifold: Theory and Applications

no code implementations31 Jul 2017 Rudrasis Chakraborty, Baba Vemuri

The Stiefel manifold is a Riemannian homogeneous space but not a symmetric space.

Intrinsic Grassmann Averages for Online Linear and Robust Subspace Learning

no code implementations CVPR 2017 Rudrasis Chakraborty, Soren Hauberg, Baba C. Vemuri

We have demonstrated competitive performance of our proposed online subspace algorithm method on one synthetic and two real data sets.

Intrinsic Grassmann Averages for Online Linear, Robust and Nonlinear Subspace Learning

no code implementations3 Feb 2017 Rudrasis Chakraborty, Søren Hauberg, Baba C. Vemuri

In this paper, we present a geometric framework for computing the principal linear subspaces in both situations as well as for the robust PCA case, that amounts to computing the intrinsic average on the space of all subspaces: the Grassmann manifold.

Dimensionality Reduction

A Nonlinear Regression Technique for Manifold Valued Data With Applications to Medical Image Analysis

no code implementations CVPR 2016 Monami Banerjee, Rudrasis Chakraborty, Edward Ofori, Michael S. Okun, David E. Viallancourt, Baba C. Vemuri

With the exception of a few, most existing methods of regression for manifold valued data are limited to geodesic regression which is a generalization of the linear regression in vector-spaces.

regression

An information theoretic formulation of the Dictionary Learning and Sparse Coding Problems on Statistical Manifolds

no code implementations23 Apr 2016 Rudrasis Chakraborty, Monami Banerjee, Victoria Crawford, Baba C. Vemuri

In this work, we propose a novel information theoretic framework for dictionary learning (DL) and sparse coding (SC) on a statistical manifold (the manifold of probability distributions).

Dictionary Learning General Classification

An efficient Exact-PGA algorithm for constant curvature manifolds

no code implementations CVPR 2016 Rudrasis Chakraborty, Dohyung Seo, Baba C. Vemuri

Recently, an alternative called exact PGA was proposed which tries to solve the optimization without any linearization.

Recursive Frechet Mean Computation on the Grassmannian and its Applications to Computer Vision

no code implementations ICCV 2015 Rudrasis Chakraborty, Baba C. Vemuri

In the limit as the number of samples tends to infinity, we prove that GiFME converges to the FM (this is called the weak consistency result on the Grassmann manifold).

Action Recognition Face Recognition +1

Cannot find the paper you are looking for? You can Submit a new open access paper.