Search Results for author: Sergey M. Plis

Found 13 papers, 8 papers with code

Tasting the cake: evaluating self-supervised generalization on out-of-distribution multimodal MRI data

1 code implementation29 Mar 2021 Alex Fedorov, Eloy Geenjaar, Lei Wu, Thomas P. DeRamus, Vince D. Calhoun, Sergey M. Plis

We show that self-supervised models are not as robust as expected based on their results in natural imaging benchmarks and can be outperformed by supervised learning with dropout.

Self-Supervised Learning

Efficient Distributed Auto-Differentiation

no code implementations18 Feb 2021 Bradley T. Baker, Vince D. Calhoun, Barak Pearlmutter, Sergey M. Plis

Although distributed machine learning has opened up numerous frontiers of research, the separation of large models across different devices, nodes, and sites can invite significant communication overhead, making reliable training difficult.


On self-supervised multi-modal representation learning: An application to Alzheimer's disease

1 code implementation25 Dec 2020 Alex Fedorov, Lei Wu, Tristan Sylvain, Margaux Luck, Thomas P. DeRamus, Dmitry Bleklov, Sergey M. Plis, Vince D. Calhoun

In this paper, we introduce a way to exhaustively consider multimodal architectures for contrastive self-supervised fusion of fMRI and MRI of AD patients and controls.

Classification General Classification +1

Whole MILC: generalizing learned dynamics across tasks, datasets, and populations

1 code implementation29 Jul 2020 Usman Mahmood, Md Mahfuzur Rahman, Alex Fedorov, Noah Lewis, Zening Fu, Vince D. Calhoun, Sergey M. Plis

In this paper we present a novel self supervised training schema which reinforces whole sequence mutual information local to context (whole MILC).

Feature Importance

Multidataset Independent Subspace Analysis with Application to Multimodal Fusion

1 code implementation11 Nov 2019 Rogers F. Silva, Sergey M. Plis, Tulay Adali, Marios S. Pattichis, Vince D. Calhoun

In the last two decades, unsupervised latent variable models---blind source separation (BSS) especially---have enjoyed a strong reputation for the interpretable features they produce.

Combinatorial Optimization Latent Variable Models

Improved Differentially Private Decentralized Source Separation for fMRI Data

no code implementations28 Oct 2019 Hafiz Imtiaz, Jafar Mohammadi, Rogers Silva, Bradley Baker, Sergey M. Plis, Anand D. Sarwate, Vince Calhoun

In this work, we propose a differentially private algorithm for performing ICA in a decentralized data setting.

Run, skeleton, run: skeletal model in a physics-based simulation

1 code implementation18 Nov 2017 Mikhail Pavlov, Sergey Kolesnikov, Sergey M. Plis

In this paper, we present our approach to solve a physics-based reinforcement learning challenge "Learning to Run" with objective to train physiologically-based human model to navigate a complex obstacle course as quickly as possible.

Policy Gradient Methods

Spatio-temporal Dynamics of Intrinsic Networks in Functional Magnetic Imaging Data Using Recurrent Neural Networks

no code implementations3 Nov 2016 R. Devon Hjelm, Eswar Damaraju, Kyunghyun Cho, Helmut Laufs, Sergey M. Plis, Vince Calhoun

We introduce a novel recurrent neural network (RNN) approach to account for temporal dynamics and dependencies in brain networks observed via functional magnetic resonance imaging (fMRI).

Variational Autoencoders for Feature Detection of Magnetic Resonance Imaging Data

no code implementations21 Mar 2016 R. Devon Hjelm, Sergey M. Plis, Vince C. Calhoun

Independent component analysis (ICA), as an approach to the blind source-separation (BSS) problem, has become the de-facto standard in many medical imaging settings.

Dimensionality Reduction

Deep learning for neuroimaging: a validation study

no code implementations20 Dec 2013 Sergey M. Plis, Devon R. Hjelm, Ruslan Salakhutdinov, Vince D. Calhoun

In this work we demonstrate our results (and feasible parameter ranges) in application of deep learning methods to structural and functional brain imaging data.

Representation Learning

Block Coordinate Descent for Sparse NMF

1 code implementation15 Jan 2013 Vamsi K. Potluru, Sergey M. Plis, Jonathan Le Roux, Barak A. Pearlmutter, Vince D. Calhoun, Thomas P. Hayes

However, present algorithms designed for optimizing the mixed norm L$_1$/L$_2$ are slow and other formulations for sparse NMF have been proposed such as those based on L$_1$ and L$_0$ norms.

Cannot find the paper you are looking for? You can Submit a new open access paper.