1 code implementation • 7 Aug 2024 • Feng Zhou, YanJie Zhou, Longjie Wang, Yun Peng, David E. Carlson, Liyun Tu
A registration-based data augmentation network creates realistic, labeled samples, while a feature distillation module helps the student network learn segmentation from these samples, guided by the teacher network.
no code implementations • 15 Mar 2019 • Yitong Li, Michael Murias, Samantha Major, Geraldine Dawson, David E. Carlson
In this work, we propose a method called Domain Adversarial nets for Target Shift (DATS) to address label shift while learning a domain invariant representation.
1 code implementation • NeurIPS 2018 • Yitong Li, Michael Murias, Geraldine Dawson, David E. Carlson
This methodology builds on existing distribution-matching approaches by assuming that source domains are varied and outcomes multi-factorial.
1 code implementation • NeurIPS 2017 • Jin Hyung Lee, David E. Carlson, Hooshmand Shokri Razaghi, Weichi Yao, Georges A. Goetz, Espen Hagen, Eleanor Batty, E.J. Chichilnisky, Gaute T. Einevoll, Liam Paninski
Spike sorting is a critical first step in extracting neural signals from large-scale electrophysiological data.
no code implementations • NeurIPS 2017 • Neil Gallagher, Kyle R. Ulrich, Austin Talbot, Kafui Dzirasa, Lawrence Carin, David E. Carlson
To facilitate understanding of network-level synchronization between brain regions, we introduce a novel model of multisite low-frequency neural recordings, such as local field potentials (LFPs) and electroencephalograms (EEGs).
no code implementations • NeurIPS 2017 • Yitong Li, Michael Murias, Samantha Major, Geraldine Dawson, Kafui Dzirasa, Lawrence Carin, David E. Carlson
We consider the analysis of Electroencephalography (EEG) and Local Field Potential (LFP) datasets, which are “big” in terms of the size of recorded data but rarely have sufficient labels required to train complex models (e. g., conventional deep learning methods).
no code implementations • NeurIPS 2015 • David E. Carlson, Edo Collins, Ya-Ping Hsieh, Lawrence Carin, Volkan Cevher
These challenges include, but are not limited to, the non-convexity of learning objectives and estimating the quantities needed for optimization algorithms, such as gradients.
no code implementations • NeurIPS 2015 • Kyle R. Ulrich, David E. Carlson, Kafui Dzirasa, Lawrence Carin
An illustrative and motivating example of a multi-task problem is multi-region electrophysiological time-series data, where experimentalists are interested in both power and phase coherence between channels.
no code implementations • NeurIPS 2014 • Kyle R. Ulrich, David E. Carlson, Wenzhao Lian, Jana S. Borg, Kafui Dzirasa, Lawrence Carin
The LFPs are modeled as a mixture of GPs, with state- and region-dependent mixture weights, and with the spectral content of the data encoded in GP spectral mixture covariance kernels.
no code implementations • NeurIPS 2014 • David E. Carlson, Jana Schaich Borg, Kafui Dzirasa, Lawrence Carin
One of the goals of neuroscience is to identify neural networks that correlate with important behaviors, environments, or genotypes.
no code implementations • NeurIPS 2013 • Liming Wang, David E. Carlson, Miguel Rodrigues, David Wilcox, Robert Calderbank, Lawrence Carin
We consider design of linear projection measurements for a vector Poisson signal model.
no code implementations • NeurIPS 2013 • David E. Carlson, Vinayak Rao, Joshua T. Vogelstein, Lawrence Carin
With simultaneous measurements from ever increasing populations of neurons, there is a growing need for sophisticated tools to recover signals from individual neurons.
no code implementations • NeurIPS 2011 • Bo Chen, David E. Carlson, Lawrence Carin
Nonparametric Bayesian methods are developed for analysis of multi-channel spike-train data, with the feature learning and spike sorting performed jointly.