You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 10 Jan 2022 • Hadas Benisty, Alexander Song, Gal Mishne, Adam S. Charles

Functional optical imaging in neuroscience is rapidly growing with the development of new optical systems and fluorescence indicators.

1 code implementation • NeurIPS 2021 • Changhao Shi, Sivan Schwartz, Shahar Levy, Shay Achvat, Maisan Abboud, Amir Ghanayim, Jackie Schiller, Gal Mishne

To understand the relationship between behavior and neural activity, experiments in neuroscience often include an animal performing a repeated behavior such as a motor task.

no code implementations • 29 Sep 2021 • Chester Holtz, Tsui-Wei Weng, Gal Mishne

There has been great interest in enhancing the robustness of neural network classifiers to defend against adversarial perturbations through adversarial training, while balancing the trade-off between robust accuracy and standard accuracy.

1 code implementation • ICLR Workshop GTRL 2021 • Dhruv Kohli, Alexander Cloninger, Gal Mishne

We present Low Distortion Local Eigenmaps (LDLE), a manifold learning technique which constructs a set of low distortion local views of a dataset in lower dimension and registers them to obtain a global embedding.

no code implementations • 23 Jan 2021 • Changhao Shi, Chester Holtz, Gal Mishne

To the best of our knowledge, our paper is the first that generalizes the idea of using self-supervised signals to perform online test-time purification.

no code implementations • 1 Jan 2021 • Chester Holtz, Changhao Shi, Gal Mishne

Recent work has demonstrated that neural networks are vulnerable to small, adversarial perturbations of their input.

no code implementations • ICLR 2021 • Changhao Shi, Chester Holtz, Gal Mishne

Deep neural networks are known to be vulnerable to adversarial examples, where a perturbation in the input space leads to an amplified shift in the latent network representation.

no code implementations • 9 Sep 2020 • Ofir Lindenbaum, Amir Sagiv, Gal Mishne, Ronen Talmon

A low-dimensional dynamical system is observed in an experiment as a high-dimensional signal; for example, a video of a chaotic pendulums system.

no code implementations • 30 Jun 2020 • Jay S. Stanley III, Eric C. Chi, Gal Mishne

Graph signal processing (GSP) is an important methodology for studying data residing on irregular structures.

1 code implementation • NeurIPS 2019 • Scott Gigante, Adam S. Charles, Smita Krishnaswamy, Gal Mishne

We demonstrate M-PHATE with two vignettes: continual learning and generalization.

1 code implementation • 25 Oct 2018 • Xiuyuan Cheng, Gal Mishne

The extraction of clusters from a dataset which includes multiple clusters and a significant background component is a non-trivial task of practical importance.

no code implementations • 16 Oct 2018 • Gal Mishne, Eric C. Chi, Ronald R. Coifman

We propose utilizing this coupled structure to perform co-manifold learning: uncovering the underlying geometry of both the rows and the columns of a given matrix, where we focus on a missing data setting.

3 code implementations • 13 Nov 2017 • George C. Linderman, Gal Mishne, Yuval Kluger, Stefan Steinerberger

If we pick $n$ random points uniformly in $[0, 1]^d$ and connect each point to its $k-$nearest neighbors, then it is well known that there exists a giant connected component with high probability.

1 code implementation • 18 Aug 2017 • Gal Mishne, Ronen Talmon, Israel Cohen, Ronald R. Coifman, Yuval Kluger

Often the data is such that the observations do not reside on a regular grid, and the given order of the features is arbitrary and does not convey a notion of locality.

no code implementations • 5 Jun 2017 • Xiuyuan Cheng, Gal Mishne, Stefan Steinerberger

Let $(M, g)$ be a compact manifold and let $-\Delta \phi_k = \lambda_k \phi_k$ be the sequence of Laplacian eigenfunctions.

no code implementations • 6 Nov 2015 • Gal Mishne, Ronen Talmon, Ron Meir, Jackie Schiller, Uri Dubin, Ronald R. Coifman

In the wake of recent advances in experimental methods in neuroscience, the ability to record in-vivo neuronal activity from awake animals has become feasible.

no code implementations • 25 Jun 2015 • Gal Mishne, Uri Shaham, Alexander Cloninger, Israel Cohen

In this paper, we propose a manifold learning algorithm based on deep learning to create an encoder, which maps a high-dimensional dataset and its low-dimensional embedding, and a decoder, which takes the embedded data back to the high-dimensional space.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.