no code implementations • 28 Apr 2024 • Noga Mudrik, Eva Yezerets, Yenho Chen, Christopher Rozell, Adam Charles
Such systems, often modeled as dynamical systems, typically exhibit noisy high-dimensional and non-stationary temporal behavior that renders their identification challenging.
no code implementations • 13 Aug 2023 • Matthew O'Shaughnessy, Mark Davenport, Christopher Rozell
We analyze the popular ``state-space'' class of algorithms for detecting casual interaction in coupled dynamical systems.
no code implementations • 31 Mar 2023 • Marissa Connor, Bruno Olshausen, Christopher Rozell
When interacting in a three dimensional world, humans must estimate 3D structure from visual inputs projected down to two dimensional retinal images.
no code implementations • 27 May 2022 • Gregory Canal, Yancy Diaz-Mercado, Magnus Egerstedt, Christopher Rozell
We construct a scalable dictionary of robotic behaviors that can be searched simply and efficiently by a BCI user, as we demonstrate through a large-scale user study testing the feasibility of our interaction algorithm, a user test of the full BCI system on (virtual and real) robot swarms, and simulations that verify our results against theoretical models.
1 code implementation • 22 Jun 2021 • Marissa Connor, Kion Fallah, Christopher Rozell
However, these approaches are limited because they require transformation labels when training their models and they lack a method for determining which regions of the manifold are appropriate for applying each specific operator.
1 code implementation • 28 Feb 2021 • Gregory Canal, Matthieu Bloch, Christopher Rozell
The iterative selection of examples for labeling in active machine learning is conceptually similar to feedback channel coding in information theory: in both tasks, the objective is to seek a minimal sequence of actions to encode information in the presence of noise.
1 code implementation • NeurIPS 2020 • Kion Fallah, Adam Willats, Ninghao Liu, Christopher Rozell
Unfortunately, current proposals for sparse coding in the compressed space require a centralized compression process (i. e., dense random matrix) that is biologically unrealistic due to local wiring constraints observed in neural circuits.
2 code implementations • NeurIPS 2020 • Matthew O'Shaughnessy, Gregory Canal, Marissa Connor, Mark Davenport, Christopher Rozell
Our objective function encourages both the generative model to faithfully represent the data distribution and the latent factors to have a large causal influence on the classifier output.
no code implementations • 5 Dec 2019 • Marissa Connor, Christopher Rozell
Deep generative networks have been widely used for learning mappings from a low-dimensional latent space to a high-dimensional data space.
1 code implementation • 9 Oct 2019 • Gregory Canal, Stefano Fenu, Christopher Rozell
Many machine learning tasks such as clustering, classification, and dataset search benefit from embedding data points in a space where distances reflect notions of relative similarity as perceived by humans.
no code implementations • ICLR 2018 • Marissa Connor, Christopher Rozell
The main contribution of this paper is to define two transfer learning methods that use this generative manifold representation to learn natural transformations and incorporate them into new data.
no code implementations • 26 May 2016 • Adam Charles, Dong Yin, Christopher Rozell
In most existing analyses, the short-term memory (STM) capacity results conclude that the ESN network size must scale linearly with the input size for unstructured inputs.
no code implementations • 22 Jul 2015 • Adam Charles, Aurele Balavoine, Christopher Rozell
Taken together, the algorithms presented in this paper represent the first strong performance analysis of dynamic filtering algorithms for time-varying sparse signals as well as state-of-the-art performance in this emerging application.