Search Results for author: Il Memming Park

Found 29 papers, 10 papers with code

Jointly learning visual motion and confidence from local patches in event cameras

no code implementations ECCV 2020 Daniel R. Kepple, Daewon Lee, Colin Prepsius, Volkan Isler, Il Memming Park, Daniel D. Lee

In the task of recovering pan-tilt ego velocities from events, we show that each individual confident local prediction of our network can be expected to be as accurate as state of the art optimization approaches which utilize the full image.

Motion Segmentation

Large-scale variational Gaussian state-space models

no code implementations3 Mar 2024 Matthew Dowling, Yuan Zhao, Il Memming Park

We introduce an amortized variational inference algorithm and structured variational approximation for state-space models with nonlinear dynamics driven by Gaussian noise.

Variational Inference

Persistent learning signals and working memory without continuous attractors

no code implementations24 Aug 2023 Il Memming Park, Ábel Ságodi, Piotr Aleksander Sokół

Our theory has broad implications for the design of artificial learning systems and makes predictions about observable signatures of biological neural dynamics that can support temporal dependence learning and working memory.

Linear Time GPs for Inferring Latent Trajectories from Neural Spike Trains

no code implementations1 Jun 2023 Matthew Dowling, Yuan Zhao, Il Memming Park

In this work, we propose cvHM, a general inference framework for latent GP models leveraging Hida-Mat\'ern kernels and conjugate computation variational inference (CVI).

Variational Inference

Real-Time Variational Method for Learning Neural Trajectory and its Dynamics

no code implementations18 May 2023 Matthew Dowling, Yuan Zhao, Il Memming Park

Latent variable models have become instrumental in computational neuroscience for reasoning about neural computation.

Experimental Design

Spectral learning of Bernoulli linear dynamical systems models

1 code implementation3 Mar 2023 Iris R. Stone, Yotam Sagiv, Il Memming Park, Jonathan W. Pillow

Latent linear dynamical systems with Bernoulli observations provide a powerful modeling framework for identifying the temporal dynamics underlying binary time series data, which arise in a variety of contexts such as binary decision-making and discrete stochastic processes (e. g., binned neural spike trains).

Decision Making Time Series +1

Metastable dynamics of neural circuits and networks

no code implementations6 Oct 2021 Braden A. W. Brinkman, Han Yan, Arianna Maffei, Il Memming Park, Alfredo Fontanini, Jin Wang, Giancarlo La Camera

In this article, we review the experimental evidence for neural metastable dynamics together with theoretical approaches to the study of metastable activity in neural circuits.

Decision Making

Gating Mechanisms Underlying Sequence-to-Sequence Working Memory

no code implementations29 Sep 2021 Ian D Jordan, Piotr A Sokol, Il Memming Park

We explain the learned mechanisms by which this network holds memory and extracts information from memory, and how gating is a natural architectural component to achieve these structures.

Hida-Matérn Kernel

no code implementations15 Jul 2021 Matthew Dowling, Piotr Sokół, Il Memming Park

We present the class of Hida-Mat\'ern kernels, which is the canonical family of covariance functions over the entire space of stationary Gauss-Markov Processes.

On 1/n neural representation and robustness

1 code implementation NeurIPS 2020 Josue Nassar, Piotr Aleksander Sokol, SueYeon Chung, Kenneth D. Harris, Il Memming Park

In this work, we investigate the latter by juxtaposing experimental results regarding the covariance spectrum of neural representations in the mouse V1 (Stringer et al) with artificial neural networks.

Adversarial Robustness

Rescuing neural spike train models from bad MLE

1 code implementation NeurIPS 2020 Diego M. Arribas, Yuan Zhao, Il Memming Park

The standard approach to fitting an autoregressive spike train model is to maximize the likelihood for one-step prediction.

Non-parametric generalized linear model

no code implementations2 Sep 2020 Matthew Dowling, Yuan Zhao, Il Memming Park

However, obtaining a satisfactory fit often requires burdensome model selection and fine tuning the form of the basis functions and their temporal span.

Model Selection

Streaming Variational Monte Carlo

1 code implementation4 Jun 2019 Yuan Zhao, Josue Nassar, Ian Jordan, Mónica Bugallo, Il Memming Park

Nonlinear state-space models are powerful tools to describe dynamical structures in complex time series.

Gaussian Processes Time Series +3

Gated recurrent units viewed through the lens of continuous time dynamical systems

no code implementations3 Jun 2019 Ian D. Jordan, Piotr Aleksander Sokol, Il Memming Park

As a result, it is both difficult to know a priori how successful a GRU network will perform on a given task, and also their capacity to mimic the underlying behavior of their biological counterparts.

The Expressive Power of Gated Recurrent Units as a Continuous Dynamical System

no code implementations ICLR 2019 Ian D. Jordan, Piotr Aleksander Sokol, Il Memming Park

Gated recurrent units (GRUs) were inspired by the common gated recurrent unit, long short-term memory (LSTM), as a means of capturing temporal structure with less complex memory unit architecture.

Time Series Time Series Prediction

Information Geometry of Orthogonal Initializations and Training

no code implementations ICLR 2020 Piotr A. Sokol, Il Memming Park

Recently mean field theory has been successfully used to analyze properties of wide, random neural networks.

Variational online learning of neural dynamics

1 code implementation27 Jul 2017 Yuan Zhao, Il Memming Park

It brings the challenge of learning both latent neural state and the underlying dynamical system because neither is known for neural systems a priori.

Decision Making Experimental Design +1

Variational Latent Gaussian Process for Recovering Single-Trial Dynamics from Population Spike Trains

1 code implementation11 Apr 2016 Yuan Zhao, Il Memming Park

In the V1 dataset, we find that vLGP achieves substantially higher performance than previous methods for predicting omitted spike trains, as well as capturing both the toroidal topology of visual stimuli space, and the noise-correlation.

Point Processes

Convolutional spike-triggered covariance analysis for neural subunit models

no code implementations NeurIPS 2015 Anqi Wu, Il Memming Park, Jonathan W. Pillow

Subunit models provide a powerful yet parsimonious description of neural spike responses to complex stimuli.

Black box variational inference for state space models

no code implementations23 Nov 2015 Evan Archer, Il Memming Park, Lars Buesing, John Cunningham, Liam Paninski

These models have the advantage of learning latent structure both from noisy observations and from the temporal ordering in the data, where it is assumed that meaningful correlation structure exists across time.

Time Series Time Series Analysis +1

Universal models for binary spike patterns using centered Dirichlet processes

no code implementations NeurIPS 2013 Il Memming Park, Evan W. Archer, Kenneth Latimer, Jonathan W. Pillow

We also establish a condition for equivalence between the cascade-logistic and the 2nd-order maxent or "Ising'' model, making cascade-logistic a reasonable choice for base measure in a universal model.

Bayesian entropy estimation for binary spike train data using parametric prior knowledge

1 code implementation NeurIPS 2013 Evan W. Archer, Il Memming Park, Jonathan W. Pillow

Shannon's entropy is a basic quantity in information theory, and a fundamental building block for the analysis of neural codes.

Spectral methods for neural characterization using generalized quadratic models

no code implementations NeurIPS 2013 Il Memming Park, Evan W. Archer, Nicholas Priebe, Jonathan W. Pillow

The quadratic form characterizes the neuron's stimulus selectivity in terms of a set linear receptive fields followed by a quadratic combination rule, and the invertible nonlinearity maps this output to the desired response range.

Bayesian Extensions of Kernel Least Mean Squares

no code implementations20 Oct 2013 Il Memming Park, Sohan Seth, Steven Van Vaerenbergh

The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that "kernelizes" the celebrated (linear) least mean squares algorithm.

Bayesian Entropy Estimation for Countable Discrete Distributions

2 code implementations2 Feb 2013 Evan Archer, Il Memming Park, Jonathan Pillow

The Pitman-Yor process, a generalization of Dirichlet process, provides a tractable prior distribution over the space of countably infinite discrete distributions, and has found major applications in Bayesian non-parametric statistics and machine learning.

Information Theory Information Theory

Bayesian estimation of discrete entropy with mixtures of stick-breaking priors

no code implementations NeurIPS 2012 Evan Archer, Il Memming Park, Jonathan W. Pillow

We consider the problem of estimating Shannon's entropy H in the under-sampled regime, where the number of possible symbols may be unknown or countably infinite.

Bayesian Spike-Triggered Covariance Analysis

no code implementations NeurIPS 2011 Il Memming Park, Jonathan W. Pillow

We describe an empirical Bayes method for selecting the number of features, and extend the model to accommodate an arbitrary elliptical nonlinear response function, which results in a more powerful and more flexible model for feature space inference.

Cannot find the paper you are looking for? You can Submit a new open access paper.