Search Results for author: Guangyao Zhou

Found 11 papers, 5 papers with code

Learning Cognitive Maps from Transformer Representations for Efficient Planning in Partially Observed Environments

no code implementations11 Jan 2024 Antoine Dedieu, Wolfgang Lehrach, Guangyao Zhou, Dileep George, Miguel Lázaro-Gredilla

Despite their stellar performance on a wide range of tasks, including in-context tasks only revealed during inference, vanilla transformers and variants trained for next-token predictions (a) do not learn an explicit world model of their environment which can be flexibly queried and (b) cannot be used for planning or navigation.

Multi Loss-based Feature Fusion and Top Two Voting Ensemble Decision Strategy for Facial Expression Recognition in the Wild

no code implementations6 Nov 2023 Guangyao Zhou, Yuanlun Xie, Wenhong Tian

Different from previous studies, this paper applies both internal feature fusion for a single model and feature fusion among multiple networks, as well as the ensemble strategy.

Facial Expression Recognition Facial Expression Recognition (FER)

RoboTAP: Tracking Arbitrary Points for Few-Shot Visual Imitation

no code implementations30 Aug 2023 Mel Vecerik, Carl Doersch, Yi Yang, Todor Davchev, Yusuf Aytar, Guangyao Zhou, Raia Hadsell, Lourdes Agapito, Jon Scholz

For robots to be useful outside labs and specialized factories we need a way to teach them new useful behaviors quickly.

Learning noisy-OR Bayesian Networks with Max-Product Belief Propagation

no code implementations31 Jan 2023 Antoine Dedieu, Guangyao Zhou, Dileep George, Miguel Lazaro-Gredilla

We evaluate both approaches on several benchmarks where VI is the state-of-the-art and show that our method (a) achieves better test performance than Ji et al. (2020) for learning noisy-OR BNs with hierarchical latent structures on large sparse real datasets; (b) recovers a higher number of ground truth parameters than Buhai et al. (2020) from cluttered synthetic scenes; and (c) solves the 2D blind deconvolution problem from Lazaro-Gredilla et al. (2021) and variant - including binary matrix factorization - while VI catastrophically fails and is up to two orders of magnitude slower.

Variational Inference

Space is a latent sequence: Structured sequence learning as a unified theory of representation in the hippocampus

no code implementations3 Dec 2022 Rajkumar Vasudeva Raju, J. Swaroop Guntupalli, Guangyao Zhou, Miguel Lázaro-Gredilla, Dileep George

Fascinating and puzzling phenomena, such as landmark vector cells, splitter cells, and event-specific representations to name a few, are regularly discovered in the hippocampus.

Hippocampus

PGMax: Factor Graphs for Discrete Probabilistic Graphical Models and Loopy Belief Propagation in JAX

2 code implementations8 Feb 2022 Guangyao Zhou, Antoine Dedieu, Nishanth Kumar, Wolfgang Lehrach, Miguel Lázaro-Gredilla, Shrinu Kushagra, Dileep George

PGMax is an open-source Python package for (a) easily specifying discrete Probabilistic Graphical Models (PGMs) as factor graphs; and (b) automatically running efficient and scalable loopy belief propagation (LBP) in JAX.

Graphical Models with Attention for Context-Specific Independence and an Application to Perceptual Grouping

1 code implementation6 Dec 2021 Guangyao Zhou, Wolfgang Lehrach, Antoine Dedieu, Miguel Lázaro-Gredilla, Dileep George

To demonstrate MAM's capabilities to capture CSIs at scale, we apply MAMs to capture an important type of CSI that is present in a symbolic approach to recurrent computations in perceptual grouping.

Query Training: Learning a Worse Model to Infer Better Marginals in Undirected Graphical Models with Hidden Variables

1 code implementation11 Jun 2020 Miguel Lázaro-Gredilla, Wolfgang Lehrach, Nishad Gothoskar, Guangyao Zhou, Antoine Dedieu, Dileep George

Here we introduce query training (QT), a mechanism to learn a PGM that is optimized for the approximate inference algorithm that will be paired with it.

Mixed Hamiltonian Monte Carlo for Mixed Discrete and Continuous Variables

1 code implementation NeurIPS 2020 Guangyao Zhou

Hamiltonian Monte Carlo (HMC) has emerged as a powerful Markov Chain Monte Carlo (MCMC) method to sample from complex continuous distributions.

Computation

Cannot find the paper you are looking for? You can Submit a new open access paper.