Search Results for author: Dandan Guo

Found 12 papers, 4 papers with code

Extracting Clean and Balanced Subset for Noisy Long-tailed Classification

no code implementations10 Apr 2024 Zhuo Li, He Zhao, Zhen Li, Tongliang Liu, Dandan Guo, Xiang Wan

To solve the joint issue of long-tailed distribution and label noise, most previous works usually aim to design a noise detector to distinguish the noisy and clean samples.

Pseudo Label

Adaptive Distribution Calibration for Few-Shot Learning with Hierarchical Optimal Transport

no code implementations9 Oct 2022 Dandan Guo, Long Tian, He Zhao, Mingyuan Zhou, Hongyuan Zha

A recent solution to this problem is calibrating the distribution of these few sample classes by transferring statistics from the base classes with sufficient examples, where how to decide the transfer weights from base classes to novel classes is the key.

Domain Generalization Few-Shot Learning

Learning to Re-weight Examples with Optimal Transport for Imbalanced Classification

no code implementations5 Aug 2022 Dandan Guo, Zhuo Li, Meixi Zheng, He Zhao, Mingyuan Zhou, Hongyuan Zha

Specifically, we view the training set as an imbalanced distribution over its samples, which is transported by OT to a balanced distribution obtained from the meta set.

Bilevel Optimization imbalanced classification

Representing Mixtures of Word Embeddings with Mixtures of Topic Embeddings

2 code implementations ICLR 2022 Dongsheng Wang, Dandan Guo, He Zhao, Huangjie Zheng, Korawat Tanwisuth, Bo Chen, Mingyuan Zhou

This paper introduces a new topic-modeling framework where each document is viewed as a set of word embedding vectors and each topic is modeled as an embedding vector in the same embedding space.

Word Embeddings

Learning Prototype-oriented Set Representations for Meta-Learning

no code implementations ICLR 2022 Dandan Guo, Long Tian, Minghe Zhang, Mingyuan Zhou, Hongyuan Zha

Since our plug-and-play framework can be applied to many meta-learning problems, we further instantiate it to the cases of few-shot classification and implicit meta generative modeling.

Meta-Learning

Matching Visual Features to Hierarchical Semantic Topics for Image Paragraph Captioning

1 code implementation10 May 2021 Dandan Guo, Ruiying Lu, Bo Chen, Zequn Zeng, Mingyuan Zhou

Inspired by recent successes in integrating semantic topics into this task, this paper develops a plug-and-play hierarchical-topic-guided image paragraph generation framework, which couples a visual extractor with a deep topic model to guide the learning of a language model.

Image Paragraph Captioning Language Modelling +1

Variational Temporal Deep Generative Model for Radar HRRP Target Recognition

no code implementations28 Sep 2020 Dandan Guo, Bo Chen, Wenchao Chen, Chaojie Wang, Hongwei Liu, Mingyuan Zhou

We develop a recurrent gamma belief network (rGBN) for radar automatic target recognition (RATR) based on high-resolution range profile (HRRP), which characterizes the temporal dependence across the range cells of HRRP.

Variational Inference

Deep Autoencoding Topic Model with Scalable Hybrid Bayesian Inference

no code implementations15 Jun 2020 Hao Zhang, Bo Chen, Yulai Cong, Dandan Guo, Hongwei Liu, Mingyuan Zhou

Given a posterior sample of the global parameters, in order to efficiently infer the local latent representations of a document under DATM across all stochastic layers, we propose a Weibull upward-downward variational encoder that deterministically propagates information upward via a deep neural network, followed by a Weibull distribution based stochastic downward generative model.

Bayesian Inference

Recurrent Hierarchical Topic-Guided RNN for Language Generation

1 code implementation ICML 2020 Dandan Guo, Bo Chen, Ruiying Lu, Mingyuan Zhou

To simultaneously capture syntax and global semantics from a text corpus, we propose a new larger-context recurrent neural network (RNN) based language model, which extracts recurrent hierarchical semantic structure via a dynamic deep topic model to guide natural language generation.

Language Modelling Sentence +1

Recurrent Hierarchical Topic-Guided Neural Language Models

no code implementations25 Sep 2019 Dandan Guo, Bo Chen, Ruiying Lu, Mingyuan Zhou

To simultaneously capture syntax and semantics from a text corpus, we propose a new larger-context language model that extracts recurrent hierarchical semantic structure via a dynamic deep topic model to guide natural language generation.

Language Modelling Sentence +1

Deep Poisson gamma dynamical systems

no code implementations NeurIPS 2018 Dandan Guo, Bo Chen, Hao Zhang, Mingyuan Zhou

We develop deep Poisson-gamma dynamical systems (DPGDS) to model sequentially observed multivariate count data, improving previously proposed models by not only mining deep hierarchical latent structure from the data, but also capturing both first-order and long-range temporal dependencies.

Data Augmentation Time Series +1

WHAI: Weibull Hybrid Autoencoding Inference for Deep Topic Modeling

1 code implementation ICLR 2018 Hao Zhang, Bo Chen, Dandan Guo, Mingyuan Zhou

To train an inference network jointly with a deep generative topic model, making it both scalable to big corpora and fast in out-of-sample prediction, we develop Weibull hybrid autoencoding inference (WHAI) for deep latent Dirichlet allocation, which infers posterior samples via a hybrid of stochastic-gradient MCMC and autoencoding variational Bayes.

Cannot find the paper you are looking for? You can Submit a new open access paper.