Search Results for author: Le Fang

Found 9 papers, 8 papers with code

Surprisal Predicts Code-Switching in Chinese-English Bilingual Text

no code implementations EMNLP 2020 Jes{\'u}s Calvillo, Le Fang, Jeremy Cole, David Reitter

We also found sentence length to be a significant predictor, which has been related to sentence complexity.

Sentence

Meta-Learning with Less Forgetting on Large-Scale Non-Stationary Task Distributions

1 code implementation3 Sep 2022 Zhenyi Wang, Li Shen, Le Fang, Qiuling Suo, Donglin Zhan, Tiehang Duan, Mingchen Gao

Two key challenges arise in this more realistic setting: (i) how to use unlabeled data in the presence of a large amount of unlabeled out-of-distribution (OOD) data; and (ii) how to prevent catastrophic forgetting on previously learned task distributions due to the task distribution shift.

Meta-Learning

Improving Task-free Continual Learning by Distributionally Robust Memory Evolution

1 code implementation15 Jul 2022 Zhenyi Wang, Li Shen, Le Fang, Qiuling Suo, Tiehang Duan, Mingchen Gao

To address these problems, for the first time, we propose a principled memory evolution framework to dynamically evolve the memory data distribution by making the memory buffer gradually harder to be memorized with distributionally robust optimization (DRO).

Continual Learning

Learning To Learn and Remember Super Long Multi-Domain Task Sequence

1 code implementation CVPR 2022 Zhenyi Wang, Li Shen, Tiehang Duan, Donglin Zhan, Le Fang, Mingchen Gao

We propose a domain shift detection technique to capture latent domain change and equip the meta optimizer with it to work in this setting.

Meta-Learning

Meta Learning on a Sequence of Imbalanced Domains with Difficulty Awareness

1 code implementation ICCV 2021 Zhenyi Wang, Tiehang Duan, Le Fang, Qiuling Suo, Mingchen Gao

In this paper, we explore a more practical and challenging setting where task distribution changes over time with domain shift.

Change Detection Management +1

Transformer-based Conditional Variational Autoencoder for Controllable Story Generation

2 code implementations4 Jan 2021 Le Fang, Tao Zeng, Chaochun Liu, Liefeng Bo, Wen Dong, Changyou Chen

In this paper, we advocate to revive latent variable modeling, essentially the power of representation learning, in the era of Transformers to enhance controllability without hurting state-of-the-art generation effectiveness.

Representation Learning Story Generation

Outline to Story: Fine-grained Controllable Story Generation from Cascaded Events

1 code implementation4 Jan 2021 Le Fang, Tao Zeng, Chaochun Liu, Liefeng Bo, Wen Dong, Changyou Chen

Our paper is among the first ones by our knowledge to propose a model and to create datasets for the task of "outline to story".

Keyword Extraction Language Modelling +1

Implicit Deep Latent Variable Models for Text Generation

1 code implementation IJCNLP 2019 Le Fang, Chunyuan Li, Jianfeng Gao, Wen Dong, Changyou Chen

Deep latent variable models (LVM) such as variational auto-encoder (VAE) have recently played an important role in text generation.

Language Modelling Response Generation +2

Expectation Propagation with Stochastic Kinetic Model in Complex Interaction Systems

1 code implementation NeurIPS 2017 Le Fang, Fan Yang, Wen Dong, Tong Guan, Chunming Qiao

Technological breakthroughs allow us to collect data with increasing spatio-temporal resolution from complex interaction systems.

Cannot find the paper you are looking for? You can Submit a new open access paper.