no code implementations • ICML 2020 • Haoran Sun, Songtao Lu, Mingyi Hong
Similarly, for online problems, the proposed method achieves an $\mathcal{O}(m \epsilon^{-3/2})$ sample complexity and an $\mathcal{O}(\epsilon^{-1})$ communication complexity, while the best existing bounds are $\mathcal{O}(m\epsilon^{-2})$ and $\mathcal{O}(\epsilon^{-2})$.
no code implementations • COLING 2022 • Haoran Sun, Deyi Xiong
Knowledge transfer across languages is crucial for multilingual neural machine translation.
no code implementations • 13 Jun 2023 • Yurong Chen, Qian Wang, Zhijian Duan, Haoran Sun, Zhaohua Chen, Xiang Yan, Xiaotie Deng
To the best of our knowledge, we are the first to consider bidder coordination in online repeated auctions with constraints.
no code implementations • 20 May 2023 • Zhijian Duan, Haoran Sun, Yurong Chen, Xiaotie Deng
AMenuNet is always DSIC and individually rational (IR) due to the properties of AMAs, and it enhances scalability by generating candidate allocations through a neural network.
no code implementations • 19 May 2023 • Shun Zhang, Haoran Sun, Runze Yu, Hongshenyuan Cui, Jian Ren, Feifei Gao, Shi Jin, Hongxiang Xie, Hao Wang
In particular, we adopt a self-developed broadband intelligent communication system 40MHz-Net (BICT-40N) terminal in order to fully acquire the channel information.
1 code implementation • 17 Feb 2023 • Haoran Sun, Yang Wang, Haipeng Liu, Biao Qian
The proposed FF-Block integrates an attention block and several convolution layers to effectively fuse the fine-grained word-context features into the corresponding visual features, in which the text information is fully used to refine the initial image with more details.
no code implementations • 30 Nov 2022 • Haoran Sun, Lijun Yu, Bo Dai, Dale Schuurmans, Hanjun Dai
Score-based modeling through stochastic differential equations (SDEs) has provided a new perspective on diffusion models, and demonstrated superior performance on continuous data.
1 code implementation • 16 Sep 2022 • Haoran Sun, Hanjun Dai, Dale Schuurmans
Optimal scaling has been well studied for Metropolis-Hastings (M-H) algorithms in continuous spaces, but a similar understanding has been lacking in discrete spaces.
no code implementations • 23 Jul 2022 • Haoran Sun, Etash K. Guha, Hanjun Dai
However, learning neural networks for CO problems is notoriously difficult in lack of the labeled data as the training is easily trapped at local optima.
no code implementations • 29 Jun 2022 • Haoran Sun, Hanjun Dai, Bo Dai, Haomin Zhou, Dale Schuurmans
It is known that gradient-based MCMC samplers for continuous spaces, such as Langevin Monte Carlo (LMC), can be derived as particle versions of a gradient flow that minimizes KL divergence on a Wasserstein manifold.
no code implementations • 28 Dec 2021 • Bingqing Song, Haoran Sun, Wenqiang Pu, Sijia Liu, Mingyi Hong
We then provide a series of theoretical results to further understand the properties of the two approaches.
no code implementations • NeurIPS 2021 • Xinshi Chen, Haoran Sun, Caleb Ellington, Eric Xing, Le Song
We consider the problem of discovering $K$ related Gaussian directed acyclic graphs (DAGs), where the involved graph structures share a consistent causal order and sparse unions of supports.
no code implementations • 18 Oct 2021 • Haoran Sun, Chen Chen, Lantian Li, Dong Wang
SpeechFlow is a powerful factorization model based on information bottleneck (IB), and its effectiveness has been reported by several studies.
no code implementations • ICLR 2022 • Haoran Sun, Hanjun Dai, Wei Xia, Arun Ramamurthy
Energy-based Model (EBM) offers a powerful approach for modeling discrete structure, but both inference and learning of EBM are hard as it involves sampling from discrete distributions.
no code implementations • ICLR 2022 • Xinshi Chen, Haoran Sun, Le Song
In this work, we propose PLISA (Provable Learning-based Iterative Sparse recovery Algorithm) to learn algorithms automatically from data.
1 code implementation • 3 May 2021 • Haoran Sun, Wenqiang Pu, Xiao Fu, Tsung-Hui Chang, Mingyi Hong
However, it is often challenging for these approaches to learn in a dynamic environment.
no code implementations • NeurIPS Workshop LMCA 2020 • Haoran Sun, Wenbo Chen, Hui Li, Le Song
Branch-and-Bound~(B\&B) is a general and widely used algorithm paradigm for solving Mixed Integer Programming~(MIP).
4 code implementations • 16 Nov 2020 • Haoran Sun, Wenqiang Pu, Minghe Zhu, Xiao Fu, Tsung-Hui Chang, Mingyi Hong
We propose to build the notion of continual learning (CL) into the modeling process of learning wireless systems, so that the learning model can incrementally adapt to the new episodes, {\it without forgetting} knowledge learned from the previous episodes.
no code implementations • 27 Oct 2020 • Haoran Sun, Lantian Li, Yunqi Cai, Yang Zhang, Thomas Fang Zheng, Dong Wang
Various information factors are blended in speech signals, which forms the primary difficulty for most speech information processing tasks.
no code implementations • 20 Jun 2020 • Mingyi Hong, Siliang Zeng, Junyu Zhang, Haoran Sun
However, by constructing some counter-examples, we show that when certain local Lipschitz conditions (LLC) on the local function gradient $\nabla f_i$'s are not satisfied, most of the existing decentralized algorithms diverge, even if the global Lipschitz condition (GLC) is satisfied, where the sum function $f$ has Lipschitz gradient.
no code implementations • 15 Jan 2020 • Haoran Sun, Xueqing Liu, Xinyang Feng, Chen Liu, Nanyan Zhu, Sabrina J. Gjerswold-Selleck, Hong-Jian Wei, Pavan S. Upadhyayula, Angeliki Mela, Cheng-Chia Wu, Peter D. Canoll, Andrew F. Laine, J. Thomas Vaughan, Scott A. Small, Jia Guo
Together, these studies validate our hypothesis that a deep learning approach can potentially replace the need for GBCAs in brain MRI.
no code implementations • 29 Oct 2019 • Haoran Sun, Yunqi Cai, Lantian Li, Dong Wang
Speech signals are complex composites of various information, including phonetic content, speaker traits, channel effect, etc.
no code implementations • 13 Oct 2019 • Haoran Sun, Songtao Lu, Mingyi Hong
Similarly, for online problems, the proposed method achieves an $\mathcal{O}(m \epsilon^{-3/2})$ sample complexity and an $\mathcal{O}(\epsilon^{-1})$ communication complexity, while the best existing bounds are $\mathcal{O}(m\epsilon^{-2})$ and $\mathcal{O}(\epsilon^{-2})$, respectively.
no code implementations • 21 Jul 2019 • Zhanzhan Zhao, Haoran Sun
This paper proposes an algorithm Alice having no access to the physics law of the environment, which is actually linear with stochastic noise, and learns to make decisions directly online without a training phase or a stable policy as initial input.
no code implementations • NeurIPS 2020 • Xiangyi Chen, Tiancong Chen, Haoran Sun, Zhiwei Steven Wu, Mingyi Hong
We show that these algorithms are non-convergent whenever there is some disparity between the expected median and mean over the local gradients.
no code implementations • 8 Jun 2018 • Xinye Cai, Haoran Sun, Chunyang Zhu, Zhenyu Li, Qingfu Zhang
In this paper, an evolutionary many-objective optimization algorithm based on corner solution search (MaOEA-CS) was proposed.