no code implementations • WMT (EMNLP) 2020 • Xiangpeng Wei, Ping Guo, Yunpeng Li, Xingsheng Zhang, Luxi Xing, Yue Hu
In this paper we introduce the systems IIE submitted for the WMT20 shared task on German-French news translation.
no code implementations • 2 May 2024 • Jiaxi Li, John-Joseph Brady, Xiongjie Chen, Yunpeng Li
Differentiable particle filters combine the flexibility of neural networks with the probabilistic nature of sequential Monte Carlo methods.
no code implementations • 3 Mar 2024 • Xiongjie Chen, Yunpeng Li
Recently, there has been a surge of interest in incorporating neural networks into particle filters, e. g. differentiable particle filters, to perform joint sequential state estimation and model learning for non-linear non-Gaussian state-space models in complex environments.
no code implementations • 5 Jan 2024 • Yang Yang, Yury Kartynnik, Yunpeng Li, Jiuqiang Tang, Xing Li, George Sung, Matthias Grundmann
We present StreamVC, a streaming voice conversion solution that preserves the content and prosody of any source speech while matching the voice timbre from any target speech.
no code implementations • 10 Dec 2023 • Jiaxi Li, Xiongjie Chen, Yunpeng Li
Differentiable particle filters are an emerging class of sequential Bayesian inference techniques that use neural networks to construct components in state space models.
no code implementations • 13 Mar 2023 • Yang Yang, Shao-Fu Shih, Hakan Erdogan, Jamie Menjay Lin, Chehung Lee, Yunpeng Li, George Sung, Matthias Grundmann
Multi-microphone speech enhancement problem is often decomposed into two decoupled steps: a beamformer that provides spatial filtering and a single-channel speech enhancement model that cleans up the beamformer output.
no code implementations • 20 Feb 2023 • Wenhan Li, Xiongjie Chen, Wenwu Wang, Víctor Elvira, Yunpeng Li
Differentiable particle filters are an emerging class of particle filtering methods that use neural networks to construct and learn parametric state-space models.
no code implementations • 19 Feb 2023 • Xiongjie Chen, Yunpeng Li
Due to the expressiveness of neural networks, differentiable particle filters are a promising computational tool for performing inference on sequential data in complex, high-dimensional tasks, such as vision-based robot localisation.
no code implementations • 9 Nov 2022 • Karthik Comandur, Yunpeng Li, Santosh Nannuru
In this paper, we use bank of PFGPF filters to construct a Particle flow Gaussian sum particle filter (PFGSPF), which approximates the predictive and posterior as Gaussian mixture model.
no code implementations • 1 Nov 2022 • Wei Peng, Ziyuan Qin, Yue Hu, Yuqiang Xie, Yunpeng Li
The core module in FADO consists of a dual-level feedback strategy selector and a double control reader.
no code implementations • COLING 2022 • Yuqiang Xie, Yue Hu, Yunpeng Li, Guanqun Bi, Luxi Xing, Wei Peng
Inspired by psychology theories, we introduce global psychological state chains, which include the needs and emotions of the protagonists, to help a story generation system create more controllable and well-planned stories.
no code implementations • 4 Jul 2022 • Karthik Comandur, Yunpeng Li, Santosh Nannuru
State estimation in non-linear models is performed by tracking the posterior distribution recursively.
no code implementations • 26 Jun 2022 • Xiongjie Chen, Yunpeng Li, Yongxin Yang
Out-of-distribution (OOD) detection has recently received much attention from the machine learning community due to its importance in deploying machine learning models in real-world applications.
1 code implementation • 27 Apr 2022 • Wei Peng, Yue Hu, Luxi Xing, Yuqiang Xie, Yajing Sun, Yunpeng Li
Emotional support conversation aims at reducing the emotional distress of the help-seeker, which is a new and challenging task.
1 code implementation • 16 Mar 2022 • Xiongjie Chen, Yunpeng Li
Tuning of measurement models is challenging in real-world applications of sequential Monte Carlo methods.
1 code implementation • 1 Mar 2022 • Oleg Rybakov, Marco Tagliasacchi, Yunpeng Li, Liyang Jiang, Xia Zhang, Fadi Biadsy
We present two methods of real time magnitude spectrogram inversion: streaming Griffin Lim(GL) and streaming MelGAN.
1 code implementation • 18 Feb 2022 • Yuqiang Xie, Yue Hu, Luxi Xing, Yunpeng Li, Wei Peng, Ping Guo
To address these two issues, we propose a novel Contrastive Learning framework for Story Ending Generation (CLSEG), which has two steps: multi-aspect sampling and story-specific contrastive learning.
no code implementations • 12 Dec 2021 • Yunpeng Li, ZhaoHui Ye
Independent component analysis is intended to recover the mutually independent components from their linear mixtures.
no code implementations • 30 Nov 2021 • Yunpeng Li
Independent Component Analysis (ICA) is intended to recover the mutually independent sources from their linear mixtures, and F astICA is one of the most successful ICA algorithms.
1 code implementation • 14 Oct 2021 • Ivan Kiskin, Marianne Sinka, Adam D. Cobb, Waqas Rafique, Lawrence Wang, Davide Zilli, Benjamin Gutteridge, Rinita Dam, Theodoros Marinos, Yunpeng Li, Dickson Msaky, Emmanuel Kaindoa, Gerard Killeen, Eva Herreros-Moya, Kathy J. Willis, Stephen J. Roberts
Our extensive dataset is both challenging to machine learning researchers focusing on acoustic identification, and critical to entomologists, geo-spatial modellers and other domain experts to understand mosquito behaviour, model their distribution, and manage the threat they pose to humans.
no code implementations • 29 Sep 2021 • Narges Pourshahrokhi, Samaneh Kouchaki, Yunpeng Li, Payam M. Barnaghi
Generative Adversarial Networks (GANs) can learn complex distributions over images, audio, and data that are difficult to model.
no code implementations • 24 Jul 2021 • Yunpeng Li
Convolutive blind source separation (BSS) is intended to recover the unknown components from their convolutive mixtures.
1 code implementation • 1 Jul 2021 • Xiongjie Chen, Hao Wen, Yunpeng Li
Differentiable particle filters provide a flexible mechanism to adaptively train dynamic and measurement models by learning from observed data.
no code implementations • 18 May 2021 • Conghui Hu, Yongxin Yang, Yunpeng Li, Timothy M. Hospedales, Yi-Zhe Song
The practical value of existing supervised sketch-based image retrieval (SBIR) algorithms is largely limited by the requirement for intensive data collection and labeling.
no code implementations • 21 Jan 2021 • Yunpeng Li, ZhaoHui Ye
Nonparametric maximum likelihood estimation is intended to infer the unknown density distribution while making as few assumptions as possible.
1 code implementation • 11 Nov 2020 • Hao Wen, Xiongjie Chen, Georgios Papagiannis, Conghui Hu, Yunpeng Li
Recent advances in incorporating neural networks into particle filters provide the desired flexibility to apply particle filters in large-scale real-world applications.
1 code implementation • 19 Oct 2020 • Zalán Borsos, Yunpeng Li, Beat Gfeller, Marco Tagliasacchi
A crucial aspect for the successful deployment of audio-based models "in-the-wild" is the robustness to the transformations introduced by heterogeneous acquisition conditions.
1 code implementation • 4 Sep 2020 • Marco Tagliasacchi, Yunpeng Li, Karolis Misiunas, Dominik Roblek
We explore the possibility of leveraging accelerometer data to perform speech enhancement in very noisy conditions.
1 code implementation • 20 Aug 2020 • Georgios Papagiannis, Yunpeng Li
In this paper, we present tractable solutions by formulating imitation learning as minimization of the Sinkhorn distance between occupancy measures.
no code implementations • 5 Aug 2020 • Yunpeng Li, Beat Gfeller, Marco Tagliasacchi, Dominik Roblek
We propose an audio-to-audio neural network model that learns to denoise old music recordings.
1 code implementation • ICLR 2022 • Xiongjie Chen, Yongxin Yang, Yunpeng Li
While theoretically appealing, the application of the Wasserstein distance to large-scale machine learning problems has been hampered by its prohibitive computational cost.
1 code implementation • 24 May 2019 • Yunpeng Li, Dominik Roblek, Marco Tagliasacchi
We first obtain a latent video representation using a stochastic fusion mechanism that learns how to incorporate information from the start and end frames.
no code implementations • 19 Dec 2018 • Paul K. Rubenstein, Yunpeng Li, Dominik Roblek
Generative adversarial networks (GANs) are capable of producing high quality image samples.
no code implementations • 29 Nov 2018 • Olga Isupova, Yunpeng Li, Danil Kuzin, Stephen J. Roberts, Katherine Willis, Steven Reece
Machine learning research for developing countries can demonstrate clear sustainable impact by delivering actionable and timely information to in-country government organisations (GOs) and NGOs in response to their critical information requirements.
no code implementations • 7 Dec 2017 • Yunpeng Li, Ivan Kiskin, Davide Zilli, Marianne Sinka, Henry Chan, Kathy Willis, Stephen Roberts
Environmental acoustic sensing involves the retrieval and processing of audio signals to better understand our surroundings.
no code implementations • 16 Nov 2017 • Yunpeng Li, Davide Zilli, Henry Chan, Ivan Kiskin, Marianne Sinka, Stephen Roberts, Kathy Willis
Mosquitoes are a major vector for malaria, causing hundreds of thousands of deaths in the developing world each year.
no code implementations • 24 Feb 2017 • Hongchao Song, Yunpeng Li, Mark Coates, Aidong Men
One of the most widely used feature extraction method is principle component analysis (PCA).
no code implementations • 20 Mar 2014 • Yunpeng Li, Ya Li, Jie Liu, Yong Deng
The results of defuzzification at the first step are not coincide with the results of defuzzification at the final step. It seems that the alternative is to defuzzification in the final step in fuzzy DEMATEL.
no code implementations • 23 Nov 2013 • Yunpeng Li, Jie Liu, Yong Deng
In this paper, we present an illustration to the history of Artificial Intelligence(AI) with a statistical analysis of publish since 1940.
no code implementations • CVPR 2013 • Aurelien Lucchi, Yunpeng Li, Pascal Fua
We propose a working set based approximate subgradient descent algorithm to minimize the margin-sensitive hinge loss arising from the soft constraints in max-margin learning frameworks, such as the structured SVM.