no code implementations • EMNLP (WNUT) 2020 • Xiangyu Yang, Giannis Bekoulis, Nikos Deligiannis
To mitigate the noisy nature of the Twitter stream, our system makes use of the COVID-Twitter-BERT (CT-BERT), which is a language model pre-trained on a large corpus of COVID-19 related Twitter messages.
no code implementations • 2 Aug 2022 • Tiange Li, Xiangyu Yang, Hao Wang
In this paper, we develop a simple yet effective screening rule strategy to improve the computational efficiency in solving structured optimization involving nonconvex $\ell_{q, p}$ regularization.
no code implementations • 13 Sep 2021 • Xiangyu Yang, Giannis Bekoulis, Nikos Deligiannis
In particular, we experiment with several models to identify (i) whether a tweet is traffic-related or not, and (ii) in the case that the tweet is traffic-related to identify more fine-grained information regarding the event (e. g., the type of the event, where the event happened).
no code implementations • 7 Apr 2021 • Hao Wang, Xiangyu Yang, Wei Jiang
Specifically, if the current iterate is in the interior of the feasible set, then the weighted $\ell_{1}$ ball is formed by linearizing the $\ell_{p}$ norm at the current iterate.
3 code implementations • 5 Jan 2021 • Xiangyu Yang, Jiashan Wang, Hao Wang
This paper primarily focuses on computing the Euclidean projection of a vector onto the $\ell_{p}$ ball in which $p\in(0, 1)$.
no code implementations • 24 Feb 2020 • Xiangyu Yang, Sheng Hua, Yuanming Shi, Hao Wang, Jun Zhang, Khaled B. Letaief
By exploiting the inherent connections between the set of task selection and group sparsity structural transmit beamforming vector, we reformulate the optimization as a group sparse beamforming problem.