no code implementations • 22 Feb 2024 • Yuwei Yang, Siqi Ouyang, Xueyu Hu, Mingyue Zheng, Hao Zhou, Lei LI
We develop a novel 3D graph editing model to generate molecules using fragments, and pre-train this model on abundant 3D ligands for learning target-independent properties.
no code implementations • 7 Dec 2023 • Xutai Ma, Anna Sun, Siqi Ouyang, Hirofumi Inaguma, Paden Tomasello
We introduce the Efficient Monotonic Multihead Attention (EMMA), a state-of-the-art simultaneous translation model with numerically-stable and unbiased monotonic alignment estimation.
1 code implementation • 24 May 2023 • Siqi Ouyang, Lei LI
However, LLMs frequently fail in complex decision-making tasks due to the misalignment between the pre-trained knowledge in LLMs and the actual rules in the environment.
1 code implementation • 19 Dec 2022 • Siqi Ouyang, Rong Ye, Lei LI
In this paper, we propose Word-Aligned COntrastive learning (WACO), a simple and effective method for extremely low-resource speech-to-text translation.
2 code implementations • 14 Dec 2022 • Xuandong Zhao, Siqi Ouyang, Zhiguo Yu, Ming Wu, Lei LI
How can we extend a pre-trained model to many language understanding tasks, without labeled or additional unlabeled data?
no code implementations • 17 Sep 2022 • Yujie Lu, Siqi Ouyang, Kairui Zhou
In this paper, we propose to solely leverage the LMs to combine the language and knowledge for knowledge based question-answering with flexibility, breadth of coverage and structured reasoning.
1 code implementation • IWSLT (ACL) 2022 • Siqi Ouyang, Rong Ye, Lei LI
Training speech translation (ST) models requires large and high-quality datasets.
no code implementations • 29 Sep 2021 • Yuwei Yang, Siqi Ouyang, Meihua Dang, Mingyue Zheng, Lei LI, Hao Zhou
Deep learning models have been widely used in automatic drug design.
1 code implementation • 30 Jul 2018 • Wayne Xin Zhao, Gaole He, Hongjian Dou, Jin Huang, Siqi Ouyang, Ji-Rong Wen
Based on our linked dataset, we first preform some interesting qualitative analysis experiments, in which we discuss the effect of two important factors (i. e. popularity and recency) on whether a RS item can be linked to a KB entity.