Search Results for author: Yue Wan

Found 5 papers, 2 papers with code

Bridging the Gap between Recognition-level Pre-training and Commonsensical Vision-language Tasks

no code implementations CSRR (ACL) 2022 Yue Wan, Yueen Ma, Haoxuan You, Zhecan Wang, Shih-Fu Chang

Large-scale visual-linguistic pre-training aims to capture the generic representations from multimodal features, which are essential for downstream vision-language tasks.

Informativeness Type prediction +1

RiNALMo: General-Purpose RNA Language Models Can Generalize Well on Structure Prediction Tasks

1 code implementation29 Feb 2024 Rafael Josip Penić, Tin Vlašić, Roland G. Huber, Yue Wan, Mile Šikić

RiNALMo is the largest RNA language model to date with $650$ million parameters pre-trained on $36$ million non-coding RNA sequences from several available databases.

Language Modelling

Improving Explainable Object-induced Model through Uncertainty for Automated Vehicles

no code implementations23 Feb 2024 Shihong Ling, Yue Wan, Xiaowei Jia, Na Du

The rapid evolution of automated vehicles (AVs) has the potential to provide safer, more efficient, and comfortable travel options.

Decision Making

From molecules to scaffolds to functional groups: building context-dependent molecular representation via multi-channel learning

no code implementations5 Nov 2023 Yue Wan, Jialu Wu, Tingjun Hou, Chang-Yu Hsieh, Xiaowei Jia

Self-supervised learning (SSL) has emerged as a popular solution, utilizing large-scale, unannotated molecular data to learn a foundational representation of chemical space that might be advantageous for downstream tasks.

Drug Discovery Molecular Property Prediction +3

Retroformer: Pushing the Limits of Interpretable End-to-end Retrosynthesis Transformer

1 code implementation29 Jan 2022 Yue Wan, Benben Liao, Chang-Yu Hsieh, Shengyu Zhang

In this paper, we propose Retroformer, a novel Transformer-based architecture for retrosynthesis prediction without relying on any cheminformatics tools for molecule editing.

Retrosynthesis

Cannot find the paper you are looking for? You can Submit a new open access paper.