Search Results for author: Zhengyi Li

Found 6 papers, 2 papers with code

Improving PTM Site Prediction by Coupling of Multi-Granularity Structure and Multi-Scale Sequence Representation

no code implementations4 Jan 2024 Zhengyi Li, Menglu Li, Lida Zhu, Wen Zhang

Specifically, multigranularity structure-aware representation learning is designed to learn neighborhood structure representations at the amino acid, atom, and whole protein granularity from AlphaFold predicted structures, followed by utilizing contrastive learning to optimize the structure representations. Additionally, multi-scale sequence representation learning is used to extract context sequence information, and motif generated by aligning all context sequences of PTM sites assists the prediction.

Contrastive Learning Representation Learning

MAWSEO: Adversarial Wiki Search Poisoning for Illicit Online Promotion

no code implementations22 Apr 2023 Zilong Lin, Zhengyi Li, Xiaojing Liao, XiaoFeng Wang, Xiaozhong Liu

As a prominent instance of vandalism edits, Wiki search poisoning for illicit promotion is a cybercrime in which the adversary aims at editing Wiki articles to promote illicit businesses through Wiki search results of relevant queries.

Holistic Transformer: A Joint Neural Network for Trajectory Prediction and Decision-Making of Autonomous Vehicles

no code implementations17 Jun 2022 Hongyu Hu, Qi Wang, Zhengguang Zhang, Zhengyi Li, Zhenhai Gao

Trajectory prediction and behavioral decision-making are two important tasks for autonomous vehicles that require good understanding of the environmental context; behavioral decisions are better made by referring to the outputs of trajectory predictions.

Autonomous Vehicles Decision Making +2

Transkimmer: Transformer Learns to Layer-wise Skim

1 code implementation ACL 2022 Yue Guan, Zhengyi Li, Jingwen Leng, Zhouhan Lin, Minyi Guo

To address the above limitations, we propose the Transkimmer architecture, which learns to identify hidden state tokens that are not required by each layer.

Computational Efficiency

Block-Skim: Efficient Question Answering for Transformer

1 code implementation16 Dec 2021 Yue Guan, Zhengyi Li, Jingwen Leng, Zhouhan Lin, Minyi Guo, Yuhao Zhu

We further prune the hidden states corresponding to the unnecessary positions early in lower layers, achieving significant inference-time speedup.

Extractive Question-Answering Question Answering

Cannot find the paper you are looking for? You can Submit a new open access paper.