1 code implementation • 23 Nov 2023 • Cheng Tan, Jingxuan Wei, Zhangyang Gao, Linzhuang Sun, Siyuan Li, Xihong Yang, Stan Z. Li
Remarkably, we show that even smaller base models, when equipped with our proposed approach, can achieve results comparable to those of larger models, illustrating the potential of our approach in harnessing the power of rationales for improved multimodal reasoning.
no code implementations • 25 Oct 2023 • Zhe Li, Zhangyang Gao, Cheng Tan, Stan Z. Li, Laurence T. Yang
This model is versatile, allowing fine-tuning for downstream point cloud representation tasks, as well as unconditional and conditional generation tasks.
no code implementations • 14 Oct 2023 • Yufei Huang, Siyuan Li, Jin Su, Lirong Wu, Odin Zhang, Haitao Lin, Jingqi Qi, Zihan Liu, Zhangyang Gao, Yuyang Liu, Jiangbin Zheng, Stan. ZQ. Li
To study this problem, we identify a Protein 3D Graph Structure Learning Problem for Robust Protein Property Prediction (PGSL-RP3), collect benchmark datasets, and present a protein Structure embedding Alignment Optimization framework (SAO) to mitigate the problem of structure embedding bias between the predicted and experimental protein structures.
no code implementations • 9 Oct 2023 • Cheng Tan, Jue Wang, Zhangyang Gao, Siyuan Li, Lirong Wu, Jun Xia, Stan Z. Li
In this paper, we re-examine the two dominant temporal modeling approaches within the realm of spatio-temporal predictive learning, offering a unified perspective.
1 code implementation • 24 Jul 2023 • Jingxuan Wei, Cheng Tan, Zhangyang Gao, Linzhuang Sun, Siyuan Li, Bihui Yu, Ruifeng Guo, Stan Z. Li
Multimodal reasoning is a critical component in the pursuit of artificial intelligence systems that exhibit human-like intelligence, especially when tackling complex tasks.
1 code implementation • 20 May 2023 • Zhangyang Gao, Cheng Tan, Stan Z. Li
After witnessing the great success of pretrained models on diverse protein-related tasks and the fact that recovery is highly correlated with confidence, we wonder whether this knowledge can push the limits of protein design further.
Ranked #1 on
Word Sense Disambiguation
on TS50
1 code implementation • 20 May 2023 • Zhangyang Gao, Xingran Chen, Cheng Tan, Stan Z. Li
Is there a unified framework for graph-based retrosynthesis prediction?
no code implementations • 21 Apr 2023 • Cheng Tan, Zhangyang Gao, Stan Z. Li
In this paper, we propose a \textit{simple yet effective} model that can co-design 1D sequences and 3D structures of CDRs in a one-shot manner.
no code implementations • 14 Feb 2023 • Zhangyang Gao, Yuqi Hu, Cheng Tan, Stan Z. Li
Is there a unified model for generating molecules considering different conditions, such as binding pockets and chemical properties?
1 code implementation • 25 Jan 2023 • Cheng Tan, Yijie Zhang, Zhangyang Gao, Hanqun Cao, Stan Z. Li
We crafted a large, well-curated benchmark dataset and designed a comprehensive structural modeling approach to represent the complex RNA tertiary structure.
1 code implementation • 22 Jan 2023 • Zhangyang Gao, Cheng Tan, Stan Z. Li
Have you ever been troubled by the complexity and computational cost of SE(3) protein structure modeling and been amazed by the simplicity and power of language modeling?
1 code implementation • 2 Dec 2022 • Cheng Tan, Zhangyang Gao, Stan Z. Li
The secondary structure of ribonucleic acid (RNA) is more stable and accessible in the cell than its tertiary structure, making it essential for functional prediction.
2 code implementations • 22 Nov 2022 • Cheng Tan, Zhangyang Gao, Siyuan Li, Stan Z. Li
Without introducing any extra tricks and strategies, SimVP can achieve superior performance on various benchmark datasets.
Ranked #1 on
Video Prediction
on Moving MNIST
no code implementations • 5 Oct 2022 • Lirong Wu, Jun Xia, Haitao Lin, Zhangyang Gao, Zicheng Liu, Guojiang Zhao, Stan Z. Li
Despite their great academic success, Multi-Layer Perceptrons (MLPs) remain the primary workhorse for practical industrial applications.
1 code implementation • 22 Sep 2022 • Zhangyang Gao, Cheng Tan, Pablo Chacón, Stan Z. Li
How can we design protein sequences folding into the desired structures effectively and efficiently?
1 code implementation • 6 Sep 2022 • Hanqun Cao, Cheng Tan, Zhangyang Gao, Yilun Xu, Guangyong Chen, Pheng-Ann Heng, Stan Z. Li
Deep generative models are a prominent approach for data generation, and have been used to produce high quality samples in various domains.
2 code implementations • CVPR 2023 • Cheng Tan, Zhangyang Gao, Lirong Wu, Yongjie Xu, Jun Xia, Siyuan Li, Stan Z. Li
Spatiotemporal predictive learning aims to generate future frames by learning from historical frames.
Ranked #12 on
Video Prediction
on Moving MNIST
no code implementations • 23 Jun 2022 • Zhangyang Gao, Cheng Tan, Lirong Wu, Stan Z. Li
Can we inject the pocket-ligand interaction knowledge into the pre-trained model and jointly learn their chemical space?
3 code implementations • CVPR 2022 • Zhangyang Gao, Cheng Tan, Lirong Wu, Stan Z. Li
From CNN, RNN, to ViT, we have witnessed remarkable advancements in video prediction, incorporating auxiliary inputs, elaborate neural architectures, and sophisticated training strategies.
Ranked #2 on
Video Prediction
on Human3.6M
1 code implementation • CVPR 2022 • Cheng Tan, Zhangyang Gao, Lirong Wu, Siyuan Li, Stan Z. Li
Though it benefits from taking advantage of both feature-dependent information from self-supervised learning and label-dependent information from supervised learning, this scheme remains suffering from bias of the classifier.
1 code implementation • 21 Apr 2022 • Cheng Tan, Zhangyang Gao, Jun Xia, Bozhen Hu, Stan Z. Li
Thus, we propose the Global-Context Aware generative de novo protein design method (GCA), consisting of local and global modules.
no code implementations • 12 Feb 2022 • Zhangyang Gao, Cheng Tan, Lirong Wu, Stan Z. Li
Experimental results show that SemiRetro significantly outperforms both existing TB and TF methods.
no code implementations • 10 Feb 2022 • Cheng Tan, Zhangyang Gao, Stan Z. Li
Building on the recent advantages of flow-based molecular generation models, we propose SiamFlow, which forces the flow to fit the distribution of target sequence embeddings in latent space.
1 code implementation • 1 Feb 2022 • Zhangyang Gao, Cheng Tan, Stan Z. Li
While DeepMind has tentatively solved protein folding, its inverse problem -- protein design which predicts protein sequences from their 3D structures -- still faces significant challenges.
1 code implementation • 19 Oct 2021 • Haitao Lin, Cheng Tan, Lirong Wu, Zhangyang Gao, Stan. Z. Li
In this paper, we first review recent research emphasis and difficulties in modeling asynchronous event sequences with deep temporal point process, which can be concluded into four fields: encoding of history sequence, formulation of conditional intensity function, relational discovery of events and learning approaches for optimization.
2 code implementations • 4 Oct 2021 • Zhangyang Gao, Haitao Lin, Cheng Tan, Lirong Wu, Stan. Z Li
\textbf{A}ccuracy, \textbf{R}obustness to noises and scales, \textbf{I}nterpretability, \textbf{S}peed, and \textbf{E}asy to use (ARISE) are crucial requirements of a good clustering algorithm.
Ranked #1 on
Clustering Algorithms Evaluation
on MNIST
no code implementations • 21 Jun 2021 • Lirong Wu, Haitao Lin, Zhangyang Gao, Cheng Tan, Stan. Z. Li
Recent years have witnessed great success in handling node classification tasks with Graph Neural Networks (GNNs).
1 code implementation • 16 May 2021 • Lirong Wu, Haitao Lin, Zhangyang Gao, Cheng Tan, Stan. Z. Li
In this survey, we extend the concept of SSL, which first emerged in the fields of computer vision and natural language processing, to present a timely and comprehensive review of existing SSL techniques for graph data.
1 code implementation • 4 Jan 2021 • Haitao Lin, Zhangyang Gao, Yongjie Xu, Lirong Wu, Ling Li, Stan. Z. Li
We further propose the distance and orientation scaling terms to reduce the impacts of irregular spatial distribution.
no code implementations • 1 Jan 2021 • Jun Xia, Haitao Lin, Yongjie Xu, Lirong Wu, Zhangyang Gao, Siyuan Li, Stan Z. Li
A pseudo label is computed from the neighboring labels for each node in the training set using LP; meta learning is utilized to learn a proper aggregation of the original and pseudo label as the final label.
no code implementations • 28 Dec 2020 • Zhangyang Gao, Haitao Lin, Stan. Z Li
Convolution and pooling are the key operations to learn hierarchical representation for graph classification, where more expressive $k$-order($k>1$) method requires more computation cost, limiting the further applications.
1 code implementation • 24 Sep 2020 • Zhangyang Gao, Haitao Lin, Stan Z. Li
GDT jointly considers the local and global structures of data samples: firstly forming local clusters based on a density growing process with a strategy for properly noise handling as well as cluster boundary detection; and then estimating a GDT from relationship between local clusters in terms of a connectivity measure, givingglobal topological graph.