1 code implementation • 5 Oct 2023 • Yiren Jian, Tingkai Liu, Yunzhe Tao, Chunhui Zhang, Soroush Vosoughi, Hongxia Yang
Our experimental findings demonstrate that our approach accelerates the training of vision-language models by a factor of 5 without a noticeable impact on overall performance.
1 code implementation • NeurIPS 2023 • Yiren Jian, Chongyang Gao, Soroush Vosoughi
We present a novel methodology aimed at optimizing the application of frozen large language models (LLMs) for resource-intensive vision-language (VL) pre-training.
1 code implementation • 13 Feb 2023 • Yiren Jian, Chongyang Gao, Chen Zeng, Yunjie Zhao, Soroush Vosoughi
Our findings indicate that the learned structural patterns of proteins can be transferred to RNAs, opening up potential new avenues for research.
1 code implementation • 20 Sep 2022 • Yiren Jian, Chongyang Gao, Soroush Vosoughi
This indicates that Transformer models are able to generalize better by doing a similar task (i. e., clustering) with unpaired examples from different modalities in a multi-task fashion.
1 code implementation • NAACL 2022 • Yiren Jian, Chongyang Gao, Soroush Vosoughi
Following this line of work, we present a contrastive learning framework that clusters inputs from the same class for better generality of models trained with only limited examples.
1 code implementation • NAACL 2022 • Yiren Jian, Chongyang Gao, Soroush Vosoughi
Few-shot language learners adapt knowledge from a pre-trained model to recognize novel classes from a few-labeled sentences.
1 code implementation • 6 Dec 2021 • Yiren Jian, Lorenzo Torresani
At the same time, training a simple linear classifier on top of "frozen" features learned from the large labeled dataset fails to adapt the model to the properties of the novel classes, effectively inducing underfitting.
1 code implementation • 5 Oct 2021 • Yiren Jian, Chongyang Gao
Previous work has shown that the performance of a semantic segmentation model can be improved by training jointly with real and synthetic examples with a proper weighting on the synthetic data.