Search Results for author: Yuantao Fan

Found 5 papers, 2 papers with code

m2mKD: Module-to-Module Knowledge Distillation for Modular Transformers

1 code implementation26 Feb 2024 Ka Man Lo, Yiming Liang, Wenyu Du, Yuantao Fan, Zili Wang, Wenhao Huang, Lei Ma, Jie Fu

Leveraging the knowledge from monolithic models, using techniques such as knowledge distillation, is likely to facilitate the training of modular models and enable them to integrate knowledge from multiple models pretrained on diverse sources.

Knowledge Distillation

Forecasting Auxiliary Energy Consumption for Electric Heavy-Duty Vehicles

no code implementations27 Nov 2023 Yuantao Fan, Zhenkan Wang, Sepideh Pashami, Slawomir Nowaczyk, Henrik Ydreskog

Given that the employed groupings correspond to relevant sub-populations, the associations between the input features and the target values are consistent within each cluster but different across clusters.

regression

RefGPT: Dialogue Generation of GPT, by GPT, and for GPT

1 code implementation24 May 2023 Dongjie Yang, Ruifeng Yuan, Yuantao Fan, Yifei Yang, Zili Wang, Shusen Wang, Hai Zhao

Therefore, we propose a method called RefGPT to generate enormous truthful and customized dialogues without worrying about factual errors caused by the model hallucination.

Dialogue Generation Hallucination

Transfer learning for Remaining Useful Life Prediction Based on Consensus Self-Organizing Models

no code implementations16 Sep 2019 Yuantao Fan, Sławomir Nowaczyk, Thorsteinn Rögnvaldsson

In this work, we present a TL method for predicting Remaining Useful Life (RUL) of equipment, under the assumption that labels are available only for the source domain and not the target domain.

Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.