Search Results for author: Qiyang Zhang

Found 7 papers, 5 papers with code

A Survey of Resource-efficient LLM and Multimodal Foundation Models

1 code implementation16 Jan 2024 Mengwei Xu, Wangsong Yin, Dongqi Cai, Rongjie Yi, Daliang Xu, QiPeng Wang, Bingyang Wu, Yihao Zhao, Chen Yang, Shihe Wang, Qiyang Zhang, Zhenyan Lu, Li Zhang, Shangguang Wang, Yuanchun Li, Yunxin Liu, Xin Jin, Xuanzhe Liu

Large foundation models, including large language models (LLMs), vision transformers (ViTs), diffusion, and LLM-based multimodal models, are revolutionizing the entire machine learning lifecycle, from training to deployment.

Not All Negatives Are Worth Attending to: Meta-Bootstrapping Negative Sampling Framework for Link Prediction

no code implementations8 Dec 2023 Yakun Wang, Binbin Hu, Shuo Yang, Meiqi Zhu, Zhiqiang Zhang, Qiyang Zhang, Jun Zhou, Guo Ye, Huimei He

In particular, we elaborately devise a Meta-learning Supported Teacher-student GNN (MST-GNN) that is not only built upon teacher-student architecture for alleviating the migration between "easy" and "hard" samples but also equipped with a meta learning based sample re-weighting module for helping the student GNN distinguish "hard" samples in a fine-grained manner.

Link Prediction Meta-Learning

Paperfetcher: A tool to automate handsearch for systematic reviews

1 code implementation24 Oct 2021 Akash Pallath, Qiyang Zhang

Handsearch is an important technique that contributes to thorough literature search in systematic reviews.

Management

Applications of Unsupervised Deep Transfer Learning to Intelligent Fault Diagnosis: A Survey and Comparative Study

1 code implementation28 Dec 2019 Zhibin Zhao, Qiyang Zhang, Xiaolei Yu, Chuang Sun, Shibin Wang, Ruqiang Yan, Xuefeng Chen

Besides, the newly collected test data in the target domain are usually unlabeled, leading to unsupervised deep transfer learning based (UDTL-based) IFD problem.

Representation Learning Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.