Search Results for author: Zhiqi Lin

Found 6 papers, 1 papers with code

Natural Language Fine-Tuning

1 code implementation29 Dec 2024 Jia Liu, Yue Wang, Zhiqi Lin, Min Chen, Yixue Hao, Long Hu

Compared to SFT, NLFT does not increase the algorithmic complexity, maintaining O(n).

GSM8K Large Language Model

Fully Bayesian Differential Gaussian Processes through Stochastic Differential Equations

no code implementations12 Aug 2024 Jian Xu, Zhiqi Lin, Min Chen, Junmei Yang, Delu Zeng, John Paisley

Traditional deep Gaussian processes model the data evolution using a discrete hierarchy, whereas differential Gaussian processes (DIFFGPs) represent the evolution as an infinitely deep Gaussian process.

Bayesian Inference Gaussian Processes

Flexible Bayesian Last Layer Models Using Implicit Priors and Diffusion Posterior Sampling

no code implementations7 Aug 2024 Jian Xu, Zhiqi Lin, Shigui Li, Min Chen, Junmei Yang, Delu Zeng, John Paisley

Bayesian Last Layer (BLL) models focus solely on uncertainty in the output layer of neural networks, demonstrating comparable performance to more complex Bayesian models.

Computational Efficiency Out-of-Distribution Detection +1

Tessel: Boosting Distributed Execution of Large DNN Models via Flexible Schedule Search

no code implementations26 Nov 2023 Zhiqi Lin, Youshan Miao, Guanbin Xu, Cheng Li, Olli Saarikivi, Saeed Maleki, Fan Yang

This paper presents Tessel, an automated system that searches for efficient schedules for distributed DNN training and inference for diverse operator placement strategies.

PaGraph: Scaling GNN Training on Large Graphs via Computation-aware Caching and Partitioning

no code implementations Proceedings of the 11th ACM Symposium on Cloud Computing 2020 Zhiqi Lin, Cheng Li, Youshan Miao, Yunxin Liu, Yinlong Xu

Emerging graph neural networks (GNNs) have extended the successes of deep learning techniques against datasets like images and texts to more complex graph-structured data.

Cannot find the paper you are looking for? You can Submit a new open access paper.