no code implementations • 21 Jun 2023 • Weihao Gao, Zhuo Deng, Zhiyuan Niu, Fuju Rong, Chucheng Chen, Zheng Gong, Wenze Zhang, Daimin Xiao, Fang Li, Zhenjie Cao, Zhaoyi Ma, Wenbin Wei, Lan Ma
We introduce visual ability into the large language model to complete the ophthalmic large language and vision assistant (OphGLM).
1 code implementation • 5 Jun 2023 • Alexander Bukharin, Tianyi Liu, Shengjie Wang, Simiao Zuo, Weihao Gao, Wen Yan, Tuo Zhao
To address this issue, we propose a multi-stage computational framework -- ASTEROID, which lowers the data cost of MLFFs by leveraging a combination of cheap inaccurate data and expensive accurate data.
no code implementations • 23 Nov 2022 • Xiang Gao, Weihao Gao, Wenzhi Xiao, Zhirui Wang, Chong Wang, Liang Xiang
To model the complex nonlinearity in predicting molecular properties in an more end-to-end approach, we propose to encode the positional quantities with a learnable embedding that is continuous and differentiable.
no code implementations • 23 Nov 2022 • Xiang Gao, Weihao Gao, Wenzhi Xiao, Zhirui Wang, Chong Wang, Liang Xiang
Experiments show that, compared to training from scratch, fine-tuning the pretrained model can significantly improve the performance for seven molecular property prediction tasks and two force field tasks.
no code implementations • NeurIPS Workshop AI4Scien 2021 • Ce Yang, Weihao Gao, Di wu, Chong Wang
Simulation of the dynamics of physical systems is essential to the development of both science and engineering.
no code implementations • NeurIPS Workshop AI4Scien 2021 • Tianze Zheng, Weihao Gao, Chong Wang
Molecular dynamics (MD) simulation predicts the trajectory of atoms by solving Newton's equation of motion with a numeric integrator.
no code implementations • 21 Jul 2021 • Jiankai Sun, Yuanshun Yao, Weihao Gao, Junyuan Xie, Chong Wang
Recently researchers have studied input leakage problems in Federated Learning (FL) where a malicious party can reconstruct sensitive training inputs provided by users from shared gradient.
no code implementations • 10 Jun 2021 • Jiankai Sun, Xin Yang, Yuanshun Yao, Aonan Zhang, Weihao Gao, Junyuan Xie, Chong Wang
In this paper, we propose a vFL framework based on Private Set Union (PSU) that allows each party to keep sensitive membership information to itself.
no code implementations • 27 Apr 2021 • Chaosheng Dong, Xiaojie Jin, Weihao Gao, Yijia Wang, Hongyi Zhang, Xiang Wu, Jianchao Yang, Xiaobing Liu
Deep learning models in large-scale machine learning systems are often continuously trained with enormous data from production environments.
2 code implementations • ICLR 2022 • Oscar Li, Jiankai Sun, Xin Yang, Weihao Gao, Hongyi Zhang, Junyuan Xie, Virginia Smith, Chong Wang
Two-party split learning is a popular technique for learning a model across feature-partitioned data.
1 code implementation • 1 Jan 2021 • Weihao Gao, Xiangjun Fan, Jiankai Sun, Kai Jia, Wenzhi Xiao, Chong Wang, Xiaobing Liu
With the model learnt, a beam search over the latent codes is performed to retrieve the top candidates.
1 code implementation • 12 Jul 2020 • Weihao Gao, Xiangjun Fan, Chong Wang, Jiankai Sun, Kai Jia, Wenzhi Xiao, Ruofan Ding, Xingyan Bin, Hui Yang, Xiaobing Liu
With the model learnt, a beam search over the structure is performed to retrieve the top candidates for reranking.
1 code implementation • 27 Jan 2019 • Yuheng Bu, Weihao Gao, Shaofeng Zou, Venugopal V. Veeravalli
We show that model compression can improve the population risk of a pre-trained model, by studying the tradeoff between the decrease in the generalization error and the increase in the empirical risk with model compression.
no code implementations • 9 Oct 2018 • Weihao Gao, Yu-Han Liu, Chong Wang, Sewoong Oh
Theoretically, we prove that the proposed scheme is optimal for compressing one-hidden-layer ReLU neural networks.
no code implementations • 9 Oct 2018 • Weihao Gao, Ashok Vardhan Makkuva, Sewoong Oh, Pramod Viswanath
Significant advances have been made recently on training neural networks, where the main challenge is in solving an optimization problem with abundant critical points.
no code implementations • NeurIPS 2018 • Jiantao Jiao, Weihao Gao, Yanjun Han
We analyze the Kozachenko--Leonenko (KL) nearest neighbor estimator for the differential entropy.
1 code implementation • NeurIPS 2017 • Weihao Gao, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
We provide numerical experiments suggesting superiority of the proposed estimator compared to other heuristics of adding small continuous noise to all the samples and applying standard estimators tailored for purely continuous variables, and quantizing the samples and applying standard estimators tailored for purely discrete variables.
no code implementations • NeurIPS 2017 • Hyeji Kim, Weihao Gao, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
Discovering a correlation from one variable to another variable is of fundamental scientific and practical interest.
no code implementations • NeurIPS 2016 • Weihao Gao, Sewoong Oh, Pramod Viswanath
In this paper, we combine both these approaches to design new estimators of entropy and mutual information that outperform state of the art methods.
1 code implementation • 11 Apr 2016 • Weihao Gao, Sewoong Oh, Pramod Viswanath
In this paper we demonstrate that the estimator is consistent and also identify an upper bound on the rate of convergence of the bias as a function of number of samples.
no code implementations • 10 Feb 2016 • Weihao Gao, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
We conduct an axiomatic study of the problem of estimating the strength of a known causal relationship between a pair of variables.