1 code implementation • 1 Mar 2024 • Ye Chen, Igor Couto, Wei Cai, Cong Fu, Bruno Dorneles
We introduce SoftTiger, a clinical large language model (CLaM) designed as a foundation model for healthcare workflows.
no code implementations • 13 Jan 2024 • Xinru Hua, Rasool Ahmad, Jose Blanchet, Wei Cai
In particular, we approximate the variance-free bias potential function with DNNs which is trained to maximize the probability of rare event transition under the importance potential function.
1 code implementation • 14 Dec 2023 • Ye Chen, Wei Cai, Liangmin Wu, Xiaowei Li, Zhanxuan Xin, Cong Fu
We release and introduce the TigerBot family of large language models (LLMs), consisting of base and chat models, sized from 7, 13, 70 and 180 billion parameters.
1 code implementation • 30 Oct 2023 • Jaehong Chung, Rasool Ahmad, WaiChing Sun, Wei Cai, Tapan Mukerji
This study presents a Graph Neural Networks (GNNs)-based approach for predicting the effective elastic moduli of rocks from their digital CT-scan images.
1 code implementation • 10 Jan 2023 • Shaswat Mohanty, Sanghyuk Yoo, Keonwook Kang, Wei Cai
Machine-learned force fields have generated significant interest in recent years as a tool for molecular dynamics (MD) simulations, with the aim of developing accurate and efficient models that can replace classical interatomic potentials.
1 code implementation • 22 Jul 2020 • Ziqi Liu, Wei Cai, Zhi-Qin John Xu
In this paper, we propose multi-scale deep neural networks (MscaleDNNs) using the idea of radial scaling in frequency domain and activation functions with compact support.
no code implementations • 25 Oct 2019 • Wei Cai, Zhi-Qin John Xu
In this paper, we propose the idea of radial scaling in frequency domain and activation functions with compact support to produce a multi-scale DNN (MscaleDNN), which will have the multi-scale capability in approximating high frequency and high dimensional functions and speeding up the solution of high dimensional PDEs.
no code implementations • 23 Sep 2019 • Wei Cai, Xiaoguang Li, Lizuo Liu
In this paper, we propose a phase shift deep neural network (PhaseDNN), which provides a uniform wideband convergence in approximating high frequency functions and solutions of wave equations.
no code implementations • 3 May 2019 • Wei Cai, Xiaoguang Li, Lizuo Liu
Due to the phase shift, each DNN achieves the speed of convergence as in the low frequency range.