no code implementations • 12 Nov 2023 • Vasilisa Bashlovkina, Zhaobin Kuang, Riley Matthews, Edward Clifford, Yennie Jun, William W. Cohen, Simon Baumgartner
Large language models (LLMs) are trained on web-scale corpora that inevitably include contradictory factual information from sources of varying reliability.
no code implementations • 30 Jun 2023 • Vasilisa Bashlovkina, Riley Matthews, Zhaobin Kuang, Simon Baumgartner, Michael Bendersky
We study the ability of transformer-based language models (LMs) to understand social media language.
no code implementations • Proceedings of the 25th International Conference on Artificial Intelligence and Statistics 2022 • Zhaobin Kuang, Chidubem Arachie, Bangyong Liang, Pradyumna Narayana, Giulia Desalvo, MICHAEL QUINN, Bert Huang, Geoffrey Downs, Yang Yang
In particular, Firebolt learns the class balance and class-specific accuracy of LFs jointly from unlabeled data.
no code implementations • 12 May 2020 • Sinong Geng, Zhaobin Kuang, Jie Liu, Stephen Wright, David Page
We study the $L_1$-regularized maximum likelihood estimator/estimation (MLE) problem for discrete Markov random fields (MRFs), where efficient and scalable learning requires both sparse regularization and approximate inference.
no code implementations • ICML 2018 • Sinong Geng, Zhaobin Kuang, Peggy Peissig, David Page
We propose temporal Poisson square root graphical models (TPSQRs), a generalization of Poisson square root graphical models (PSQRs) specifically designed for modeling longitudinal event data.
no code implementations • 11 Apr 2020 • Zhaobin Kuang, Frederic Sala, Nimit Sohoni, Sen Wu, Aldo Córdova-Palomera, Jared Dunnmon, James Priest, Christopher Ré
To relax these assumptions, we propose Ivy, a new method to combine IV candidates that can handle correlated and invalid IV candidates in a robust manner.
no code implementations • 3 Jul 2019 • Ross S. Kleiman, Paul S. Bennett, Peggy L. Peissig, Richard L. Berg, Zhaobin Kuang, Scott J. Hebbring, Michael D. Caldwell, David Page
For the first time, we can get a much more complete picture of how well risks for thousands of different diagnosis codes can be predicted.
no code implementations • 12 Jun 2019 • Finn Kuusisto, John Steill, Zhaobin Kuang, James Thomson, David Page, Ron Stewart
We present a simple text mining method that is easy to implement, requires minimal data collection and preparation, and is easy to use for proposing ranked associations between a list of target terms and a key phrase.
no code implementations • NeurIPS 2017 • Zhaobin Kuang, Sinong Geng, David Page
We discover a screening rule for l1-regularized Ising model estimation.
no code implementations • 27 Feb 2017 • Sinong Geng, Zhaobin Kuang, David Page
In this way, many insights and optimization procedures for sparse logistic regression can be applied to the learning of discrete Markov networks.
no code implementations • 20 Apr 2016 • Zhaobin Kuang, James Thomson, Michael Caldwell, Peggy Peissig, Ron Stewart, David Page
Computational Drug Repositioning (CDR) is the task of discovering potential new indications for existing drugs by mining large-scale heterogeneous drug-related data sources.