no code implementations • 18 Aug 2024 • Xiawei Wang, James Sharpnack, Thomas C. M. Lee
Additionally, we propose to combine mini-batched loss and binary cross-entropy to predict both lung cancer occurrence and the risk of mortality.
no code implementations • 26 Apr 2024 • Yue Kang, Cho-Jui Hsieh, Thomas C. M. Lee
By utilizing the truncation on observed payoffs and the dynamic exploration, we propose a novel algorithm called LOTUS attaining the regret bound of order $\tilde O(d^\frac{3}{2}r^\frac{1}{2}T^\frac{1}{1+\delta}/\tilde{D}_{rr})$ without knowing $T$, which matches the state-of-the-art regret bound under sub-Gaussian noises~\citep{lu2021low, kang2022efficient} with $\delta = 1$.
no code implementations • 14 Jan 2024 • Yue Kang, Cho-Jui Hsieh, Thomas C. M. Lee
In the stochastic contextual low-rank matrix bandit problem, the expected reward of an action is given by the inner product between the action's feature matrix and some fixed, but initially unknown $d_1$ by $d_2$ matrix $\Theta^*$ with rank $r \ll \{d_1, d_2\}$, and an agent sequentially takes actions based on past experience to maximize the cumulative reward.
no code implementations • 18 Feb 2023 • Yue Kang, Cho-Jui Hsieh, Thomas C. M. Lee
In stochastic contextual bandits, an agent sequentially makes actions from a time-dependent action set based on past experience to minimize the cumulative regret.
no code implementations • 26 Jan 2022 • Zhenyu Wei, Raymond K. W. Wong, Thomas C. M. Lee
In the signal processing and statistics literature, the minimum description length (MDL) principle is a popular tool for choosing model complexity.
1 code implementation • 18 Nov 2021 • Yao Li, Minhao Cheng, Cho-Jui Hsieh, Thomas C. M. Lee
Despite the efficiency and scalability of machine learning systems, recent studies have demonstrated that many classification methods, especially deep neural networks (DNNs), are vulnerable to adversarial examples; i. e., examples that are carefully crafted to fool a well-trained classification model while being indistinguishable from natural data to human.
no code implementations • 5 Jun 2021 • Qin Ding, Yue Kang, Yi-Wei Liu, Thomas C. M. Lee, Cho-Jui Hsieh, James Sharpnack
To tackle this problem, we first propose a two-layer bandit structure for auto tuning the exploration parameter and further generalize it to the Syndicated Bandits framework which can learn multiple hyper-parameters dynamically in contextual bandit environment.
1 code implementation • 18 May 2021 • Yao Li, Tongyi Tang, Cho-Jui Hsieh, Thomas C. M. Lee
In this paper, we propose a new framework to detect adversarial examples motivated by the observations that random components can improve the smoothness of predictors and make it easier to simulate the output distribution of a deep neural network.
no code implementations • 14 Nov 2019 • Suofei Wu, Jan Hannig, Thomas C. M. Lee
The main contribution is a new method that quantifies the uncertainties of the estimates and predictions produced by honest random forests.
no code implementations • 4 Aug 2019 • Miles E. Lopes, Suofei Wu, Thomas C. M. Lee
When randomized ensemble methods such as bagging and random forests are implemented, a basic question arises: Is the ensemble large enough?
no code implementations • 19 Nov 2018 • Yao Li, Martin Renqiang Min, Wenchao Yu, Cho-Jui Hsieh, Thomas C. M. Lee, Erik Kruus
Recent studies have demonstrated the vulnerability of deep convolutional neural networks against adversarial examples.
no code implementations • 4 Nov 2018 • Yuefeng Liang, Cho-Jui Hsieh, Thomas C. M. Lee
Extreme multi-label classification aims to learn a classifier that annotates an instance with a relevant subset of labels from an extremely large label set.
1 code implementation • 28 Aug 2015 • Raymond K. W. Wong, Vinay L. Kashyap, Thomas C. M. Lee, David A. van Dyk
We embed change points into a marked Poisson process, where photon wavelengths are regarded as marks and both the Poisson intensity parameter and the distribution of the marks are allowed to change.
Applications Instrumentation and Methods for Astrophysics
no code implementations • 1 Mar 2015 • Raymond K. W. Wong, Thomas C. M. Lee
This paper considers the problem of matrix completion when the observed entries are noisy and contain outliers.