Search Results for author: Yuqing Kong

Found 13 papers, 3 papers with code

Robust Decision Aggregation with Adversarial Experts

no code implementations13 Mar 2024 Yongkang Guo, Yuqing Kong

The decision maker does not know the specific information structure, which is a joint distribution of signals, states, and strategies of adversarial experts.

Ensemble Learning

Algorithmic Robust Forecast Aggregation

no code implementations31 Jan 2024 Yongkang Guo, Jason D. Hartline, Zhihuan Huang, Yuqing Kong, Anant Shah, Fang-Yi Yu

Given a family of information structures, robust forecast aggregation aims to find the aggregator with minimal worst-case regret compared to the omniscient aggregator.

Robust Decision Aggregation with Second-order Information

no code implementations23 Nov 2023 Yuqi Pan, Zhaohua Chen, Yuqing Kong

When the aggregator is deterministic, we present a robust aggregator that leverages second-order information, which can significantly outperform counterparts without it.

Near-Optimal Experimental Design Under the Budget Constraint in Online Platforms

no code implementations10 Feb 2023 Yongkang Guo, Yuan Yuan, Jinshan Zhang, Yuqing Kong, Zhihua Zhu, Zheng Cai

A/B testing, or controlled experiments, is the gold standard approach to causally compare the performance of algorithms on online platforms.

Experimental Design

Information Elicitation Meets Clustering

no code implementations3 Oct 2021 Yuqing Kong

Here in the setting where a large number of people are asked to answer a small number of multi-choice questions (multi-task, large group), we propose an information aggregation method that is robust to people's strategies.

Clustering

Survey Equivalence: A Procedure for Measuring Classifier Accuracy Against Human Labels

1 code implementation2 Jun 2021 Paul Resnick, Yuqing Kong, Grant Schoenebeck, Tim Weninger

We refer to such tasks as survey settings because the ground truth is defined through a survey of one or more human raters.

Modularity and Mutual Information in Networks: Two Sides of the Same Coin

no code implementations3 Mar 2021 Yongkang Guo, Zhihuan Huang, Yuqing Kong, Qian Wang

At a high level, we show the significance of community structure is equivalent to the amount of information contained in the network.

Community Detection Social and Information Networks

Equal Affection or Random Selection: the Quality of Subjective Feedback from a Group Perspective

no code implementations24 Feb 2021 Jiale Chen, Yuqing Kong, Yuxuan Lu

With this assumption, we propose a new definition for uninformative feedback and correspondingly design a family of evaluation metrics, called f-variety, for group-level feedback which can 1) distinguish informative feedback and uninformative feedback (separation) even if their statistics are both uniform and 2) decrease as the ratio of uninformative respondents increases (monotonicity).

Computer Science and Game Theory

TCGM: An Information-Theoretic Framework for Semi-Supervised Multi-Modality Learning

no code implementations ECCV 2020 Xinwei Sun, Yilun Xu, Peng Cao, Yuqing Kong, Lingjing Hu, Shanghang Zhang, Yizhou Wang

In this paper, we propose a novel information-theoretic approach, namely \textbf{T}otal \textbf{C}orrelation \textbf{G}ain \textbf{M}aximization (TCGM), for semi-supervised multi-modal learning, which is endowed with promising properties: (i) it can utilize effectively the information across different modalities of unlabeled data points to facilitate training classifiers of each modality (ii) it has theoretical guarantee to identify Bayesian classifiers, i. e., the ground truth posteriors of all modalities.

Disease Prediction Emotion Recognition +1

L_DMI: A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise

no code implementations NeurIPS 2019 Yilun Xu, Peng Cao, Yuqing Kong, Yizhou Wang

To the best of our knowledge, L_DMI is the first loss function that is provably robust to instance-independent label noise, regardless of noise pattern, and it can be applied to any existing classification neural networks straightforwardly without any auxiliary information.

Ranked #36 on Image Classification on Clothing1M (using extra training data)

Learning with noisy labels

L_DMI: An Information-theoretic Noise-robust Loss Function

2 code implementations8 Sep 2019 Yilun Xu, Peng Cao, Yuqing Kong, Yizhou Wang

\emph{To the best of our knowledge, $\mathcal{L}_{DMI}$ is the first loss function that is provably robust to instance-independent label noise, regardless of noise pattern, and it can be applied to any existing classification neural networks straightforwardly without any auxiliary information}.

Learning with noisy labels

Max-MIG: an Information Theoretic Approach for Joint Learning from Crowds

1 code implementation ICLR 2019 Peng Cao, Yilun Xu, Yuqing Kong, Yizhou Wang

Furthermore, we devise an accurate data-crowds forecaster that employs both the data and the crowdsourced labels to forecast the ground truth.

Water from Two Rocks: Maximizing the Mutual Information

no code implementations24 Feb 2018 Yuqing Kong, Grant Schoenebeck

In co-training/multiview learning, the goal is to aggregate two views of data into a prediction for a latent label.

Multiview Learning Vocal Bursts Valence Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.