Search Results for author: Xu Guo

Found 6 papers, 2 papers with code

Latent-Optimized Adversarial Neural Transfer for Sarcasm Detection

1 code implementation NAACL 2021 Xu Guo, Boyang Li, Han Yu, Chunyan Miao

The existence of multiple datasets for sarcasm detection prompts us to apply transfer learning to exploit their commonality.

Meta-Learning Sarcasm Detection +1

Federated Learning for Personalized Humor Recognition

no code implementations3 Dec 2020 Xu Guo, Han Yu, Boyang Li, Hao Wang, Pengwei Xing, Siwei Feng, Zaiqing Nie, Chunyan Miao

In this paper, we propose the FedHumor approach for the recognition of humorous content in a personalized manner through Federated Learning (FL).

Federated Learning Language Modelling

DIFER: Differentiable Automated Feature Engineering

no code implementations17 Oct 2020 Guanghui Zhu, Zhuoer Xu, Xu Guo, Chunfeng Yuan, Yihua Huang

Extensive experiments on classification and regression datasets demonstrate that DIFER can significantly improve the performance of various machine learning algorithms and outperform current state-of-the-art AutoFE methods in terms of both efficiency and performance.

Automated Feature Engineering Feature Engineering

False Discovery Rate Control Under General Dependence By Symmetrized Data Aggregation

1 code implementation27 Feb 2020 Lilun Du, Xu Guo, Wenguang Sun, Changliang Zou

We develop a new class of distribution--free multiple testing rules for false discovery rate (FDR) control under general dependence.

Methodology Statistics Theory Statistics Theory

Deep Learning Inversion of Electrical Resistivity Data

no code implementations10 Apr 2019 Bin Liu, Qian Guo, Shucai Li, Benchao Liu, Yuxiao Ren, Yonghao Pang, Xu Guo, Lanbo Liu, Peng Jiang

According to the comprehensive qualitative analysis and quantitative comparison, ERSInvNet with tier feature map, smooth constraints, and depth weighting function together achieve the best performance.

Model Selection

Cannot find the paper you are looking for? You can Submit a new open access paper.