Search Results for author: Guoxian Yu

Found 27 papers, 0 papers with code

Calibration-compatible Listwise Distillation of Privileged Features for CTR Prediction

no code implementations14 Dec 2023 Xiaoqiang Gui, Yueyao Cheng, Xiang-Rong Sheng, Yunfeng Zhao, Guoxian Yu, Shuguang Han, Yuning Jiang, Jian Xu, Bo Zheng

A typical practice is privileged features distillation (PFD): train a teacher model using all features (including privileged ones) and then distill the knowledge from the teacher model using a student model (excluding the privileged features), which is then employed for online serving.

Click-Through Rate Prediction

Multi-dimensional Fair Federated Learning

no code implementations9 Dec 2023 Cong Su, Guoxian Yu, Jun Wang, Hui Li, Qingzhong Li, Han Yu

Federated learning (FL) has emerged as a promising collaborative and secure paradigm for training a model from decentralized data without compromising privacy.

Fairness Federated Learning

Federated Causality Learning with Explainable Adaptive Optimization

no code implementations9 Dec 2023 Dezhi Yang, Xintong He, Jun Wang, Guoxian Yu, Carlotta Domeniconi, Jinglin Zhang

We design a global optimization formula to naturally aggregate the causal graphs from client data and constrain the acyclicity of the global graph without exposing local data.

Causal Discovery

Multi-granularity Causal Structure Learning

no code implementations9 Dec 2023 Jiaxuan Liang, Jun Wang, Guoxian Yu, Shuyin Xia, Guoyin Wang

Unveil, model, and comprehend the causal mechanisms underpinning natural phenomena stand as fundamental endeavors across myriad scientific disciplines.

Entire Space Cascade Delayed Feedback Modeling for Effective Conversion Rate Prediction

no code implementations9 Aug 2023 Yunfeng Zhao, Xu Yan, Xiaoqiang Gui, Shuguang Han, Xiang-Rong Sheng, Guoxian Yu, Jufeng Chen, Zhao Xu, Bo Zheng

Furthermore, there is delayed feedback in both conversion and refund events and they are sequentially dependent, named cascade delayed feedback (CDF), which significantly harms data freshness for model training.

Recommendation Systems Selection bias

Long-tail Cross Modal Hashing

no code implementations28 Nov 2022 Zijun Gao, Jun Wang, Guoxian Yu, Zhongmin Yan, Carlotta Domeniconi, Jinglin Zhang

LtCMH firstly adopts auto-encoders to mine the individuality and commonality of different modalities by minimizing the dependency between the individuality of respective modalities and by enhancing the commonality of these modalities.

Reinforcement Causal Structure Learning on Order Graph

no code implementations22 Nov 2022 Dezhi Yang, Guoxian Yu, Jun Wang, Zhengtian Wu, Maozu Guo

In this paper, we propose {Reinforcement Causal Structure Learning on Order Graph} (RCL-OG) that uses order graph instead of MCMC to model different DAG topological orderings and to reduce the problem size.

Causal Discovery Q-Learning

Open-Set Crowdsourcing using Multiple-Source Transfer Learning

no code implementations7 Nov 2021 Guangyang Han, Guoxian Yu, Lei Liu, Lizhen Cui, Carlotta Domeniconi, Xiangliang Zhang

First, OSCrowd integrates crowd theme related datasets into a large source domain to facilitate partial transfer learning to approximate the label space inference of these tasks.

Transfer Learning

Crowdsourcing with Meta-Workers: A New Way to Save the Budget

no code implementations7 Nov 2021 Guangyang Han, Guoxian Yu, Lizhen Cui, Carlotta Domeniconi, Xiangliang Zhang

Due to the unreliability of Internet workers, it's difficult to complete a crowdsourcing project satisfactorily, especially when the tasks are multiple and the budget is limited.

Few-Shot Learning Image Classification

Meta Cross-Modal Hashing on Long-Tailed Data

no code implementations7 Nov 2021 Runmin Wang, Guoxian Yu, Carlotta Domeniconi, Xiangliang Zhang

Due to the lack of training samples in the tail classes, MetaCMH first learns direct features from data in different modalities, and then introduces an associative memory module to learn the memory features of samples of the tail classes.

Meta-Learning

MetaMIML: Meta Multi-Instance Multi-Label Learning

no code implementations7 Nov 2021 Yuanlin Yang, Guoxian Yu, Jun Wang, Lei Liu, Carlotta Domeniconi, Maozu Guo

Multi-Instance Multi-Label learning (MIML) models complex objects (bags), each of which is associated with a set of interrelated labels and composed with a set of instances.

Meta-Learning Multi-Label Learning +1

Cross-modal Zero-shot Hashing by Label Attributes Embedding

no code implementations7 Nov 2021 Runmin Wang, Guoxian Yu, Lei Liu, Lizhen Cui, Carlotta Domeniconi, Xiangliang Zhang

Cross-modal hashing (CMH) is one of the most promising methods in cross-modal approximate nearest neighbor search.

Attribute

Few-Shot Partial-Label Learning

no code implementations2 Jun 2021 Yunfeng Zhao, Guoxian Yu, Lei Liu, Zhongmin Yan, Lizhen Cui, Carlotta Domeniconi

Partial-label learning (PLL) generally focuses on inducing a noise-tolerant multi-class classifier by training on overly-annotated samples, each of which is annotated with a set of labels, but only one is the valid label.

Few-Shot Learning Metric Learning +2

Multi-typed Objects Multi-view Multi-instance Multi-label Learning

no code implementations6 Oct 2020 Yuanlin Yang, Guoxian Yu, Jun Wang, Carlotta Domeniconi, Xiangliang Zhang

Multi-typed objects Multi-view Multi-instance Multi-label Learning (M4L) deals with interconnected multi-typed objects (or bags) that are made of diverse instances, represented with heterogeneous feature views and annotated with a set of non-exclusive but semantically related labels.

Multi-Label Learning

Deep Incomplete Multi-View Multiple Clusterings

no code implementations2 Oct 2020 Shaowei Wei, Jun Wang, Guoxian Yu, Carlotta Domeniconi, Xiangliang Zhang

Multi-view clustering aims at exploiting information from multiple heterogeneous views to promote clustering.

Clustering

Partial Multi-label Learning with Label and Feature Collaboration

no code implementations17 Mar 2020 Tingting Yu, Guoxian Yu, Jun Wang, Maozu Guo

Partial multi-label learning (PML) models the scenario where each training instance is annotated with a set of candidate labels, and only some of the labels are relevant.

Multi-Label Learning

Attention-Aware Answers of the Crowd

no code implementations24 Dec 2019 Jingzheng Tu, Guoxian Yu, Jun Wang, Carlotta Domeniconi, Xiangliang Zhang

However, they all assume that workers' label quality is stable over time (always at the same level whenever they conduct the tasks).

Bayesian Inference

Multi-View Multiple Clusterings using Deep Matrix Factorization

no code implementations26 Nov 2019 Shaowei Wei, Jun Wang, Guoxian Yu, Carlotta, Xiangliang Zhang

Multi-view clustering aims at integrating complementary information from multiple heterogeneous views to improve clustering results.

Clustering

Prototypical Networks for Multi-Label Learning

no code implementations17 Nov 2019 Zhuo Yang, Yufei Han, Guoxian Yu, Qiang Yang, Xiangliang Zhang

We propose to formulate multi-label learning as a estimation of class distribution in a non-linear embedding space, where for each label, its positive data embeddings and negative data embeddings distribute compactly to form a positive component and negative component respectively, while the positive component and negative component are pushed away from each other.

Multi-Label Classification Multi-Label Learning

Active Multi-Label Crowd Consensus

no code implementations7 Nov 2019 Jinzheng Tu, Guoxian Yu, Carlotta Domeniconi, Jun Wang, Xiangliang Zhang

AMCC accounts for the commonality and individuality of workers, and assumes that workers can be organized into different groups.

Cross-modal Zero-shot Hashing

no code implementations19 Aug 2019 Xuanwu Liu, Zhao Li, Jun Wang, Guoxian Yu, Carlotta Domeniconi, Xiangliang Zhang

It then defines an objective function to achieve deep feature learning compatible with the composite similarity preserving, category attribute space learning, and hashing coding function learning.

Attribute Retrieval

Weakly-paired Cross-Modal Hashing

no code implementations29 May 2019 Xuanwu Liu, Jun Wang, Guoxian Yu, Carlotta Domeniconi, Xiangliang Zhang

FlexCMH first introduces a clustering-based matching strategy to explore the local structure of each cluster, and thus to find the potential correspondence between clusters (and samples therein) across modalities.

Clustering Retrieval

ActiveHNE: Active Heterogeneous Network Embedding

no code implementations14 May 2019 Xia Chen, Guoxian Yu, Jun Wang, Carlotta Domeniconi, Zhao Li, Xiangliang Zhang

To maximize the profit of utilizing the rare and valuable supervised information in HNEs, we develop a novel Active Heterogeneous Network Embedding (ActiveHNE) framework, which includes two components: Discriminative Heterogeneous Network Embedding (DHNE) and Active Query in Heterogeneous Networks (AQHN).

Network Embedding

Multi-View Multi-Instance Multi-Label Learning based on Collaborative Matrix Factorization

no code implementations13 May 2019 Yuying Xing, Guoxian Yu, Carlotta Domeniconi, Jun Wang, Zili Zhang, Maozu Guo

To preserve the intrinsic structure of the data matrices, M3Lcmf collaboratively factorizes them into low-rank matrices, explores the latent relationships between bags, instances, and labels, and selectively merges the data matrices.

Multi-Label Learning

Multi-View Multiple Clustering

no code implementations13 May 2019 Shixing Yao, Guoxian Yu, Jun Wang, Carlotta Domeniconi, Xiangliang Zhang

It then uses matrix factorization on the individual matrices, along with the shared matrix, to generate diverse clusterings of high-quality.

Clustering Representation Learning

Ranking-based Deep Cross-modal Hashing

no code implementations11 May 2019 Xuanwu Liu, Guoxian Yu, Carlotta Domeniconi, Jun Wang, Yazhou Ren, Maozu Guo

Next, to expand the semantic representation power of hand-crafted features, RDCMH integrates the semantic ranking information into deep cross-modal hashing and jointly optimizes the compatible parameters of deep feature representations and of hashing functions.

Cross-Modal Retrieval Retrieval

Multiple Independent Subspace Clusterings

no code implementations10 May 2019 Xing Wang, Jun Wang, Carlotta Domeniconi, Guoxian Yu, Guo-Qiang Xiao, Maozu Guo

To ease this process, we consider diverse clusterings embedded in different subspaces, and analyze the embedding subspaces to shed light into the structure of each clustering.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.