Search Results for author: Tao Qi

Found 47 papers, 9 papers with code

Named Entity Recognition with Context-Aware Dictionary Knowledge

no code implementations CCL 2020 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

In addition, we propose an auxiliary term classification task to predict the types of the matched entity names, and jointly train it with the NER model to fuse both contexts and dictionary knowledge into NER.

named-entity-recognition Named Entity Recognition +1

FedSampling: A Better Sampling Strategy for Federated Learning

no code implementations25 Jun 2023 Tao Qi, Fangzhao Wu, Lingjuan Lyu, Yongfeng Huang, Xing Xie

In this paper, instead of client uniform sampling, we propose a novel data uniform sampling strategy for federated learning (FedSampling), which can effectively improve the performance of federated learning especially when client data size distribution is highly imbalanced across clients.

Federated Learning Privacy Preserving

FairVFL: A Fair Vertical Federated Learning Framework with Contrastive Adversarial Learning

1 code implementation7 Jun 2022 Tao Qi, Fangzhao Wu, Chuhan Wu, Lingjuan Lyu, Tong Xu, Zhongliang Yang, Yongfeng Huang, Xing Xie

In order to learn a fair unified representation, we send it to each platform storing fairness-sensitive features and apply adversarial learning to remove bias from the unified representation inherited from the biased data.

Fairness Privacy Preserving +1

Robust Quantity-Aware Aggregation for Federated Learning

no code implementations22 May 2022 Jingwei Yi, Fangzhao Wu, Huishuai Zhang, Bin Zhu, Tao Qi, Guangzhong Sun, Xing Xie

Federated learning (FL) enables multiple clients to collaboratively train models without sharing their local data, and becomes an important privacy-preserving machine learning framework.

Federated Learning Privacy Preserving

FedCL: Federated Contrastive Learning for Privacy-Preserving Recommendation

no code implementations21 Apr 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang, Xing Xie

In this paper, we propose a federated contrastive learning method named FedCL for privacy-preserving recommendation, which can exploit high-quality negative samples for effective model training with privacy well protected.

Contrastive Learning Privacy Preserving

News Recommendation with Candidate-aware User Modeling

no code implementations10 Apr 2022 Tao Qi, Fangzhao Wu, Chuhan Wu, Yongfeng Huang

Existing methods for news recommendation usually model user interest from historical clicked news without the consideration of candidate news.

News Recommendation

FUM: Fine-grained and Fast User Modeling for News Recommendation

no code implementations10 Apr 2022 Tao Qi, Fangzhao Wu, Chuhan Wu, Yongfeng Huang

The core idea of FUM is to concatenate the clicked news into a long document and transform user modeling into a document modeling task with both intra-news and inter-news word-level interactions.

News Recommendation

ProFairRec: Provider Fairness-aware News Recommendation

1 code implementation10 Apr 2022 Tao Qi, Fangzhao Wu, Chuhan Wu, Peijie Sun, Le Wu, Xiting Wang, Yongfeng Huang, Xing Xie

To learn provider-fair representations from biased data, we employ provider-biased representations to inherit provider bias from data.

Fairness News Recommendation

Unified and Effective Ensemble Knowledge Distillation

no code implementations1 Apr 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

In addition, we weight the distillation loss based on the overall prediction correctness of the teacher ensemble to distill high-quality knowledge.

Knowledge Distillation Transfer Learning

Semi-FairVAE: Semi-supervised Fair Representation Learning with Adversarial Variational Autoencoder

no code implementations1 Apr 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

In this paper, we propose a semi-supervised fair representation learning approach based on adversarial variational autoencoder, which can reduce the dependency of adversarial fair models on data with labeled sensitive attributes.

Attribute Fairness +1

End-to-end Learnable Diversity-aware News Recommendation

no code implementations1 Apr 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

Different from existing news recommendation methods that are usually based on point- or pair-wise ranking, in LeaDivRec we propose a more effective list-wise news recommendation model.

News Recommendation

FairRank: Fairness-aware Single-tower Ranking Framework for News Recommendation

no code implementations1 Apr 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

Since candidate news selection can be biased, we propose to use a shared candidate-aware user model to match user interest with a real displayed candidate news and a random news, respectively, to learn a candidate-aware user embedding that reflects user interest in candidate news and a candidate-invariant user embedding that indicates intrinsic user interest.

Attribute Fairness +1

Are Big Recommendation Models Fair to Cold Users?

no code implementations28 Feb 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

They are usually learned on historical user behavior data to infer user interest and predict future user behaviors (e. g., clicks).

Fairness Recommendation Systems

Quality-aware News Recommendation

no code implementations28 Feb 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

In this paper, we propose a quality-aware news recommendation method named QualityRec that can effectively improve the quality of recommended news.

News Recommendation

NoisyTune: A Little Noise Can Help You Finetune Pretrained Language Models Better

no code implementations ACL 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang, Xing Xie

In this paper, we propose a very simple yet effective method named NoisyTune to help better finetune PLMs on downstream tasks by adding some noise to the parameters of PLMs before fine-tuning.

FedAttack: Effective and Covert Poisoning Attack on Federated Recommendation via Hard Sampling

no code implementations10 Feb 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang, Xing Xie

However, existing general FL poisoning methods for degrading model performance are either ineffective or not concealed in poisoning federated recommender systems.

Federated Learning Recommendation Systems

Game of Privacy: Towards Better Federated Platform Collaboration under Privacy Restriction

no code implementations10 Feb 2022 Chuhan Wu, Fangzhao Wu, Tao Qi, Yanlin Wang, Yuqing Yang, Yongfeng Huang, Xing Xie

To solve the game, we propose a platform negotiation method that simulates the bargaining among platforms and locally optimizes their policies via gradient descent.

Vertical Federated Learning

Uni-FedRec: A Unified Privacy-Preserving News Recommendation Framework for Model Training and Online Serving

no code implementations Findings (EMNLP) 2021 Tao Qi, Fangzhao Wu, Chuhan Wu, Yongfeng Huang, Xing Xie

In this paper, we propose a unified news recommendation framework, which can utilize user data locally stored in user clients to train models and serve users in a privacy-preserving way.

News Generation News Recommendation +2

UserBERT: Contrastive User Model Pre-training

no code implementations3 Sep 2021 Chuhan Wu, Fangzhao Wu, Yang Yu, Tao Qi, Yongfeng Huang, Xing Xie

Two self-supervision tasks are incorporated in UserBERT for user model pre-training on unlabeled user behavior data to empower user modeling.

Is News Recommendation a Sequential Recommendation Task?

no code implementations20 Aug 2021 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

News recommendation is often modeled as a sequential recommendation task, which assumes that there are rich short-term dependencies over historical clicked news.

News Recommendation Sequential Recommendation

Smart Bird: Learnable Sparse Attention for Efficient and Effective Transformer

no code implementations20 Aug 2021 Chuhan Wu, Fangzhao Wu, Tao Qi, Binxing Jiao, Daxin Jiang, Yongfeng Huang, Xing Xie

We then sample token pairs based on their probability scores derived from the sketched attention matrix to generate different sparse attention index matrices for different attention heads.

Fastformer: Additive Attention Can Be All You Need

9 code implementations20 Aug 2021 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang, Xing Xie

In this way, Fastformer can achieve effective context modeling with linear complexity.

 Ranked #1 on News Recommendation on MIND (using extra training data)

News Recommendation Text Classification +1

HieRec: Hierarchical User Interest Modeling for Personalized News Recommendation

no code implementations ACL 2021 Tao Qi, Fangzhao Wu, Chuhan Wu, Peiru Yang, Yang Yu, Xing Xie, Yongfeng Huang

Instead of a single user embedding, in our method each user is represented in a hierarchical interest tree to better capture their diverse and multi-grained interest in news.

News Recommendation

Personalized News Recommendation with Knowledge-aware Interactive Matching

1 code implementation20 Apr 2021 Tao Qi, Fangzhao Wu, Chuhan Wu, Yongfeng Huang

Our method interactively models candidate news and user interest to facilitate their accurate matching.

Knowledge Graphs News Recommendation

MM-Rec: Multimodal News Recommendation

no code implementations15 Apr 2021 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

Most of existing news representation methods learn news representations only from news texts while ignore the visual information in news like images.

News Recommendation object-detection +1

Empowering News Recommendation with Pre-trained Language Models

1 code implementation15 Apr 2021 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

Our PLM-empowered news recommendation models have been deployed to the Microsoft News platform, and achieved significant gains in terms of both click and pageview in both English-speaking and global markets.

Natural Language Understanding News Recommendation

FeedRec: News Feed Recommendation with Various User Feedbacks

no code implementations9 Feb 2021 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

Besides, the feed recommendation models trained solely on click behaviors cannot optimize other objectives such as user engagement.

News Recommendation

NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application

no code implementations Findings (EMNLP) 2021 Chuhan Wu, Fangzhao Wu, Yang Yu, Tao Qi, Yongfeng Huang, Qi Liu

However, existing language models are pre-trained and distilled on general corpus like Wikipedia, which has some gaps with the news domain and may be suboptimal for news intelligence.

Knowledge Distillation Language Modelling +2

SentiRec: Sentiment Diversity-aware Neural News Recommendation

no code implementations Asian Chapter of the Association for Computational Linguistics 2020 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

We learn user representations from browsed news representations, and compute click scores based on user and candidate news representations.

News Recommendation

Improving Attention Mechanism with Query-Value Interaction

no code implementations8 Oct 2020 Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

We propose a query-value interaction function which can learn query-aware attention values, and combine them with the original values and attention weights to form the final output.

PTUM: Pre-training User Model from Unlabeled User Behaviors via Self-supervision

1 code implementation Findings of the Association for Computational Linguistics 2020 Chuhan Wu, Fangzhao Wu, Tao Qi, Jianxun Lian, Yongfeng Huang, Xing Xie

Motivated by pre-trained language models which are pre-trained on large-scale unlabeled corpus to empower many downstream tasks, in this paper we propose to pre-train user models from large-scale unlabeled user behaviors data.

Attentive Pooling with Learnable Norms for Text Representation

no code implementations ACL 2020 Chuhan Wu, Fangzhao Wu, Tao Qi, Xiaohui Cui, Yongfeng Huang

Different from existing pooling methods that use a fixed pooling norm, we propose to learn the norm in an end-to-end manner to automatically find the optimal ones for text representation in different tasks.

Graph Enhanced Representation Learning for News Recommendation

no code implementations31 Mar 2020 Suyu Ge, Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang

Existing news recommendation methods achieve personalization by building accurate news representations from news content and user representations from their direct interactions with news (e. g., click), while ignoring the high-order relatedness between users and news.

Graph Attention News Recommendation +1

FedNER: Privacy-preserving Medical Named Entity Recognition with Federated Learning

no code implementations20 Mar 2020 Suyu Ge, Fangzhao Wu, Chuhan Wu, Tao Qi, Yongfeng Huang, Xing Xie

Since the labeled data in different platforms usually has some differences in entity type and annotation criteria, instead of constraining different platforms to share the same model, we decompose the medical NER model in each platform into a shared module and a private module.

Federated Learning Medical Named Entity Recognition +4

Reviews Meet Graphs: Enhancing User and Item Representations for Recommendation with Hierarchical Attentive Graph Neural Network

no code implementations IJCNLP 2019 Chuhan Wu, Fangzhao Wu, Tao Qi, Suyu Ge, Yongfeng Huang, Xing Xie

In the review content-view, we propose to use a hierarchical model to first learn sentence representations from words, then learn review representations from sentences, and finally learn user/item representations from reviews.

MULTI-VIEW LEARNING Representation Learning +1

Neural News Recommendation with Heterogeneous User Behavior

no code implementations IJCNLP 2019 Chuhan Wu, Fangzhao Wu, Mingxiao An, Tao Qi, Jianqiang Huang, Yongfeng Huang, Xing Xie

In the user representation module, we propose an attentive multi-view learning framework to learn unified representations of users from their heterogeneous behaviors such as search queries, clicked news and browsed webpages.

MULTI-VIEW LEARNING News Recommendation

Detecting and Extracting of Adverse Drug Reaction Mentioning Tweets with Multi-Head Self Attention

no code implementations WS 2019 Suyu Ge, Tao Qi, Chuhan Wu, Yongfeng Huang

This paper describes our system for the first and second shared tasks of the fourth Social Media Mining for Health Applications (SMM4H) workshop.

Language Modelling Task 2 +1

THU\_NGN at SemEval-2019 Task 3: Dialog Emotion Classification using Attentional LSTM-CNN

no code implementations SEMEVAL 2019 Suyu Ge, Tao Qi, Chuhan Wu, Yongfeng Huang

With the development of the Internet, dialog systems are widely used in online platforms to provide personalized services for their users.

Emotion Classification Emotion Recognition +2

THU\_NGN at SemEval-2019 Task 12: Toponym Detection and Disambiguation on Scientific Papers

no code implementations SEMEVAL 2019 Tao Qi, Suyu Ge, Chuhan Wu, Yubo Chen, Yongfeng Huang

First name: Tao Last name: Qi Email: taoqi. qt@gmail. com Affiliation: Department of Electronic Engineering, Tsinghua University First name: Suyu Last name: Ge Email: gesy17@mails. tsinghua. edu. cn Affiliation: Department of Electronic Engineering, Tsinghua University First name: Chuhan Last name: Wu Email: wuch15@mails. tsinghua. edu. cn Affiliation: Department of Electronic Engineering, Tsinghua University First name: Yubo Last name: Chen Email: chen-yb18@mails. tsinghua. edu. cn Affiliation: Department of Electronic Engineering, Tsinghua University First name: Yongfeng Last name: Huang Email: yfhuang@mail. tsinghua. edu. cn Affiliation: Department of Electronic Engineering, Tsinghua University Toponym resolution is an important and challenging task in the neural language processing field, and has wide applications such as emergency response and social media geographical event analysis.

POS Toponym Resolution +1

Cannot find the paper you are looking for? You can Submit a new open access paper.