Search Results for author: Qingkai Zeng

Found 17 papers, 7 papers with code

ChatEL: Entity Linking with Chatbots

1 code implementation20 Feb 2024 Yifan Ding, Qingkai Zeng, Tim Weninger

Fortunately, Large Language Models (LLMs) like GPT provide a highly-advanced solution to the problems inherent in EL models, but simply naive prompts to LLMs do not work well.

Entity Linking Sentence

EntGPT: Linking Generative Large Language Models with Knowledge Bases

no code implementations9 Feb 2024 Yifan Ding, Amrit Poudel, Qingkai Zeng, Tim Weninger, Balaji Veeramani, Sanmitra Bhattacharya

Overall, the prompting method improves the micro-F_1 score of the original vanilla models by a large margin, on some cases up to 36% and higher, and obtains comparable performance across 10 datasets when compared to existing methods with SFT.

Entity Disambiguation Fact Checking +2

Democratizing Large Language Models via Personalized Parameter-Efficient Fine-tuning

no code implementations6 Feb 2024 Zhaoxuan Tan, Qingkai Zeng, Yijun Tian, Zheyuan Liu, Bing Yin, Meng Jiang

OPPU integrates parametric user knowledge in the personal PEFT parameters with the non-parametric knowledge acquired through retrieval and profile.

Retrieval

When eBPF Meets Machine Learning: On-the-fly OS Kernel Compartmentalization

no code implementations11 Jan 2024 Zicheng Wang, Tiejin Chen, Qinrun Dai, Yueqi Chen, Hua Wei, Qingkai Zeng

Compartmentalization effectively prevents initial corruption from turning into a successful attack.

Auto-Instruct: Automatic Instruction Generation and Ranking for Black-Box Language Models

no code implementations19 Oct 2023 Zhihan Zhang, Shuohang Wang, Wenhao Yu, Yichong Xu, Dan Iter, Qingkai Zeng, Yang Liu, Chenguang Zhu, Meng Jiang

Large language models (LLMs) can perform a wide range of tasks by following natural language instructions, without the necessity of task-specific fine-tuning.

Automatic Controllable Product Copywriting for E-Commerce

1 code implementation21 Jun 2022 Xiaojie Guo, Qingkai Zeng, Meng Jiang, Yun Xiao, Bo Long, Lingfei Wu

Automatic product description generation for e-commerce has witnessed significant advancement in the past decade.

Aspect Extraction Language Modelling +2

Traceability Transformed: Generating moreAccurate Links with Pre-Trained BERT Models

1 code implementation8 Feb 2021 Jinfeng Lin, Yalin Liu, Qingkai Zeng, Meng Jiang, Jane Cleland-Huang

In this study, we propose a novel framework called Trace BERT (T-BERT) to generate trace links between source code and natural language artifacts.

Transfer Learning Software Engineering

Tri-Train: Automatic Pre-Fine Tuning between Pre-Training and Fine-Tuning for SciNER

no code implementations Findings of the Association for Computational Linguistics 2020 Qingkai Zeng, Wenhao Yu, Mengxia Yu, Tianwen Jiang, Tim Weninger, Meng Jiang

The training process of scientific NER models is commonly performed in two steps: i) Pre-training a language model by self-supervised tasks on huge data and ii) fine-tune training with small labelled data.

Language Modelling NER

Technical Question Answering across Tasks and Domains

1 code implementation NAACL 2021 Wenhao Yu, Lingfei Wu, Yu Deng, Qingkai Zeng, Ruchi Mahindru, Sinem Guven, Meng Jiang

In this paper, we propose a novel framework of deep transfer learning to effectively address technical QA across tasks and domains.

Question Answering Reading Comprehension +2

Crossing Variational Autoencoders for Answer Retrieval

no code implementations ACL 2020 Wenhao Yu, Lingfei Wu, Qingkai Zeng, Shu Tao, Yu Deng, Meng Jiang

Existing methods learned semantic representations with dual encoders or dual variational auto-encoders.

Retrieval

Faceted Hierarchy: A New Graph Type to Organize Scientific Concepts and a Construction Method

no code implementations WS 2019 Qingkai Zeng, Mengxia Yu, Wenhao Yu, JinJun Xiong, Yiyu Shi, Meng Jiang

On a scientific concept hierarchy, a parent concept may have a few attributes, each of which has multiple values being a group of child concepts.

Face Recognition

Cannot find the paper you are looking for? You can Submit a new open access paper.