Search Results for author: Yuqing Xie

Found 15 papers, 6 papers with code

Equivariant Symmetry Breaking Sets

no code implementations5 Feb 2024 Yuqing Xie, Tess Smidt

Minimizing the size of these sets equates to data efficiency.

LLM-Powered Hierarchical Language Agent for Real-time Human-AI Coordination

1 code implementation23 Dec 2023 Jijia Liu, Chao Yu, Jiaxuan Gao, Yuqing Xie, Qingmin Liao, Yi Wu, Yu Wang

AI agents powered by Large Language Models (LLMs) have made significant advances, enabling them to assist humans in diverse complex tasks and leading to a revolution in human-AI coordination.

Code Generation

Approximating Human-Like Few-shot Learning with GPT-based Compression

no code implementations14 Aug 2023 Cynthia Huang, Yuqing Xie, Zhiying Jiang, Jimmy Lin, Ming Li

Leveraging the approximated information distance, our method allows the direct application of GPT models in quantitative text similarity measurements.

Data Compression Few-Shot Learning +6

Gate Recurrent Unit Network based on Hilbert-Schmidt Independence Criterion for State-of-Health Estimation

no code implementations16 Mar 2023 Ziyue Huang, Lujuan Dang, Yuqing Xie, Wentao Ma, Badong Chen

State-of-health (SOH) estimation is a key step in ensuring the safe and reliable operation of batteries.

Improving Prediction Backward-Compatiblility in NLP Model Upgrade with Gated Fusion

no code implementations4 Feb 2023 Yi-An Lai, Elman Mansimov, Yuqing Xie, Yi Zhang

When upgrading neural models to a newer version, new errors that were not encountered in the legacy version can be introduced, known as regression errors.


An Embedding-Based Grocery Search Model at Instacart

no code implementations12 Sep 2022 Yuqing Xie, Taesik Na, Xiao Xiao, Saurav Manchanda, Young Rao, Zhihong Xu, Guanghua Shu, Esther Vasiete, Tejaswi Tenneti, Haixun Wang

To train the model efficiently on noisy data, we propose a self-adversarial learning method and a cascade training method.

Evaluating Token-Level and Passage-Level Dense Retrieval Models for Math Information Retrieval

1 code implementation21 Mar 2022 Wei Zhong, Jheng-Hong Yang, Yuqing Xie, Jimmy Lin

With the recent success of dense retrieval methods based on bi-encoders, studies have applied this approach to various interesting downstream retrieval tasks with good efficiency and in-domain effectiveness.

 Ranked #1 on Math Information Retrieval on ARQMath (using extra training data)

Information Retrieval Math +2

Segatron: Segment-Aware Transformer for Language Modeling and Understanding

1 code implementation30 Apr 2020 He Bai, Peng Shi, Jimmy Lin, Yuqing Xie, Luchen Tan, Kun Xiong, Wen Gao, Ming Li

To verify this, we propose a segment-aware Transformer (Segatron), by replacing the original token position encoding with a combined position encoding of paragraph, sentence, and token.

Language Modelling Masked Language Modeling +3

Rapid Adaptation of BERT for Information Extraction on Domain-Specific Business Documents

1 code implementation5 Feb 2020 Ruixue Zhang, Wei Yang, Luyun Lin, Zhengkai Tu, Yuqing Xie, Zihang Fu, Yuhao Xie, Luchen Tan, Kun Xiong, Jimmy Lin

Techniques for automatically extracting important content elements from business documents such as contracts, statements, and filings have the potential to make business operations more efficient.

Asymmetric Correntropy for Robust Adaptive Filtering

no code implementations21 Nov 2019 Badong Chen, Yuqing Xie, Zhuang Li, Yingsong Li, Pengju Ren

Correntropy is generally defined as the expectation of a Gaussian kernel between two random variables.

Multi-Kernel Correntropy for Robust Learning

no code implementations24 May 2019 Badong Chen, Yuqing Xie, Xin Wang, Zejian yuan, Pengju Ren, Jing Qin

In a recent work, the concept of mixture correntropy (MC) was proposed to improve the learning performance, where the kernel function is a mixture Gaussian kernel, namely a linear combination of several zero-mean Gaussian kernels with different widths.

Data Augmentation for BERT Fine-Tuning in Open-Domain Question Answering

no code implementations14 Apr 2019 Wei Yang, Yuqing Xie, Luchen Tan, Kun Xiong, Ming Li, Jimmy Lin

Recently, a simple combination of passage retrieval using off-the-shelf IR techniques and a BERT reader was found to be very effective for question answering directly on Wikipedia, yielding a large improvement over the previous state of the art on a standard benchmark dataset.

Data Augmentation Open-Domain Question Answering +2

Cannot find the paper you are looking for? You can Submit a new open access paper.