Search Results for author: Haochen Li

Found 11 papers, 2 papers with code

KiPT: Knowledge-injected Prompt Tuning for Event Detection

no code implementations COLING 2022 Haochen Li, Tong Mo, Hongcheng Fan, Jingkun Wang, Jiaxi Wang, Fuhao Zhang, Weiping Li

Then, knowledge-injected prompts are constructed using external knowledge bases, and a prompt tuning strategy is leveraged to optimize the prompts.

Event Detection

Prompt-based Graph Model for Joint Liberal Event Extraction and Event Schema Induction

no code implementations19 Mar 2024 Haochen Li, Di Geng

Therefore, the researchers propose Liberal Event Extraction (LEE), which aims to extract events and discover event schemas simultaneously.

Event Extraction

GraphERE: Jointly Multiple Event-Event Relation Extraction via Graph-Enhanced Event Embeddings

no code implementations19 Mar 2024 Haochen Li, Di Geng

First, we enrich the event embeddings with event argument and structure features by using static AMR graphs and IE graphs; Then, to jointly extract multiple event relations, we use Node Transformer and construct Task-specific Dynamic Event Graphs for each type of relation.

Event Relation Extraction Multi-Task Learning +2

Towards Goal-oriented Large Language Model Prompting: A Survey

no code implementations25 Jan 2024 Haochen Li, Jonathan Leung, Zhiqi Shen

Large Language Models (LLMs) have shown prominent performance in various downstream tasks in which prompt engineering plays a pivotal role in optimizing LLMs' performance.

Language Modelling Large Language Model +1

Rewriting the Code: A Simple Method for Large Language Model Augmented Code Search

no code implementations9 Jan 2024 Haochen Li, Xin Zhou, Zhiqi Shen

In code search, the Generation-Augmented Retrieval (GAR) framework, which generates exemplar code snippets to augment queries, has emerged as a promising strategy to address the principal challenge of modality misalignment between code snippets and natural language queries, particularly with the demonstrated code generation capabilities of Large Language Models (LLMs).

Code Generation Code Search +4

Rethinking Negative Pairs in Code Search

1 code implementation12 Oct 2023 Haochen Li, Xin Zhou, Luu Anh Tuan, Chunyan Miao

In our proposed loss function, we apply three methods to estimate the weights of negative pairs and show that the vanilla InfoNCE loss is a special case of Soft-InfoNCE.

Code Search Contrastive Learning +2

An Empirical Analysis on Financial Markets: Insights from the Application of Statistical Physics

no code implementations28 Aug 2023 Haochen Li, Yi Cao, Maria Polukarov, Carmine Ventre

In this study, we introduce a physical model inspired by statistical physics for predicting price volatility and expected returns by leveraging Level 3 order book data.

Detecting Financial Market Manipulation with Statistical Physics Tools

no code implementations16 Aug 2023 Haochen Li, Maria Polukarova, Carmine Ventre

We take inspiration from statistical physics to develop a novel conceptual framework for the analysis of financial markets.

Anomaly Detection

Joint Event Extraction via Structural Semantic Matching

no code implementations6 Jun 2023 Haochen Li, Tianhao Gao, Jingkun Wang, Weiping Li

Event Extraction (EE) is one of the essential tasks in information extraction, which aims to detect event mentions from text and find the corresponding argument roles.

Event Detection Event Extraction

Exploring Representation-Level Augmentation for Code Search

1 code implementation21 Oct 2022 Haochen Li, Chunyan Miao, Cyril Leung, Yanxian Huang, Yuan Huang, Hongyu Zhang, Yanlin Wang

In this paper, we explore augmentation methods that augment data (both code and query) at representation level which does not require additional data processing and training, and based on this we propose a general format of representation-level augmentation that unifies existing methods.

Code Search Contrastive Learning +1

Sparse Winograd Convolutional neural networks on small-scale systolic arrays

no code implementations3 Oct 2018 Feng Shi, Haochen Li, Yuhe Gao, Benjamin Kuschner, Song-Chun Zhu

The reconfigurability, energy-efficiency, and massive parallelism on FPGAs make them one of the best choices for implementing efficient deep learning accelerators.

Layout Design

Cannot find the paper you are looking for? You can Submit a new open access paper.