Search Results for author: Zeping Yu

Found 7 papers, 4 papers with code

How do Large Language Models Learn In-Context? Query and Key Matrices of In-Context Heads are Two Towers for Metric Learning

no code implementations5 Feb 2024 Zeping Yu, Sophia Ananiadou

In shallow layers, the features of demonstrations are merged into their corresponding labels, and the features of the input text are aggregated into the last token.

In-Context Learning Metric Learning

EmoLLMs: A Series of Emotional Large Language Models and Annotation Tools for Comprehensive Affective Analysis

1 code implementation16 Jan 2024 Zhiwei Liu, Kailai Yang, Tianlin Zhang, Qianqian Xie, Zeping Yu, Sophia Ananiadou

In this paper, we propose EmoLLMs, the first series of open-sourced instruction-following LLMs for comprehensive affective analysis based on fine-tuning various LLMs with instruction data, the first multi-task affective analysis instruction dataset (AAID) with 234K data samples based on various classification and regression tasks to support LLM instruction tuning, and a comprehensive affective evaluation benchmark (AEB) with 14 tasks from various sources and domains to test the generalization ability of LLMs.

Instruction Following regression +1

Emotion Detection for Misinformation: A Review

no code implementations1 Nov 2023 Zhiwei Liu, Tianlin Zhang, Kailai Yang, Paul Thompson, Zeping Yu, Sophia Ananiadou

The emotions and sentiments of netizens, as expressed in social media posts and news, constitute important factors that can help to distinguish fake news from genuine news and to understand the spread of rumors.

Fake News Detection Misinformation

CodeCMR: Cross-Modal Retrieval For Function-Level Binary Source Code Matching

1 code implementation NeurIPS 2020 Zeping Yu, Wenxin Zheng, Jiaqi Wang, Qiyi Tang, Sen Nie, Shi Wu

We adopt Deep Pyramid Convolutional Neural Network (DPCNN) for source code feature extraction and Graph Neural Network (GNN) for binary code feature extraction.

Computer Security Cross-Modal Retrieval +2

Sliced Recurrent Neural Networks

3 code implementations COLING 2018 Zeping Yu, Gongshen Liu

In this paper, we introduce sliced recurrent neural networks (SRNNs), which could be parallelized by slicing the sequences into many subsequences.

Sentiment Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.