Search Results for author: Ninglu Shao

Found 4 papers, 3 papers with code

Extensible Embedding: A Flexible Multipler For LLM's Context Length

no code implementations18 Feb 2024 Ninglu Shao, Shitao Xiao, Zheng Liu, Peitian Zhang

2) Strong sample efficiency of training, which enables the embedding model to be learned in a cost-effective way.

Language Modelling

Flexibly Scaling Large Language Models Contexts Through Extensible Tokenization

1 code implementation15 Jan 2024 Ninglu Shao, Shitao Xiao, Zheng Liu, Peitian Zhang

Extensible Tokenization stands as a midware in between of the tokenized context and the LLM, which transforms the raw token embeddings into the extensible embeddings.

Few-Shot Learning Language Modelling

Soaring from 4K to 400K: Extending LLM's Context with Activation Beacon

1 code implementation7 Jan 2024 Peitian Zhang, Zheng Liu, Shitao Xiao, Ninglu Shao, Qiwei Ye, Zhicheng Dou

Although the context window can be extended through fine-tuning, it will result in a considerable cost at both training and inference time, and exert an unfavorable impact to the LLM's original capabilities.

Language Modelling

Uncovering ChatGPT's Capabilities in Recommender Systems

1 code implementation3 May 2023 Sunhao Dai, Ninglu Shao, Haiyuan Zhao, Weijie Yu, Zihua Si, Chen Xu, Zhongxiang Sun, Xiao Zhang, Jun Xu

The debut of ChatGPT has recently attracted the attention of the natural language processing (NLP) community and beyond.

Explainable Recommendation Information Retrieval +2

Cannot find the paper you are looking for? You can Submit a new open access paper.