Search Results for author: Changran Hu

Found 6 papers, 2 papers with code

SambaLingo: Teaching Large Language Models New Languages

no code implementations8 Apr 2024 Zoltan Csaki, Bo Li, Jonathan Li, Qiantong Xu, Pian Pawakapan, Leon Zhang, Yun Du, Hengyu Zhao, Changran Hu, Urmish Thakker

In this paper, we present a comprehensive investigation into the adaptation of LLMs to new languages.

On the Tool Manipulation Capability of Open-source Large Language Models

1 code implementation25 May 2023 Qiantong Xu, Fenglu Hong, Bo Li, Changran Hu, Zhengyu Chen, Jian Zhang

In this paper, we ask can we enhance open-source LLMs to be competitive to leading closed LLM APIs in tool manipulation, with practical amount of human supervision.

Programming Language Agnostic Mining of Code and Language Pairs with Sequence Labeling Based Question Answering

no code implementations21 Mar 2022 Changran Hu, Akshara Reddi Methukupalli, Yutong Zhou, Chen Wu, Yubo Chen

In particular, we propose to apply the BIO tagging scheme instead of the conventional binary scheme to mine the code solutions which are often composed of multiple blocks of a post.

Question Answering

Jointly Extracting Explicit and Implicit Relational Triples with Reasoning Pattern Enhanced Binary Pointer Network

no code implementations NAACL 2021 Yubo Chen, Yunqi Zhang, Changran Hu, Yongfeng Huang

To explore entity pairs that may be implicitly connected by relations, we propose a binary pointer network to extract overlapping relational triples relevant to each word sequentially and retain the information of previously extracted triples in an external memory.

graph construction Implicit Relations +5

Generating Code with the Help of Retrieved Template Functions and Stack Overflow Answers

no code implementations12 Apr 2021 Dawn Drain, Changran Hu, Chen Wu, Mikhail Breslav, Neel Sundaresan

To demonstrate the effectiveness of our model designs, we perform extensive experiments with CodeSearchNet which contains template functions and CoNaLa which contains Stack Overflow intent-snippet pairs.

Code Search Retrieval

A Hierarchical Recurrent Neural Network for Symbolic Melody Generation

2 code implementations14 Dec 2017 Jian Wu, Changran Hu, Yulong Wang, Xiaolin Hu, Jun Zhu

In this paper, we present a hierarchical recurrent neural network for melody generation, which consists of three Long-Short-Term-Memory (LSTM) subnetworks working in a coarse-to-fine manner along time.

Sound Multimedia

Cannot find the paper you are looking for? You can Submit a new open access paper.