ACM-CR: A Manually Annotated Test Collection for Citation Recommendation

17 Aug 2021  ·  Florian Boudin ·

Citation recommendation is intended to assist researchers in the process of searching for relevant papers to cite by recommending appropriate citations for a given input text. Existing test collections for this task are noisy and unreliable since they are built automatically from parsed PDF papers. In this paper, we present our ongoing effort at creating a publicly available, manually annotated test collection for citation recommendation. We also conduct a series of experiments to evaluate the effectiveness of content-based baseline models on the test collection, providing results for future work to improve upon. Our test collection and code to replicate experiments are available at https://github.com/boudinfl/acm-cr

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here