Prompt Agnostic Essay Scorer: A Domain Generalization Approach to Cross-prompt Automated Essay Scoring

4 Aug 2020  ·  Robert Ridley, Liang He, Xin-yu Dai, Shu-Jian Huang, Jia-Jun Chen ·

Cross-prompt automated essay scoring (AES) requires the system to use non target-prompt essays to award scores to a target-prompt essay. Since obtaining a large quantity of pre-graded essays to a particular prompt is often difficult and unrealistic, the task of cross-prompt AES is vital for the development of real-world AES systems, yet it remains an under-explored area of research. Models designed for prompt-specific AES rely heavily on prompt-specific knowledge and perform poorly in the cross-prompt setting, whereas current approaches to cross-prompt AES either require a certain quantity of labelled target-prompt essays or require a large quantity of unlabelled target-prompt essays to perform transfer learning in a multi-step manner. To address these issues, we introduce Prompt Agnostic Essay Scorer (PAES) for cross-prompt AES. Our method requires no access to labelled or unlabelled target-prompt data during training and is a single-stage approach. PAES is easy to apply in practice and achieves state-of-the-art performance on the Automated Student Assessment Prize (ASAP) dataset.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here