Automated Essay Scoring

5 papers with code • 0 benchmarks • 0 datasets

Essay scoring: Automated Essay Scoring is the task of assigning a score to an essay, usually in the context of assessing the language ability of a language learner. The quality of an essay is affected by the following four primary dimensions: topic relevance, organization and coherence, word usage and sentence complexity, and grammar and mechanics.

Source: A Joint Model for Multimodal Document Quality Assessment

Datasets


Greatest papers with code

Co-Attention Based Neural Network for Source-Dependent Essay Scoring

Rokeer/co-attention WS 2018

This paper presents an investigation of using a co-attention based neural network for source-dependent essay scoring.

Automated Essay Scoring

Automated Essay Scoring based on Two-Stage Learning

ustcljw/fupugec-score 23 Jan 2019

Current state-of-art feature-engineered and end-to-end Automated Essay Score (AES) methods are proven to be unable to detect adversarial samples, e. g. the essays composed of permuted sentences and the prompt-irrelevant essays.

Automated Essay Scoring

EXPATS: A Toolkit for Explainable Automated Text Scoring

octanove/expats 7 Apr 2021

Automated text scoring (ATS) tasks, such as automated essay scoring and readability assessment, are important educational applications of natural language processing.

Automated Essay Scoring

Neural Automated Essay Scoring and Coherence Modeling for Adversarially Crafted Input

Youmna-H/Coherence_AES NAACL 2018

We demonstrate that current state-of-the-art approaches to Automated Essay Scoring (AES) are not well-suited to capturing adversarially crafted input of grammatical but incoherent sequences of sentences.

Automated Essay Scoring