Point to the Expression: Solving Algebraic Word Problems using the Expression-Pointer Transformer Model

Solving algebraic word problems has recently emerged as an important natural language processing task. To solve algebraic word problems, recent studies suggested neural models that generate solution equations by using {`}Op (operator/operand){'} tokens as a unit of input/output. However, such a neural model suffered two issues: expression fragmentation and operand-context separation. To address each of these two issues, we propose a pure neural model, Expression-Pointer Transformer (EPT), which uses (1) {`}Expression{'} token and (2) operand-context pointers when generating solution equations. The performance of the EPT model is tested on three datasets: ALG514, DRAW-1K, and MAWPS. Compared to the state-of-the-art (SoTA) models, the EPT model achieved a comparable performance accuracy in each of the three datasets; 81.3{\%} on ALG514, 59.5{\%} on DRAW-1K, and 84.5{\%} on MAWPS. The contribution of this paper is two-fold; (1) We propose a pure neural model, EPT, which can address the expression fragmentation and the operand-context separation. (2) The fully automatic EPT model, which does not use hand-crafted features, yields comparable performance to existing models using hand-crafted features, and achieves better performance than existing pure neural models by at most 40{\%}.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Math Word Problem Solving ALG514 EPT Accuracy (%) 81.31 # 3
Math Word Problem Solving DRAW-1K EPT Accuracy (%) 59.5 # 3
Math Word Problem Solving MAWPS EPT Accuracy (%) 84.51 # 13

Methods


No methods listed for this paper. Add relevant methods here