Paper

Finding the Answers with Definition Models

Inspired by a previous attempt to answer crossword questions using neural networks (Hill, Cho, Korhonen, & Bengio, 2015), this dissertation implements extensions to improve the performance of this existing definition model on the task of answering crossword questions. A discussion and evaluation of the original implementation finds that there are some ways in which the recurrent neural model could be extended. Insights from related fields neural language modeling and neural machine translation provide the justification and means required for these extensions. Two extensions are applied to the LSTM encoder, first taking the average of LSTM states across the sequence and secondly using a bidirectional LSTM, both implementations serve to improve model performance on a definitions and crossword test set. In order to improve performance on crossword questions, the training data is increased to include crossword questions and answers, and this serves to improve results on definitions as well as crossword questions. The final experiments are conducted using sub-word unit segmentation, first on the source side and then later preliminary experimentation is conducted to facilitate character-level output. Initially, an exact reproduction of the baseline results proves unsuccessful. Despite this, the extensions improve performance, allowing the definition model to surpass the performance of the recurrent neural network variants of the previous work (Hill, et al., 2015).

Results in Papers With Code
(↓ scroll down to see all results)