NeuroLogic A*esque Decoding: Constrained Text Generation with Lookahead Heuristics

The dominant paradigm for neural text generation is left-to-right decoding from autoregressive language models. Constrained or controllable generation under complex lexical constraints, however, requires foresight to plan ahead feasible future paths. Drawing inspiration from the A* search algorithm, we propose NeuroLogic A*esque, a decoding algorithm that incorporates heuristic estimates of future cost. We develop efficient lookahead heuristics that are efficient for large-scale language models, making our method a drop-in replacement for common techniques such as beam search and top-k sampling. To enable constrained generation, we build on NeuroLogic decoding (Lu et al., 2021), combining its flexibility in incorporating logical constraints with A*esque estimates of future constraint satisfaction. Our approach outperforms competitive baselines on five generation tasks, and achieves new state-of-the-art performance on table-to-text generation, constrained machine translation, and keyword-constrained generation. The improvements are particularly notable on tasks that require complex constraint satisfaction or in few-shot or zero-shot settings. NeuroLogic A*esque illustrates the power of decoding for improving and enabling new capabilities of large-scale language models.

PDF Abstract NAACL 2022 PDF NAACL 2022 Abstract

Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Text Generation ROCStories Beam search + A*esque (beam) Perplexity 2.14 # 2
BLEU-1 34.4 # 1
Text Generation ROCStories Beam search + A*esque (greedy) Perplexity 2.11 # 1
BLEU-1 34.3 # 3
Text Generation ROCStories Beam search Perplexity 2.24 # 4
BLEU-1 33.7 # 4
Text Generation ROCStories Beam search + A*esque (sample) Perplexity 2.16 # 3
BLEU-1 34.4 # 1


No methods listed for this paper. Add relevant methods here