Unsupervised Abstractive Meeting Summarization with Multi-Sentence Compression and Budgeted Submodular Maximization

We introduce a novel graph-based framework for abstractive meeting speech summarization that is fully unsupervised and does not rely on any annotations. Our work combines the strengths of multiple recent approaches while addressing their weaknesses. Moreover, we leverage recent advances in word embeddings and graph degeneracy applied to NLP to take exterior semantic knowledge into account, and to design custom diversity and informativeness measures. Experiments on the AMI and ICSI corpus show that our system improves on the state-of-the-art. Code and data are publicly available, and our system can be interactively tested.

PDF Abstract ACL 2018 PDF ACL 2018 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Meeting Summarization AMI Meeting Corpus UNS ROUGE-1 F1 37.53 # 1
Meeting Summarization ICSI Meeting Corpus UNS ROUGE-1 F1 34.11 # 1

Methods