Search Results for author: Tobias Falke

Found 12 papers, 3 papers with code

Fast Concept Mention Grouping for Concept Map-based Multi-Document Summarization

1 code implementation NAACL 2019 Tobias Falke, Iryna Gurevych

Concept map-based multi-document summarization has recently been proposed as a variant of the traditional summarization task with graph-structured summaries.

Clustering Document Summarization +1

GraphDocExplore: A Framework for the Experimental Comparison of Graph-based Document Exploration Techniques

no code implementations EMNLP 2017 Tobias Falke, Iryna Gurevych

Many techniques to automatically extract different types of graphs, showing for example entities or concepts and different relationships between them, have been suggested.

Navigate

Data-Efficient Paraphrase Generation to Bootstrap Intent Classification and Slot Labeling for New Features in Task-Oriented Dialog Systems

no code implementations COLING 2020 Shailza Jolly, Tobias Falke, Caglar Tirkaz, Daniil Sorokin

Recent progress through advanced neural models pushed the performance of task-oriented dialog systems to almost perfect accuracy on existing benchmark datasets for intent classification and slot labeling.

intent-classification Intent Classification +1

Leveraging User Paraphrasing Behavior In Dialog Systems To Automatically Collect Annotations For Long-Tail Utterances

no code implementations COLING 2020 Tobias Falke, Markus Boese, Daniil Sorokin, Caglar Tirkaz, Patrick Lehnen

In large-scale commercial dialog systems, users express the same request in a wide variety of alternative ways with a long tail of less frequent alternatives.

Feedback Attribution for Counterfactual Bandit Learning in Multi-Domain Spoken Language Understanding

no code implementations EMNLP 2021 Tobias Falke, Patrick Lehnen

With counterfactual bandit learning, models can be trained based on positive and negative feedback received for historical predictions, with no labeled data needed.

counterfactual Multi-agent Reinforcement Learning +2

Recipes for Sequential Pre-training of Multilingual Encoder and Seq2Seq Models

no code implementations14 Jun 2023 Saleh Soltan, Andy Rosenbaum, Tobias Falke, Qin Lu, Anna Rumshisky, Wael Hamza

(2) Conversely, using an encoder to warm-start seq2seq training, we show that by unfreezing the encoder partway through training, we can match task performance of a from-scratch seq2seq model.

Language Modelling Masked Language Modeling

Cannot find the paper you are looking for? You can Submit a new open access paper.