Search Results for author: Arash Eshghi

Found 21 papers, 4 papers with code

A Study of Automatic Metrics for the Evaluation of Natural Language Explanations

1 code implementation EACL 2021 Miruna Clinciu, Arash Eshghi, Helen Hastie

As transparency becomes key for robotics and AI, it will be necessary to evaluate the methods through which transparency is provided, including automatically generated natural language (NL) explanations.

Text Generation

A Comprehensive Evaluation of Incremental Speech Recognition and Diarization for Conversational AI

2 code implementations COLING 2020 Angus Addlesee, Yanchao Yu, Arash Eshghi

Automatic Speech Recognition (ASR) systems are increasingly powerful and more accurate, but also more numerous with several options existing currently as a service (e. g. Google, IBM, and Microsoft).

automatic-speech-recognition Speaker Diarization +1

Benchmarking Natural Language Understanding Services for building Conversational Agents

4 code implementations13 Mar 2019 Xingkun Liu, Arash Eshghi, Pawel Swietojanski, Verena Rieser

We have recently seen the emergence of several publicly available Natural Language Understanding (NLU) toolkits, which map user utterances to structured, but more abstract, Dialogue Act (DA) or Intent specifications, while making this process accessible to the lay developer.

General Classification Intent Classification +1

Multi-Task Learning for Domain-General Spoken Disfluency Detection in Dialogue Systems

no code implementations8 Oct 2018 Igor Shalyminov, Arash Eshghi, Oliver Lemon

To test the model's generalisation potential, we evaluate the same model on the bAbI+ dataset, without any additional training.

Multi-Task Learning

The BURCHAK corpus: a Challenge Data Set for Interactive Learning of Visually Grounded Word Meanings

no code implementations WS 2017 Yanchao Yu, Arash Eshghi, Gregory Mills, Oliver Joseph Lemon

We motivate and describe a new freely available human-human dialogue dataset for interactive learning of visually grounded word meanings through ostensive definition by a tutor to a learner.

Learning how to learn: an adaptive dialogue agent for incrementally learning visually grounded word meanings

no code implementations WS 2017 Yanchao Yu, Arash Eshghi, Oliver Lemon

We present an optimised multi-modal dialogue agent for interactive learning of visually grounded word meanings from a human tutor, trained on real human-human tutoring data.

Training an adaptive dialogue policy for interactive learning of visually grounded word meanings

no code implementations WS 2016 Yanchao Yu, Arash Eshghi, Oliver Lemon

We present a multi-modal dialogue system for interactive learning of perceptually grounded word meanings from a human tutor.

Semantic Parsing

Challenging Neural Dialogue Models with Natural Data: Memory Networks Fail on Incremental Phenomena

1 code implementation22 Sep 2017 Igor Shalyminov, Arash Eshghi, Oliver Lemon

Results show that the semantic accuracy of the MemN2N model drops drastically; and that although it is in principle able to learn to process the constructions in bAbI+, it needs an impractical amount of training data to do so.

VOILA: An Optimised Dialogue System for Interactively Learning Visually-Grounded Word Meanings (Demonstration System)

no code implementations WS 2017 Yanchao Yu, Arash Eshghi, Oliver Lemon

We present VOILA: an optimised, multi-modal dialogue agent for interactive learning of visually grounded word meanings from a human user.

Active Learning

Bootstrapping incremental dialogue systems: using linguistic knowledge to learn from minimal data

no code implementations1 Dec 2016 Dimitrios Kalatzis, Arash Eshghi, Oliver Lemon

We present a method for inducing new dialogue systems from very small amounts of unannotated dialogue data, showing how word-level exploration using Reinforcement Learning (RL), combined with an incremental and semantic grammar - Dynamic Syntax (DS) - allows systems to discover, generate, and understand many new dialogue variants.

Dialogue Management Text Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.