Search Results for author: Sam Thomson

Found 19 papers, 11 papers with code

Online Semantic Parsing for Latency Reduction in Task-Oriented Dialogue

no code implementations ACL 2022 Jiawei Zhou, Jason Eisner, Michael Newman, Emmanouil Antonios Platanios, Sam Thomson

Standard conversational semantic parsing maps a complete user utterance into an executable program, after which the program is executed to respond to the user.

Machine Translation Semantic Parsing +1

Value-Agnostic Conversational Semantic Parsing

no code implementations ACL 2021 Emmanouil Antonios Platanios, Adam Pauls, Subhro Roy, Yuchen Zhang, Alexander Kyte, Alan Guo, Sam Thomson, Jayant Krishnamurthy, Jason Wolfe, Jacob Andreas, Dan Klein

Conversational semantic parsers map user utterances to executable programs given dialogue histories composed of previous utterances, programs, and system responses.

Semantic Parsing

Rational Recurrences

1 code implementation EMNLP 2018 Hao Peng, Roy Schwartz, Sam Thomson, Noah A. Smith

We characterize this connection formally, defining rational recurrences to be recurrent hidden state update functions that can be written as the Forward calculation of a finite set of WFSAs.

Language Modelling Text Classification

Bridging CNNs, RNNs, and Weighted Finite-State Machines

no code implementations ACL 2018 Roy Schwartz, Sam Thomson, Noah A. Smith

Recurrent and convolutional neural networks comprise two distinct families of models that have proven to be useful for encoding natural language utterances.

General Classification Representation Learning +2

Toward Abstractive Summarization Using Semantic Representations

1 code implementation HLT 2015 Fei Liu, Jeffrey Flanigan, Sam Thomson, Norman Sadeh, Noah A. Smith

We present a novel abstractive summarization framework that draws on the recent development of a treebank for the Abstract Meaning Representation (AMR).

Abstractive Text Summarization

SoPa: Bridging CNNs, RNNs, and Weighted Finite-State Machines

2 code implementations15 May 2018 Roy Schwartz, Sam Thomson, Noah A. Smith

Recurrent and convolutional neural networks comprise two distinct families of models that have proven to be useful for encoding natural language utterances.

Explainable artificial intelligence General Classification +2

Backpropagating through Structured Argmax using a SPIGOT

1 code implementation ACL 2018 Hao Peng, Sam Thomson, Noah A. Smith

We introduce the structured projection of intermediate gradients optimization technique (SPIGOT), a new method for backpropagating through neural networks that include hard-decision structured predictions (e. g., parsing) in intermediate layers.

Dependency Parsing reinforcement-learning +2

Learning Joint Semantic Parsers from Disjoint Data

2 code implementations NAACL 2018 Hao Peng, Sam Thomson, Swabha Swayamdipta, Noah A. Smith

We present a new approach to learning semantic parsers from multiple datasets, even when the target semantic formalisms are drastically different, and the underlying corpora do not overlap.

Dependency Parsing Frame +1

Neural Motifs: Scene Graph Parsing with Global Context

6 code implementations CVPR 2018 Rowan Zellers, Mark Yatskar, Sam Thomson, Yejin Choi

We then introduce Stacked Motif Networks, a new architecture designed to capture higher order motifs in scene graphs that further improves over our strong baseline by an average 7. 1% relative gain.

Frame-Semantic Parsing with Softmax-Margin Segmental RNNs and a Syntactic Scaffold

9 code implementations29 Jun 2017 Swabha Swayamdipta, Sam Thomson, Chris Dyer, Noah A. Smith

We present a new, efficient frame-semantic parser that labels semantic arguments to FrameNet predicates.

Frame Semantic Parsing

Deep Multitask Learning for Semantic Dependency Parsing

1 code implementation ACL 2017 Hao Peng, Sam Thomson, Noah A. Smith

We present a deep neural architecture that parses sentences into three semantic dependency graph formalisms.

Dependency Parsing Semantic Dependency Parsing

Cannot find the paper you are looking for? You can Submit a new open access paper.