Search Results for author: David Weiss

Found 17 papers, 6 papers with code

Learning Cross-Context Entity Representations from Text

no code implementations11 Jan 2020 Jeffrey Ling, Nicholas FitzGerald, Zifei Shan, Livio Baldini Soares, Thibault Févry, David Weiss, Tom Kwiatkowski

Language modeling tasks, in which words, or word-pieces, are predicted on the basis of a local context, have been very effective for learning word embeddings and context dependent representations of phrases.

Entity Linking Language Modelling +1

Learning Entity Representations for Few-Shot Reconstruction of Wikipedia Categories

no code implementations ICLR Workshop LLD 2019 Jeffrey Ling, Nicholas FitzGerald, Livio Baldini Soares, David Weiss, Tom Kwiatkowski

Language modeling tasks, in which words are predicted on the basis of a local context, have been very effective for learning word embeddings and context dependent representations of phrases.

Entity Typing Language Modelling +1

A Fast, Compact, Accurate Model for Language Identification of Codemixed Text

no code implementations EMNLP 2018 Yuan Zhang, Jason Riesa, Daniel Gillick, Anton Bakalov, Jason Baldridge, David Weiss

We address fine-grained multilingual language identification: providing a language code for every token in a sentence, including codemixed text containing multiple languages.

Language Identification

State-of-the-art Chinese Word Segmentation with Bi-LSTMs

1 code implementation EMNLP 2018 Ji Ma, Kuzman Ganchev, David Weiss

A wide variety of neural-network architectures have been proposed for the task of Chinese word segmentation.

Chinese Word Segmentation

Linguistically-Informed Self-Attention for Semantic Role Labeling

1 code implementation EMNLP 2018 Emma Strubell, Patrick Verga, Daniel Andor, David Weiss, Andrew McCallum

Unlike previous models which require significant pre-processing to prepare linguistic features, LISA can incorporate syntax using merely raw tokens as input, encoding the sequence only once to simultaneously perform parsing, predicate detection and role labeling for all predicates.

Dependency Parsing Multi-Task Learning +4

Natural Language Processing with Small Feed-Forward Networks

1 code implementation EMNLP 2017 Jan A. Botha, Emily Pitler, Ji Ma, Anton Bakalov, Alex Salcianu, David Weiss, Ryan Mcdonald, Slav Petrov

We show that small and shallow feed-forward neural networks can achieve near state-of-the-art results on a range of unstructured and structured language processing tasks while being considerably cheaper in memory and computational requirements than deep recurrent models.

Natural Language Processing

Stack-propagation: Improved Representation Learning for Syntax

no code implementations ACL 2016 Yuan Zhang, David Weiss

Traditional syntax models typically leverage part-of-speech (POS) information by constructing features from hand-tuned templates.

Dependency Parsing POS +1

Globally Normalized Transition-Based Neural Networks

1 code implementation ACL 2016 Daniel Andor, Chris Alberti, David Weiss, Aliaksei Severyn, Alessandro Presta, Kuzman Ganchev, Slav Petrov, Michael Collins

Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models.

Dependency Parsing Part-Of-Speech Tagging +1

Understanding Objects in Detail with Fine-Grained Attributes

no code implementations CVPR 2014 Andrea Vedaldi, Siddharth Mahendran, Stavros Tsogkas, Subhransu Maji, Ross Girshick, Juho Kannala, Esa Rahtu, Iasonas Kokkinos, Matthew B. Blaschko, David Weiss, Ben Taskar, Karen Simonyan, Naomi Saphra, Sammy Mohamed

We show that the collected data can be used to study the relation between part detection and attribute prediction by diagnosing the performance of classifiers that pool information from different parts of an object.

object-detection Object Detection

SCALPEL: Segmentation Cascades with Localized Priors and Efficient Learning

no code implementations CVPR 2013 David Weiss, Ben Taskar

We propose SCALPEL, a flexible method for object segmentation that integrates rich region-merging cues with midand high-level information about object layout, class, and scale into the segmentation process.

Re-Ranking Semantic Segmentation

Sidestepping Intractable Inference with Structured Ensemble Cascades

no code implementations NeurIPS 2010 David Weiss, Benjamin Sapp, Ben Taskar

For many structured prediction problems, complex models often require adopting approximate inference techniques such as variational methods or sampling, which generally provide no satisfactory accuracy guarantees.

Pose Estimation Structured Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.