Search Results for author: Christopher Potts

Found 53 papers, 28 papers with code

PLAID: An Efficient Engine for Late Interaction Retrieval

1 code implementation19 May 2022 Keshav Santhanam, Omar Khattab, Christopher Potts, Matei Zaharia

PLAID uses centroid interaction as well as centroid pruning, a mechanism for sparsifying the bag of centroids, within a highly-optimized engine to reduce late interaction search latency by up to 7$\times$ on a GPU and 45$\times$ on a CPU against vanilla ColBERTv2, while continuing to deliver state-of-the-art retrieval quality.

Information Retrieval

Color Overmodification Emerges from Data-Driven Learning and Pragmatic Reasoning

no code implementations18 May 2022 Fei Fang, Kunal Sinha, Noah D. Goodman, Christopher Potts, Elisa Kreiss

It seems likely that these patterns are shaped by the environment a speaker is exposed to in complex ways.

Language Acquisition

Causal Distillation for Language Models

1 code implementation5 Dec 2021 Zhengxuan Wu, Atticus Geiger, Josh Rozner, Elisa Kreiss, Hanson Lu, Thomas Icard, Christopher Potts, Noah D. Goodman

Distillation efforts have led to language models that are more compact and efficient without serious drops in performance.

Language Modelling Masked Language Modeling +3

Inducing Causal Structure for Interpretable Neural Networks

2 code implementations1 Dec 2021 Atticus Geiger, Zhengxuan Wu, Hanson Lu, Josh Rozner, Elisa Kreiss, Thomas Icard, Noah D. Goodman, Christopher Potts

In IIT, we (1)align variables in the causal model with representations in the neural model and (2) train a neural model to match the counterfactual behavior of the causal model on a base input when aligned representations in both models are set to be the value they would be for a second source input.

Data Augmentation

Hindsight: Posterior-guided training of retrievers for improved open-ended generation

no code implementations ICLR 2022 Ashwin Paranjape, Omar Khattab, Christopher Potts, Matei Zaharia, Christopher D. Manning

Many text generation systems benefit from using a retriever to retrieve passages from a textual knowledge corpus (e. g., Wikipedia) which are then provided as additional context to the generator.

Text Generation

ReaSCAN: Compositional Reasoning in Language Grounding

2 code implementations18 Sep 2021 Zhengxuan Wu, Elisa Kreiss, Desmond C. Ong, Christopher Potts

The ability to compositionally map language to referents, relations, and actions is an essential component of language understanding.

On the Opportunities and Risks of Foundation Models

no code implementations16 Aug 2021 Rishi Bommasani, Drew A. Hudson, Ehsan Adeli, Russ Altman, Simran Arora, Sydney von Arx, Michael S. Bernstein, Jeannette Bohg, Antoine Bosselut, Emma Brunskill, Erik Brynjolfsson, Shyamal Buch, Dallas Card, Rodrigo Castellon, Niladri Chatterji, Annie Chen, Kathleen Creel, Jared Quincy Davis, Dora Demszky, Chris Donahue, Moussa Doumbouya, Esin Durmus, Stefano Ermon, John Etchemendy, Kawin Ethayarajh, Li Fei-Fei, Chelsea Finn, Trevor Gale, Lauren Gillespie, Karan Goel, Noah Goodman, Shelby Grossman, Neel Guha, Tatsunori Hashimoto, Peter Henderson, John Hewitt, Daniel E. Ho, Jenny Hong, Kyle Hsu, Jing Huang, Thomas Icard, Saahil Jain, Dan Jurafsky, Pratyusha Kalluri, Siddharth Karamcheti, Geoff Keeling, Fereshte Khani, Omar Khattab, Pang Wei Kohd, Mark Krass, Ranjay Krishna, Rohith Kuditipudi, Ananya Kumar, Faisal Ladhak, Mina Lee, Tony Lee, Jure Leskovec, Isabelle Levent, Xiang Lisa Li, Xuechen Li, Tengyu Ma, Ali Malik, Christopher D. Manning, Suvir Mirchandani, Eric Mitchell, Zanele Munyikwa, Suraj Nair, Avanika Narayan, Deepak Narayanan, Ben Newman, Allen Nie, Juan Carlos Niebles, Hamed Nilforoshan, Julian Nyarko, Giray Ogut, Laurel Orr, Isabel Papadimitriou, Joon Sung Park, Chris Piech, Eva Portelance, Christopher Potts, aditi raghunathan, Rob Reich, Hongyu Ren, Frieda Rong, Yusuf Roohani, Camilo Ruiz, Jack Ryan, Christopher Ré, Dorsa Sadigh, Shiori Sagawa, Keshav Santhanam, Andy Shih, Krishnan Srinivasan, Alex Tamkin, Rohan Taori, Armin W. Thomas, Florian Tramèr, Rose E. Wang, William Wang, Bohan Wu, Jiajun Wu, Yuhuai Wu, Sang Michael Xie, Michihiro Yasunaga, Jiaxuan You, Matei Zaharia, Michael Zhang, Tianyi Zhang, Xikun Zhang, Yuhui Zhang, Lucia Zheng, Kaitlyn Zhou, Percy Liang

AI is undergoing a paradigm shift with the rise of models (e. g., BERT, DALL-E, GPT-3) that are trained on broad data at scale and are adaptable to a wide range of downstream tasks.

Transfer Learning

Causal Abstractions of Neural Networks

no code implementations NeurIPS 2021 Atticus Geiger, Hanson Lu, Thomas Icard, Christopher Potts

Structural analysis methods (e. g., probing and feature attribution) are increasingly important tools for neural network analysis.

Natural Language Inference

Dynaboard: An Evaluation-As-A-Service Platform for Holistic Next-Generation Benchmarking

no code implementations NeurIPS 2021 Zhiyi Ma, Kawin Ethayarajh, Tristan Thrush, Somya Jain, Ledell Wu, Robin Jia, Christopher Potts, Adina Williams, Douwe Kiela

We introduce Dynaboard, an evaluation-as-a-service framework for hosting benchmarks and conducting holistic model comparison, integrated with the Dynabench platform.

Decrypting Cryptic Crosswords: Semantically Complex Wordplay Puzzles as a Target for NLP

1 code implementation NeurIPS 2021 Joshua Rozner, Christopher Potts, Kyle Mahowald

Cryptic crosswords, the dominant crossword variety in the UK, are a promising target for advancing NLP systems that seek to process semantically complex, highly compositional language.

Language Modelling

Identifying the Limits of Cross-Domain Knowledge Transfer for Pretrained Models

1 code implementation RepL4NLP (ACL) 2022 Zhengxuan Wu, Nelson F. Liu, Christopher Potts

There is growing evidence that pretrained language models improve task-specific fine-tuning not just for the languages seen in pretraining, but also for new languages and even non-linguistic data.

Pretrained Language Models Transfer Learning

Baleen: Robust Multi-Hop Reasoning at Scale via Condensed Retrieval

1 code implementation NeurIPS 2021 Omar Khattab, Christopher Potts, Matei Zaharia

Multi-hop reasoning (i. e., reasoning across two or more documents) is a key ingredient for NLP models that leverage large corpora to exhibit broad knowledge.

Question Answering

DynaSent: A Dynamic Benchmark for Sentiment Analysis

1 code implementation ACL 2021 Christopher Potts, Zhengxuan Wu, Atticus Geiger, Douwe Kiela

We introduce DynaSent ('Dynamic Sentiment'), a new English-language benchmark task for ternary (positive/negative/neutral) sentiment analysis.

Sentiment Analysis

Relevance-guided Supervision for OpenQA with ColBERT

3 code implementations1 Jul 2020 Omar Khattab, Christopher Potts, Matei Zaharia

In much recent work, the retriever is a learned component that uses coarse-grained vector representations of questions and passages.

Open-Domain Question Answering

Modeling Subjective Assessments of Guilt in Newspaper Crime Narratives

1 code implementation CONLL 2020 Elisa Kreiss, Zijian Wang, Christopher Potts

Crime reporting is a prevalent form of journalism with the power to shape public perceptions and social policies.

Relational reasoning and generalization using non-symbolic neural networks

1 code implementation14 Jun 2020 Atticus Geiger, Alexandra Carstensen, Michael C. Frank, Christopher Potts

In the two latter cases, our models perform tasks proposed in previous work to demarcate human-unique symbolic abilities.

Relational Reasoning

Neural Natural Language Inference Models Partially Embed Theories of Lexical Entailment and Negation

1 code implementation EMNLP (BlackboxNLP) 2020 Atticus Geiger, Kyle Richardson, Christopher Potts

We address whether neural models for Natural Language Inference (NLI) can learn the compositional interactions between lexical entailment and negation, using four methods: the behavioral evaluation methods of (1) challenge test sets and (2) systematic generalization tasks, and the structural evaluation methods of (3) probes and (4) interventions.

Lexical Entailment Natural Language Inference +1

Pragmatic Issue-Sensitive Image Captioning

1 code implementation Findings of the Association for Computational Linguistics 2020 Allen Nie, Reuben Cohn-Gordon, Christopher Potts

Image captioning systems have recently improved dramatically, but they still tend to produce captions that are insensitive to the communicative goals that captions should meet.

Image Captioning Question Answering +1

TalkDown: A Corpus for Condescension Detection in Context

1 code implementation IJCNLP 2019 Zijian Wang, Christopher Potts

Condescending language use is caustic; it can bring dialogues to an end and bifurcate communities.

Communication-based Evaluation for Natural Language Generation

1 code implementation SCiL 2020 Benjamin Newman, Reuben Cohn-Gordon, Christopher Potts

Natural language generation (NLG) systems are commonly evaluated using n-gram overlap measures (e. g. BLEU, ROUGE).

Text Generation

Modeling Drug-Disease Relations with Linguistic and Knowledge Graph Constraints

no code implementations31 Mar 2019 Bruno Godefroy, Christopher Potts

FDA drug labels are rich sources of information about drugs and drug-disease relations, but their complexity makes them challenging texts to analyze in isolation.

Knowledge Graphs

Effective Feature Representation for Clinical Text Concept Extraction

no code implementations WS 2019 Yifeng Tao, Bruno Godefroy, Guillaume Genthial, Christopher Potts

Crucial information about the practice of healthcare is recorded only in free-form text, which creates an enormous opportunity for high-impact NLP.

Stress-Testing Neural Models of Natural Language Inference with Multiply-Quantified Sentences

no code implementations30 Oct 2018 Atticus Geiger, Ignacio Cases, Lauri Karttunen, Christopher Potts

Standard evaluations of deep learning models for semantics using naturalistic corpora are limited in what they can tell us about the fidelity of the learned representations, because the corpora rarely come with good measures of semantic complexity.

Natural Language Inference

An Incremental Iterated Response Model of Pragmatics

no code implementations WS 2019 Reuben Cohn-Gordon, Noah D. Goodman, Christopher Potts

Recent Iterated Response (IR) models of pragmatics conceptualize language use as a recursive process in which agents reason about each other to increase communicative efficiency.

Referring Expression Referring expression generation

A case for deep learning in semantics

no code implementations10 Sep 2018 Christopher Potts

Pater's target article builds a persuasive case for establishing stronger ties between theoretical linguistics and connectionism (deep learning).

Representing Social Media Users for Sarcasm Detection

1 code implementation EMNLP 2018 Y. Alex Kolchinski, Christopher Potts

We explore two methods for representing authors in the context of textual sarcasm detection: a Bayesian approach that directly represents authors' propensities to be sarcastic, and a dense embedding approach that can learn interactions between the author and the text.

Sarcasm Detection

Pragmatically Informative Image Captioning with Character-Level Inference

no code implementations NAACL 2018 Reuben Cohn-Gordon, Noah Goodman, Christopher Potts

We combine a neural image captioner with a Rational Speech Acts (RSA) model to make a system that is pragmatically informative: its objective is to produce captions that are not merely true but also distinguish their inputs from similar images.

Image Captioning

Mittens: An Extension of GloVe for Learning Domain-Specialized Representations

1 code implementation NAACL 2018 Nicholas Dingwall, Christopher Potts

We present a simple extension of the GloVe representation learning model that begins with general-purpose representations and updates them based on data from a specialized domain.

Representation Learning

Generating Bilingual Pragmatic Color References

1 code implementation NAACL 2018 Will Monroe, Jennifer Hu, Andrew Jong, Christopher Potts

Contextual influences on language often exhibit substantial cross-lingual regularities; for example, we are more verbose in situations that require finer distinctions.

On the Effective Use of Pretraining for Natural Language Inference

no code implementations5 Oct 2017 Ignacio Cases, Minh-Thang Luong, Christopher Potts

Neural networks have excelled at many NLP tasks, but there remain open questions about the performance of pretrained distributed word representations and their interaction with weight initialization and other hyperparameters.

Natural Language Inference

Retrofitting Distributional Embeddings to Knowledge Graphs with Functional Relations

1 code implementation COLING 2018 Benjamin J. Lengerich, Andrew L. Maas, Christopher Potts

Knowledge graphs are a versatile framework to encode richly structured data relationships, but it can be challenging to combine these graphs with unstructured data.

Knowledge Graph Completion

Colors in Context: A Pragmatic Neural Model for Grounded Language Understanding

1 code implementation TACL 2017 Will Monroe, Robert X. D. Hawkins, Noah D. Goodman, Christopher Potts

We present a model of pragmatic referring expression interpretation in a grounded communication task (identifying colors from descriptions) that draws upon predictions from two recurrent neural network classifiers, a speaker and a listener, unified by a recursive pragmatic reasoning framework.

Referring Expression

Learning in the Rational Speech Acts Model

no code implementations23 Oct 2015 Will Monroe, Christopher Potts

The Rational Speech Acts (RSA) model treats language use as a recursive process in which probabilistic speaker and listener agents reason about each other's intentions to enrich the literal semantics of their language along broadly Gricean lines.

Text Generation

A large annotated corpus for learning natural language inference

1 code implementation EMNLP 2015 Samuel R. Bowman, Gabor Angeli, Christopher Potts, Christopher D. Manning

Understanding entailment and contradiction is fundamental to understanding natural language, and inference about entailment and contradiction is a valuable testing ground for the development of semantic representations.

Image Captioning Natural Language Inference

Tree-structured composition in neural networks without tree-structured architectures

1 code implementation16 Jun 2015 Samuel R. Bowman, Christopher D. Manning, Christopher Potts

We hypothesize that neural sequence models like LSTMs are in fact able to discover and implicitly use recursive compositional structure, at least for tasks with clear cues to that structure in the data.

Text to 3D Scene Generation with Rich Lexical Grounding

no code implementations IJCNLP 2015 Angel Chang, Will Monroe, Manolis Savva, Christopher Potts, Christopher D. Manning

The ability to map descriptions of scenes to 3D geometric representations has many applications in areas such as art, education, and robotics.

Scene Generation

Learning Distributed Word Representations for Natural Logic Reasoning

no code implementations15 Oct 2014 Samuel R. Bowman, Christopher Potts, Christopher D. Manning

Natural logic offers a powerful relational conception of meaning that is a natural counterpart to distributed semantic representations, which have proven valuable in a wide range of sophisticated language tasks.

Tensor Networks

Exploiting Social Network Structure for Person-to-Person Sentiment Analysis

no code implementations TACL 2014 Robert West, Hristo S. Paskov, Jure Leskovec, Christopher Potts

Person-to-person evaluations are prevalent in all kinds of discourse and important for establishing reputations, building social bonds, and shaping public opinion.

Decision Making Sentiment Analysis

Recursive Neural Networks Can Learn Logical Semantics

no code implementations WS 2015 Samuel R. Bowman, Christopher Potts, Christopher D. Manning

Tree-structured recursive neural networks (TreeRNNs) for sentence meaning have been successful for many applications, but it remains an open question whether the fixed-length representations that they learn can support tasks as demanding as logical deduction.

Relational Reasoning Tensor Networks

Cannot find the paper you are looking for? You can Submit a new open access paper.