Search Results for author: Yury Zemlyanskiy

Found 13 papers, 4 papers with code

MEMORY-VQ: Compression for Tractable Internet-Scale Memory

no code implementations28 Aug 2023 Yury Zemlyanskiy, Michiel de Jong, Luke Vilnis, Santiago Ontañón, William W. Cohen, Sumit Sanghai, Joshua Ainslie

Retrieval augmentation is a powerful but expensive method to make language models more knowledgeable about the world.

Quantization Retrieval

GLIMMER: generalized late-interaction memory reranker

no code implementations17 Jun 2023 Michiel de Jong, Yury Zemlyanskiy, Nicholas FitzGerald, Sumit Sanghai, William W. Cohen, Joshua Ainslie

Memory-augmentation is a powerful approach for efficiently incorporating external information into language models, but leads to reduced performance relative to retrieving text.

Retrieval

CoLT5: Faster Long-Range Transformers with Conditional Computation

no code implementations17 Mar 2023 Joshua Ainslie, Tao Lei, Michiel de Jong, Santiago Ontañón, Siddhartha Brahma, Yury Zemlyanskiy, David Uthus, Mandy Guo, James Lee-Thorp, Yi Tay, Yun-Hsuan Sung, Sumit Sanghai

Many natural language processing tasks benefit from long inputs, but processing long documents with Transformers is expensive -- not only due to quadratic attention complexity but also from applying feedforward and projection layers to every token.

Long-range modeling

Arithmetic Sampling: Parallel Diverse Decoding for Large Language Models

1 code implementation18 Oct 2022 Luke Vilnis, Yury Zemlyanskiy, Patrick Murray, Alexandre Passos, Sumit Sanghai

Decoding methods for large language models often trade-off between diversity of outputs and parallelism of computation.

Diversity Language Modelling +2

ReadTwice: Reading Very Large Documents with Memories

no code implementations NAACL 2021 Yury Zemlyanskiy, Joshua Ainslie, Michiel de Jong, Philip Pham, Ilya Eckstein, Fei Sha

Knowledge-intensive tasks such as question answering often require assimilating information from different sections of large inputs such as books or article collections.

Question Answering

DOCENT: Learning Self-Supervised Entity Representations from Large Document Collections

no code implementations EACL 2021 Yury Zemlyanskiy, Sudeep Gandhe, Ruining He, Bhargav Kanagal, Anirudh Ravula, Juraj Gottweis, Fei Sha, Ilya Eckstein

This enables a new class of powerful, high-capacity representations that can ultimately distill much of the useful information about an entity from multiple text sources, without any human supervision.

Knowledge Base Completion Natural Language Queries +4

Self-Attentive, Multi-Context One-Class Classification for Unsupervised Anomaly Detection on Text

1 code implementation ACL 2019 Lukas Ruff, Yury Zemlyanskiy, V, Robert ermeulen, Thomas Schnake, Marius Kloft

There exist few text-specific methods for unsupervised anomaly detection, and for those that do exist, none utilize pre-trained models for distributed vector representations of words.

Contextual Anomaly Detection General Classification +3

Aiming to Know You Better Perhaps Makes Me a More Engaging Dialogue Partner

no code implementations CONLL 2018 Yury Zemlyanskiy, Fei Sha

There have been several attempts to define a plausible motivation for a chit-chat dialogue agent that can lead to engaging conversations.

Cannot find the paper you are looking for? You can Submit a new open access paper.