In-Context Learning

856 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find In-Context Learning models and implementations

Most implemented papers

Neural Codec Language Models are Zero-Shot Text to Speech Synthesizers

microsoft/unilm 5 Jan 2023

In addition, we find Vall-E could preserve the speaker's emotion and acoustic environment of the acoustic prompt in synthesis.

TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second

automl/tabpfn 5 Jul 2022

We present TabPFN, a trained Transformer that can do supervised classification for small tabular datasets in less than a second, needs no hyperparameter tuning and is competitive with state-of-the-art classification methods.

From system models to class models: An in-context learning paradigm

forgi86/sysid-neural-transformers 25 Aug 2023

Is it possible to understand the intricacies of a dynamical system not solely from its input/output pattern, but also by observing the behavior of other systems within the same class?

PanGu-$α$: Large-scale Autoregressive Pretrained Chinese Language Models with Auto-parallel Computation

mindspore-ai/models 26 Apr 2021

To enhance the generalization ability of PanGu-$\alpha$, we collect 1. 1TB high-quality Chinese data from a wide range of domains to pretrain the model.

Data Distributional Properties Drive Emergent In-Context Learning in Transformers

deepmind/emergent_in_context_learning 22 Apr 2022

In further experiments, we found that naturalistic data distributions were only able to elicit in-context learning in transformers, and not in recurrent models.

Large Language Models Are Human-Level Prompt Engineers

keirp/automatic_prompt_engineer 3 Nov 2022

By conditioning on natural language instructions, large language models (LLMs) have displayed impressive capabilities as general-purpose computers.

HyenaDNA: Long-Range Genomic Sequence Modeling at Single Nucleotide Resolution

HazyResearch/hyena-dna NeurIPS 2023

Leveraging Hyena's new long-range capabilities, we present HyenaDNA, a genomic foundation model pretrained on the human reference genome with context lengths of up to 1 million tokens at the single nucleotide-level - an up to 500x increase over previous dense attention-based models.

OpenICL: An Open-Source Framework for In-context Learning

shark-nlp/openicl 6 Mar 2023

However, the implementation of ICL is sophisticated due to the diverse retrieval and inference methods involved, as well as the varying pre-processing requirements for different models, datasets, and tasks.

SegGPT: Segmenting Everything In Context

baaivision/painter 6 Apr 2023

We unify various segmentation tasks into a generalist in-context learning framework that accommodates different kinds of segmentation data by transforming them into the same format of images.

Label Words are Anchors: An Information Flow Perspective for Understanding In-Context Learning

lancopku/label-words-are-anchors 23 May 2023

In-context learning (ICL) emerges as a promising capability of large language models (LLMs) by providing them with demonstration examples to perform diverse tasks.