CoLA

28 papers with code • 1 benchmarks • 2 datasets

This task has no description! Would you like to contribute one?

Datasets


Most implemented papers

General Cross-Architecture Distillation of Pretrained Language Models into Matrix Embeddings

lgalke/cross-architecture-distillation 17 Sep 2021

We match or exceed the scores of ELMo for all tasks of the GLUE benchmark except for the sentiment analysis task SST-2 and the linguistic acceptability task CoLA.

Monolingual and Cross-Lingual Acceptability Judgments with the Italian CoLA corpus

dhfbk/itacola-dataset Findings (EMNLP) 2021

The development of automated approaches to linguistic acceptability has been greatly fostered by the availability of the English CoLA corpus, which has also been included in the widely used GLUE benchmark.

COLA: COarse LAbel pre-training for 3D semantic segmentation of sparse LiDAR datasets

julessanchez/coarselabel 14 Feb 2022

Transfer learning is a proven technique in 2D computer vision to leverage the large amount of data available and achieve high performance with datasets limited in size due to the cost of acquisition or annotation.

COLA: Consistent Learning with Opponent-Learning Awareness

aidandos/cola 8 Mar 2022

Finally, in an empirical evaluation on a set of general-sum games, we find that COLA finds prosocial solutions and that it converges under a wider range of learning rates than HOLA and LOLA.

Acceptability Judgements via Examining the Topology of Attention Maps

danchern97/tda4la 19 May 2022

The role of the attention mechanism in encoding linguistic knowledge has received special interest in NLP.

Linear Connectivity Reveals Generalization Strategies

anonwhymoos/connectivity 24 May 2022

It is widely accepted in the mode connectivity literature that when two neural networks are trained similarly on the same data, they are connected by a path through parameter space over which test set accuracy is maintained.

Field Level Neural Network Emulator for Cosmological N-body Simulations

dsjamieson/map2map_emu 9 Jun 2022

We build a field level emulator for cosmic structure formation that is accurate in the nonlinear regime.

Simple lessons from complex learning: what a neural network model learns about cosmic structure formation

dsjamieson/map2map_fid 9 Jun 2022

We find our model generalizes well to these well understood scenarios, demonstrating that the networks have inferred general physical principles and learned the nonlinear mode couplings from the complex, random Gaussian training data.

RPN: A Word Vector Level Data Augmentation Algorithm in Deep Learning for Language Understanding

DLYuanGod/RPN 12 Dec 2022

However, existing data augmentation techniques in natural language understanding (NLU) may not fully capture the complexity of natural language variations, and they can be challenging to apply to large datasets.

Multi-Source Contrastive Learning from Musical Audio

cgaroufis/mscol_smc23 14 Feb 2023

Contrastive learning constitutes an emerging branch of self-supervised learning that leverages large amounts of unlabeled data, by learning a latent space, where pairs of different views of the same sample are associated.