Search Results

Transformers: State-of-the-Art Natural Language Processing

3 code implementations EMNLP 2020

Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks.

Image Classification Object Recognition +1

HuggingFace's Transformers: State-of-the-art Natural Language Processing

9 code implementations9 Oct 2019

Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks.

Text Generation Transfer Learning

Datasets: A Community Library for Natural Language Processing

1 code implementation EMNLP (ACL) 2021

The scale, variety, and quantity of publicly-available NLP datasets has grown rapidly as researchers propose new tasks, larger models, and novel benchmarks.

Image Classification Object Recognition +2

AllenNLP: A Deep Semantic Natural Language Processing Platform

2 code implementations WS 2018

This paper describes AllenNLP, a platform for research on deep learning methods in natural language understanding.

Natural Language Understanding Reading Comprehension +1

Revisiting Pre-Trained Models for Chinese Natural Language Processing

6 code implementations Findings of the Association for Computational Linguistics 2020

Bidirectional Encoder Representations from Transformers (BERT) has shown marvelous improvements across various NLP tasks, and consecutive variants have been proposed to further improve the performance of the pre-trained language models.

Language Modeling Language Modelling +1

Computational Job Market Analysis with Natural Language Processing

1 code implementation29 Apr 2024

[Abridged Abstract] Recent technological advances underscore labor market dynamics, yielding significant consequences for employment prospects and increasing job vacancy data across platforms and languages.

Active Learning De-identification +1

ByT5: Towards a token-free future with pre-trained byte-to-byte models

5 code implementations28 May 2021

Most widely-used pre-trained language models operate on sequences of tokens corresponding to word or subword units.

Cross-Lingual Natural Language Inference Cross-Lingual NER +3

Pre-trained Models for Natural Language Processing: A Survey

2 code implementations18 Mar 2020

Recently, the emergence of pre-trained models (PTMs) has brought natural language processing (NLP) to a new era.

Representation Learning Survey