Fill Mask

5 papers with code • 1 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Zero-Shot Video Question Answering via Frozen Bidirectional Language Models

antoyang/FrozenBiLM 16 Jun 2022

Manual annotation of question and answers for videos, however, is tedious and prohibits scalability.

Prompt Tuning or Fine-Tuning - Investigating Relational Knowledge in Pre-Trained Language Models

leandrafichtel/bertriple AKBC 2021

In this work, we propose using a completely different approach: Instead of spending resources on training an additional model, we simply perform an adaptive fine-tuning of the pre-trained language model on the standard fill-mask task using a small training dataset of existing facts from a knowledge graph.

MAX: Masked Autoencoder for X-ray Fluorescence in Geological Investigation

dispink/xpt 16 Oct 2024

Pre-training foundation models has become the de-facto procedure for deep learning approaches, yet its application remains limited in the geological studies, where in needs of the model transferability to break the shackle of data scarcity.

CNMBert: A Model for Hanyu Pinyin Abbreviation to Character Conversion Task

igarashiakatuki/cnmbert 18 Nov 2024

The task of converting hanyu pinyin abbreviations to Chinese characters is a significant branch within the domain of Chinese Spelling Correction (CSC).