Text Compression
5 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Text Compression
Most implemented papers
Syntactically Informed Text Compression with Recurrent Neural Networks
We present a self-contained system for constructing natural language models for use in text compression.
Authorship Verification based on Compression-Models
Instead, the only three key components of our method are a compressing algorithm, a dissimilarity measure and a threshold, needed to accept or reject the authorship of the questioned document.
A Batch Noise Contrastive Estimation Approach for Training Large Vocabulary Language Models
Training large vocabulary Neural Network Language Models (NNLMs) is a difficult task due to the explicit requirement of the output layer normalization, which typically involves the evaluation of the full softmax function over the complete vocabulary.
Data-efficient Neural Text Compression with Interactive Learning
Neural sequence-to-sequence models have been successfully applied to text compression.
Contextualized Semantic Distance between Highly Overlapped Texts
Overlapping frequently occurs in paired texts in natural language processing tasks like text editing and semantic similarity evaluation.