text similarity

67 papers with code • 0 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Stacked Cross Attention for Image-Text Matching

kuanghuei/SCAN ECCV 2018

Prior work either simply aggregates the similarity of all possible pairs of regions and words without attending differentially to more and less important words or regions, or uses a multi-step attentional process to capture limited number of semantic alignments which is less interpretable.

CAT-Seg: Cost Aggregation for Open-Vocabulary Semantic Segmentation

KU-CVLAB/CAT-Seg 21 Mar 2023

However, the problem of transferring these capabilities learned from image-level supervision to the pixel-level task of segmentation and addressing arbitrary unseen categories at inference makes this task challenging.

Query-based Attention CNN for Text Similarity Map

chun5212021202/ACM-Net 15 Sep 2017

This network is composed of compare mechanism, two-staged CNN architecture with attention mechanism, and a prediction layer.

Matching Images and Text with Multi-modal Tensor Fusion and Re-ranking

Wangt-CN/MTFN-RR-PyTorch-Code 12 Aug 2019

We propose a novel framework that achieves remarkable matching performance with acceptable model complexity.

Effective Crowd-Annotation of Participants, Interventions, and Outcomes in the Text of Clinical Trial Reports

Markus-Zlabinger/pico-annotation Findings of the Association for Computational Linguistics 2020

Obtaining such a corpus from crowdworkers, however, has been shown to be ineffective since (i) workers usually lack domain-specific expertise to conduct the task with sufficient quality, and (ii) the standard approach of annotating entire abstracts of trial reports as one task-instance (i. e. HIT) leads to an uneven distribution in task effort.

ESimCSE: Enhanced Sample Building Method for Contrastive Learning of Unsupervised Sentence Embedding

caskcsg/sentemb COLING 2022

Unsup-SimCSE takes dropout as a minimal data augmentation method, and passes the same input sentence to a pre-trained Transformer encoder (with dropout turned on) twice to obtain the two corresponding embeddings to build a positive pair.

Smoothed Contrastive Learning for Unsupervised Sentence Embedding

caskcsg/sentemb COLING 2022

Contrastive learning has been gradually applied to learn high-quality unsupervised sentence embedding.

InfoCSE: Information-aggregated Contrastive Learning of Sentence Embeddings

caskcsg/sentemb 8 Oct 2022

Contrastive learning has been extensively studied in sentence embedding learning, which assumes that the embeddings of different views of the same sentence are closer.

RETSim: Resilient and Efficient Text Similarity

google/unisim 28 Nov 2023

This paper introduces RETSim (Resilient and Efficient Text Similarity), a lightweight, multilingual deep learning model trained to produce robust metric embeddings for near-duplicate text retrieval, clustering, and dataset deduplication tasks.