Search Results for author: Shuang Chen

Found 8 papers, 3 papers with code

Fact :Teaching MLLMs with Faithful, Concise and Transferable Rationales

no code implementations17 Apr 2024 Minghe Gao, Shuang Chen, Liang Pang, Yuan YAO, Jisheng Dang, Wenqiao Zhang, Juncheng Li, Siliang Tang, Yueting Zhuang, Tat-Seng Chua

Their ability to execute intricate compositional reasoning tasks is also constrained, culminating in a stagnation of learning progression for these models.

Hallucination

HINT: High-quality INPainting Transformer with Mask-Aware Encoding and Enhanced Attention

1 code implementation22 Feb 2024 Shuang Chen, Amir Atapour-Abarghouei, Hubert P. H. Shum

In this paper, we propose an end-to-end High-quality INpainting Transformer, abbreviated as HINT, which consists of a novel mask-aware pixel-shuffle downsampling module (MPD) to preserve the visible information extracted from the corrupted image while maintaining the integrity of the information available for high-level inferences made within the model.

Image Inpainting speech-recognition +1

INCLG: Inpainting for Non-Cleft Lip Generation with a Multi-Task Image Processing Network

no code implementations17 May 2023 Shuang Chen, Amir Atapour-Abarghouei, Edmond S. L. Ho, Hubert P. H. Shum

We present a software that predicts non-cleft facial images for patients with cleft lip, thereby facilitating the understanding, awareness and discussion of cleft lip surgeries.

Image Inpainting

Rows from Many Sources: Enriching row completions from Wikidata with a pre-trained Language Model

no code implementations14 Apr 2022 Carina Negreanu, Alperen Karaoglu, Jack Williams, Shuang Chen, Daniel Fabian, Andrew Gordon, Chin-Yew Lin

The task divides into two steps: subject suggestion, the task of populating the main column; and gap filling, the task of populating the remaining columns.

Language Modelling Text Generation

Improving Entity Linking by Modeling Latent Entity Type Information

no code implementations6 Jan 2020 Shuang Chen, Jinpeng Wang, Feng Jiang, Chin-Yew Lin

Existing state of the art neural entity linking models employ attention-based bag-of-words context model and pre-trained entity embeddings bootstrapped from word embeddings to assess topic level context compatibility.

Ranked #2 on Entity Disambiguation on AIDA-CoNLL (Micro-F1 metric)

Entity Disambiguation Entity Embeddings +3

Cannot find the paper you are looking for? You can Submit a new open access paper.