Search Results for author: Yi Liao

Found 14 papers, 6 papers with code

Feature Activation Map: Visual Explanation of Deep Learning Models for Image Classification

no code implementations11 Jul 2023 Yi Liao, Yongsheng Gao, Weichuan Zhang

However, all the CAM-based methods (e. g., CAM, Grad-CAM, and Relevance-CAM) can only be used for interpreting CNN models with fully-connected (FC) layers as a classifier.

Classification Contrastive Learning +4

A resource-efficient deep learning framework for low-dose brain PET image reconstruction and analysis

no code implementations14 Feb 2022 Yu Fu, Shunjie Dong, Yi Liao, Le Xue, Yuanfan Xu, Feng Li, Qianqian Yang, Tianbai Yu, Mei Tian, Cheng Zhuo

18F-fluorodeoxyglucose (18F-FDG) Positron Emission Tomography (PET) imaging usually needs a full-dose radioactive tracer to obtain satisfactory diagnostic results, which raises concerns about the potential health risks of radiation exposure, especially for pediatric patients.

Generative Adversarial Network Image Reconstruction

TGEA: An Error-Annotated Dataset and Benchmark Tasks for TextGeneration from Pretrained Language Models

no code implementations ACL 2021 Jie He, Bo Peng, Yi Liao, Qun Liu, Deyi Xiong

Each error is hence manually labeled with comprehensive annotations, including the span of the error, the associated span, minimal correction to the error, the type of the error, and rationale behind the error.

Common Sense Reasoning Text Generation

Zero-Shot Paraphrase Generation with Multilingual Language Models

no code implementations9 Nov 2019 Yinpeng Guo, Yi Liao, Xin Jiang, Qing Zhang, Yibo Zhang, Qun Liu

Leveraging multilingual parallel texts to automatically generate paraphrases has drawn much attention as size of high-quality paraphrase corpus is limited.

Denoising Machine Translation +3

NEZHA: Neural Contextualized Representation for Chinese Language Understanding

10 code implementations31 Aug 2019 Junqiu Wei, Xiaozhe Ren, Xiaoguang Li, Wenyong Huang, Yi Liao, Yasheng Wang, Jiashu Lin, Xin Jiang, Xiao Chen, Qun Liu

The pre-trained language models have achieved great successes in various natural language understanding (NLU) tasks due to its capacity to capture the deep contextualized information in text by pre-training on large-scale corpora.

named-entity-recognition Named Entity Recognition +6

GPT-based Generation for Classical Chinese Poetry

2 code implementations29 Jun 2019 Yi Liao, Yasheng Wang, Qun Liu, Xin Jiang

We present a simple yet effective method for generating high quality classical Chinese poetry with Generative Pre-trained Language Model (GPT).

Language Modelling

QuaSE: Sequence Editing under Quantifiable Guidance

1 code implementation EMNLP 2018 Yi Liao, Lidong Bing, Piji Li, Shuming Shi, Wai Lam, Tong Zhang

For example, an input sequence could be a word sequence, such as review sentence and advertisement text.

Disentanglement Sentence +1

Abstractive Multi-Document Summarization via Phrase Selection and Merging

no code implementations IJCNLP 2015 Lidong Bing, Piji Li, Yi Liao, Wai Lam, Weiwei Guo, Rebecca J. Passonneau

We propose an abstraction-based multi-document summarization framework that can construct new sentences by exploring more fine-grained syntactic units than sentences, namely, noun/verb phrases.

Document Summarization Multi-Document Summarization +1

Cannot find the paper you are looking for? You can Submit a new open access paper.