Search Results for author: Yuanyuan Wu

Found 7 papers, 3 papers with code

Hierarchical Skip Decoding for Efficient Autoregressive Text Generation

no code implementations22 Mar 2024 Yunqi Zhu, Xuebing Yang, Yuanyuan Wu, Wensheng Zhang

Autoregressive decoding strategy is a commonly used method for text generation tasks with pre-trained language models, while early-exiting is an effective approach to speedup the inference stage.

Text Generation

Parameter-Efficient Fine-Tuning with Layer Pruning on Free-Text Sequence-to-Sequence Modeling

1 code implementation15 May 2023 Yunqi Zhu, Xuebing Yang, Yuanyuan Wu, Wensheng Zhang

The increasing size of language models raises great research interests in parameter-efficient fine-tuning such as LoRA that freezes the pre-trained model, and injects small-scale trainable parameters for multiple downstream tasks (e. g., summarization, question answering and translation).

Dialogue Generation Question Answering

Leveraging Summary Guidance on Medical Report Summarization

no code implementations8 Feb 2023 Yunqi Zhu, Xuebing Yang, Yuanyuan Wu, Wensheng Zhang

This study presents three deidentified large medical text datasets, named DISCHARGE, ECHO and RADIOLOGY, which contain 50K, 16K and 378K pairs of report and summary that are derived from MIMIC-III, respectively.

16k Abstractive Text Summarization +1

Rapid Phase Ambiguity Elimination Methods for DOA Estimator via Hybrid Massive MIMO Receive Array

no code implementations27 Apr 2022 Xichao Zhan, YiWen Chen, Feng Shu, Xin Cheng, Yuanyuan Wu, Qi Zhang, Yifang Li, Peng Zhang

In the proposed Max-RP-QI, a quadratic interpolation scheme is adopted to interpolate the three DOA values corresponding to the largest three receive powers of Max-RP.

Differentiable N-gram Objective on Abstractive Summarization

1 code implementation8 Feb 2022 Yunqi Zhu, Xuebing Yang, Yuanyuan Wu, Mingjin Zhu, Wensheng Zhang

ROUGE is a standard automatic evaluation metric based on n-grams for sequence-to-sequence tasks, while cross-entropy loss is an essential objective of neural network language model that optimizes at a unigram level.

Abstractive Text Summarization Language Modelling

Real-World Single Image Super-Resolution: A Brief Review

1 code implementation3 Mar 2021 Honggang Chen, Xiaohai He, Linbo Qing, Yuanyuan Wu, Chao Ren, Ce Zhu

More specifically, this review covers the critical publically available datasets and assessment metrics for RSISR, and four major categories of RSISR methods, namely the degradation modeling-based RSISR, image pairs-based RSISR, domain translation-based RSISR, and self-learning-based RSISR.

Computational Efficiency Image Super-Resolution +2

Long-Range Motion Trajectories Extraction of Articulated Human Using Mesh Evolution

no code implementations30 Jun 2015 Yuanyuan Wu, Xiaohai He, Byeongkeun Kang, Haiying Song, Truong Q. Nguyen

This letter presents a novel approach to extract reliable dense and long-range motion trajectories of articulated human in a video sequence.

Motion Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.