Search Results for author: Yiyun Zhao

Found 10 papers, 4 papers with code

Inferring missing metadata from environmental policy texts

no code implementations WS 2019 Steven Bethard, Egoitz Laparra, Sophia Wang, Yiyun Zhao, Ragheb Al-Ghezi, Aaron Lien, Laura L{\'o}pez-Hoffman

The National Environmental Policy Act (NEPA) provides a trove of data on how environmental policy decisions have been made in the United States over the last 50 years.

Pyramid Real Image Denoising Network

4 code implementations1 Aug 2019 Yiyun Zhao, Zhuqing Jiang, Aidong Men, Guodong Ju

Second, at the multi-scale denoising stage, pyramid pooling is utilized to extract multi-scale features.

Image Denoising Noise Estimation

How does BERT's attention change when you fine-tune? An analysis methodology and a case study in negation scope

no code implementations ACL 2020 Yiyun Zhao, Steven Bethard

We apply this methodology to test BERT and RoBERTa on a hypothesis that some attention heads will consistently attend from a word in negation scope to the negation cue.

Negation

The University of Arizona at SemEval-2021 Task 10: Applying Self-training, Active Learning and Data Augmentation to Source-free Domain Adaptation

no code implementations SEMEVAL 2021 Xin Su, Yiyun Zhao, Steven Bethard

This paper describes our systems for negation detection and time expression recognition in SemEval 2021 Task 10, Source-Free Domain Adaptation for Semantic Processing.

Active Learning Data Augmentation +3

Importance of Synthesizing High-quality Data for Text-to-SQL Parsing

no code implementations17 Dec 2022 Yiyun Zhao, Jiarong Jiang, Yiqun Hu, Wuwei Lan, Henry Zhu, Anuj Chauhan, Alexander Li, Lin Pan, Jun Wang, Chung-Wei Hang, Sheng Zhang, Marvin Dong, Joe Lilien, Patrick Ng, Zhiguo Wang, Vittorio Castelli, Bing Xiang

In this paper, we first examined the existing synthesized datasets and discovered that state-of-the-art text-to-SQL algorithms did not further improve on popular benchmarks when trained with augmented synthetic data.

SQL Parsing SQL-to-Text +2

A Comparison of Strategies for Source-Free Domain Adaptation

1 code implementation ACL 2022 Xin Su, Yiyun Zhao, Steven Bethard

Data sharing restrictions are common in NLP, especially in the clinical domain, but there is limited research on adapting models to new domains without access to the original training data, a setting known as source-free domain adaptation.

Active Learning Data Augmentation +1

Do pretrained transformers infer telicity like humans?

no code implementations CoNLL (EMNLP) 2021 Yiyun Zhao, Jian Gang Ngui, Lucy Hall Hartley, Steven Bethard

Pretrained transformer-based language models achieve state-of-the-art performance in many NLP tasks, but it is an open question whether the knowledge acquired by the models during pretraining resembles the linguistic knowledge of humans.

Open-Ended Question Answering

Cannot find the paper you are looking for? You can Submit a new open access paper.