Search Results for author: Shih-ting Lin

Found 7 papers, 3 papers with code

LG-LSQ: Learned Gradient Linear Symmetric Quantization

no code implementations18 Feb 2022 Shih-ting Lin, Zhaofang Li, Yu-Hsiang Cheng, Hao-Wen Kuo, Chih-Cheng Lu, Kea-Tiong Tang

The main challenge associated with the quantization algorithm is maintaining accuracy at low bit-widths.

Quantization

Conditional Generation of Temporally-ordered Event Sequences

no code implementations ACL 2021 Shih-ting Lin, Nathanael Chambers, Greg Durrett

We propose a single model that addresses both temporal ordering, sorting given events into the order they occurred, and event infilling, predicting new events which fit into an existing temporally-ordered sequence.

Denoising Story Completion

ReadOnce Transformers: Reusable Representations of Text for Transformers

no code implementations ACL 2021 Shih-ting Lin, Ashish Sabharwal, Tushar Khot

We present ReadOnce Transformers, an approach to convert a transformer-based model into one that can build an information-capturing, task-independent, and compressed representation of text.

Document Summarization

Effective Distant Supervision for Temporal Relation Extraction

2 code implementations EACL (AdaptNLP) 2021 Xinyu Zhao, Shih-ting Lin, Greg Durrett

A principal barrier to training temporal relation extraction models in new domains is the lack of varied, high quality examples and the challenge of collecting more.

Temporal Relation Extraction

Tradeoffs in Sentence Selection Techniques for Open-Domain Question Answering

no code implementations18 Sep 2020 Shih-ting Lin, Greg Durrett

Current methods in open-domain question answering (QA) usually employ a pipeline of first retrieving relevant documents, then applying strong reading comprehension (RC) models to that retrieved text.

Open-Domain Question Answering Reading Comprehension +1

Multi-hop Question Answering via Reasoning Chains

3 code implementations7 Oct 2019 Jifan Chen, Shih-ting Lin, Greg Durrett

Our analysis shows the properties of chains that are crucial for high performance: in particular, modeling extraction sequentially is important, as is dealing with each candidate sentence in a context-aware way.

Coreference Resolution Multi-hop Question Answering +3

DATC RDF: An Open Design Flow from Logic Synthesis to Detailed Routing

2 code implementations2 Oct 2018 Jinwook Jung, Iris Hui-Ru Jiang, Jianli Chen, Shih-ting Lin, Yih-Lang Li, Victor N. Kravets, Gi-Joon Nam

In this paper, we present DATC Robust Design Flow (RDF) from logic synthesis to detailed routing.

Other Computer Science

Cannot find the paper you are looking for? You can Submit a new open access paper.