Search Results for author: Chris Tar

Found 7 papers, 3 papers with code

FreshLLMs: Refreshing Large Language Models with Search Engine Augmentation

1 code implementation5 Oct 2023 Tu Vu, Mohit Iyyer, Xuezhi Wang, Noah Constant, Jerry Wei, Jason Wei, Chris Tar, Yun-Hsuan Sung, Denny Zhou, Quoc Le, Thang Luong

Specifically, we introduce FreshQA, a novel dynamic QA benchmark encompassing a diverse range of question and answer types, including questions that require fast-changing world knowledge as well as questions with false premises that need to be debunked.

Hallucination World Knowledge

Predicting Annotation Difficulty to Improve Task Routing and Model Performance for Biomedical Information Extraction

no code implementations NAACL 2019 Yinfei Yang, Oshin Agarwal, Chris Tar, Byron C. Wallace, Ani Nenkova

Experiments on a complex biomedical information extraction task using expert and lay annotators show that: (i) simply excluding from the training data instances predicted to be difficult yields a small boost in performance; (ii) using difficulty scores to weight instances during training provides further, consistent gains; (iii) assigning instances predicted to be difficult to domain experts is an effective strategy for task routing.

Universal Sentence Encoder

23 code implementations29 Mar 2018 Daniel Cer, Yinfei Yang, Sheng-yi Kong, Nan Hua, Nicole Limtiaco, Rhomni St. John, Noah Constant, Mario Guajardo-Cespedes, Steve Yuan, Chris Tar, Yun-Hsuan Sung, Brian Strope, Ray Kurzweil

For both variants, we investigate and report the relationship between model complexity, resource consumption, the availability of transfer task training data, and task performance.

Conversational Response Selection Semantic Textual Similarity +7

A Growing Long-term Episodic & Semantic Memory

no code implementations20 Oct 2016 Marc Pickett, Rami Al-Rfou, Louis Shao, Chris Tar

The long-term memory of most connectionist systems lies entirely in the weights of the system.

Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.