no code implementations • 8 Oct 2024 • Yi Liang, You Wu, Honglei Zhuang, Li Chen, Jiaming Shen, Yiling Jia, Zhen Qin, Sumit Sanghai, Xuanhui Wang, Carl Yang, Michael Bendersky
To overcome the scarcity of training data for these intermediate steps, we leverage LLMs to generate synthetic intermediate writing data such as outlines, key information and summaries from existing full articles.
1 code implementation • 8 Oct 2023 • Lu Yin, You Wu, Zhenyu Zhang, Cheng-Yu Hsieh, Yaqing Wang, Yiling Jia, Gen Li, Ajay Jaiswal, Mykola Pechenizkiy, Yi Liang, Michael Bendersky, Zhangyang Wang, Shiwei Liu
Large Language Models (LLMs), renowned for their remarkable performance across diverse domains, present a challenge when it comes to practical deployment due to their colossal model size.
no code implementations • 13 Jun 2022 • Yiling Jia, Hongning Wang
Deep neural networks (DNNs) demonstrate significant advantages in improving ranking performance in retrieval tasks.
no code implementations • ICLR 2022 • Yiling Jia, Weitong Zhang, Dongruo Zhou, Quanquan Gu, Hongning Wang
Thanks to the power of representation learning, neural contextual bandit algorithms demonstrate remarkable performance improvement against their classical counterparts.
no code implementations • 17 Jan 2022 • Yiling Jia, Hongning Wang
Existing online learning to rank (OL2R) solutions are limited to linear models, which are incompetent to capture possible non-linear relations between queries and documents.
no code implementations • 1 Nov 2021 • Yiling Jia, Hongning Wang
Online learning to rank (OL2R) has attracted great research interests in recent years, thanks to its advantages in avoiding expensive relevance labeling as required in offline supervised ranking model learning.
1 code implementation • 28 Feb 2021 • Yiling Jia, Huazheng Wang, Stephen Guo, Hongning Wang
Online Learning to Rank (OL2R) eliminates the need of explicit relevance annotation by directly optimizing the rankers from their interactions with users.
1 code implementation • 2 Sep 2019 • Yiling Jia, Nipun Batra, Hongning Wang, Kamin Whitehouse
However, very few homes in the world have installed sub-meters (sensors measuring individual appliance energy); and the cost of retrofitting a home with extensive sub-metering eats into the funds available for energy saving retrofits.
no code implementations • 3 Jun 2019 • Yiyi Tao, Yiling Jia, Nan Wang, Hongning Wang
In this work, we integrate regression trees to guide the learning of latent factor models for recommendation, and use the learnt tree structure to explain the resulting latent factors.
1 code implementation • 10 Jun 2018 • Nan Wang, Hongning Wang, Yiling Jia, Yue Yin
Explaining automatically generated recommendations allows users to make more informed and accurate decisions about which results to utilize, and therefore improves their satisfaction.