no code implementations • 20 Aug 2023 • Kai Sun, Yifan Ethan Xu, Hanwen Zha, Yue Liu, Xin Luna Dong
Since the recent prosperity of Large Language Models (LLMs), there have been interleaved discussions regarding how to reduce hallucinations from LLM responses, how to increase the factuality of LLMs, and whether Knowledge Graphs (KGs), which store the world knowledge in a symbolic form, will be replaced with LLMs.
no code implementations • 13 Jun 2023 • Xiao Yang, Ahmed K. Mohamed, Shashank Jain, Stanislav Peshterliev, Debojeet Chatterjee, Hanwen Zha, Nikita Bhalla, Gagan Aneja, Pranab Mohanty
Importantly, LEDO is computationally efficient compared to methods that require loss function change, and cost-effective as the resulting data can be used in the same continuous training pipeline for production.
1 code implementation • 12 Mar 2021 • Hanwen Zha, Zhiyu Chen, Xifeng Yan
Relation prediction in knowledge graphs is dominated by embedding based methods which mainly focus on the transductive setting.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Zhiyu Chen, Wenhu Chen, Hanwen Zha, Xiyou Zhou, Yunkai Zhang, Sairam Sundaresan, William Yang Wang
If only provided with the table, it is hard for existing models to produce controllable and high-fidelity logical generations.
2 code implementations • Findings of the Association for Computational Linguistics 2020 • Wenhu Chen, Hanwen Zha, Zhiyu Chen, Wenhan Xiong, Hong Wang, William Wang
3) a hybrid model that combines heterogeneous information to find the answer.
Ranked #4 on Question Answering on HybridQA
1 code implementation • ACL 2019 • Zhiyu Chen, Hanwen Zha, Honglei Liu, Wenhu Chen, Xifeng Yan, Yu Su
Pre-trained embeddings such as word embeddings and sentence embeddings are fundamental tools facilitating a wide range of downstream NLP tasks.
Ranked #142 on Action Classification on Kinetics-400