no code implementations • 13 Dec 2023 • Jinta Weng, Jiarui Zhang, Yue Hu, Daidong Fa, Xiaofeng Xuand, Heyan Huang
In interaction with large language models, embedding more task-related information into prompts will make it easier to stimulate knowledge embedded in large language models.
no code implementations • 8 Nov 2022 • Jinta Weng, Yifan Deng, d Donghao Li, Hao You, Yue Hu, Heyan Huang
The prompt has become an effective linguistic tool for utilizing pre-trained language models.
no code implementations • 29 Oct 2022 • Jinta Weng, Yue Hu, Jing Qiu, Heyan Huan
The effectiveness of prompt learning has been demonstrated in different pre-trained language models.
no code implementations • 18 Oct 2020 • Jinta Weng, Ying Gao, Jing Qiu, Guozhu Ding, Huanqin Zheng
Through the combination of crowdsourcing knowledge graph and teaching system, research methods to generate knowledge graph and its applications.