no code implementations • CCL 2020 • Dinghe Xiao, Nannan Wang, Jiangang Yu, Chunhong Zhang, Jiaqi Wu
So we develop two pipelines of processing methods for semi-structured data and unstructured data respectively.
no code implementations • 20 Oct 2023 • Jiarun Liu, Wentao Hu, Chunhong Zhang
Large Language Models (LLMs) have emerged as promising agents for web navigation tasks, interpreting objectives and interacting with web pages.
1 code implementation • 3 Sep 2023 • Moyu Zhang, Xinning Zhu, Chunhong Zhang, Feng Pan, Wenchen Qian, Hui Zhao
The Knowledge Tracing (KT) task plays a crucial role in personalized learning, and its purpose is to predict student responses based on their historical practice behavior sequence.
1 code implementation • 7 Aug 2023 • Moyu Zhang, Xinning Zhu, Chunhong Zhang, Feng Pan, Wenchen Qian, Hui Zhao
Knowledge tracing (KT) aims to predict students' responses to practices based on their historical question-answering behaviors.
1 code implementation • 7 Aug 2023 • Moyu Zhang, Xinning Zhu, Chunhong Zhang, Wenchen Qian, Feng Pan, Hui Zhao
Since students' mastery of knowledge concepts is often unlabeled, existing KT methods rely on the implicit paradigm of historical practice to mastery of knowledge concepts to students' responses to practices to address the challenge of unlabeled concept mastery.
no code implementations • 8 May 2023 • Zeju Li, Linya Cheng, Chunhong Zhang, Xinning Zhu, Hui Zhao
The field of education has undergone a significant transformation due to the rapid advancements in Artificial Intelligence (AI).
1 code implementation • 12 May 2022 • Changhong Yu, Chunhong Zhang, Qi Sun
The goal of building intelligent dialogue systems has largely been separately pursued under two motives: task-oriented dialogue (TOD) systems, and open-domain systems for chit-chat (CC).
1 code implementation • 10 Aug 2021 • Moyu Zhang, Xinning Zhu, Chunhong Zhang, Yang Ji, Feng Pan, Changchuan Yin
In this paper, we propose Multi-Factors Aware Dual-Attentional model (MF-DAKT) which enriches question representations and utilizes multiple factors to model students' learning progress based on a dual-attentional mechanism.