no code implementations • 12 Sep 2020 • Song Cheng, Lei Wang, Pan Zhang
Tensor networks, a model that originated from quantum physics, has been gradually generalized as efficient models in machine learning in recent years.
no code implementations • 14 Jan 2020 • Song Cheng, Qi Liu, Enhong Chen
We refer to this problem as domain adaptation for knowledge tracing (DAKT) which contains two aspects: (1) how to achieve great knowledge tracing performance in each domain.
1 code implementation • 27 May 2019 • Song Cheng, Qi Liu
However, traditional IRT ignores the rich information in question texts, cannot diagnose knowledge concept proficiency, and it is inaccurate to diagnose the parameters for the questions which only appear several times.
1 code implementation • 11 Apr 2019 • Ze-Feng Gao, Song Cheng, Rong-Qiang He, Z. Y. Xie, Hui-Hai Zhao, Zhong-Yi Lu, Tao Xiang
A deep neural network is a parametrization of a multilayer mapping of signals in terms of many alternatively arranged linear and nonlinear transformations.
no code implementations • 8 Jan 2019 • Song Cheng, Lei Wang, Tao Xiang, Pan Zhang
Matrix product states (MPS), a tensor network designed for one-dimensional quantum systems, has been recently proposed for generative modeling of natural data (such as images) in terms of `Born machine'.
no code implementations • 12 Dec 2017 • Song Cheng, Jing Chen, Lei Wang
We compare and contrast the statistical physics and quantum physics inspired approaches for unsupervised generative modeling of classical data.
1 code implementation • 17 Jan 2017 • Jing Chen, Song Cheng, Haidong Xie, Lei Wang, Tao Xiang
Conversely, we give sufficient and necessary conditions to determine whether a TNS can be transformed into an RBM of given architectures.