no code implementations • 13 Apr 2022 • Yaojie Hu, Xingjian Shi, Qiang Zhou, Lee Pike
We introduce NSEdit (neural-symbolic edit), a novel Transformer-based code repair method.
1 code implementation • 4 Nov 2021 • Xingjian Shi, Jonas Mueller, Nick Erickson, Mu Li, Alexander J. Smola
We consider the use of automated supervised learning systems for data tables that not only contain numeric/categorical columns, but one or more text fields as well.
no code implementations • EMNLP (sustainlp) 2021 • Haoyu He, Xingjian Shi, Jonas Mueller, Zha Sheng, Mu Li, George Karypis
We aim to identify how different components in the KD pipeline affect the resulting performance and how much the optimal KD pipeline varies across different datasets/tasks, such as the data augmentation policy, the loss function, and the intermediate representation for transferring the knowledge between teacher and student.
2 code implementations • ICML Workshop AutoML 2021 • Xingjian Shi, Jonas Mueller, Nick Erickson, Mu Li, Alex Smola
We design automated supervised learning systems for data tables that not only contain numeric/categorical columns, but text fields as well.
no code implementations • 25 Sep 2019 • Mufei Li, Hao Zhang, Xingjian Shi, Minjie Wang, Yixing Guan, Zheng Zhang
Does attention matter and, if so, when and how?
4 code implementations • 9 Jul 2019 • Jian Guo, He He, Tong He, Leonard Lausen, Mu Li, Haibin Lin, Xingjian Shi, Chenguang Wang, Junyuan Xie, Sheng Zha, Aston Zhang, Hang Zhang, Zhi Zhang, Zhongyue Zhang, Shuai Zheng, Yi Zhu
We present GluonCV and GluonNLP, the deep learning toolkits for computer vision and natural language processing based on Apache MXNet (incubating).
no code implementations • 27 May 2019 • Jiani Zhang, Xingjian Shi, Shenglin Zhao, Irwin King
We propose a new STAcked and Reconstructed Graph Convolutional Networks (STAR-GCN) architecture to learn node representations for boosting the performance in recommender systems, especially in the cold start scenario.
no code implementations • 21 Aug 2018 • Xingjian Shi, Dit-yan Yeung
Forecasting the multi-step future of these spatiotemporal systems based on the past observations, or, Spatiotemporal Sequence Forecasting (STSF), is a significant and challenging problem.
1 code implementation • 20 Mar 2018 • Jiani Zhang, Xingjian Shi, Junyuan Xie, Hao Ma, Irwin King, Dit-yan Yeung
We propose a new network architecture, Gated Attention Networks (GaAN), for learning on graphs.
Ranked #1 on
Node Property Prediction
on ogbn-proteins
no code implementations • ICCV 2017 • Feng Xiong, Xingjian Shi, Dit-yan Yeung
To exploit the otherwise very useful temporal information in video sequences, we propose a variant of a recent deep learning model called convolutional LSTM (ConvLSTM) for crowd counting.
4 code implementations • NeurIPS 2017 • Xingjian Shi, Zhihan Gao, Leonard Lausen, Hao Wang, Dit-yan Yeung, Wai-kin Wong, Wang-chun Woo
To address these problems, we propose both a new model and a benchmark for precipitation nowcasting.
Ranked #1 on
Video Prediction
on KTH
(Cond metric)
1 code implementation • 24 Nov 2016 • Jiani Zhang, Xingjian Shi, Irwin King, Dit-yan Yeung
Knowledge Tracing (KT) is a task of tracing evolving knowledge state of students with respect to one or more concepts as they engage in a sequence of learning activities.
no code implementations • NeurIPS 2016 • Hao Wang, Xingjian Shi, Dit-yan Yeung
To address this problem, we develop a collaborative recurrent autoencoder (CRAE) which is a denoising recurrent autoencoder (DRAE) that models the generation of content sequences in the collaborative filtering (CF) setting.
1 code implementation • NeurIPS 2016 • Hao Wang, Xingjian Shi, Dit-yan Yeung
Another shortcoming of NN is the lack of flexibility to customize different distributions for the weights and neurons according to the data, as is often done in probabilistic graphical models.
12 code implementations • NeurIPS 2015 • Xingjian Shi, Zhourong Chen, Hao Wang, Dit-yan Yeung, Wai-kin Wong, Wang-chun Woo
The goal of precipitation nowcasting is to predict the future rainfall intensity in a local region over a relatively short period of time.
Ranked #1 on
Video Prediction
on KTH
(Cond metric)