no code implementations • 3 Jan 2023 • Ke Jiang, Jiayu Yao, Xiaoyang Tan
In this paper, we propose Contextual Conservative Q-Learning(C-CQL) to learn a robustly reliable policy through the contextual information captured via an inverse dynamics model.
no code implementations • 14 Nov 2022 • Baoshun Shi, Ke Jiang, Shaolei Zhang, Qiusheng Lian, Yanwei Qin
Recent deep learning-based methods have achieved promising performance for computed tomography metal artifact reduction (CTMAR).
no code implementations • 30 Apr 2019 • Sudipto Mukherjee, Ke Jiang
Email has remained a principal form of communication among people, both in enterprise and social settings.
no code implementations • 7 Apr 2016 • Ke Jiang, Suvrit Sra, Brian Kulis
Topic models have emerged as fundamental tools in unsupervised machine learning.
no code implementations • CVPR 2015 • Ke Jiang, Qichao Que, Brian Kulis
We present a simple but powerful reinterpretation of kernelized locality-sensitive hashing (KLSH), a general and popular method developed in the vision community for performing approximate nearest-neighbor searches in an arbitrary reproducing kernel Hilbert space (RKHS).
no code implementations • NeurIPS 2013 • Anirban Roychowdhury, Ke Jiang, Brian Kulis
Starting with the standard HMM, we first derive a “hard” inference algorithm analogous to k-means that arises when particular variances in the model tend to zero.
no code implementations • NeurIPS 2012 • Ke Jiang, Brian Kulis, Michael. I. Jordan
Links between probabilistic and non-probabilistic learning algorithms can arise by performing small-variance asymptotics, i. e., letting the variance of particular distributions in a graphical model go to zero.