Search Results for author: Zhi-Qiang Liu

Found 9 papers, 3 papers with code

Deep Transferring Quantization

1 code implementation ECCV 2020 Zheng Xie, Zhiquan Wen, Jing Liu, Zhi-Qiang Liu, Xixian Wu, Mingkui Tan

Specifically, we propose a method named deep transferring quantization (DTQ) to effectively exploit the knowledge in a pre-trained full-precision model.

Face Recognition Image Classification +2

Scene Learning: Deep Convolutional Networks For Wind Power Prediction by Embedding Turbines into Grid Space

no code implementations16 Jul 2018 Ruiguo Yu, Zhi-Qiang Liu, Xuewei Li, Wenhuan Lu, Mei Yu, Jianrong Wang, Bin Li

There have been a lot of researches based on the time series of the wind power or speed, but In fact, these time series cannot express the temporal and spatial changes of wind, which fundamentally hinders the advance of wind power prediction.

Time Series Time Series Analysis

Heterogeneous Multi-task Learning for Human Pose Estimation with Deep Convolutional Neural Network

no code implementations13 Jun 2014 Sijin Li, Zhi-Qiang Liu, Antoni B. Chan

We propose an heterogeneous multi-task learning framework for human pose estimation from monocular image with deep convolutional neural network.

Multi-Task Learning Pose Estimation

ESSP: An Efficient Approach to Minimizing Dense and Nonsubmodular Energy Functions

no code implementations19 May 2014 Wei Feng, Jiaya Jia, Zhi-Qiang Liu

From our study, we make some reasonable recommendations of combining existing methods that perform the best in different situations for this challenging problem.

Towards Big Topic Modeling

no code implementations17 Nov 2013 Jian-Feng Yan, Jia Zeng, Zhi-Qiang Liu, Yang Gao

Although parallel LDA algorithms on the multi-processor architecture have low time and space complexities, their communication costs among processors often scale linearly with the vocabulary size and the number of topics, leading to a serious scalability problem.

Fast Online EM for Big Topic Modeling

no code implementations8 Oct 2012 Jia Zeng, Zhi-Qiang Liu, Xiao-Qin Cao

The expectation-maximization (EM) algorithm can compute the maximum-likelihood (ML) or maximum a posterior (MAP) point estimate of the mixture models or latent variable models such as latent Dirichlet allocation (LDA), which has been one of the most popular probabilistic topic modeling methods in the past decade.

Scheduling

A New Approach to Speeding Up Topic Modeling

no code implementations1 Apr 2012 Jia Zeng, Zhi-Qiang Liu, Xiao-Qin Cao

To accelerate the training speed, ABP actively scans the subset of corpus and searches the subset of topic space for topic modeling, therefore saves enormous training time in each iteration.

Cannot find the paper you are looking for? You can Submit a new open access paper.