Search Results for author: Michael Lyu

Found 17 papers, 9 papers with code

Discrete Auto-regressive Variational Attention Models for Text Modeling

1 code implementation16 Jun 2021 Xianghong Fang, Haoli Bai, Jian Li, Zenglin Xu, Michael Lyu, Irwin King

We further design discrete latent space for the variational attention and mathematically show that our model is free from posterior collapse.

Language Modelling

Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation

1 code implementation NAACL 2021 Yongchang Hao, Shilin He, Wenxiang Jiao, Zhaopeng Tu, Michael Lyu, Xing Wang

In addition, experimental results demonstrate that our Multi-Task NAT is complementary to knowledge distillation, the standard knowledge transfer method for NAT.

Knowledge Distillation Machine Translation +1

Learning 3D Face Reconstruction with a Pose Guidance Network

no code implementations9 Oct 2020 Pengpeng Liu, Xintong Han, Michael Lyu, Irwin King, Jia Xu

We present a self-supervised learning approach to learning monocular 3D face reconstruction with a pose guidance network (PGN).

3D Face Reconstruction Pose Estimation +1

Discrete Variational Attention Models for Language Generation

no code implementations21 Apr 2020 Xianghong Fang, Haoli Bai, Zenglin Xu, Michael Lyu, Irwin King

Variational autoencoders have been widely applied for natural language generation, however, there are two long-standing problems: information under-representation and posterior collapse.

Language Modelling Text Generation

Few Shot Network Compression via Cross Distillation

1 code implementation21 Nov 2019 Haoli Bai, Jiaxiang Wu, Irwin King, Michael Lyu

The core challenge of few shot network compression lies in high estimation errors from the original network during inference, since the compressed network can easily over-fits on the few training instances.

Knowledge Distillation Model Compression

Detecting Deep Neural Network Defects with Data Flow Analysis

no code implementations5 Sep 2019 Jiazhen Gu, Huanlin Xu, Yangfan Zhou, Xin Wang, Hui Xu, Michael Lyu

Deep neural networks (DNNs) are shown to be promising solutions in many challenging artificial intelligence tasks.

Object Recognition

Learning to Rank Using Localized Geometric Mean Metrics

1 code implementation22 May 2017 Yuxin Su, Irwin King, Michael Lyu

First, we design a concept called \textit{ideal candidate document} to introduce metric learning algorithm to query-independent model.

Learning-To-Rank Metric Learning

Simple and Efficient Parallelization for Probabilistic Temporal Tensor Factorization

no code implementations11 Nov 2016 Guangxi Li, Zenglin Xu, Linnan Wang, Jinmian Ye, Irwin King, Michael Lyu

Probabilistic Temporal Tensor Factorization (PTTF) is an effective algorithm to model the temporal tensor data.

Adaptive Regularization for Transductive Support Vector Machine

no code implementations NeurIPS 2009 Zenglin Xu, Rong Jin, Jianke Zhu, Irwin King, Michael Lyu, Zhirong Yang

In this framework, SVM and TSVM can be regarded as a learning machine without regularization and one with full regularization from the unlabeled data, respectively.

Learning with Consistency between Inductive Functions and Kernels

no code implementations NeurIPS 2008 Haixuan Yang, Irwin King, Michael Lyu

Regularized Least Squares (RLS) algorithms have the ability to avoid over-fitting problems and to express solutions as kernel expansions.

An Extended Level Method for Efficient Multiple Kernel Learning

no code implementations NeurIPS 2008 Zenglin Xu, Rong Jin, Irwin King, Michael Lyu

We consider the problem of multiple kernel learning (MKL), which can be formulated as a convex-concave problem.

Efficient Convex Relaxation for Transductive Support Vector Machine

no code implementations NeurIPS 2007 Zenglin Xu, Rong Jin, Jianke Zhu, Irwin King, Michael Lyu

We consider the problem of Support Vector Machine transduction, which involves a combinatorial problem with exponential computational complexity in the number of unlabeled examples.

Cannot find the paper you are looking for? You can Submit a new open access paper.