Search Results for author: Jianping Gou

Found 6 papers, 1 papers with code

Federated Distillation: A Survey

no code implementations2 Apr 2024 Lin Li, Jianping Gou, Baosheng Yu, Lan Du, Zhang Yiand Dacheng Tao

Federated Learning (FL) seeks to train a model collaboratively without sharing private training data from individual clients.

Federated Learning Knowledge Distillation +1

Deep Dictionary Learning with An Intra-class Constraint

no code implementations14 Jul 2022 Xia Yuan, Jianping Gou, Baosheng Yu, Jiali Yu, Zhang Yi

Specifically, we design the intra-class compactness constraint on the intermediate representation at different levels to encourage the intra-class representations to be closer to each other, and eventually the learned representation becomes more discriminative.~Unlike the traditional DDL methods, during the classification stage, our DDLIC performs a layer-wise greedy optimization in a similar way to the training stage.

Dictionary Learning Representation Learning

Learning Canonical F-Correlation Projection for Compact Multiview Representation

no code implementations CVPR 2022 Yun-Hao Yuan, Jin Li, Yun Li, Jipeng Qiang, Yi Zhu, Xiaobo Shen, Jianping Gou

With this framework as a tool, we propose a correlative covariation projection (CCP) method by using an explicit nonlinear mapping.

Representation Learning

Collaborative Teacher-Student Learning via Multiple Knowledge Transfer

no code implementations21 Jan 2021 Liyuan Sun, Jianping Gou, Baosheng Yu, Lan Du, DaCheng Tao

However, most of the existing knowledge distillation methods consider only one type of knowledge learned from either instance features or instance relations via a specific distillation strategy in teacher-student learning.

Knowledge Distillation Model Compression +2

Knowledge Distillation: A Survey

no code implementations9 Jun 2020 Jianping Gou, Baosheng Yu, Stephen John Maybank, DaCheng Tao

To this end, a variety of model compression and acceleration techniques have been developed.

Knowledge Distillation Model Compression +2

Collaboratively Weighting Deep and Classic Representation via L2 Regularization for Image Classification

1 code implementation21 Feb 2018 Shaoning Zeng, Bob Zhang, Yanghao Zhang, Jianping Gou

We propose a deep collaborative weight-based classification (DeepCWC) method to resolve this problem, by providing a novel option to fully take advantage of deep features in classic machine learning.

Classification General Classification +3

Cannot find the paper you are looking for? You can Submit a new open access paper.