Search Results for author: Ping Huang

Found 6 papers, 2 papers with code

PAEDID: Patch Autoencoder Based Deep Image Decomposition For Pixel-level Defective Region Segmentation

no code implementations28 Mar 2022 Shancong Mou, Meng Cao, Haoping Bai, Ping Huang, Jianjun Shi, Jiulong Shan

To combine the best of both worlds, we present an unsupervised patch autoencoder based deep image decomposition (PAEDID) method for defective region segmentation.

Anomaly Detection

Synthetic Defect Generation for Display Front-of-Screen Quality Inspection: A Survey

no code implementations3 Mar 2022 Shancong Mou, Meng Cao, Zhendong Hong, Ping Huang, Jiulong Shan, Jianjun Shi

Display front-of-screen (FOS) quality inspection is essential for the mass production of displays in the manufacturing process.

Synthetic Data Generation

Information Gain Propagation: a new way to Graph Active Learning with Soft Labels

1 code implementation ICLR 2022 Wentao Zhang, Yexin Wang, Zhenbang You, Meng Cao, Ping Huang, Jiulong Shan, Zhi Yang, Bin Cui

Graph Neural Networks (GNNs) have achieved great success in various tasks, but their performance highly relies on a large number of labeled nodes, which typically requires considerable human effort.

Active Learning

Self-supervised Semi-supervised Learning for Data Labeling and Quality Evaluation

no code implementations22 Nov 2021 Haoping Bai, Meng Cao, Ping Huang, Jiulong Shan

On active learning task, our method achieves 97. 0% Top-1 Accuracy on CIFAR10 with 0. 1% annotated data, and 83. 9% Top-1 Accuracy on CIFAR100 with 10% annotated data.

Active Learning Representation Learning

RIM: Reliable Influence-based Active Learning on Graphs

1 code implementation NeurIPS 2021 Wentao Zhang, Yexin Wang, Zhenbang You, Meng Cao, Ping Huang, Jiulong Shan, Zhi Yang, Bin Cui

Message passing is the core of most graph models such as Graph Convolutional Network (GCN) and Label Propagation (LP), which usually require a large number of clean labeled data to smooth out the neighborhood over the graph.

Active Learning

BatchQuant: Quantized-for-all Architecture Search with Robust Quantizer

no code implementations NeurIPS 2021 Haoping Bai, Meng Cao, Ping Huang, Jiulong Shan

While single-shot quantized neural architecture search enjoys flexibility in both model architecture and quantization policy, the combined search space comes with many challenges, including instability when training the weight-sharing supernet and difficulty in navigating the exponentially growing search space.

Neural Architecture Search Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.