Search Results for author: Chunnan Wang

Found 12 papers, 2 papers with code

AutoTS: Automatic Time Series Forecasting Model Design Based on Two-Stage Pruning

no code implementations26 Mar 2022 Chunnan Wang, Xingyu Chen, Chengyue Wu, Hongzhi Wang

We allow the effective combination of design experience from different sources, so as to create an effective search space containing a variety of TSF models to support different TSF tasks.

Neural Architecture Search Time Series +1

AutoMC: Automated Model Compression based on Domain Knowledge and Progressive search strategy

1 code implementation24 Jan 2022 Chunnan Wang, Hongzhi Wang, Xiangyu Shi

Model compression methods can reduce model complexity on the premise of maintaining acceptable performance, and thus promote the application of deep neural networks under resource constrained environments.

Model Compression

TPAD: Identifying Effective Trajectory Predictions Under the Guidance of Trajectory Anomaly Detection Model

no code implementations9 Jan 2022 Chunnan Wang, Chen Liang, Xiang Chen, Hongzhi Wang

They are lack of self-evaluation ability, that is, to examine the rationality of their prediction results, thus failing to guide users to identify high-quality ones from their candidate results.

Anomaly Detection AutoML +1

ATPFL: Automatic Trajectory Prediction Model Design Under Federated Learning Framework

no code implementations CVPR 2022 Chunnan Wang, Xiang Chen, Junzhe Wang, Hongzhi Wang

Although the Trajectory Prediction (TP) model has achieved great success in computer vision and robotics fields, its architecture and training scheme design rely on heavy manual work and domain knowledge, which is not friendly to common users.

Federated Learning Trajectory Prediction

Search For Deep Graph Neural Networks

no code implementations21 Sep 2021 Guosheng Feng, Chunnan Wang, Hongzhi Wang

Current GNN-oriented NAS methods focus on the search for different layer aggregate components with shallow and simple architectures, which are limited by the 'over-smooth' problem.

Diversity Q-Learning

FL-AGCNS: Federated Learning Framework for Automatic Graph Convolutional Network Search

no code implementations9 Apr 2021 Chunnan Wang, Bozhou Chen, Geng Li, Hongzhi Wang

Recently, some Neural Architecture Search (NAS) techniques are proposed for the automatic design of Graph Convolutional Network (GCN) architectures.

Federated Learning Neural Architecture Search

Auto-STGCN: Autonomous Spatial-Temporal Graph Convolutional Network Search Based on Reinforcement Learning and Existing Research Results

no code implementations15 Oct 2020 Chunnan Wang, Kaixin Zhang, Hongzhi Wang, Bozhou Chen

In recent years, many spatial-temporal graph convolutional network (STGCN) models are proposed to deal with the spatial-temporal network data forecasting problem.

Auto-CASH: Autonomous Classification Algorithm Selection with Deep Q-Network

no code implementations7 Jul 2020 Tianyu Mu, Hongzhi Wang, Chunnan Wang, Zheng Liang

In our work, we present Auto-CASH, a pre-trained model based on meta-learning, to solve the CASH problem more efficiently.

BIG-bench Machine Learning General Classification +2

Multi-Objective Neural Architecture Search Based on Diverse Structures and Adaptive Recommendation

1 code implementation6 Jul 2020 Chunnan Wang, Hongzhi Wang, Guosheng Feng, Fei Geng

To reduce searching cost, most NAS algorithms use fixed outer network level structure, and search the repeatable cell structure only.

Neural Architecture Search

ExperienceThinking: Constrained Hyperparameter Optimization based on Knowledge and Pruning

no code implementations2 Dec 2019 Chunnan Wang, Hongzhi Wang, Chang Zhou, Hanxiao Chen

Motivated by this, we propose ExperienceThinking algorithm to quickly find the best possible hyperparameter configuration of machine learning algorithms within a few configuration evaluations.

BIG-bench Machine Learning Hyperparameter Optimization +1

Auto-Model: Utilizing Research Papers and HPO Techniques to Deal with the CASH problem

no code implementations24 Oct 2019 Chunnan Wang, Hongzhi Wang, Tianyu Mu, Jianzhong Li, Hong Gao

In many fields, a mass of algorithms with completely different hyperparameters have been developed to address the same type of problems.

Hyperparameter Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.