no code implementations • 4 Nov 2020 • Kakeru Mitsuno, Yuichiro Nomura, Takio Kurita
Our planting can search the optimal network architecture with smaller number of parameters for improving the network performance by augmenting channels incrementally to layers of the initial networks while keeping the earlier trained parameters fixed.
no code implementations • 4 Nov 2020 • Kakeru Mitsuno, Takio Kurita
It is shown that the proposed method can reduce more than 50% parameters of ResNet for CIFAR-10 with only 0. 3% decrease in the accuracy of test samples.
1 code implementation • 9 Apr 2020 • Kakeru Mitsuno, Junichi Miyao, Takio Kurita
As a result, we can prune the weights more adequately depending on the structure of the network and the number of channels keeping high performance.