no code implementations • 18 Aug 2021 • Zhu Baozhou, Peter Hofstee, Jinho Lee, Zaid Al-Ars
To solve the two problems together, we initially propose an attention module for convolutional neural networks by developing an AW-convolution, where the shape of attention maps matches that of the weights rather than the activations.
1 code implementation • 25 May 2021 • Baozhou Zhu, Peter Hofstee, Johan Peltenburg, Jinho Lee, Zaid AlArs
Thus, a common approach is to compute a reconstructed training dataset before compression.
no code implementations • 11 Sep 2020 • Zhu Baozhou, Peter Hofstee, Jinho Lee, Zaid Al-Ars
Inspired by the shortcuts and fractal architectures, we propose two Shortcut-based Fractal Architectures (SoFAr) specifically designed for BCNNs: 1. residual connection-based fractal architectures for binary ResNet, and 2. dense connection-based fractal architectures for binary DenseNet.
no code implementations • 8 Aug 2020 • Baozhou Zhu, Zaid Al-Ars, Peter Hofstee
In this paper, we propose a strategy, named NASB, which adopts Neural Architecture Search (NAS) to find an optimal architecture for the binarization of CNNs.