2 code implementations • 23 Jun 2019 • Mohit Goyal, Rajan Goyal, Brejesh lall
Since the optimization of SLNNs is still a challenge, we show that using SLAF along with standard activations (like ReLU) can provide performance improvements with only a small increase in number of parameters.