1 code implementation • 21 Jun 2019 • Huiping Zhuang, Yi Wang, Qinglai Liu, Shuai Zhang, Zhiping Lin
Training neural networks with back-propagation (BP) requires a sequential passing of activations and gradients, which forces the network modules to work in a synchronous fashion.