Distributed Methods

ByteScheduler

ByteScheduler is a generic communication scheduler for distributed DNN training acceleration. It is based on analysis that partitioning and rearranging the tensor transmissions can result in optimal results in theory and good performance in real-world even with scheduling overhead.

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Bayesian Optimization 1 100.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories