TedNet: A Pytorch Toolkit for Tensor Decomposition Networks

11 Apr 2021  ·  Yu Pan, Maolin Wang, Zenglin Xu ·

Tensor Decomposition Networks (TDNs) prevail for their inherent compact architectures. To give more researchers a flexible way to exploit TDNs, we present a Pytorch toolkit named TedNet. TedNet implements 5 kinds of tensor decomposition(i.e., CANDECOMP/PARAFAC (CP), Block-Term Tucker (BTT), Tucker-2, Tensor Train (TT) and Tensor Ring (TR) on traditional deep neural layers, the convolutional layer and the fully-connected layer. By utilizing the basic layers, it is simple to construct a variety of TDNs. TedNet is available at https://github.com/tnbar/tednet.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods